Hello there,

I am new to pymc and have an issue with a toy model with computes a value Z given a vector w and a matrix M.

Z = wM

Ultimately I would like this to be a complex expression

Z = jwM

however, I only need the absolute value of this expression and am quite happy to flatten the matrix so I have a single vector observation which I could compare to a similarly shaped likelihood function.

In the following code, I am trying to infer the values m1 and m2 in the matrix M. The model and inference run successfully (I have to separate the two as the sampler hangs forever if I donât - any ideas why??) but the trace values indicate that the sample space is not being explored, with nearly identical values for m1, m2 and crazy high values for sigma.

This will ultimately form a basis for more complex models, but I hope this gets me over this current issue and gets me moving.

```
import pytensor
import numpy
import pytensor.tensor as pt
from pytensor import function
import numpy as np
import pymc as pm
import matplotlib.pyplot as plt
import arviz as az
from pymc import pytensorf
plt.close('all')
#Generate synthetic data
f = np.linspace(0.01, 10, 1000) # frequencies for calculation
w = 2*np.pi*f # angular frequencies
m1_true = 1
m2_true = 2
M_true = np.array([[m1_true,0],[0,m2_true]])
#observation data Z = jwm
Z_true = np.zeros((2,2,len(f)))
for i in range(len(f)):
Z_true[:,:,i] = np.abs(1 * w[i] * M_true)#.reshape(-1, 1)
#add noise to the observation
n_std = 5
Z_obs = np.abs(Z_true + np.random.normal(0, n_std, Z_true.shape))
# Create a PyMC3 model to sample masses and sigma
with pm.Model() as model:
# Priors for unknown model parameters
mass1 = pm.Uniform("mass1", 0, 5)
mass2 = pm.Uniform("mass2", 0, 5)
sigma = pm.HalfNormal("sigma", 10)
#symbolic variables and matrix creation
m1 = pt.dscalar('m1')
m2 = pt.dscalar('m2')
mass_mat = pt.stacklists([[mass1, 0], [0, mass2]])
#loop step function over w to calculate/predict Z with priors
def step(w, mass_mat):
return np.abs(1* w *mass_mat)#/(1j*w_vals))
# store results using scan instead of for looping
results, _ = pytensor.scan(fn=step, sequences=[w], outputs_info=None, non_sequences=[mass_mat])
# Likelihood term
likelihood = likelihood = pm.Normal("likelihood", mu=results.flatten(), sigma=sigma, observed=Z_obs.flatten())
#%%
with model: # note - i have to implement this way otherwise the sampler hangs indefinitely
trace = pm.sample(1000, tune=1000)
#%%
# Plotting
az.plot_trace(trace, combined = True)
```