Okay so after struggling with dimensions for a bit i created a simple example to test whether the convolutions works
import theano.tensor.signal.conv
from scipy import signal
xtest=np.linspace(0,10,100)
filt=2.0*xtest
conv=signal.convolve(xtest, filt, mode='full')
with pm.Model() as model:
A =pm.Uniform('A',lower=0.0,upper=4.0)#pm.Normal('A', mu=2.0, sigma=1.0)
filtt = A*xtest
convol=theano.tensor.signal.conv.conv2d(xtest[None,:],filtt[None,:],(1,100),(1,100),border_mode='full')
likelihood = pm.Normal('conv', mu=convol[0], observed=conv)
trace = pm.sample(1000)
I tried both using a uniform and normal prior to see how robust the solution was. While the convolution seems to work i am suprised that the sampler seems to have a high acceptance probability and a small number of effective samples
The acceptance probability does not match the target. It is 0.9979773108121525,
but should be close to 0.8. Try to increase the number of tuning steps.
The estimated number of effective samples is smaller than 200 for some parameters.
Though the statictics seem to be fine for the constant
|mean|sd|mc_error|hpd_2.5|hpd_97.5|n_eff|Rhat|
|A|2.000464|0.004782|0.000403|1.999902|2.000096|95.187888|1.012574|
Now i just need a way to use the ‘same’ border mode that the scipy and numpy convolve functions have. My idea is to obtain the middle of the convolution output and obtain an array of the same length as the input. The trouble there will be to write it in a way theano understands.