Hello @gmingas!
I have quickly gone through the MLDA PR if I did get correctly the usage then some simple FEniCS code with MLDA sampler would look something like
from fenics import *
from fenics_adjoint import *
# import pymc3
coarse_mesh1 = UnitSquareMesh(5, 5)
coarse_mesh2 = UnitSquareMesh(10, 10)
fine_mesh = UnitSquareMesh(20, 20)
# Define appropriate FunctionSpace objects etc.
def coarse_solve1(...):
# write the solver here
def coarse_solve2(...):
# write the solver here
def fine_solve(...):
# write the solver here
# Wrap above solver functions with Theano Op
theano_coarse_solve1 = create_fenics_theano_op(...)
# ...
with Model() as coarse_model1:
y = pm.Deterministic("y", theano_coarse_solve1(...))
obs = Normal('obs', mu=y, sigma=1., observed=data)
with Model() as coarse_model2:
y = pm.Deterministic("y", theano_coarse_solve2(...))
obs = Normal('obs', mu=y, sigma=1., observed=data)
with Model() as model:
y = pm.Deterministic("y", theano_fine_solve(...))
obs = Normal('obs', mu=y, sigma=1., observed=data)
with model:
sample(10, MLDA(coarse_models=[coarse_model1, coarse_model2]))
So fenics-pymc3 can be combined now with multi-level sampler by defining needed coarse systems separately.
What kind of information has to be in this custom Op?
Currently create_fenics_theano_op is functional wrapper for creating FenicsOp. FenicsOp has perform method defined that itself is a wrapper around evaluate_primal function from numpy-fenics-adjoint package. What evaluate_primal does is transforms NumPy array input into FEniCS types calls the user-defined function with FEniCS inputs and transforms FEniCS output to NumPy array.
Of course, FenicsOp can be taken as a template for any further work that requires some additional computations/operations in perform, but does it have to be all in one place? It’s important to keep Ops as one unit of computation so that grad or L_op methods are simple.
If there are any specific questions about the usage of fenics-pymc3 interface, feel free to ask here or GitHub issues.