Information regarding GSOC 2019

Hello People,
I was going through the gsoc project list of pymc, and found the ode solver project quite interesting. Could the mentors elaborate on the scope of the project, and what are the topics they are planning to cover under the span of this project for gsoc 2019?
Link - [https://github.com/pymc-devs/pymc3/wiki/GSoC-2018-projects#ode-solvers](ODE Solver)

Edit : @junpenglao Can you help me out?

I think it is a bit too early to plan about GSoC 2019… As for the project scope, I am not sure who is currently working on this (@aseyboldt is doing some work linking julia ODE solver to pymc3, @michaelosthege also implemented some version of ODE solver). Overall, the plan is to implement a ODE solver with gradient for parameters (which is the reason we cannot just use the ODE solver in scipy).

1 Like

Are there any updates regarding the scope of the project?
Also, I am confused regarding the ode solver; can you please elaborate on what is meant by implement a ODE solver with gradient for parameters ?
Ping @aseyboldt @michaelosthege
@junpenglao

For inference, it is preferable to be able to compute gradients with respect to the parameters. Most ODE solvers such as scipy.odeint don’t give gradients, which means one can’t use Hamiltonian Monte Carlo (NUTS) right away. There are multiple workarounds:

  • using a less sophisticated sampling algorithm (e.g. DEMetropolis), which works as long as there aren’t too many dimensions
  • doing the ODE-solving in theano, to make it differentiable (see this example) - works, but the performance is awkward
  • a Theano Op that uses scipy.odeint for the forward-pass and an analytical (manually derived) gradient (works if you’re good with maths)
  • a Theano Op that uses scipy.odeint in the forward-pass and adjoints for the gradients
  • a Theano Op that wraps the famous JuliaDiffEq package (which can do gradients)

I’ve done the first two and would love to see a comparison, or even a generalizable implementation of the last two.

1 Like

Are there any research papers for the last two methods?
I mean this -

  • a Theano Op that uses scipy.odeint for the forward-pass and an analytical (manually derived) gradient (works if you’re good with maths)
  • a Theano Op that uses scipy.odeint in the forward-pass and adjoints for the gradients

The adjoint method was recently used in a very popular deep learning paper called “Neural Ordinary Differential Equations”. There are explanations of the method here and here.
In the original paper, they added the Python implementation to appendix D,

This is the Python interface to JuliaDiffEq.

You can see how I wrapped the scipy.odeint here: https://gist.github.com/michaelosthege/6953a2af7417da6ebdd41771a9e7e7a8
An analytical gradient computation could be implemented in the grad method, as shown in the Theano documentation.

1 Like

Thanks !! I will have a look and post some more query as I proceed.
Regarding the contributions for gsoc, how many issues do I need to solve ?

Oh hey, I was just pointed here. If you want to use diffeqpy and run into any issues, just ping/email/etc. me and I’ll chime in. All of the docs are on this page: http://docs.juliadiffeq.org/latest/analysis/sensitivity.html#Adjoint-Sensitivity-Analysis-1 . You can also take a look at how the @grad was written to make this work with Flux.jl: https://github.com/JuliaDiffEq/DiffEqFlux.jl/blob/master/src/Flux/layers.jl#L60-L93. Basically you just

sol = solve(_prob,args...;save_start=true,kwargs...)

and then, using that solution, call

    du0, dp = adjoint_sensitivities_u0(sol,args...,df,ts;
                    sensealg=sensealg,
                    kwargs...)

to get the gradients with respect to u0 and the parameters. Then work that into whatever backpass code you need. The DiffEqFlux code shows how to work it into a code which is expecting an array as output. If you just add the DE. in front of everything it should work in DiffEqPy.

1 Like