Simulated Annealing equivalent in PyMC3?

I’m wondering, is there something analogous to simulated annealing in PyMC3?

The scipy algorithm is deprecated and it’s effectively the application of Metropolis-Hastings to combinatorial optimization problems. For detailed specifics, see the wikipedia entry.

generally speaking, the algorithm gets its namesake from metallurgy; a problem is “hot” in early iterations, which means large changes to parameters are accepted; as the problem “cools” only small changes to the optimization problem are accepted.

With PyMC3’s usage of HMC/NUTs, I thought it might be able to use gradient information to more efficiently explore the “reward space”. This might get rid of the random walk nature of SA and remove the need for the dynamic/shrinking proposal distribution.

As an aside: If this isn’t a thing in the PyMC3 ecosystem, does anyone see value in it? Would anyone like to collaborate with me on such a solution?


1 Like

There is Sequential Monte Carlos which works similarly (ie based on importance sampling), we dont use it to do optimization thought.
I guess the question of whether it would be useful for PyMC3 ecosystem depends on the use case: what kind of problems are uniquely suitable to use this type of algorithm?

To your question, any sort of combinatorial optimization problem is high impact for most businesses, especially as it relates to logistics. Skip down to “Specific Problems” for a quick glance at what this looks like.

I see, so it is mostly discrete models right? I think those would be indeed very useful as currently inference on discrete model is pretty limited.

1 Like

There is a dual_annealing function in scipy now.

1 Like