**Setup**

I have a Potts model with spins s_i \in \{1, ..., q\}.:

P(s_1, ..., s_n) \sim \exp(\sum_i h_i(s_i) + \sum_{i < j} J_{ij}(s_i, s_j)),

where P denotes the probability of finding the configuration (s_1, ... s_n). I am given fixed values for h_i(s_i) for each s_i and J_{ij}(s_i, s_j) for each pair (s_i, s_j) (so I am *not* doing Bayesian inference to learn s_i or J_{ij}).

**What I would like to do**

Start from a random initialization point, then run a bunch of Monte Carlo steps to equilibrate it, to get a configuration C with high probability. Then starting from that one configuration C, run many independent Monte Carlo trajectories (all starting from C) (each for a certain number of time steps).

My main question is: how can I do this in pymc?

**Notes**

- I understand that I am not using most of the machinery of pymc (i.e., Bayesian inference). I have a probability distribution that I want to sample from using Markov Chain Monte Carlo, all from the same starting point C. I want to use pymc instead of writing it myself because pymc includes optimized versions of HMC, NUTS, etc. that would take me a long time to write myself.
- I looked around this forum and found this post showing that I can sample from a distribution without doing the full Bayesian inference. This is great because I have no data; I just want to sample from a given Potts distribution.
- If I have the above working for Metropolis-Hastings, what are the steps I need to get Hamiltonian Monte Carlo to work?
- One question I have is how to specify a common starting point C for a bunch of independent Monte Carlo trajectories.
- I am new to pymc so any advice you can provide will be super helpful.
- Many resources I see are about using the full power of pymc (Bayesian inference), as opposed to simply doing sampling from a specified distribution. For example, many use standard distributions like Gaussian; how can I specify instead the Potts model above with given (fixed) values for h_i and J_{ij}?
- The reason I ask question 6 is that from the pymc overview here, it talks about how one can define new functions (which I need to since there is no pre-defined Potts models, unlike the Gaussian distribution) but it says that if one defines new functions, it is not possible to compute the gradient so I cannot use HMC or NUTS samplers. I would like to not just use Metropolis-Hastings, but I would also like to use HMC/NUTS samplers.

Thank you!