 How to solve bad initial energy (mixture of von mises distribution)

Hello, I’m a student majoring biology in Japan.
I am very new to statistics and python so this might be too basic question…

I am trying to fit my experimental data(~200,000 angles, 0~+360, maybe mu=0(360) and 180, bimodal) with mixture of von mises distribution.
I want to predict the parameters(mu, kappa) and the number of components.

I can’t figure out how to git rid of bad initial energy.
Is this caused by inappropriate prior distibutions?

The following is my code.

``````import pymc3 as pm
import pymc3.distributions.transforms as tr

with pm.Model() as model:
# Use non informative distribution as prior
mu_1 = pm.Uniform('mu_1', 0, 360)
kappa_1 = pm.Uniform('kappa', 0, 10)
mu_2 = pm.Uniform('mu_2', 0, 360)
kappa_2 = pm.Uniform('kappa_2', 0, 10)

component = pm.VonMises.dist(mu=mu_1, kappa=kappa_1)
component1 = pm.VonMises.dist(mu=mu_2, kappa=kappa_2)

# weight of each distribution?
w = pm.Dirichlet('w', np.ones_like([1, 1]), shape=2)
vm = pm.Mixture('vm', w=w, comp_dists=[component, component1],
transform=tr.circular, observed=mydata)

print(model.check_test_point())

with model:
``````

I get following return.

``````Auto-assigning NUTS sampler...
mu_1_interval__      -1.39
kappa_interval__     -1.39
mu_2_interval__      -1.39
kappa_2_interval__   -1.39
w_stickbreaking__    -1.39
vm                     NaN
Name: Log-probability of test_point, dtype: float64
Sequential sampling (2 chains in 1 job)
NUTS: [w, kappa_2, mu_2, kappa, mu_1]
0%|          | 0/7500 [00:00<?, ?it/s]
``````

From the other questions, I’m thinking that

``````vm NaN
``````

is my problem.

It might worked by converting original data to radian.

You do need to make sure the observed is in the right domain.

Also, Von mises mixture is really difficult to inference. I have this small notebook you might find helpful:

Basically, unless you have a lot of data, it is difficult to estimate the mixture weight.

I appreciate your kind response. (I was just watching your video to study.)
The notebook helps me a lot.

Running MCMC sampling worked by fixing my data.
Convergence of posterior distribution and estimation of parameters was not bad.

However, I came across to another problem.
I’m using google colab as my python environment but it always crashes when calculating pm.waic or pm.loo.
I believe this is due to RAM shortage.

In general, reducing the data by random sampling or using better PC with larger RAM are the only options to solve this kind of problems?

Unfortunately yes. Btw if you are using colab you can also connect to your local kernel, in most cases it should have more memory than the free colab kernel.

Using local kernel worked in my case.
I have never been conscious about the RAM limit…
I really appreciate it.

1 Like