Partsworth estimation causes "bad energy" warning on some observations

Hey Alex,

Thank you for your quick reply! I really appreciate it!

To give you an idea of the data, here is a small part:

in the rows are the different tasks, and in the columns the answer of each participant to each task (15 Tasks in total).

There are no “zeros” in the dataset, but Pandas returns me “objects” as dtypes:

but visually the corresponding (& troublesome) participant (530) seems to be an integer.

(I also tried to send the “selection” as int

observed = selection[na_mask.values].astype(int).values)

to the sampler. Did not work either.

The Answers of the participants are not aggregated (the number indicates which product they have selected in the task, e.g. 1 = product one; 5 = product five, and so on).

I tried your “pm.Multinomial” suggestion with:

temp = pm.Multinomial("Obs_%s" % selection.name,
                              n = finalSelection.shape[0],
                              p = softmax,
                              observed=selection[nan_mask.values].values)

but it raises another error message during compilation:

The shape[1] = 20 can only be the alternative choices within the task (I am showing twenty product configurations within one “shelf”. The participant has to choose one out of the 20 alternatives).

When I am deleting the “.T” behind

stack = tt.stack(to_be_stacked, axis = 0)

it surprisingly compiles, but all participants are “-inf”.

Any ideas?

Offtopic question:
Is there somewhere a good overview and explanation of the different probability distributions? If I understood it correctly, choosing the appropriate distribution is key in bayesian inference statistics. In psychology on the other hand you normally only get to know & use the gaussian normal distribution…

Thank you for your help,
Daniel