I am interested in using TP but it currently does not support observations (well, technically it might using kwargs, but conditional would not work correctly). I’ve been reading the Shah paper and I don’t see any problems mathematically in adding this, so it seems straightforward to add a MarginalTP. Any interest in this idea?
While investigating this I’ve come across a couple of more general questions:
- This one is probably for @bwengals, and forgive me if I’m being dense here. What is the reason for splitting processes into
LatentandMarginalversions? Why not just haveGPandTPclasses with apriormethod that can takeyas an argument to determine whether or not it is observed? Thenconditionalwould check foryand use it if it exists. This would save a fair amount of code, especially ifMarginalTPgets implemented. I understand that this may be too big of a change, but I’m just curious! - This one is in regards to the
TPandMvStudentTarguments. It seems likecov_funcandcovare misleading parameter names, since they aren’t actually the covariances of the distributions. Rather, they refer to theSigmaparameter ofMvStudentT, which is related to the covariance bynu * Sigma / (nu-2). Thus, here, for example, it looks like apples and oranges are being compared when using the samecov_funcin a GP and TP. Can these arguments be changed so that they really are covariances, or at least be better documented so that people know that they aren’t actually what they seem to be?