@jessegrabowski The discussion in the golf question is helpful for my general understanding, but there’s a very significant difference in how I’m identifying success. The problem I’m seeing is that I don’t care that much about individual coordinates being correct if they aren’t contextualized in a way such as Intersection Over Union. For example, if I count how many coordinates are correctly classified as burning or not, by simply extending the geographical area I’m considering, the model will drastically improve in performance because every new coordinate will just be non-burning. It would be ideal to still be able to use IoU in the data assimilation process, but I don’t understand how to do it. I’ve tried passing the IoUs to the log pdf of the Beta distribution, taking the sum, and using that as a likelihood factor with pm.Potential(). The code runs, but it can’t seem to find any acceptable samples. Is this likelihood just too non-smooth and creates a distribution that’s too difficult to sample?