I just finished an overnight run with ten different models I want to compare. I created a dictionary then saved the trace and model as a new entry into the dictionary. I pickle these at the end and save to my server. I then transfer to my local machine. My issue is that my dictionary is over 10GB. Is there a convenient way to reduce the memory of traces? Like saving them as 16 or 32 bit floats instead of 64? Any other hints on reducing the memory of traces?
How many samples you are taking? Depending on the effective sample size you are aiming for, reduce the sample size is the first thing I would do.
I was taking 2000 samples via NUTS for each of 12 models. I should get good n_eff since I am not estimating hyperparameters. I reduced in half and am running for only 9 models. These are just preliminary to estimate sensitivity to hyperparameters. So hopefully I’ll have n_eff near 1000 and that should be good enough. I wonder if I need to save all parameters to get WAIC estimates. I probably don’t need the deterministic ones as those are redundant.
Yes, if you don’t wrap your intermediate deterministic variables in
Deterministic objects, they will not be saved.