I have a model that implements a set of Latent Gaussian Processes. It runs fine for small datasets (I run it in parallel with several cores) but when I start increasing their size the model does not start. There is simply no output and it stays consuming 100% of a single CPU core. I am using ADVI to do the fitting.
What could be happening?
How much data are you running this on? GPs do not scale too well with data size.
Thank you for the feedback. It uses several (could reach 2,000) smallish GPs (<300 points).
The strange thing is that I do not receive any error, it simply hangs. And even stranger I manage to run it on a desktop machine with 8 cores but not on a server with 56 cores. Is there any way to debug it? It seems to be something in the model building (?), because it does not even reach fitting (progress bar does not appear). Also, it does not use a significant amount of memory or CPU, which is strange for me. Any ideas?