MCMC vs. VI in Handling Problematic Priors

Hi All,

I recently handling both MCMC and variational inference (VI) to estimate posterior distributions in my model. And, I just have a question to ask your opinions about it.

In general, MCMC outperform VI in terms of more accurate estimation of our posterior distribution. However, when I use a problematic priors such as Gamma(0.001,0.001) which is very flat but has very high probability over small ranges, MCMC (e.g., nuts) may not work well and give me some divergence problems because the shape of the prior is very difficult for sampling. This means my MCMC could not explore the posterior space sufficiently and leads to unreliable posterior estimation. If I can run the MCMC for a long long time, it may give me true posterior distributions; however, it may not be feasible in some cases.

In this case, I feel like VI may be more efficiently consider the region that MCMC cannot explore well because it starts with continuous functions and optimizes functional shapes. May I ask you guys if this is a reasonable thought? Or, still in this case, will MCMC give us more accurate posterior estimation?

Thanks,

Jay