Metropolis sampler is kind of the only thing “works” out of the box now. You can also try SMC, but that will require some twisting. My suggestion to this kind of model (calling external function and gradient not available) is to try likelihood-free inference, e.g., http://elfi.readthedocs.io/en/latest/
The problem is that in high-dimension, random walk MC is highly inefficient: the sampler often appears to stuck, as all the proposal are rejected. This is the problem you see initially, but often happen also using Metropolis. You can force the sampler to take smaller steps, but that would result in high autocorrelation and low effective samples. It will explore the posterior space so slow that you will have bias result in finite time.
So you can keep using Metropolis for these model, but be very careful and always validate your model using simulation data (where you know the true parameters) and check the resulting fit using posterior prediction check.