Downside Bias in the Stochastic Volatility Model?

I have a question regarding an observation I made when trying out the stochastic volatility model (described here). In short, I’m not sure if what I’m observing is to be expected, or if it could be a symptom of convergence issues. My notebook can be found here.

What I’ve tried to do is to use the posterior samples from the stochastic volatility model to forecast the price of index with bands for credible intervals. At first, my result seemed reasonable:

After splitting into training and testing data, the result still seemed pretty reasonable (given that the pandemic is a tail event):

However, when I started splitting the data up into training and testing portions and looking at the forecast made over a longer interval of time, I noticed a strong downside bias that I can’t explain. For example:

The way I see it, this could be the result of one of three things: (1) my code that calculates the implied price series from the samples of future daily returns; (2) the model itself is able to learn the skew of the return distribution; or (3) possibly a convergence issue? I don’t believe the issue is (1) since I’ve tested it, and I don’t think it’s (3) either. However, the stochastic volatility model in question utilizes a symmetric return distribution (Student T)… so it shouldn’t be possible that the model is learning a skew in the daily return distribution?

Does anyone have any insight as to what could be going on here? Am I mistaken in thinking that this stochastic volatility model can’t learn a skewed return distribution?

1 Like

Could it be because symmetric multiplicative returns mean an expected loss? Eg 1.1 * 0.9 = 0.99 < 1


Hi Dan, thanks for pointing that out! I’ll try adding a mu parameter for the distribution’s mean and see if that changes anything (it was set to 0 originally).

For the sake of completeness of this thread, here’s the resulting model with an inferred (nonzero) mean for the Student T distribution:

The inferred mean was positive. As you can see, this makes a huge difference!