# Bayesian linear regression with SVGD

Hi, I am using stein variational gradient descent to solve a simple Bayesian linear regression problem, but the result seems to have some problems for the variance parameter. My code does not use pymc3, and can anyone provide a sample code in pymc3 for this problem so I can verify whether this is my own mistake or a limitation of the SVGD algorithm.

Hi, I’m not familiar with SVGD myself, but have you checked out the examples or tutorials from our gallery?

tutorials_notebooks — PyMC3 3.10.0 documentation section Tutorials/Basics/Variational API Quickstart
examples_notebooks — PyMC3 3.10.0 documentation section Examples/Variational Inference

Hi michael, I have checked these exampes and I got the following ADVI example, but it cannot work

``````import matplotlib.pyplot as plt
import numpy as np
import pymc3 as pm
from pymc3 import *
import pandas as pd
import theano

size = 200
true_intercept = 1
true_slope = 2

x = np.linspace(0, 1, size)

# y = a + b*x
true_regression_line = true_intercept + true_slope * x
y = true_regression_line + np.random.normal(scale=0.5, size=size)

data = dict(x=x, y=y)

# show the data
fig = plt.figure(figsize=(7, 7))
ax = fig.add_subplot(111, xlabel="x", ylabel="y", title="Generated data and underlying model")
ax.plot(x, y, "x", label="sampled data")
ax.plot(x, true_regression_line, label="true regression line", lw=2.0)
plt.legend(loc=0)
plt.show()

with Model() as model:  # model specifications in PyMC3 are wrapped in a with-statement
# Define priors
sigma = HalfNormal("sigma", sd=1)
intercept = Normal("Intercept", 0, sd=1)
x_coeff = Normal("x", 0, sd=1)

# Define likelihood
likelihood = Normal("y", mu=intercept + x_coeff * x, sd=sigma, observed=y)

# Inference!
trace = sample(3000, cores=2)  # draw 3000 posterior samples using NUTS sampling

# traceplot(trace)

Edit: For some reason the image is shown with black background hiding the labels. From left to right, these are the posteriors for `Intercept`, `x`, and `sigma`.