Is there anyway to use a relu activation for a Bayesian neural network in the form that is presented on the twiecki.io/blog?
Sure, you can replace the function pm.math.tanh
with tt.nnet.relu
.
ahh!!! Thanks!
Is there anyway to use a relu activation for a Bayesian neural network in the form that is presented on the twiecki.io/blog?
Sure, you can replace the function pm.math.tanh
with tt.nnet.relu
.
ahh!!! Thanks!