Advi_minibatch is deprecated?

Hi,
I used your sample for Bayesian Neural Network with lasagne and for the function

def run_advi(likelihood, advi_iters=50000):
    # Train on train data
    input_var.set_value(X_train[:500, ...])
    target_var.set_value(y_train[:500, ...])
    
    v_params = pm.variational.advi_minibatch(
        n=advi_iters, minibatch_tensors=minibatch_tensors, 
        minibatch_RVs=[likelihood], minibatches=minibatches, 
        total_size=total_size, learning_rate=1e-2, epsilon=1.0
    )
    trace = pm.variational.sample_vp(v_params, draws=500)
    
    # Predict on test data
    input_var.set_value(X_test)
    target_var.set_value(y_test)
    
    ppc = pm.sample_ppc(trace, samples=100)
    y_pred = mode(ppc['out'], axis=0).mode[0, :]
    
    return v_params, trace, ppc, y_pred

I get the following error:

AttributeError: module ‘pymc3.variational’ has no attribute ‘advi_minibatch’

PYMC3 version: 3.5

Is the method deprecated?
What would I use now?

Please see up-to-date example in http://docs.pymc.io/notebooks/bayesian_neural_network_advi.html

Thanks for your reply.

I tried:

minibatch_x = pm.Minibatch(X_train, batch_size=500, dtype='float64')
minibatch_y = pm.Minibatch(y_train, batch_size=500, dtype='float64')

with neural_network:
    inference = pm.ADVI()
    approx = pm.fit(400, more_replacements={input_var: minibatch_x, target_var:minibatch_y}, method=inference)

Even though I can now fit the model, when I try to sample I get:

ValueError: rng_mrg cpu-implementation does not support more than (2**31 -1) samples

I think the dimension of your random variables is too large, what is the shape of your pymc3 RVs when you define them?

The model I am using is the same as the one in the example:

def build_ann(init, input_var, target_var):
    l_in = lasagne.layers.InputLayer(shape=(None, 1, 28, 28),
                                     input_var=input_var)
    # Add a fully-connected layer of 800 units, using the linear rectifier, and
    # initializing weights with Glorot's scheme (which is the default anyway):
    with pm.Model() as neural_network:
        n_hid1 = 800
        l_hid1 = lasagne.layers.DenseLayer(
            l_in, num_units=n_hid1,
            nonlinearity=lasagne.nonlinearities.tanh,
            b=init,
            W=init
        )
        n_hid2 = 800
        # Another 800-unit layer:
        l_hid2 = lasagne.layers.DenseLayer(
            l_hid1, num_units=n_hid2,
            nonlinearity=lasagne.nonlinearities.tanh,
            b=init,
            W=init
        )
        # Finally, we'll add the fully-connected output layer, of 10 softmax units:
        l_out = lasagne.layers.DenseLayer(
            l_hid2, num_units=10,
            nonlinearity=lasagne.nonlinearities.softmax,
            b=init,
            W=init
        )
        prediction = lasagne.layers.get_output(l_out)
        # 10 discrete output classes -> pymc3 categorical distribution
        out = pm.Categorical('out', 
                            prediction,
                            observed=target_var)
    return neural_network

And I am sampling from it using:

trace = approx.sample(draws = 500)