Unsure how to select sd depending on criteria using ADVI with mini batch

I’m trying to change the standard deviation used for my output in a bayesian neural net based on the cluster that the input belongs to, using ADVI with mini batch. However I am a little puzzled on how I’d implement this.

I know which cluster each input belongs to, I then want to select a specific sd to use for the output based on this cluster, so that I can get different uncertainties depending on the cluster.

Currently my code looks something like this:

with pm.Model() as nn:
    #weights and activation functions here
    sds = []
    for i in range(n_clusters):
        sds.append(pm.HalfNormal('sd' + str(i), sd=1), shape=(1,2))
    sds = np.array(sds)
    labels = input_var[:, 2:]
    output = pm.Normal('out', mu=act_out, sd=?, observed=target_var, total_size=10000, shape=(1,2))

This is where I’m stuck. Creating a mapping function, to map labels to sds, and then returning an array of corresponding standard deviations doesn’t work as you can’t pass an array of standard deviations, and since I don’t know the index, I can’t pinpoint which variable it should use.

Any help is greatly appreciated!

If you know the true labels, then you could use theano.tensor.stack instead of creating a numpy array of sds objects. Then you can index into that with the inverse unique indexes of labels

sds = theano.tensor.stack(sds, axis=0)
ulabels, inds = unique (labels, return_inverse=True, axis=0)
output = pm.Normal('out', mu=act_out, sd=sds[inds],...)