Custom update is automatically replaced by ADAM update?

So I’ve (locally) added an implementation of Noisy Natural Gradient in pm.variational.updates which I’ve termed adanoise():

def adanoise(loss_or_grads=None, params=None, datasize=None, learning_rate=0.002, beta1=0.9,
               beta2=0.999, epsilon=1e-8, kl_lambda=1., gammax=0.1, gammah=0.1):
    """\
    Noisy Adam updates implemented as in [Zhang2018 - arXiv:1712.02390.v2]; 
    equations (7) and (8) 
...

However when I attempt to use this via

advi = pm.ADVI(beta=1e-4, start=start_dict)
    tracker = pm.callbacks.Tracker(
      mean=advi.approx.mean.eval,  # callable that returns mean
      std=advi.approx.std.eval  # callable that returns std
    )
    advi_fit = advi.fit(N_FIT, callbacks=[tracker], 
                       obj_optimizer=pm.variational.updates.adanoise(datasize=X_train.shape[0], 
                                                         kl_lambda=1e-4, 
                                                         learning_rate=2e-3),
                       obj_n_mc=1)

somehow the step function being called is still adam() (?!)

--> 257                 more_obj_params))
    258         if self.op.returns_loss:
    259             updates.loss = obj_target

TypeError: adam() got an unexpected keyword argument 'datasize'

What exactly is going on here?

Paging @ferrine for some help

Hi, I’ll look into today or tomorrow

From initial glance I can’t find where does Adam come from. Is it possible to see sources?

Ah you’re right. This was my fault. I was focused on the body of the algorithm, but the first line is

if loss_or_grads is None and params is None:
        return partial(adam, **_get_call_kwargs(locals()))

instead of

if loss_or_grads is None and params is None:
        return partial(adanoise, **_get_call_kwargs(locals()))

And – of course – because I was looking for adam\(, I never found this line with grep.