# Question on Aesara tutorials

Hello all, I’m still trying to wrap my head around these Ops, but looking at the example provided by Aesara, I have a few questions hopefully someone can answer:

In the following example it seems like x is the only variable and so i thought grads() would return [a * output_grads[0]] and not [a * output_grads[0]+b]

also If i wanted a,x,b to be variables would grads() return

``````import aesara
from aesara.graph.op import Op
from aesara.graph.basic import Apply

class AXPBOp(Op):
"""
This creates an Op that takes x to a*x+b.
"""
__props__ = ("a", "b")

def __init__(self, a, b):
self.a = a
self.b = b
super().__init__()

def make_node(self, x):
x = aesara.tensor.as_tensor_variable(x)
return Apply(self, [x], [x.type()])

def perform(self, node, inputs, output_storage):
x = inputs[0]
z = output_storage[0]
z[0] = self.a * x + self.b

def infer_shape(self, fgraph, node, i0_shapes):
return i0_shapes

return [a * output_grads[0] + b]
``````

Note the missing self in your original call. Also the constant b is dropped when you take the derivative.

If the input order is (x, a, b) I think you are correct

1 Like

Oh I only realized now that your example is in the docs! Do you mind opening an issue in our repo? GitHub - aesara-devs/aesara: Aesara is a Python library that allows one to define, optimize, and efficiently evaluate mathematical expressions involving multi-dimensional arrays.

@ricardoV94

Thank you for the response!

That clears things up.

I’ll open up an issue on the aesara repo.

That helped me,too. Thank you!