You can not sample Variables (transformed or not) directly using transformed hyperparameters, because the forward graph (for prior and prior predictive sample) is on the origin space. Usually, the strategy is to transform the hyperparameters back to the origin space, run the forward sample, and do additional transformation if needed. But good thing is that you dont need to do all these manually in Python, with the graph nature Aesara/Theano can replace the input of the forward graph (in origin space) with a clone that is after transformed.
If you look at Planet_Sakaar_Data_Science/discourse_8528 (MUSE).ipynb at main · junpenglao/Planet_Sakaar_Data_Science · GitHub, I did
# Flatten and concat θ into 1D tensors
(replace_theta,
flatten_theta,
init_theta,
) = create_flatten_replace_var(theta_val, name='theta')
# Replace theta in original space
replace_theta_org = {}
for org_var, input_var, replace_var in zip(theta, input_theta, replace_theta):
if hasattr(input_var.tag, "transform"):
replace_theta_org[org_var] = input_var.tag.transform.backward(
replace_var, *org_var.owner.inputs)
else:
replace_theta_org[org_var] = replace_var
# Function to sample x conditioned on θ
x_clone = aesara.clone_replace(x, replace_theta_org)
sample_x: callable = aesara.function([flatten_theta], x_clone)
In this case, you have [θ, z] → x from the PyMC model and you want [θ_transformed, z_transformed] → x. replace_theta_org is a python dict of {θ:θ_transformed, z:z_transformed} that after compile, you get the new function [θ_transformed, z_transformed] → x_clone