@ricardoV94 That seems easy to implement; ill try it out. Do maybe have other suggestions @junpenglao?
step1=pm.Slice([post_from_regresion_step1])##Custom stochastic vairable
step2=pm.NUTS([sigma_likelihood,mu])##free vairables
trace = pm.sample(draws=1*10**4,step=[step1,step2])
From the PyMC docs for Compound Steps: "The concern with mixing discrete and continuous sampling is that the change in discrete parameters will affect the continuous distribution’s geometry so that the adaptation (i.e., the tuned mass matrix and step size) may be inappropriate for the Hamiltonian Monte Carlo sampling. HMC/NUTS is hypersensitive to its tuning parameters (mass matrix and step size). Another issue is that we also don’t know how many iterations we have to run to get a decent sample when the discrete parameters change. Though it hasn’t been fully evaluated, it seems that if the discrete parameter is in low dimensions (e.g., 2-class mixture models, outlier detection with explicit discrete labeling), the mixing of discrete sampling with HMC/NUTS works OK. However, it is much less efficient than marginalizing out the discrete parameters. "