Custom sampler getting overriden by CompoundStep for discrete parameters

Hi,

I’m implementing a custom sampler to perform RJMCMC.
In the sampler’s current design, it steps across both discrete and continous parameters.
However when calling


step = RJMCMC(delta_parameters, configurations_to_subspaces, jumps)
trace = pm.sample(ndraws, tune = nburn, step = step, discard_tuned_samples = True, cores=1)

It seems to be instantiating BinaryGibbsMetropolis on these discrete parameters, despite these being in the RJMCMC.vars parameter…

Only 10 samples in chain.
Sequential sampling (2 chains in 1 job)
CompoundStep
>RJMCMC: [k_1_0, k_0_1, delta_1_0, k_1_0, delta_0_1, k_0_1]
>BinaryGibbsMetropolis: [delta_0_1, delta_1_0]

Is there something special I need to set somewhere to prevent this from happening ?

src for RJMCMC stepper:


class RJMCMC:
    """
    Largely based on the structure of CompoundStep
    """
    def __init__(self, delta_variables, configurations_to_subpaces, jumps):
        """
        Arguments:
            - delta_variables: [delta_1, ..., delta_m] ordered collection of marker variables
            - configurations_to_subspaces: {0bxxxx: {theta_1, ..., theta_n}} mapping between configuration numbers and model variables
                                            ex: 0b10 -> delta_1 = 1, delta_2 = 0 (The order is as specified in the delta_variables parameter)
            - jumps: {0bxxxx: {0byyy: step_function }} double dictionary refering to the step function that maps between the configuration spaces
                                        
        """
        self.delta_variables = delta_variables
        self.configurations_to_subspaces = configurations_to_subpaces
        self.jumps = jumps

        # Create the intra subspace stepper functions
        self.intraspace_steppers = {config:pm.NUTS(list(subspace)) for config, subspace in configurations_to_subpaces.items()}

        # We need to refer to all the steppers that in the collections
        self.methods = [x for y in self.jumps.values() for x in y.values()] + [x for x in self.intraspace_steppers.values()]

        # Determine if we generate states (from CompoundStep)
        self.generates_stats = any(method.generates_stats for method in self.methods)
        self.stats_dtypes = []
        for method in self.methods:
            if method.generates_stats:
                self.stats_dtypes.extend(method.stats_dtypes)

    def step(self, point):

        jumping_probability = 0.1
        # We need to randomely select a move type and (intra or inter)
        # Then proceed to simply stepping in that space
        # Since spending a little bit more time in each model is probably better than zig zagging
        # we'll just have a bias in that direction

        # Figure out the value of the current configuration 
        current_config = tuple(int(point[str(x)]) for x in self.delta_variables)

        if np.random.random() < jumping_probability:
            # jump
            # randomly select a new space to jump to
            next_space = random.choice(list(self.jumps[current_config].keys()))
            method = self.jumps[current_config][next_space]
        else:
            # stay in current subspace 
            method = self.intraspace_steppers[current_config]

        if self.generates_stats:
            if method.generates_stats:
                point, state = method.step(point)
            else:
                point = method.step(point)

            return point, state
        else:
            point = method.step(point)

            return point

    def warnings(self):
        """From CompoundStep"""
        warns = []
        for method in self.methods:
            if hasattr(method, "warnings"):
                warns.extend(method.warnings())
        return warns

    def stop_tuning(self):
        """From CompoundStep"""
        for method in self.methods:
            method.stop_tuning()

    def reset_tuning(self):
        """From CompoundStep"""
        for method in self.methods:
            if hasattr(method, "reset_tuning"):
                method.reset_tuning()

    @property
    def vars(self):
        # TODO check if this needs to be properly ordered or something for some sort of guarantee
        return list({var for method in self.methods for var in method.vars})

I believe you need to give your sampler object an attribute .vars that stores the variables that it’s assigned to. Try adding self.vars = delta_variables in the init method definition.

1 Like

But I have the @property vars declared which returns [k_1_0_interval__, delta_0_1, k_0_1, k_1_0, k_0_1_interval__, delta_1_0] I’m not sure why the transformed rv are there either… but the discrete parameters delta_1_0 and delta_0_1 are both included.

I tried changing the property to an initialisation in __init__ but same problem.

Ah I found my mistake, I was adding in RVs in the self.vars instead of the value_vars.