So, just to properly answer you question. The problem is the name assigned to the LKJ prior. When you use init_pts = model.initial_point(), you will get the internal name.
init_pts = model.initial_point()
init_pts
Out[39]: {'chol_cholesky-cov-packed__': array([0., 0., 0.]), 'mu': array([0., 0.])}
Though the documents state that the names of transformed variables should be used, it actually works with the name of the assigned variable:
init_pts = {'chol': np.array([1, 1, 1]), 'mu': np.array([1, 1])} ##note I use 1 as initial value, as with 0 it may break
with model:
idata = pm.sample(initvals=init_pts,
idata_kwargs={"dims": {"chol_stds": ["axis"],
"chol_corr": ["axis", "axis_bis"]}},
)
Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (4 chains in 4 jobs)
NUTS: [chol, mu]
|████████████| 100.00% [8000/8000 00:54<00:00 Sampling 4 chains, 0 divergences]Sampling 4 chains for 1_000 tune and 1_000 draw iterations (4_000 + 4_000 draws total) took 74 seconds.