That is very correct, also main reason why we have so much headache of shape etc issue right now.
It will be correct, consider a subgraph in a Bayesian Graph (i.e., Markov blanket):
… → A ~ distribution_1 → B ~ distribution_3(A, C) → …
C ~ distribution_2 -------------/
for log_prob computation, you replace A in the graph so that:
… + distribution_1.log_prob(input_A) + distribution_3(input_A, input_C).log_prob(…) + …
As you can see, A is replace by user/programmatic input in both distribution and value, thus as long as it is transform to the proper constrained space that defined by distribution_1 and distribution_2, it is valid. Note how it is different than in the forward computation that distribution_1 does not depending on the value of A, whereas in the inverse (log_prob) computation it does depending on it.