No it is not correct, as you suspect the logp is not connected to the graph, which means it is not being evaluated during sampling/inferece. Right now you are sampling from the prior. You can use potential to add the logp to the model:
with pm.Model() as model:
...
pm.Potential('logp', logp(A, B, C, D, E))
It is allowed as pymc3 will sampled with a metropolis within Gibbs sampler (ie compoundstep), but you should just call sample() as Metropolis is actually not suitable for DiscreteUniform (it would give invalid proposal that goes out of bound). More information see http://docs.pymc.io/notebooks/sampling_compound_step.html, especially the last paragraph.
Lastly, I think you can rewrite your logp into theano. There is random generator in theano, and the histogram function could be rewrite into theano as well.