The first approach produces a Python list (which is probably not what you want). I’m not sure exact what the second does, but the Python list ([1]) probably doesn’t play well with the tensor. @ricardoV94 would know more, but you might pack the constant value into a pm.ConstantData object? See here and here. Just a guess.
I think the issue isn’t with concatenating; it’s the flat prior. Can you tell me what error you get? Also, in practice, a very wide normal or a uniform prior typically works just as well. Outside of some unusual circumstances, it’s generally a good idea to avoid using flat priors.
I believe your code works well for concatenating with other priors. For reference, here’s an example colab and gist showing the syntax for doing the concatenation in 1 and 2 dimensions.