Data before entering activation function

I was studying the case of BNN in documentation (

I am not sure if I missed something… input data is suggested to use skl.scalar for normalization, but before the data enter act_1 , the values of, weights_in_1) are also possible outside the functional range of x. Is it not suggested to transform again before the data entering tanh(x) ?

Not sure I understand what you mean by

As long as the final output is within the range of the observation, your operation is valid (in terms of mapping the input to output). Intermedia output does not need to match the same support (range of value) as either the input or the output.

Hm…I just happened to have a look of the values of, weights_in_1) _ then found that many of them reach 1 or -1. In this case, I am not sure if the training is still meaningful…