I have trained a model with linear activation function for the last dense layer, but I have a constraint that forbids negative values for the target which is a continuous positive value.

Can I use ReLU as the activation of the output layer? I am afraid of trying, since it is generally used in hidden layers as a rectifier. I'm using Keras.

没有正确的解决方案

许可以下: CC-BY-SA归因
scroll top