Can we use ReLU activation function as the output layer's non-linearity?
-
01-11-2019 - |
题
I have trained a model with linear activation function for the last dense layer, but I have a constraint that forbids negative values for the target which is a continuous positive value.
Can I use ReLU as the activation of the output layer? I am afraid of trying, since it is generally used in hidden layers as a rectifier. I'm using Keras.
没有正确的解决方案