How does cost function change by choice of activation function (ReLU, Sigmoid, Softmax)?
-
02-11-2019 - |
题
I am new to ML and as I take courses for the area DL, I am wondering, by our choice of activation function for the last layer, whether we take sigmoid, relu or softmax, would the formula for calculation of cost function change?
I am grateful for every good reply I can get, have a nice day! :)
没有正确的解决方案