Pergunta

Since if we do not declare the activation function, the default will be set as linear for Conv2D layer. Is it true to write:

model.add(Conv2D(32, kernel_size=(3, 3),
           input_shape=(380,380,1))
model.add(LeakyReLU(alpha=0.01))

I mean now by the written lines, the activation function for Conv2D layer is set as LeakyRelu or not?

Further, I want to know what is the best alpha? I couldn't find any resources analyzing it.

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
scroll top