Question

I have a question targeting some basics of CNN. I came across various CNN networks like AlexNet, GoogLeNet and LeNet.

I read at a lot of places that AlexNet has 3 Fully Connected layers with 4096, 4096, 1000 layers each. The layer containing 1000 nodes is the classification layer and each neuron represents the each class.

Now I came across GoogLeNet. I read about its architecture at http://cs231n.stanford.edu/slides/2017/cs231n_2017_lecture9.pdf It says that GoogLeNet has 0 FC layers. However, you do need the 1000 node layer in the end with Softmax activation for the classification task. So the final layer isn't treated as FC in this?

Also then, what is the number of FC layers in LeNet-5?

A bit confused. Any help or leads would be greatly appreciated.

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top