For CNN, if the previous layer outputs the characteristic graph of H W C, the fully connected layer consists of n convolution kernels, the dimension of which is H W C, and n values are output.
CNN extracts local features, and the role of the full connection layer is to integrate these local features, integrate the feature map into a value, and improve the position invariance.
Here, a kitten is used to explain why a fully connected layer is needed, which is quite clear.
The number of parameters increases sharply, and the parameters of the full connection layer are the most in the network.
Because the size of neurons in the fully connected layer is determined in the training process, the size of the input image will be limited.
1 1 convolution, as the name implies, the size of convolution kernel is 1 1, and the number of channels is the same as the number of input characteristic channels, but the number of convolution kernels can be changed, so as to realize dimension reduction and upgrade.
The number of channels of convolution kernel is the same as the number of input characteristic channels.
The number of convolution kernels determines the number of channels to output features.
Fully connected layers will destroy the original spatial information of features and flatten them for further processing.
After 1* 1 convolution, the resolution of the feature has not changed, but the number of channels has changed.
-Do not change the size of the feature map, and there is no limit to the input size.
-The number of parameters is greatly reduced. The dimension can be reduced first and then increased, and the same function as other convolution operations can be realized while reducing the amount of calculation.
Add the pixel values of each channel of the feature map to get an average value, and use this value to represent the corresponding feature map.
Output results are sent directly to softmax.
-Reduce the number of parameters and replace fully connected layers.
-Reduce over-fitting.
-Give each channel the actual category meaning directly.
Reference 1
Reference 2
A summary of ning's papers
That's okay.