Dense (Fully Connected) Layers Explained
Dense layers are called ‘fully connected’ because each neuron in a dense layer is connected to every neuron in the previous layer. This is in contrast to convolutional layers where neurons are connected to a small region of the input. You can learn more about convolutional layers here:
But before we proceed, we need to understand the concept of ‘flattening’ or ‘reshaping’.
In the context of CNN, flattening (or reshaping) is the process of converting a multi-dimensional tensor to a one-dimensional array.
You may also want to read my previous article about pooling layers.
Suppose we have an output from a previous pooling layer that is 11x11x98. To prepare the data for the dense layer we need to flatten this tensor into a one-dimensional vector while preserving the total number of elements. In this case, we reshape the tensor into a vector just by flattening the data as 11x11x98 = 11858. With this operation, we obtain a vector with 11858 elements that will be the…