Fully connected layers
The layers of neurons that make up the ANNs that we saw earlier are commonly called densely connected layers, or fully connected (FC) layers or simply just linear layers. Some deep learning libraries such as Caffe would actually consider them just as the dot product operation that might or might not be followed by a nonlinearity layer. Its main parameter will be the output size, which will be basically the number of neurons in its output.
In Chapter 1, Setup and Introduction to TensorFlow, we created our own dense layer, but you can create it in an easier way using tf.layers, as follows:
dense_layer = tf.layers.dense(inputs=some_input_layer, units=1024, activation=tf.nn.relu)
Here, we defined a fully connected layer with 1,024 outputs, and it will be followed by a ReLU activation.
It is important to note that the input of this layer has to have just two dimensions, so if your input is a spatial tensor for example an image of shape [28*28*3] you will have to reshape it into a vector before inputting it :
reshaped_input_to_dense_layer = tf.reshape(spatial_tensor_in, [-1, 28 * 28 * 3])