Neural Network Components
Neural networks are made of connected layers of Artificial Neurons.
These layers of neurons are coupled by weighted connections which are adjusted during the learning process in a way to minimize the prediction error of the network, using error gradient Backpropagation, an optimization method, that has been published in 1986 by Geoffrey Hinton et. al.
A good way to build neural networks is to define the following basic elements, which make it easy to create neural networks of different structure and arbitrary complexity:
Connect layer
Implement sums of weighted connections.
- Dense connected layer
- Convolution layer
- UpConvolution layer
- Pooling layer
Function layer
Implement neuron activation functions.
Loss layer
Implement error loss functions.
- RMS loss (= L2 Norm loss)
- L1 Norm loss
- Cross Entropy loss
- Binary Cross Entropy loss
- Kullback-Leibler loss
Complex layer
- Sequential layer
- Inception layer
- Latent and Sample layer
- LSTM layer
- GRU layer
Optimizer
Adjust the network parameters during learning.
- An excellent overview by Sebastian Ruder
- Stochastic gradient descent
- RMSprop
- Adagrad
- Adadelta
- Adam