Multilayer Perceptron (MLP
) is a class of Feedforward Neural network
Multilayer Perceptrons consists of at least three layers of nodes. Except for the input nodes, each node is a neuron that uses a nonlinear Activation Function.
Multilayer Perceptrons utilizes a supervised Learning technique called backpropagation for training. Multilayer Perceptrons multiple layers and non-linear activation distinguish MLP from a linear perceptron. Multilayer Perceptrons can distinguish data that is not linearly separable.
Multilayer Perceptron are sometimes colloquially referred to as "vanilla" Artificial Neural networks, especially when they have a single hidden Layer
There might be more information for this subject on one of the following: