jspωiki
Rectified Linear Unit

Overview#

Rectified Linear Unit (ReLU) performs a threshold operation to each element of the input, where any value less than zero is set to zero.

Rectified Linear Unit Artificial Neural networks#

In the context of Artificial Neural networks, the Rectified Linear Unit is an Activation Function defined as the positive part of its argument:
a = f(x) = max(0,x)
where x is the input to a Artificial Neuron. This is also known as a ramp function and is analogous to half-wave rectification in electrical engineering.

Rectified Linear Unit Activation Function was first introduced to a dynamical network by Hahnloser et al. in a 2000 paper in Nature with strong biological motivations and mathematical justifications.

Rectified Linear Unit has been used in convolutional networks[3] more effectively than the widely used logistic sigmoid function (which is inspired by probability theory and its more practical counterpart, the hyperbolic tangent. The rectifier is, as of 2017, the most popular activation function for Deep Neural networks.

A Artificial Neuron employing the rectifier is also called a Rectified Linear Unit.

A smooth approximation to the rectifier is the analytic function

Rectified Linear Unit in Python#

def relu(x, derivative=False):
    if (derivative == True):
        for i in range(0, len(x)):
            for k in range(len(x[i])):
                if x[i][k] > 0:
                    x[i][k] = 1
                else:
                    x[i][k] = 0
        return x
    for i in range(0, len(x)):
        for k in range(0, len(x[i])):
            if x[i][k] > 0:
                pass  # do nothing since it would be effectively replacing x with x
            else:
                x[i][k] = 0
    return x

More Information#

There might be more information for this subject on one of the following: