!!! Overview
[{$pagename}] ([ReLU]) performs a threshold operation to each element of the input, where any value less than zero is set to zero.

!! [{$pagename}] [Artificial Neural networks]
In the context of  [Artificial Neural networks], the [{$pagename}] is an [Activation Function] defined as the positive part of its argument:
%%prettify 
{{{
a = f(x) = max(0,x)
}}} 
/%
where x is the input to a [Artificial Neuron]. This is also known as a ramp function and is analogous to half-wave rectification in electrical engineering. 

[{$pagename}] [Activation Function] was first introduced to a dynamical network by Hahnloser et al. in a 2000 paper in Nature with strong biological motivations and mathematical justifications.

[{$pagename}] has been used in convolutional networks[3] more effectively than the widely used logistic [sigmoid function] (which is inspired by probability theory and its more practical counterpart, the [hyperbolic tangent]. The rectifier is, as of 2017, the most popular activation function for [Deep Neural networks].

A [Artificial Neuron] employing the rectifier is also called a [{$pagename}].

A smooth approximation to the rectifier is the analytic function

!! [{$pagename}] in [Python]

%%prettify 
{{{
def relu(x, derivative=False):
    if (derivative == True):
        for i in range(0, len(x)):
            for k in range(len(x[i])):
                if x[i][k] > 0:
                    x[i][k] = 1
                else:
                    x[i][k] = 0
        return x
    for i in range(0, len(x)):
        for k in range(0, len(x[i])):
            if x[i][k] > 0:
                pass  # do nothing since it would be effectively replacing x with x
            else:
                x[i][k] = 0
    return x
}}} 
/%

!! More Information
There might be more information for this subject on one of the following:
[{ReferringPagesPlugin before='*' after='\n' }]
----
* [#1] - [Rectifier_(neural_networks)|Wikipedia:Rectifier_(neural_networks)|target='_blank'] - based on information obtained 2017-12-10-