a = f(x) = max(0,x)
Rectified Linear Unit Activation Function was first introduced to a dynamical network by Hahnloser et al. in a 2000 paper in Nature with strong biological motivations and mathematical justifications.
Rectified Linear Unit has been used in convolutional networks[3] more effectively than the widely used logistic sigmoid function (which is inspired by probability theory and its more practical counterpart, the hyperbolic tangent. The rectifier is, as of 2017, the most popular activation function for Deep Neural networks.
A Artificial Neuron employing the rectifier is also called a Rectified Linear Unit.
A smooth approximation to the rectifier is the analytic function
def relu(x, derivative=False): if (derivative == True): for i in range(0, len(x)): for k in range(len(x[i])): if x[i][k] > 0: x[i][k] = 1 else: x[i][k] = 0 return x for i in range(0, len(x)): for k in range(0, len(x[i])): if x[i][k] > 0: pass # do nothing since it would be effectively replacing x with x else: x[i][k] = 0 return x