Your History

Menu

Rectified Linear Unit

Description

Rectified Linear Unit (ReLU) is an activation function, commonly used in neural networks. It zeroes negative inputs and leaves positive inputs unchanged.

\[\htmlClass{sdt-0000000051}{\sigma}(\htmlClass{sdt-0000000103}{u}) = max(0,\htmlClass{sdt-0000000103}{u})\]

Symbols Used:

This symbol represents the activation function. It maps real values to other real values in a non-linear way.

\( u \)

This symbol denotes the input of a model.

Derivation

For positive values of the input, we want to leave it as it is. However, if the input is negative, we want to change it to 0. Since positive values are larger than 0, we can define ReLU as:

\[\htmlClass{sdt-0000000051}{\sigma}(\htmlClass{sdt-0000000103}{u}) = max(0,\htmlClass{sdt-0000000103}{u})\]

This is the visualization of this function:

ReLU Plot

Example

Suppose we have an \( \htmlClass{sdt-0000000103}{u} \) with value \(5\). If we pass \( \htmlClass{sdt-0000000103}{u} \) through the ReLU function we get:

\(\htmlClass{sdt-0000000051}{\sigma}(5) = max(0,5) = 5 \).

Now suppose \( \htmlClass{sdt-0000000103}{u} \) is \(-3\), and we pass that value through the ReLU function we get:
\(\htmlClass{sdt-0000000051}{\sigma}(-3) = max(-3,0) = 0 \)