Rectified Linear Unit

« Back to Glossary Index

A unit in a neural network that uses the rectifier function as its activation function. The ReLU function outputs the input directly if it is positive, and zero if it is negative. It is commonly used in deep learning models due to its simplicity and effectiveness in helping networks converge faster and mitigate the vanishing gradient problem.