Learning Rate

« Back to Glossary Index

A scalar value used in the gradient descent algorithm during the training phase of an Artificial Neural Network. It determines the step size at each iteration by multiplying with the gradient, controlling how much the model’s weights are adjusted in response to the error. A proper learning rate is crucial for effective training, as it influences convergence speed and stability.