In machine learning, gradient descent is an optimization technique used to adjust a model’s parameters by iteratively moving in the direction of the steepest decrease in its loss function. For example, in linear regression, gradient descent refines the slope and intercept of the best-fit line to minimize prediction errors.
Gradient Descent
Please Share This Share this content
« Back to Glossary Index