Join & EARN

FOREX ALGOS { }

Gradient descent

An optimization algorithm used to train models by iteratively adjusting parameters to minimize loss. It calculates the gradient (slope) of the loss function and steps the model parameters in the opposite direction. Stochastic Gradient Descent (SGD) and its variants (e.g. Adam) are standard for training neural networks and other models. In trading model training, gradient descent updates the weights on each epoch of historical data, seeking to reduce prediction error. For example, backpropagation in a neural network is driven by gradient descent on the loss between predicted and actual forex prices.