Overfitting – When a model or strategy is so finely tuned to historical data that it fails to generalize to new data. In practice, an overfitted trading robot shows excellent backtest results but poor live or out-of-sample performance. It has effectively “memorized” the noise in the training data. In ML terms, overfitting means the model predicts well on the training set but poorly on unseen data. Preventing overfitting requires cross-validation, regularization, and keeping models simple relative to the amount of data.