A type of neural network designed to process sequential data by maintaining a hidden state that captures information from previous time steps. In RNNs, output from the previous step influences the next output, making them suitable for time series. Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) are popular RNN variants that mitigate training issues. In forex forecasting, RNNs (especially LSTMs) are widely used to capture dependencies over time: they can remember past price patterns and use that memory to inform predictions. This makes RNNs a common choice for sequence modeling in automated trading strategies.