HomeGlossaryRecurrent Neural Network
Models & ArchitectureRNN

Recurrent Neural Network

A neural network architecture with connections that loop back, allowing it to process sequences and maintain memory of past inputs.

Recurrent Neural Networks (RNNs) process sequences one step at a time, maintaining a hidden state that carries information from previous steps. This made them the default choice for language, speech, and time-series tasks before transformers arrived.

Standard RNNs struggle with long sequences due to vanishing gradients. Variants like LSTM and GRU added gating mechanisms to preserve long-range information.

Historical significance: RNNs powered early neural machine translation, speech recognition, and text generation systems before being largely replaced by transformers.

Today RNNs are mostly a legacy architecture in NLP, but they're still used in some low-resource or streaming applications where their sequential computation is an advantage.

Related Terms

← Back to Glossary