This document provides an overview of recurrent neural network (RNN) models including long short-term memory (LSTM) networks and sequence-to-sequence (seq-2-seq) models. RNNs maintain information about previous computations through feedback connections, making them well-suited for sequence processing tasks. LSTMs address the gradient vanishing problem of standard RNNs through gated cell states. Seq-2-seq models consist of an encoder RNN that encodes the input sequence into a vector, and a decoder RNN that generates the output sequence from the vector. The document includes a TensorFlow code example of an RNN trained to predict the next character in a sequence.