A Reccurent Neural Network with LSTM nodes implementation for text generation, trained on Donald Trump tweets dataset using contextual labels, and can generate realistic ones from random noise.
Ability to use Bidirectional RNNs, techniques such as attention-weighting and skip-embedding.
CuDNN implementation for training the RNNs on an nVidia GPU
Figure 1: The recurrent neural network takes sequence of words as input and outputs a matrix of probability for each word from dictionary to be the next of given sequence.
The model also learns how much similarity is between words or characters and calculates the probability of each. Using that, it predicts or generates the next word or character of sequence.
Figure 2: Representative image of model architecture for the Bidirectional LSTM network