This is a group of related models used to produce word embeddings. Word embeddings are vector representations of words that provide substantial improvements over older bag-of-word encoding schemes in terms of model performance and handling of textual data.