BERT
BERT (Bidirectional Encoder Representations from Transformers) is a Deep Learning - based language model developed and open-sourced by Google Research. It is a bidirectional, unsupervised language representation, pre-trained using a plain text corpus (Wikipedia). It makes very good word embeddings for 102 languages that we use as part of our algorithms. More here