Paper on phrase2vec
We don’t have a publication on the technical details of our phrase2vec sequence embedding algorithm. In short, it's like word2vec, but for multiple-word questions and commands. Each word in the sentence is mapped to a vector with GloVe, and then a bi-directional LSTM is applied to the sequence. This way, the word sequence is mapped to a 300-dimensional vector