What is the difference between word2vec and glove?
They are both used to train a word embedding. They both provide the same core output: a vector per word, with the vectors in a useful arrangement - with relative distances/directions that roughly correspond with our ideas of overall word relatedness, and even relatedness along certain salient semantic dimensions.
Word2Vec does incremental, 'sparse' training of a neural network, by repeatedly iterating over a training corpus.
GloVe works to fit vectors to model a giant word co-occurrence matrix built from the corpus.
Working from the same corpus, creating word-vectors of the same dimensionality, and devoting the same attention to meta-optimizations, the quality of their resulting word-vectors will be roughly similar.
Word2Vec works on the following process