What Are word embeddings in NLP?
Word embeddings are basically a representation of text used to let a machine understand what a human being wants to convey and they are represented in n-dimensional space. Word embeddings algorithm converts plain text into numbers as most of the machine learning or deep learning algorithms find difficult to understand and decode the text. Word embeddings are basically of two types
- Frequency based embedding
- Prediction based embedding