What Are word embeddings in NLP?

490    Asked by TristaBrigman in Data Science , Asked on Dec 23, 2019
Answered by Trista Brigman

Word embeddings are basically a representation of text used to let a machine understand what a human being wants to convey and they are represented in n-dimensional space. Word embeddings algorithm converts plain text into numbers as most of the machine learning or deep learning algorithms find difficult to understand and decode the text. Word embeddings are basically of two types

  1. Frequency based embedding
  2. Prediction based embedding




Your Answer

Interviews

Parent Categories