What is embedding machine learning?
I just met a terminology called "embedding" in a paper regarding deep learning. The context is "multimodal embedding"
My guess: embedding something is to extract some feature of sth,to form a vector.
I couldn't get the explicit meaning for this terminology and that stops me from fully understanding the author's idea and model mechanism
I checked the dictionary and searched online,but the explanation is based more on the real life meaning rather than meaning as a machine learning terminology.
And that raises a more generalised and frequently encountered question, when you find some machine learning terminology/word that you can't understand well, where can you get the solution, some specific way to google? join a machine learning group? raise a question in stack exchange?
The embedding machine learning or NLP is actually a technique mapping from words to vectors which you can do better analysis or relating, for example, "toyota" or "honda" can be hardly related in words, but in vector space it can be set to very close according to some measure, also you can strengthen the relationship of word by setting: king-man+woman = Queen. So we can set boys to (1,0) and then set girls to (-1,0) to show they are in the same dimension but the meaning is just opposite. And all nouns that just diff in gender can be parallel~ My initial guess that embedding is extracting features from something is close but not specific enough. And for my last point, when you encounter a jargon in some special area, how to quickly get the essential meaning of it, I still haven't found a very good way. Maybe a website that can explain the meaning of jargon in that area will save a lot of time for us.