Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Does anyone know the provenance for when vectors started to be called embeddings?


In an NLP context, earliest I could find was ICML 2008:

http://machinelearning.org/archive/icml2008/papers/391.pdf

I'm sure there are earlier instances, though - the strict mathematical definition of embedding has surely been around for a lot longer.

(interestingly, the word2vec papers don't use the term either, so I guess it didn't enter "common" usage until the mid-late 2010s)


I think it was due to GloVe embeddings back then: I don't recall them ever being called GloVe vectors, although the "Ve" does stand for vector so it could have been RAS syndrome.


>> https://nlp.stanford.edu/projects/glove/

A quick scan of the project website yields zero uses of 'embedding' and 23 of 'vector'


It's how I remember it when I was working with them back in the day (word embeddings): I could be wrong.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: