←
Back
to glossary
Embeddings
Embeddings are dense, lower-dimensional representations of words, sentences, or other linguistic units in a continuous vector space, capturing semantic and contextual information. They are learned through unsupervised methods and are used in various natural language processing tasks.