Sébastien De Greef commited on
Commit
0d660dc
1 Parent(s): 91d4350

feat: Add additional resources on embeddings in LLMs

Browse files
Files changed (1) hide show
  1. src/llms/embeddings.qmd +4 -1
src/llms/embeddings.qmd CHANGED
@@ -4,7 +4,10 @@ title: Embeddings
4
 
5
  Embeddings in Large Language Models (LLMs) are a foundational component in the field of natural language processing (NLP). These embeddings transform words, phrases, or even longer texts into a vector space, capturing the semantic meaning that enables LLMs to perform a variety of language-based tasks with remarkable proficiency. This article focuses on the role of embeddings in LLMs, how they are generated, and their impact on the performance of these models.
6
 
7
- [HuggingFace Embeddings Leaderboard](https://huggingface.co/spaces/mteb/leaderboard)
 
 
 
8
 
9
  ## What are Embeddings in LLMs?
10
  In the context of LLMs, embeddings are dense vector representations of text. Each vector aims to encapsulate aspects of linguistic meaning such as syntax, semantics, and context. Unlike simpler models that might use one-hot encoding, LLM embeddings map words or tokens to vectors in a way that reflects their semantic and contextual relationships.
 
4
 
5
  Embeddings in Large Language Models (LLMs) are a foundational component in the field of natural language processing (NLP). These embeddings transform words, phrases, or even longer texts into a vector space, capturing the semantic meaning that enables LLMs to perform a variety of language-based tasks with remarkable proficiency. This article focuses on the role of embeddings in LLMs, how they are generated, and their impact on the performance of these models.
6
 
7
+ * [Andrew NG: amazing series about embeddings](https://www.youtube.com/playlist?list=PLhWB2ZsrULv-wEM8JDKA1zk8_2Lc88I-s)
8
+ * [Linus Lee: The Hidden Life of Embeddings](https://www.youtube.com/watch?v=YvobVu1l7GI)
9
+ * [OpenAI Embeddings and Vector Databases Crash Course](https://www.youtube.com/watch?v=ySus5ZS0b94)
10
+ * [HuggingFace Embedding Models Leaderboard](https://huggingface.co/spaces/mteb/leaderboard)
11
 
12
  ## What are Embeddings in LLMs?
13
  In the context of LLMs, embeddings are dense vector representations of text. Each vector aims to encapsulate aspects of linguistic meaning such as syntax, semantics, and context. Unlike simpler models that might use one-hot encoding, LLM embeddings map words or tokens to vectors in a way that reflects their semantic and contextual relationships.