title
stringlengths
0
82
url
stringlengths
46
131
markdown
stringlengths
193
178k
html
stringlengths
243
3.7M
crawlDate
stringlengths
24
24
Introduction - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/1?fw=pt
## [](#introduction)Introduction [![Ask a Question](https://img.shields.io/badge/Ask%20a%20question-ffcb4c.svg?logo=data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHZpZXdCb3g9IjAgLTEgMTA0IDEwNiI+PGRlZnM+PHN0eWxlPi5jbHMtMXtmaWxsOiMyMzFmMjA7fS5jbHMtMntmaWxsOiNmZmY5YWU7fS5jbHMtM3tmaWxsOiMwMGFlZWY7fS5jbHMtNHtmaWxsOiMwMGE5NGY7fS5jbHMtNXtmaWxsOiNmMTVkMjI7fS5jbHMtNntmaWxsOiNlMzFiMjM7fTwvc3R5bGU+PC9kZWZzPjx0aXRsZT5EaXNjb3Vyc2VfbG9nbzwvdGl0bGU+PGcgaWQ9IkxheWVyXzIiPjxnIGlkPSJMYXllcl8zIj48cGF0aCBjbGFzcz0iY2xzLTEiIGQ9Ik01MS44NywwQzIzLjcxLDAsMCwyMi44MywwLDUxYzAsLjkxLDAsNTIuODEsMCw1Mi44MWw1MS44Ni0uMDVjMjguMTYsMCw1MS0yMy43MSw1MS01MS44N1M4MCwwLDUxLjg3LDBaIi8+PHBhdGggY2xhc3M9ImNscy0yIiBkPSJNNTIuMzcsMTkuNzRBMzEuNjIsMzEuNjIsMCwwLDAsMjQuNTgsNjYuNDFsLTUuNzIsMTguNEwzOS40LDgwLjE3YTMxLjYxLDMxLjYxLDAsMSwwLDEzLTYwLjQzWiIvPjxwYXRoIGNsYXNzPSJjbHMtMyIgZD0iTTc3LjQ1LDMyLjEyYTMxLjYsMzEuNiwwLDAsMS0zOC4wNSw0OEwxOC44Niw4NC44MmwyMC45MS0yLjQ3QTMxLjYsMzEuNiwwLDAsMCw3Ny40NSwzMi4xMloiLz48cGF0aCBjbGFzcz0iY2xzLTQiIGQ9Ik03MS42MywyNi4yOUEzMS42LDMxLjYsMCwwLDEsMzguOCw3OEwxOC44Niw4NC44MiwzOS40LDgwLjE3QTMxLjYsMzEuNiwwLDAsMCw3MS42MywyNi4yOVoiLz48cGF0aCBjbGFzcz0iY2xzLTUiIGQ9Ik0yNi40Nyw2Ny4xMWEzMS42MSwzMS42MSwwLDAsMSw1MS0zNUEzMS42MSwzMS42MSwwLDAsMCwyNC41OCw2Ni40MWwtNS43MiwxOC40WiIvPjxwYXRoIGNsYXNzPSJjbHMtNiIgZD0iTTI0LjU4LDY2LjQxQTMxLjYxLDMxLjYxLDAsMCwxLDcxLjYzLDI2LjI5YTMxLjYxLDMxLjYxLDAsMCwwLTQ5LDM5LjYzbC0zLjc2LDE4LjlaIi8+PC9nPjwvZz48L3N2Zz4=)](https://discuss.huggingface.co/t/chapter-1-questions) ## [](#welcome-to-the-course)Welcome to the 🤗 Course! This course will teach you about natural language processing (NLP) using libraries from the [Hugging Face](https://huggingface.co/) ecosystem — [🤗 Transformers](https://github.com/huggingface/transformers), [🤗 Datasets](https://github.com/huggingface/datasets), [🤗 Tokenizers](https://github.com/huggingface/tokenizers), and [🤗 Accelerate](https://github.com/huggingface/accelerate) — as well as the [Hugging Face Hub](https://huggingface.co/models). It’s completely free and without ads. ## [](#what-to-expect)What to expect? Here is a brief overview of the course: ![Brief overview of the chapters of the course.](https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/summary.svg) ![Brief overview of the chapters of the course.](https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/summary-dark.svg) - Chapters 1 to 4 provide an introduction to the main concepts of the 🤗 Transformers library. By the end of this part of the course, you will be familiar with how Transformer models work and will know how to use a model from the [Hugging Face Hub](https://huggingface.co/models), fine-tune it on a dataset, and share your results on the Hub! - Chapters 5 to 8 teach the basics of 🤗 Datasets and 🤗 Tokenizers before diving into classic NLP tasks. By the end of this part, you will be able to tackle the most common NLP problems by yourself. - Chapters 9 to 12 go beyond NLP, and explore how Transformer models can be used to tackle tasks in speech processing and computer vision. Along the way, you’ll learn how to build and share demos of your models, and optimize them for production environments. By the end of this part, you will be ready to apply 🤗 Transformers to (almost) any machine learning problem! This course: - Requires a good knowledge of Python - Is better taken after an introductory deep learning course, such as [fast.ai’s](https://www.fast.ai/) [Practical Deep Learning for Coders](https://course.fast.ai/) or one of the programs developed by [DeepLearning.AI](https://www.deeplearning.ai/) - Does not expect prior [PyTorch](https://pytorch.org/) or [TensorFlow](https://www.tensorflow.org/) knowledge, though some familiarity with either of those will help After you’ve completed this course, we recommend checking out DeepLearning.AI’s [Natural Language Processing Specialization](https://www.coursera.org/specializations/natural-language-processing?utm_source=deeplearning-ai&utm_medium=institutions&utm_campaign=20211011-nlp-2-hugging_face-page-nlp-refresh), which covers a wide range of traditional NLP models like naive Bayes and LSTMs that are well worth knowing about! ## [](#who-are-we)Who are we? About the authors: [**Abubakar Abid**](https://huggingface.co/abidlabs) completed his PhD at Stanford in applied machine learning. During his PhD, he founded [Gradio](https://github.com/gradio-app/gradio), an open-source Python library that has been used to build over 600,000 machine learning demos. Gradio was acquired by Hugging Face, which is where Abubakar now serves as a machine learning team lead. [**Matthew Carrigan**](https://huggingface.co/Rocketknight1) is a Machine Learning Engineer at Hugging Face. He lives in Dublin, Ireland and previously worked as an ML engineer at Parse.ly and before that as a post-doctoral researcher at Trinity College Dublin. He does not believe we’re going to get to AGI by scaling existing architectures, but has high hopes for robot immortality regardless. [**Lysandre Debut**](https://huggingface.co/lysandre) is a Machine Learning Engineer at Hugging Face and has been working on the 🤗 Transformers library since the very early development stages. His aim is to make NLP accessible for everyone by developing tools with a very simple API. [**Sylvain Gugger**](https://huggingface.co/sgugger) is a Research Engineer at Hugging Face and one of the core maintainers of the 🤗 Transformers library. Previously he was a Research Scientist at fast.ai, and he co-wrote _[Deep Learning for Coders with fastai and PyTorch](https://learning.oreilly.com/library/view/deep-learning-for/9781492045519/)_ with Jeremy Howard. The main focus of his research is on making deep learning more accessible, by designing and improving techniques that allow models to train fast on limited resources. [**Dawood Khan**](https://huggingface.co/dawoodkhan82) is a Machine Learning Engineer at Hugging Face. He’s from NYC and graduated from New York University studying Computer Science. After working as an iOS Engineer for a few years, Dawood quit to start Gradio with his fellow co-founders. Gradio was eventually acquired by Hugging Face. [**Merve Noyan**](https://huggingface.co/merve) is a developer advocate at Hugging Face, working on developing tools and building content around them to democratize machine learning for everyone. [**Lucile Saulnier**](https://huggingface.co/SaulLu) is a machine learning engineer at Hugging Face, developing and supporting the use of open source tools. She is also actively involved in many research projects in the field of Natural Language Processing such as collaborative training and BigScience. [**Lewis Tunstall**](https://huggingface.co/lewtun) is a machine learning engineer at Hugging Face, focused on developing open-source tools and making them accessible to the wider community. He is also a co-author of the O’Reilly book [Natural Language Processing with Transformers](https://www.oreilly.com/library/view/natural-language-processing/9781098136789/). [**Leandro von Werra**](https://huggingface.co/lvwerra) is a machine learning engineer in the open-source team at Hugging Face and also a co-author of the O’Reilly book [Natural Language Processing with Transformers](https://www.oreilly.com/library/view/natural-language-processing/9781098136789/). He has several years of industry experience bringing NLP projects to production by working across the whole machine learning stack.. ## [](#faq)FAQ Here are some answers to frequently asked questions: - **Does taking this course lead to a certification?** Currently we do not have any certification for this course. However, we are working on a certification program for the Hugging Face ecosystem — stay tuned! - **How much time should I spend on this course?** Each chapter in this course is designed to be completed in 1 week, with approximately 6-8 hours of work per week. However, you can take as much time as you need to complete the course. - **Where can I ask a question if I have one?** If you have a question about any section of the course, just click on the ”_Ask a question_” banner at the top of the page to be automatically redirected to the right section of the [Hugging Face forums](https://discuss.huggingface.co/): ![Link to the Hugging Face forums](https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/forum-button.png) Note that a list of [project ideas](https://discuss.huggingface.co/c/course/course-event/25) is also available on the forums if you wish to practice more once you have completed the course. - **Where can I get the code for the course?** For each section, click on the banner at the top of the page to run the code in either Google Colab or Amazon SageMaker Studio Lab: ![Link to the Hugging Face course notebooks](https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/notebook-buttons.png) The Jupyter notebooks containing all the code from the course are hosted on the [`huggingface/notebooks`](https://github.com/huggingface/notebooks) repo. If you wish to generate them locally, check out the instructions in the [`course`](https://github.com/huggingface/course#-jupyter-notebooks) repo on GitHub. - **How can I contribute to the course?** There are many ways to contribute to the course! If you find a typo or a bug, please open an issue on the [`course`](https://github.com/huggingface/course) repo. If you would like to help translate the course into your native language, check out the instructions [here](https://github.com/huggingface/course#translating-the-course-into-your-language). - **What were the choices made for each translation?** Each translation has a glossary and `TRANSLATING.txt` file that details the choices that were made for machine learning jargon etc. You can find an example for German [here](https://github.com/huggingface/course/blob/main/chapters/de/TRANSLATING.txt). - **Can I reuse this course?** Of course! The course is released under the permissive [Apache 2 license](https://www.apache.org/licenses/LICENSE-2.0.html). This means that you must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. If you would like to cite the course, please use the following BibTeX: ``` @misc{huggingfacecourse, author = {Hugging Face}, title = {The Hugging Face Course, 2022}, howpublished = "\url{https://huggingface.co/course}", year = {2022}, note = "[Online; accessed <today>]" }``` ## [](#lets-go)Let's Go Are you ready to roll? In this chapter, you will learn: - How to use the `pipeline()` function to solve NLP tasks such as text generation and classification - About the Transformer architecture - How to distinguish between encoder, decoder, and encoder-decoder architectures and use cases
<!DOCTYPE html><html class=""><head> <meta charset="utf-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=no"> <meta name="description" content="We’re on a journey to advance and democratize artificial intelligence through open source and open science."> <meta property="fb:app_id" content="1321688464574422"> <meta name="twitter:card" content="summary_large_image"> <meta name="twitter:site" content="@huggingface"> <meta property="og:title" content="Introduction - Hugging Face NLP Course"> <meta property="og:type" content="website"> <meta property="og:url" content="https://huggingface.co/learn/nlp-course/chapter1/1"> <meta property="og:image" content="https://huggingface.co/front/thumbnails/learn/nlp-course.png"> <link rel="stylesheet" href="/front/build/kube-c0d76de/style.css"> <link rel="preconnect" href="https://fonts.gstatic.com"> <link href="https://fonts.googleapis.com/css2?family=Source+Sans+Pro:ital,wght@0,200;0,300;0,400;0,600;0,700;0,900;1,200;1,300;1,400;1,600;1,700;1,900&amp;display=swap" rel="stylesheet"> <link href="https://fonts.googleapis.com/css2?family=IBM+Plex+Mono:wght@400;600;700&amp;display=swap" rel="stylesheet"> <link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/katex@0.12.0/dist/katex.min.css"> <title>Introduction - Hugging Face NLP Course</title> <script async="" src="https://www.google-analytics.com/analytics.js"></script><script defer="" data-domain="huggingface.co" src="/js/script.js"></script> <script src="https://js.stripe.com/v3/" async=""></script><script src="https://www.googletagmanager.com/gtag/js?id=G-8Q63TH4CSL" async=""></script><link rel="stylesheet" href="/docs/course/main/en/_app/assets/pages/__layout.svelte-hf-doc-builder.css"><link rel="modulepreload" as="script" crossorigin="" href="/docs/course/main/en/_app/error.svelte-hf-doc-builder.js"><meta name="hf:doc:metadata" content="{&quot;local&quot;:&quot;introduction&quot;,&quot;sections&quot;:[{&quot;local&quot;:&quot;welcome-to-the-course&quot;,&quot;title&quot;:&quot;Welcome to the 🤗 Course!&quot;},{&quot;local&quot;:&quot;what-to-expect&quot;,&quot;title&quot;:&quot;What to expect?&quot;},{&quot;local&quot;:&quot;who-are-we&quot;,&quot;title&quot;:&quot;Who are we?&quot;},{&quot;local&quot;:&quot;faq&quot;,&quot;title&quot;:&quot;FAQ&quot;},{&quot;local&quot;:&quot;lets-go&quot;,&quot;title&quot;:&quot;Let's Go&quot;}],&quot;title&quot;:&quot;Introduction&quot;}"></head> <body class="flex flex-col min-h-screen bg-white dark:bg-gray-950 text-black DocBuilderPage"> <div class="flex min-h-screen flex-col"> <div class="SVELTE_HYDRATER contents" data-props="{&quot;isWide&quot;:true,&quot;isZh&quot;:false}" data-target="MainHeader"><header class="border-b border-gray-100"><div class="w-full px-4 flex h-16 items-center"><div class="flex flex-1 items-center"><a class="mr-5 flex flex-none items-center lg:mr-6" href="/"><img alt="Hugging Face's logo" class="w-7 md:mr-2" src="/front/assets/huggingface_logo-noborder.svg"> <span class="hidden whitespace-nowrap text-lg font-bold md:block">Hugging Face</span></a> <div class="relative flex-1 lg:max-w-sm mr-2 sm:mr-4 lg:mr-6"><input autocomplete="off" class="w-full dark:bg-gray-950 pl-8 form-input-alt h-9 pr-3 focus:shadow-xl" name="" placeholder="Search models, datasets, users..." spellcheck="false" type="text"> <svg class="absolute left-2.5 text-gray-400 top-1/2 transform -translate-y-1/2" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M30 28.59L22.45 21A11 11 0 1 0 21 22.45L28.59 30zM5 14a9 9 0 1 1 9 9a9 9 0 0 1-9-9z" fill="currentColor"></path></svg> </div> <button class="relative flex w-8 flex-none items-center justify-center place-self-stretch lg:hidden" type="button"><svg width="1em" height="1em" viewBox="0 0 10 10" class="text-xl" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" preserveAspectRatio="xMidYMid meet" fill="currentColor"><path fill-rule="evenodd" clip-rule="evenodd" d="M1.65039 2.9999C1.65039 2.8066 1.80709 2.6499 2.00039 2.6499H8.00039C8.19369 2.6499 8.35039 2.8066 8.35039 2.9999C8.35039 3.1932 8.19369 3.3499 8.00039 3.3499H2.00039C1.80709 3.3499 1.65039 3.1932 1.65039 2.9999ZM1.65039 4.9999C1.65039 4.8066 1.80709 4.6499 2.00039 4.6499H8.00039C8.19369 4.6499 8.35039 4.8066 8.35039 4.9999C8.35039 5.1932 8.19369 5.3499 8.00039 5.3499H2.00039C1.80709 5.3499 1.65039 5.1932 1.65039 4.9999ZM2.00039 6.6499C1.80709 6.6499 1.65039 6.8066 1.65039 6.9999C1.65039 7.1932 1.80709 7.3499 2.00039 7.3499H8.00039C8.19369 7.3499 8.35039 7.1932 8.35039 6.9999C8.35039 6.8066 8.19369 6.6499 8.00039 6.6499H2.00039Z"></path></svg> </button> </div> <nav aria-label="Main" class="ml-auto hidden lg:block"><ul class="flex items-center space-x-2"><li><a class="group flex items-center px-2 py-0.5 dark:hover:text-gray-400 hover:text-indigo-700" href="/models"><svg class="mr-1.5 text-gray-400 group-hover:text-indigo-500" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 24 24"><path class="uim-quaternary" d="M20.23 7.24L12 12L3.77 7.24a1.98 1.98 0 0 1 .7-.71L11 2.76c.62-.35 1.38-.35 2 0l6.53 3.77c.29.173.531.418.7.71z" opacity=".25" fill="currentColor"></path><path class="uim-tertiary" d="M12 12v9.5a2.09 2.09 0 0 1-.91-.21L4.5 17.48a2.003 2.003 0 0 1-1-1.73v-7.5a2.06 2.06 0 0 1 .27-1.01L12 12z" opacity=".5" fill="currentColor"></path><path class="uim-primary" d="M20.5 8.25v7.5a2.003 2.003 0 0 1-1 1.73l-6.62 3.82c-.275.13-.576.198-.88.2V12l8.23-4.76c.175.308.268.656.27 1.01z" fill="currentColor"></path></svg> Models</a></li><li><a class="group flex items-center px-2 py-0.5 dark:hover:text-gray-400 hover:text-red-700" href="/datasets"><svg class="mr-1.5 text-gray-400 group-hover:text-red-500" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 25 25"><ellipse cx="12.5" cy="5" fill="currentColor" fill-opacity="0.25" rx="7.5" ry="2"></ellipse><path d="M12.5 15C16.6421 15 20 14.1046 20 13V20C20 21.1046 16.6421 22 12.5 22C8.35786 22 5 21.1046 5 20V13C5 14.1046 8.35786 15 12.5 15Z" fill="currentColor" opacity="0.5"></path><path d="M12.5 7C16.6421 7 20 6.10457 20 5V11.5C20 12.6046 16.6421 13.5 12.5 13.5C8.35786 13.5 5 12.6046 5 11.5V5C5 6.10457 8.35786 7 12.5 7Z" fill="currentColor" opacity="0.5"></path><path d="M5.23628 12C5.08204 12.1598 5 12.8273 5 13C5 14.1046 8.35786 15 12.5 15C16.6421 15 20 14.1046 20 13C20 12.8273 19.918 12.1598 19.7637 12C18.9311 12.8626 15.9947 13.5 12.5 13.5C9.0053 13.5 6.06886 12.8626 5.23628 12Z" fill="currentColor"></path></svg> Datasets</a></li><li><a class="group flex items-center px-2 py-0.5 dark:hover:text-gray-400 hover:text-blue-700" href="/spaces"><svg class="mr-1.5 text-gray-400 group-hover:text-blue-500" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" viewBox="0 0 25 25"><path opacity=".5" d="M6.016 14.674v4.31h4.31v-4.31h-4.31ZM14.674 14.674v4.31h4.31v-4.31h-4.31ZM6.016 6.016v4.31h4.31v-4.31h-4.31Z" fill="currentColor"></path><path opacity=".75" fill-rule="evenodd" clip-rule="evenodd" d="M3 4.914C3 3.857 3.857 3 4.914 3h6.514c.884 0 1.628.6 1.848 1.414a5.171 5.171 0 0 1 7.31 7.31c.815.22 1.414.964 1.414 1.848v6.514A1.914 1.914 0 0 1 20.086 22H4.914A1.914 1.914 0 0 1 3 20.086V4.914Zm3.016 1.102v4.31h4.31v-4.31h-4.31Zm0 12.968v-4.31h4.31v4.31h-4.31Zm8.658 0v-4.31h4.31v4.31h-4.31Zm0-10.813a2.155 2.155 0 1 1 4.31 0 2.155 2.155 0 0 1-4.31 0Z" fill="currentColor"></path><path opacity=".25" d="M16.829 6.016a2.155 2.155 0 1 0 0 4.31 2.155 2.155 0 0 0 0-4.31Z" fill="currentColor"></path></svg> Spaces</a></li><li><a class="group flex items-center px-2 py-0.5 dark:hover:text-gray-400 hover:text-yellow-700" href="/docs"><svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="mr-1.5 text-gray-400 group-hover:text-yellow-500" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path opacity="0.5" d="M20.9022 5.10334L10.8012 10.8791L7.76318 9.11193C8.07741 8.56791 8.5256 8.11332 9.06512 7.7914L15.9336 3.73907C17.0868 3.08811 18.5002 3.26422 19.6534 3.91519L19.3859 3.73911C19.9253 4.06087 20.5879 4.56025 20.9022 5.10334Z" fill="currentColor"></path><path d="M10.7999 10.8792V28.5483C10.2136 28.5475 9.63494 28.4139 9.10745 28.1578C8.5429 27.8312 8.074 27.3621 7.74761 26.7975C7.42122 26.2327 7.24878 25.5923 7.24756 24.9402V10.9908C7.25062 10.3319 7.42358 9.68487 7.74973 9.1123L10.7999 10.8792Z" fill="currentColor" fill-opacity="0.75"></path><path fill-rule="evenodd" clip-rule="evenodd" d="M21.3368 10.8499V6.918C21.3331 6.25959 21.16 5.61234 20.8346 5.03949L10.7971 10.8727L10.8046 10.874L21.3368 10.8499Z" fill="currentColor"></path><path opacity="0.5" d="M21.7937 10.8488L10.7825 10.8741V28.5486L21.7937 28.5234C23.3344 28.5234 24.5835 27.2743 24.5835 25.7335V13.6387C24.5835 12.0979 23.4365 11.1233 21.7937 10.8488Z" fill="currentColor"></path></svg> Docs</a></li> <li><div class="relative "><button class="px-2 py-0.5 group hover:text-green-700 dark:hover:text-gray-400 flex items-center " type="button"><svg class="mr-1.5 text-gray-400 group-hover:text-green-500" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 24 24"><path class="uim-tertiary" d="M19 6H5a3 3 0 0 0-3 3v2.72L8.837 14h6.326L22 11.72V9a3 3 0 0 0-3-3z" opacity=".5" fill="currentColor"></path><path class="uim-primary" d="M10 6V5h4v1h2V5a2.002 2.002 0 0 0-2-2h-4a2.002 2.002 0 0 0-2 2v1h2zm-1.163 8L2 11.72V18a3.003 3.003 0 0 0 3 3h14a3.003 3.003 0 0 0 3-3v-6.28L15.163 14H8.837z" fill="currentColor"></path></svg> Solutions </button> </div></li> <li><a class="group flex items-center px-2 py-0.5 hover:text-gray-500 dark:hover:text-gray-400" href="/pricing">Pricing</a></li> <li><div class="relative group"><button class="px-2 py-0.5 hover:text-gray-500 dark:hover:text-gray-600 flex items-center " type="button"><svg class="mr-1.5 text-gray-500 w-5 group-hover:text-gray-400 dark:text-gray-300 dark:group-hover:text-gray-400" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" viewBox="0 0 32 18" preserveAspectRatio="xMidYMid meet"><path fill-rule="evenodd" clip-rule="evenodd" d="M14.4504 3.30221C14.4504 2.836 14.8284 2.45807 15.2946 2.45807H28.4933C28.9595 2.45807 29.3374 2.836 29.3374 3.30221C29.3374 3.76842 28.9595 4.14635 28.4933 4.14635H15.2946C14.8284 4.14635 14.4504 3.76842 14.4504 3.30221Z" fill="currentColor"></path><path fill-rule="evenodd" clip-rule="evenodd" d="M14.4504 9.00002C14.4504 8.53382 14.8284 8.15588 15.2946 8.15588H28.4933C28.9595 8.15588 29.3374 8.53382 29.3374 9.00002C29.3374 9.46623 28.9595 9.84417 28.4933 9.84417H15.2946C14.8284 9.84417 14.4504 9.46623 14.4504 9.00002Z" fill="currentColor"></path><path fill-rule="evenodd" clip-rule="evenodd" d="M14.4504 14.6978C14.4504 14.2316 14.8284 13.8537 15.2946 13.8537H28.4933C28.9595 13.8537 29.3374 14.2316 29.3374 14.6978C29.3374 15.164 28.9595 15.542 28.4933 15.542H15.2946C14.8284 15.542 14.4504 15.164 14.4504 14.6978Z" fill="currentColor"></path><path fill-rule="evenodd" clip-rule="evenodd" d="M1.94549 6.87377C2.27514 6.54411 2.80962 6.54411 3.13928 6.87377L6.23458 9.96907L9.32988 6.87377C9.65954 6.54411 10.194 6.54411 10.5237 6.87377C10.8533 7.20343 10.8533 7.73791 10.5237 8.06756L6.23458 12.3567L1.94549 8.06756C1.61583 7.73791 1.61583 7.20343 1.94549 6.87377Z" fill="currentColor"></path></svg> </button> </div></li> <li><hr class="h-5 w-0.5 border-none bg-gray-100 dark:bg-gray-800"></li> <li><a class="block cursor-pointer px-2 py-0.5 hover:text-gray-500 dark:hover:text-gray-400" href="/login">Log In</a></li> <li><a class="btn ml-2" href="/join">Sign Up</a></li></ul></nav></div></header></div> <div class="SVELTE_HYDRATER contents" data-props="{}" data-target="GoogleAnalyticsTracker"></div> <div class="SVELTE_HYDRATER contents" data-props="{}" data-target="SSOBanner"></div> <main class="flex flex-1 flex-col "><div class="relative lg:flex"><div class="sticky top-0 z-20 self-start"><div class="SVELTE_HYDRATER contents" data-props="{&quot;chapters&quot;:[{&quot;title&quot;:&quot;0. Setup&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;id&quot;:&quot;chapter0/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter0/1?fw=pt&quot;}]},{&quot;title&quot;:&quot;1. Transformer models&quot;,&quot;isExpanded&quot;:true,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;isExpanded&quot;:true,&quot;id&quot;:&quot;chapter1/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/1?fw=pt&quot;},{&quot;title&quot;:&quot;Natural Language Processing&quot;,&quot;id&quot;:&quot;chapter1/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/2?fw=pt&quot;},{&quot;title&quot;:&quot;Transformers, what can they do?&quot;,&quot;id&quot;:&quot;chapter1/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/3?fw=pt&quot;},{&quot;title&quot;:&quot;How do Transformers work?&quot;,&quot;id&quot;:&quot;chapter1/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/4?fw=pt&quot;},{&quot;title&quot;:&quot;Encoder models&quot;,&quot;id&quot;:&quot;chapter1/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/5?fw=pt&quot;},{&quot;title&quot;:&quot;Decoder models&quot;,&quot;id&quot;:&quot;chapter1/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/6?fw=pt&quot;},{&quot;title&quot;:&quot;Sequence-to-sequence models&quot;,&quot;id&quot;:&quot;chapter1/7&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/7?fw=pt&quot;},{&quot;title&quot;:&quot;Bias and limitations&quot;,&quot;id&quot;:&quot;chapter1/8&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/8?fw=pt&quot;},{&quot;title&quot;:&quot;Summary&quot;,&quot;id&quot;:&quot;chapter1/9&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/9?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:1,&quot;id&quot;:&quot;chapter1/10&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter1/10?fw=pt&quot;}]},{&quot;title&quot;:&quot;2. Using 🤗 Transformers&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;id&quot;:&quot;chapter2/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/1?fw=pt&quot;},{&quot;title&quot;:&quot;Behind the pipeline&quot;,&quot;id&quot;:&quot;chapter2/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/2?fw=pt&quot;},{&quot;title&quot;:&quot;Models&quot;,&quot;id&quot;:&quot;chapter2/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/3?fw=pt&quot;},{&quot;title&quot;:&quot;Tokenizers&quot;,&quot;id&quot;:&quot;chapter2/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/4?fw=pt&quot;},{&quot;title&quot;:&quot;Handling multiple sequences&quot;,&quot;id&quot;:&quot;chapter2/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/5?fw=pt&quot;},{&quot;title&quot;:&quot;Putting it all together&quot;,&quot;id&quot;:&quot;chapter2/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/6?fw=pt&quot;},{&quot;title&quot;:&quot;Basic usage completed!&quot;,&quot;id&quot;:&quot;chapter2/7&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/7?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:2,&quot;id&quot;:&quot;chapter2/8&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter2/8?fw=pt&quot;}]},{&quot;title&quot;:&quot;3. Fine-tuning a pretrained model&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;id&quot;:&quot;chapter3/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter3/1?fw=pt&quot;},{&quot;title&quot;:&quot;Processing the data&quot;,&quot;id&quot;:&quot;chapter3/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter3/2?fw=pt&quot;},{&quot;title&quot;:&quot;Fine-tuning a model with the Trainer API or Keras&quot;,&quot;local_fw&quot;:{&quot;pt&quot;:&quot;chapter3/3&quot;,&quot;tf&quot;:&quot;chapter3/3_tf&quot;},&quot;id&quot;:&quot;chapter3/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter3/3?fw=pt&quot;},{&quot;title&quot;:&quot;A full training&quot;,&quot;id&quot;:&quot;chapter3/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter3/4?fw=pt&quot;},{&quot;title&quot;:&quot;Fine-tuning, Check!&quot;,&quot;id&quot;:&quot;chapter3/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter3/5?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:3,&quot;id&quot;:&quot;chapter3/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter3/6?fw=pt&quot;}]},{&quot;title&quot;:&quot;4. Sharing models and tokenizers&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;The Hugging Face Hub&quot;,&quot;id&quot;:&quot;chapter4/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter4/1?fw=pt&quot;},{&quot;title&quot;:&quot;Using pretrained models&quot;,&quot;id&quot;:&quot;chapter4/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter4/2?fw=pt&quot;},{&quot;title&quot;:&quot;Sharing pretrained models&quot;,&quot;id&quot;:&quot;chapter4/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter4/3?fw=pt&quot;},{&quot;title&quot;:&quot;Building a model card&quot;,&quot;id&quot;:&quot;chapter4/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter4/4?fw=pt&quot;},{&quot;title&quot;:&quot;Part 1 completed!&quot;,&quot;id&quot;:&quot;chapter4/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter4/5?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:4,&quot;id&quot;:&quot;chapter4/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter4/6?fw=pt&quot;}]},{&quot;title&quot;:&quot;5. The 🤗 Datasets library&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;id&quot;:&quot;chapter5/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/1?fw=pt&quot;},{&quot;title&quot;:&quot;What if my dataset isn't on the Hub?&quot;,&quot;id&quot;:&quot;chapter5/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/2?fw=pt&quot;},{&quot;title&quot;:&quot;Time to slice and dice&quot;,&quot;id&quot;:&quot;chapter5/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/3?fw=pt&quot;},{&quot;title&quot;:&quot;Big data? 🤗 Datasets to the rescue!&quot;,&quot;id&quot;:&quot;chapter5/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/4?fw=pt&quot;},{&quot;title&quot;:&quot;Creating your own dataset&quot;,&quot;id&quot;:&quot;chapter5/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/5?fw=pt&quot;},{&quot;title&quot;:&quot;Semantic search with FAISS&quot;,&quot;id&quot;:&quot;chapter5/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/6?fw=pt&quot;},{&quot;title&quot;:&quot;🤗 Datasets, check!&quot;,&quot;id&quot;:&quot;chapter5/7&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/7?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:5,&quot;id&quot;:&quot;chapter5/8&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter5/8?fw=pt&quot;}]},{&quot;title&quot;:&quot;6. The 🤗 Tokenizers library&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;id&quot;:&quot;chapter6/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/1?fw=pt&quot;},{&quot;title&quot;:&quot;Training a new tokenizer from an old one&quot;,&quot;id&quot;:&quot;chapter6/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/2?fw=pt&quot;},{&quot;title&quot;:&quot;Fast tokenizers' special powers&quot;,&quot;id&quot;:&quot;chapter6/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/3?fw=pt&quot;},{&quot;title&quot;:&quot;Fast tokenizers in the QA pipeline&quot;,&quot;id&quot;:&quot;chapter6/3b&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/3b?fw=pt&quot;},{&quot;title&quot;:&quot;Normalization and pre-tokenization&quot;,&quot;id&quot;:&quot;chapter6/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/4?fw=pt&quot;},{&quot;title&quot;:&quot;Byte-Pair Encoding tokenization&quot;,&quot;id&quot;:&quot;chapter6/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/5?fw=pt&quot;},{&quot;title&quot;:&quot;WordPiece tokenization&quot;,&quot;id&quot;:&quot;chapter6/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/6?fw=pt&quot;},{&quot;title&quot;:&quot;Unigram tokenization&quot;,&quot;id&quot;:&quot;chapter6/7&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/7?fw=pt&quot;},{&quot;title&quot;:&quot;Building a tokenizer, block by block&quot;,&quot;id&quot;:&quot;chapter6/8&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/8?fw=pt&quot;},{&quot;title&quot;:&quot;Tokenizers, check!&quot;,&quot;id&quot;:&quot;chapter6/9&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/9?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:6,&quot;id&quot;:&quot;chapter6/10&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter6/10?fw=pt&quot;}]},{&quot;title&quot;:&quot;7. Main NLP tasks&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;id&quot;:&quot;chapter7/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/1?fw=pt&quot;},{&quot;title&quot;:&quot;Token classification&quot;,&quot;id&quot;:&quot;chapter7/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/2?fw=pt&quot;},{&quot;title&quot;:&quot;Fine-tuning a masked language model&quot;,&quot;id&quot;:&quot;chapter7/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/3?fw=pt&quot;},{&quot;title&quot;:&quot;Translation&quot;,&quot;id&quot;:&quot;chapter7/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/4?fw=pt&quot;},{&quot;title&quot;:&quot;Summarization&quot;,&quot;id&quot;:&quot;chapter7/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/5?fw=pt&quot;},{&quot;title&quot;:&quot;Training a causal language model from scratch&quot;,&quot;id&quot;:&quot;chapter7/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/6?fw=pt&quot;},{&quot;title&quot;:&quot;Question answering&quot;,&quot;id&quot;:&quot;chapter7/7&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/7?fw=pt&quot;},{&quot;title&quot;:&quot;Mastering NLP&quot;,&quot;id&quot;:&quot;chapter7/8&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/8?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:7,&quot;id&quot;:&quot;chapter7/9&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter7/9?fw=pt&quot;}]},{&quot;title&quot;:&quot;8. How to ask for help&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction&quot;,&quot;id&quot;:&quot;chapter8/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter8/1?fw=pt&quot;},{&quot;title&quot;:&quot;What to do when you get an error&quot;,&quot;id&quot;:&quot;chapter8/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter8/2?fw=pt&quot;},{&quot;title&quot;:&quot;Asking for help on the forums&quot;,&quot;id&quot;:&quot;chapter8/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter8/3?fw=pt&quot;},{&quot;title&quot;:&quot;Debugging the training pipeline&quot;,&quot;local_fw&quot;:{&quot;pt&quot;:&quot;chapter8/4&quot;,&quot;tf&quot;:&quot;chapter8/4_tf&quot;},&quot;id&quot;:&quot;chapter8/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter8/4?fw=pt&quot;},{&quot;title&quot;:&quot;How to write a good issue&quot;,&quot;id&quot;:&quot;chapter8/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter8/5?fw=pt&quot;},{&quot;title&quot;:&quot;Part 2 completed!&quot;,&quot;id&quot;:&quot;chapter8/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter8/6?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:8,&quot;id&quot;:&quot;chapter8/7&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter8/7?fw=pt&quot;}]},{&quot;title&quot;:&quot;9. Building and sharing demos&quot;,&quot;new&quot;:true,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Introduction to Gradio&quot;,&quot;id&quot;:&quot;chapter9/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/1?fw=pt&quot;},{&quot;title&quot;:&quot;Building your first demo&quot;,&quot;id&quot;:&quot;chapter9/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/2?fw=pt&quot;},{&quot;title&quot;:&quot;Understanding the Interface class&quot;,&quot;id&quot;:&quot;chapter9/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/3?fw=pt&quot;},{&quot;title&quot;:&quot;Sharing demos with others&quot;,&quot;id&quot;:&quot;chapter9/4&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/4?fw=pt&quot;},{&quot;title&quot;:&quot;Integrations with the Hugging Face Hub&quot;,&quot;id&quot;:&quot;chapter9/5&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/5?fw=pt&quot;},{&quot;title&quot;:&quot;Advanced Interface features&quot;,&quot;id&quot;:&quot;chapter9/6&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/6?fw=pt&quot;},{&quot;title&quot;:&quot;Introduction to Blocks&quot;,&quot;id&quot;:&quot;chapter9/7&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/7?fw=pt&quot;},{&quot;title&quot;:&quot;Gradio, check!&quot;,&quot;id&quot;:&quot;chapter9/8&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/8?fw=pt&quot;},{&quot;title&quot;:&quot;End-of-chapter quiz&quot;,&quot;quiz&quot;:9,&quot;id&quot;:&quot;chapter9/9&quot;,&quot;url&quot;:&quot;/learn/nlp-course/chapter9/9?fw=pt&quot;}]},{&quot;title&quot;:&quot;Course Events&quot;,&quot;isExpanded&quot;:false,&quot;sections&quot;:[{&quot;title&quot;:&quot;Live sessions and workshops&quot;,&quot;id&quot;:&quot;events/1&quot;,&quot;url&quot;:&quot;/learn/nlp-course/events/1?fw=pt&quot;},{&quot;title&quot;:&quot;Part 2 release event&quot;,&quot;id&quot;:&quot;events/2&quot;,&quot;url&quot;:&quot;/learn/nlp-course/events/2?fw=pt&quot;},{&quot;title&quot;:&quot;Gradio Blocks party&quot;,&quot;id&quot;:&quot;events/3&quot;,&quot;url&quot;:&quot;/learn/nlp-course/events/3?fw=pt&quot;}]}],&quot;chapterId&quot;:&quot;chapter1/1&quot;,&quot;docType&quot;:&quot;learn&quot;,&quot;isLoggedIn&quot;:false,&quot;lang&quot;:&quot;en&quot;,&quot;langs&quot;:[&quot;ar&quot;,&quot;bn&quot;,&quot;de&quot;,&quot;en&quot;,&quot;es&quot;,&quot;fa&quot;,&quot;fr&quot;,&quot;gj&quot;,&quot;he&quot;,&quot;hi&quot;,&quot;id&quot;,&quot;it&quot;,&quot;ja&quot;,&quot;ko&quot;,&quot;pt&quot;,&quot;ru&quot;,&quot;th&quot;,&quot;tr&quot;,&quot;vi&quot;,&quot;zh-CN&quot;,&quot;zh-TW&quot;],&quot;library&quot;:&quot;nlp-course&quot;,&quot;theme&quot;:&quot;light&quot;,&quot;version&quot;:&quot;main&quot;,&quot;versions&quot;:[{&quot;version&quot;:&quot;main&quot;}],&quot;title&quot;:&quot;Introduction&quot;}" data-target="SideMenu"> <div class="z-2 w-full flex-none lg:block lg:h-screen lg:w-[270px] 2xl:w-[300px] false"><div class="shadow-alternate flex h-16 w-full items-center rounded-b-xl border-b bg-white text-lg leading-tight lg:hidden"><div class="flex flex-1 cursor-pointer flex-col justify-center self-stretch pl-6"><p class="text-sm text-gray-400 first-letter:capitalize">NLP Course documentation</p> <div class="flex items-center"><p class="font-semibold">Introduction</p> <svg class="text-xl false" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 24 24"><path d="M16.293 9.293L12 13.586L7.707 9.293l-1.414 1.414L12 16.414l5.707-5.707z" fill="currentColor"></path></svg></div></div> <button class="hover:shadow-alternate group ml-auto mr-6 inline-flex flex-none cursor-pointer rounded-xl border p-2"><svg class="text-gray-500 group-hover:text-gray-700" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M30 28.59L22.45 21A11 11 0 1 0 21 22.45L28.59 30zM5 14a9 9 0 1 1 9 9a9 9 0 0 1-9-9z" fill="currentColor"></path></svg></button></div> <div class="hidden h-32 flex-col justify-between border-r border-b bg-white bg-gradient-to-r p-4 lg:flex from-amber-50 to-white dark:from-gray-900 dark:to-gray-950" style="background: url(https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/course-logo.png); background-position: 90% 5%; background-repeat: no-repeat; background-size: 18%;"><div class="relative "><button class=" " type="button"><h1 class="flex items-center text-lg font-bold leading-tight first-letter:capitalize"><div class="mr-1.5 h-1.5 w-1.5 rounded-full bg-amber-500 flex-none"></div> NLP Course <span><svg class="opacity-70 " xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 24 24"><path d="M16.293 9.293L12 13.586L7.707 9.293l-1.414 1.414L12 16.414l5.707-5.707z" fill="currentColor"></path></svg></span></h1> </button> </div> <button class="shadow-alternate flex w-full items-center rounded-full border bg-white px-2 py-1 text-left text-sm text-gray-400 ring-indigo-200 hover:bg-indigo-50 hover:ring-2 dark:border-gray-700 dark:ring-yellow-600 dark:hover:bg-gray-900 dark:hover:text-yellow-500"><svg class="flex-none mr-1.5" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M30 28.59L22.45 21A11 11 0 1 0 21 22.45L28.59 30zM5 14a9 9 0 1 1 9 9a9 9 0 0 1-9-9z" fill="currentColor"></path></svg> <div>Search documentation</div> <span class="ml-auto rounded border border-gray-200 bg-gray-100 px-0.5 text-xs dark:border-gray-800 dark:bg-gray-800"><kbd class="font-sans">⌘K</kbd></span></button> <div class="flex items-center"> <select class="form-input mr-1 rounded border-gray-200 p-1 text-xs dark:!text-gray-400 !w-12 !mt-0 !border"><option value="ar">AR</option><option value="bn">BN</option><option value="de">DE</option><option value="en">EN</option><option value="es">ES</option><option value="fa">FA</option><option value="fr">FR</option><option value="gj">GJ</option><option value="he">HE</option><option value="hi">HI</option><option value="id">ID</option><option value="it">IT</option><option value="ja">JA</option><option value="ko">KO</option><option value="pt">PT</option><option value="ru">RU</option><option value="th">TH</option><option value="tr">TR</option><option value="vi">VI</option><option value="zh-CN">ZH-CN</option><option value="zh-TW">ZH-TW</option></select> <div class="relative inline-block"><button class="rounded-full border border-gray-100 py-1 pl-2 pr-0.5 flex items-center text-sm text-gray-500 bg-white hover:bg-yellow-50 hover:border-yellow-200 dark:hover:bg-gray-800 dark:hover:border-gray-950 " type="button"><svg class="mr-1.5 text-yellow-500" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 24 24" fill="currentColor"><path d="M6.05 4.14l-.39-.39a.993.993 0 0 0-1.4 0l-.01.01a.984.984 0 0 0 0 1.4l.39.39c.39.39 1.01.39 1.4 0l.01-.01a.984.984 0 0 0 0-1.4zM3.01 10.5H1.99c-.55 0-.99.44-.99.99v.01c0 .55.44.99.99.99H3c.56.01 1-.43 1-.98v-.01c0-.56-.44-1-.99-1zm9-9.95H12c-.56 0-1 .44-1 .99v.96c0 .55.44.99.99.99H12c.56.01 1-.43 1-.98v-.97c0-.55-.44-.99-.99-.99zm7.74 3.21c-.39-.39-1.02-.39-1.41-.01l-.39.39a.984.984 0 0 0 0 1.4l.01.01c.39.39 1.02.39 1.4 0l.39-.39a.984.984 0 0 0 0-1.4zm-1.81 15.1l.39.39a.996.996 0 1 0 1.41-1.41l-.39-.39a.993.993 0 0 0-1.4 0c-.4.4-.4 1.02-.01 1.41zM20 11.49v.01c0 .55.44.99.99.99H22c.55 0 .99-.44.99-.99v-.01c0-.55-.44-.99-.99-.99h-1.01c-.55 0-.99.44-.99.99zM12 5.5c-3.31 0-6 2.69-6 6s2.69 6 6 6s6-2.69 6-6s-2.69-6-6-6zm-.01 16.95H12c.55 0 .99-.44.99-.99v-.96c0-.55-.44-.99-.99-.99h-.01c-.55 0-.99.44-.99.99v.96c0 .55.44.99.99.99zm-7.74-3.21c.39.39 1.02.39 1.41 0l.39-.39a.993.993 0 0 0 0-1.4l-.01-.01a.996.996 0 0 0-1.41 0l-.39.39c-.38.4-.38 1.02.01 1.41z"></path></svg> </button> </div> <a href="https://github.com/huggingface/course" class="group ml-auto text-xs text-gray-500 hover:text-gray-700 hover:underline dark:hover:text-gray-300"><svg class="inline-block text-gray-500 group-hover:text-gray-700 dark:group-hover:text-gray-300 mr-1.5 -mt-1 w-4 h-4" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1.03em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 250"><path d="M128.001 0C57.317 0 0 57.307 0 128.001c0 56.554 36.676 104.535 87.535 121.46c6.397 1.185 8.746-2.777 8.746-6.158c0-3.052-.12-13.135-.174-23.83c-35.61 7.742-43.124-15.103-43.124-15.103c-5.823-14.795-14.213-18.73-14.213-18.73c-11.613-7.944.876-7.78.876-7.78c12.853.902 19.621 13.19 19.621 13.19c11.417 19.568 29.945 13.911 37.249 10.64c1.149-8.272 4.466-13.92 8.127-17.116c-28.431-3.236-58.318-14.212-58.318-63.258c0-13.975 5-25.394 13.188-34.358c-1.329-3.224-5.71-16.242 1.24-33.874c0 0 10.749-3.44 35.21 13.121c10.21-2.836 21.16-4.258 32.038-4.307c10.878.049 21.837 1.47 32.066 4.307c24.431-16.56 35.165-13.12 35.165-13.12c6.967 17.63 2.584 30.65 1.255 33.873c8.207 8.964 13.173 20.383 13.173 34.358c0 49.163-29.944 59.988-58.447 63.157c4.591 3.972 8.682 11.762 8.682 23.704c0 17.126-.148 30.91-.148 35.126c0 3.407 2.304 7.398 8.792 6.14C219.37 232.5 256 184.537 256 128.002C256 57.307 198.691 0 128.001 0zm-80.06 182.34c-.282.636-1.283.827-2.194.39c-.929-.417-1.45-1.284-1.15-1.922c.276-.655 1.279-.838 2.205-.399c.93.418 1.46 1.293 1.139 1.931zm6.296 5.618c-.61.566-1.804.303-2.614-.591c-.837-.892-.994-2.086-.375-2.66c.63-.566 1.787-.301 2.626.591c.838.903 1 2.088.363 2.66zm4.32 7.188c-.785.545-2.067.034-2.86-1.104c-.784-1.138-.784-2.503.017-3.05c.795-.547 2.058-.055 2.861 1.075c.782 1.157.782 2.522-.019 3.08zm7.304 8.325c-.701.774-2.196.566-3.29-.49c-1.119-1.032-1.43-2.496-.726-3.27c.71-.776 2.213-.558 3.315.49c1.11 1.03 1.45 2.505.701 3.27zm9.442 2.81c-.31 1.003-1.75 1.459-3.199 1.033c-1.448-.439-2.395-1.613-2.103-2.626c.301-1.01 1.747-1.484 3.207-1.028c1.446.436 2.396 1.602 2.095 2.622zm10.744 1.193c.036 1.055-1.193 1.93-2.715 1.95c-1.53.034-2.769-.82-2.786-1.86c0-1.065 1.202-1.932 2.733-1.958c1.522-.03 2.768.818 2.768 1.868zm10.555-.405c.182 1.03-.875 2.088-2.387 2.37c-1.485.271-2.861-.365-3.05-1.386c-.184-1.056.893-2.114 2.376-2.387c1.514-.263 2.868.356 3.061 1.403z" fill="currentColor"></path></svg> </a></div></div> <nav class="top-32 hidden lg:flex absolute bottom-0 left-0 w-full flex-col overflow-y-auto border-r px-4 pt-3 pb-16 text-[0.95rem] lg:w-[270px] 2xl:w-[300px]"> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>0. Setup</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] after:rotate-90 after:transform"><span><span class="inline-block space-x-1 leading-5"><span>1. Transformer models</span> </span></span></div></div> <div class="flex flex-col"><a class="rounded-xl bg-gradient-to-br from-black to-gray-900 py-1 pr-2 pl-2 text-white first:mt-1 last:mb-4 dark:from-gray-800 dark:to-gray-900 ml-2" href="/learn/nlp-course/chapter1/1?fw=pt">Introduction </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/2?fw=pt">Natural Language Processing </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/3?fw=pt">Transformers, what can they do? </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/4?fw=pt">How do Transformers work? </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/5?fw=pt">Encoder models </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/6?fw=pt">Decoder models </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/7?fw=pt">Sequence-to-sequence models </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/8?fw=pt">Bias and limitations </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/9?fw=pt">Summary </a><a class="transform py-1 pr-2 pl-2 text-gray-500 first:mt-1 last:mb-4 hover:translate-x-px hover:text-black dark:hover:text-gray-300 ml-2" href="/learn/nlp-course/chapter1/10?fw=pt">End-of-chapter quiz </a> </div><div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>2. Using 🤗 Transformers</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>3. Fine-tuning a pretrained model</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>4. Sharing models and tokenizers</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>5. The 🤗 Datasets library</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>6. The 🤗 Tokenizers library</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>7. Main NLP tasks</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>8. How to ask for help</span> </span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>9. Building and sharing demos</span> <span class="ml-1 rounded bg-yellow-200 px-1 text-xs text-yellow-900 dark:bg-yellow-500">new</span></span></span></div></div> <div class="group flex cursor-pointer items-center pl-2 text-[0.8rem] font-semibold uppercase leading-9 hover:text-gray-700 dark:hover:text-gray-300 ml-0"><div class="flex after:absolute after:right-4 after:text-gray-500 group-hover:after:content-['▶'] false"><span><span class="inline-block space-x-1 leading-5"><span>Course Events</span> </span></span></div></div> </nav></div></div></div> <div class="z-1 min-w-0 flex-1"> <div class="px-6 pt-6 md:px-12 md:pt-16 md:pb-16"><div class="max-w-4xl mx-auto mb-10"><div class="relative overflow-hidden rounded-xl bg-gradient-to-br from-orange-300/10 py-5 px-4 ring-1 ring-orange-100/70 md:px-6 md:py-8"><img alt="Hugging Face's logo" class="absolute -right-6 -bottom-6 w-28 -rotate-45 md:hidden" src="/front/assets/huggingface_logo-noborder.svg"> <div class="mb-2 text-2xl font-bold dark:text-gray-200 md:mb-0">Join the Hugging Face community</div> <p class="mb-4 text-lg text-gray-400 dark:text-gray-300 md:mb-8">and get access to the augmented documentation experience </p> <div class="mb-8 hidden space-y-4 md:block xl:flex xl:space-y-0 xl:space-x-6"><div class="flex items-center"><div class="mr-3 flex h-9 w-9 flex-none items-center justify-center rounded-lg bg-gradient-to-br from-indigo-100 to-indigo-100/20 dark:to-indigo-100"><svg class="text-indigo-400 group-hover:text-indigo-500" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 24 24"><path class="uim-quaternary" d="M20.23 7.24L12 12L3.77 7.24a1.98 1.98 0 0 1 .7-.71L11 2.76c.62-.35 1.38-.35 2 0l6.53 3.77c.29.173.531.418.7.71z" opacity=".25" fill="currentColor"></path><path class="uim-tertiary" d="M12 12v9.5a2.09 2.09 0 0 1-.91-.21L4.5 17.48a2.003 2.003 0 0 1-1-1.73v-7.5a2.06 2.06 0 0 1 .27-1.01L12 12z" opacity=".5" fill="currentColor"></path><path class="uim-primary" d="M20.5 8.25v7.5a2.003 2.003 0 0 1-1 1.73l-6.62 3.82c-.275.13-.576.198-.88.2V12l8.23-4.76c.175.308.268.656.27 1.01z" fill="currentColor"></path></svg></div> <div class="text-smd leading-tight text-gray-500 dark:text-gray-300 xl:max-w-[200px] 2xl:text-base">Collaborate on models, datasets and Spaces </div></div> <div class="flex items-center"><div class="mr-3 flex h-9 w-9 flex-none items-center justify-center rounded-lg bg-gradient-to-br from-orange-100 to-orange-100/20 dark:to-orange-50"><svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" class="text-xl text-yellow-400" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 24 24"><path d="M11 15H6l7-14v8h5l-7 14v-8z" fill="currentColor"></path></svg></div> <div class="text-smd leading-tight text-gray-500 dark:text-gray-300 xl:max-w-[200px] 2xl:text-base">Faster examples with accelerated inference </div></div> <div class="flex items-center"><div class="mr-3 flex h-9 w-9 flex-none items-center justify-center rounded-lg bg-gradient-to-br from-gray-500/10 to-gray-500/5"><svg class="text-gray-400" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M14.9804 3C14.9217 3.0002 14.8631 3.00555 14.8054 3.016C11.622 3.58252 8.76073 5.30669 6.77248 7.85653C4.78422 10.4064 3.80955 13.6016 4.03612 16.8271C4.26268 20.0525 5.67447 23.0801 7.99967 25.327C10.3249 27.5738 13.3991 28.8811 16.6304 28.997C16.7944 29.003 16.9584 28.997 17.1204 28.997C19.2193 28.9984 21.2877 28.4943 23.1507 27.5274C25.0137 26.5605 26.6164 25.1592 27.8234 23.442C27.9212 23.294 27.9783 23.1229 27.9889 22.9458C27.9995 22.7687 27.9633 22.592 27.884 22.4333C27.8046 22.2747 27.6848 22.1397 27.5367 22.0421C27.3887 21.9444 27.2175 21.8875 27.0404 21.877C25.0426 21.7017 23.112 21.0693 21.3976 20.0288C19.6832 18.9884 18.231 17.5676 17.1533 15.8764C16.0756 14.1852 15.4011 12.2688 15.1822 10.2754C14.9632 8.28193 15.2055 6.26484 15.8904 4.38C15.9486 4.22913 15.97 4.06652 15.9527 3.90572C15.9354 3.74492 15.8799 3.59059 15.7909 3.45557C15.7019 3.32055 15.5819 3.20877 15.4409 3.12952C15.2999 3.05028 15.142 3.00587 14.9804 3Z" fill="currentColor"></path></svg></div> <div class="text-smd leading-tight text-gray-500 dark:text-gray-300 xl:max-w-[200px] 2xl:text-base">Switch between documentation themes </div></div></div> <div class="flex items-center space-x-2.5"><a href="/join"><button class="rounded-lg bg-white bg-gradient-to-br from-gray-100/20 to-gray-200/60 py-1.5 px-5 font-semibold text-gray-700 shadow-sm ring-1 ring-gray-300/60 hover:to-gray-100/70 hover:ring-gray-300/30 active:shadow-inner">Sign Up</button></a> <p class="text-gray-500 dark:text-gray-300">to get started</p></div></div></div> <div class="prose-doc prose relative mx-auto max-w-4xl break-words"> <h1 class="relative group"><a id="introduction" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#introduction"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>Introduction</span></h1> <div class="flex space-x-1 absolute z-10 right-0 top-0"><a href="https://discuss.huggingface.co/t/chapter-1-questions" target="_blank"><img alt="Ask a Question" class="!m-0" src="https://img.shields.io/badge/Ask%20a%20question-ffcb4c.svg?logo=data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHZpZXdCb3g9IjAgLTEgMTA0IDEwNiI+PGRlZnM+PHN0eWxlPi5jbHMtMXtmaWxsOiMyMzFmMjA7fS5jbHMtMntmaWxsOiNmZmY5YWU7fS5jbHMtM3tmaWxsOiMwMGFlZWY7fS5jbHMtNHtmaWxsOiMwMGE5NGY7fS5jbHMtNXtmaWxsOiNmMTVkMjI7fS5jbHMtNntmaWxsOiNlMzFiMjM7fTwvc3R5bGU+PC9kZWZzPjx0aXRsZT5EaXNjb3Vyc2VfbG9nbzwvdGl0bGU+PGcgaWQ9IkxheWVyXzIiPjxnIGlkPSJMYXllcl8zIj48cGF0aCBjbGFzcz0iY2xzLTEiIGQ9Ik01MS44NywwQzIzLjcxLDAsMCwyMi44MywwLDUxYzAsLjkxLDAsNTIuODEsMCw1Mi44MWw1MS44Ni0uMDVjMjguMTYsMCw1MS0yMy43MSw1MS01MS44N1M4MCwwLDUxLjg3LDBaIi8+PHBhdGggY2xhc3M9ImNscy0yIiBkPSJNNTIuMzcsMTkuNzRBMzEuNjIsMzEuNjIsMCwwLDAsMjQuNTgsNjYuNDFsLTUuNzIsMTguNEwzOS40LDgwLjE3YTMxLjYxLDMxLjYxLDAsMSwwLDEzLTYwLjQzWiIvPjxwYXRoIGNsYXNzPSJjbHMtMyIgZD0iTTc3LjQ1LDMyLjEyYTMxLjYsMzEuNiwwLDAsMS0zOC4wNSw0OEwxOC44Niw4NC44MmwyMC45MS0yLjQ3QTMxLjYsMzEuNiwwLDAsMCw3Ny40NSwzMi4xMloiLz48cGF0aCBjbGFzcz0iY2xzLTQiIGQ9Ik03MS42MywyNi4yOUEzMS42LDMxLjYsMCwwLDEsMzguOCw3OEwxOC44Niw4NC44MiwzOS40LDgwLjE3QTMxLjYsMzEuNiwwLDAsMCw3MS42MywyNi4yOVoiLz48cGF0aCBjbGFzcz0iY2xzLTUiIGQ9Ik0yNi40Nyw2Ny4xMWEzMS42MSwzMS42MSwwLDAsMSw1MS0zNUEzMS42MSwzMS42MSwwLDAsMCwyNC41OCw2Ni40MWwtNS43MiwxOC40WiIvPjxwYXRoIGNsYXNzPSJjbHMtNiIgZD0iTTI0LjU4LDY2LjQxQTMxLjYxLDMxLjYxLDAsMCwxLDcxLjYzLDI2LjI5YTMxLjYxLDMxLjYxLDAsMCwwLTQ5LDM5LjYzbC0zLjc2LDE4LjlaIi8+PC9nPjwvZz48L3N2Zz4="></a> </div> <h2 class="relative group"><a id="welcome-to-the-course" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#welcome-to-the-course"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>Welcome to the 🤗 Course!</span></h2> <iframe class="w-full xl:w-4/6 h-80" src="https://www.youtube-nocookie.com/embed/00GKzGyWFEs" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe> <p>This course will teach you about natural language processing (NLP) using libraries from the <a href="https://huggingface.co/" rel="nofollow">Hugging Face</a> ecosystem — <a href="https://github.com/huggingface/transformers" rel="nofollow">🤗 Transformers</a>, <a href="https://github.com/huggingface/datasets" rel="nofollow">🤗 Datasets</a>, <a href="https://github.com/huggingface/tokenizers" rel="nofollow">🤗 Tokenizers</a>, and <a href="https://github.com/huggingface/accelerate" rel="nofollow">🤗 Accelerate</a> — as well as the <a href="https://huggingface.co/models" rel="nofollow">Hugging Face Hub</a>. It’s completely free and without ads.</p> <h2 class="relative group"><a id="what-to-expect" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#what-to-expect"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>What to expect?</span></h2> <p>Here is a brief overview of the course:</p> <div class="flex justify-center"><img class="block dark:hidden" src="https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/summary.svg" alt="Brief overview of the chapters of the course."> <img class="hidden dark:block" src="https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/summary-dark.svg" alt="Brief overview of the chapters of the course."></div> <ul><li>Chapters 1 to 4 provide an introduction to the main concepts of the 🤗 Transformers library. By the end of this part of the course, you will be familiar with how Transformer models work and will know how to use a model from the <a href="https://huggingface.co/models" rel="nofollow">Hugging Face Hub</a>, fine-tune it on a dataset, and share your results on the Hub!</li> <li>Chapters 5 to 8 teach the basics of 🤗 Datasets and 🤗 Tokenizers before diving into classic NLP tasks. By the end of this part, you will be able to tackle the most common NLP problems by yourself.</li> <li>Chapters 9 to 12 go beyond NLP, and explore how Transformer models can be used to tackle tasks in speech processing and computer vision. Along the way, you’ll learn how to build and share demos of your models, and optimize them for production environments. By the end of this part, you will be ready to apply 🤗 Transformers to (almost) any machine learning problem!</li></ul> <p>This course:</p> <ul><li>Requires a good knowledge of Python</li> <li>Is better taken after an introductory deep learning course, such as <a href="https://www.fast.ai/" rel="nofollow">fast.ai’s</a> <a href="https://course.fast.ai/" rel="nofollow">Practical Deep Learning for Coders</a> or one of the programs developed by <a href="https://www.deeplearning.ai/" rel="nofollow">DeepLearning.AI</a></li> <li>Does not expect prior <a href="https://pytorch.org/" rel="nofollow">PyTorch</a> or <a href="https://www.tensorflow.org/" rel="nofollow">TensorFlow</a> knowledge, though some familiarity with either of those will help</li></ul> <p>After you’ve completed this course, we recommend checking out DeepLearning.AI’s <a href="https://www.coursera.org/specializations/natural-language-processing?utm_source=deeplearning-ai&amp;utm_medium=institutions&amp;utm_campaign=20211011-nlp-2-hugging_face-page-nlp-refresh" rel="nofollow">Natural Language Processing Specialization</a>, which covers a wide range of traditional NLP models like naive Bayes and LSTMs that are well worth knowing about!</p> <h2 class="relative group"><a id="who-are-we" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#who-are-we"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>Who are we?</span></h2> <p>About the authors:</p> <p><a href="https://huggingface.co/abidlabs" rel="nofollow"><strong>Abubakar Abid</strong></a> completed his PhD at Stanford in applied machine learning. During his PhD, he founded <a href="https://github.com/gradio-app/gradio" rel="nofollow">Gradio</a>, an open-source Python library that has been used to build over 600,000 machine learning demos. Gradio was acquired by Hugging Face, which is where Abubakar now serves as a machine learning team lead.</p> <p><a href="https://huggingface.co/Rocketknight1" rel="nofollow"><strong>Matthew Carrigan</strong></a> is a Machine Learning Engineer at Hugging Face. He lives in Dublin, Ireland and previously worked as an ML engineer at Parse.ly and before that as a post-doctoral researcher at Trinity College Dublin. He does not believe we’re going to get to AGI by scaling existing architectures, but has high hopes for robot immortality regardless.</p> <p><a href="https://huggingface.co/lysandre" rel="nofollow"><strong>Lysandre Debut</strong></a> is a Machine Learning Engineer at Hugging Face and has been working on the 🤗 Transformers library since the very early development stages. His aim is to make NLP accessible for everyone by developing tools with a very simple API.</p> <p><a href="https://huggingface.co/sgugger" rel="nofollow"><strong>Sylvain Gugger</strong></a> is a Research Engineer at Hugging Face and one of the core maintainers of the 🤗 Transformers library. Previously he was a Research Scientist at fast.ai, and he co-wrote <em><a href="https://learning.oreilly.com/library/view/deep-learning-for/9781492045519/" rel="nofollow">Deep Learning for Coders with fastai and PyTorch</a></em> with Jeremy Howard. The main focus of his research is on making deep learning more accessible, by designing and improving techniques that allow models to train fast on limited resources.</p> <p><a href="https://huggingface.co/dawoodkhan82" rel="nofollow"><strong>Dawood Khan</strong></a> is a Machine Learning Engineer at Hugging Face. He’s from NYC and graduated from New York University studying Computer Science. After working as an iOS Engineer for a few years, Dawood quit to start Gradio with his fellow co-founders. Gradio was eventually acquired by Hugging Face.</p> <p><a href="https://huggingface.co/merve" rel="nofollow"><strong>Merve Noyan</strong></a> is a developer advocate at Hugging Face, working on developing tools and building content around them to democratize machine learning for everyone.</p> <p><a href="https://huggingface.co/SaulLu" rel="nofollow"><strong>Lucile Saulnier</strong></a> is a machine learning engineer at Hugging Face, developing and supporting the use of open source tools. She is also actively involved in many research projects in the field of Natural Language Processing such as collaborative training and BigScience.</p> <p><a href="https://huggingface.co/lewtun" rel="nofollow"><strong>Lewis Tunstall</strong></a> is a machine learning engineer at Hugging Face, focused on developing open-source tools and making them accessible to the wider community. He is also a co-author of the O’Reilly book <a href="https://www.oreilly.com/library/view/natural-language-processing/9781098136789/" rel="nofollow">Natural Language Processing with Transformers</a>.</p> <p><a href="https://huggingface.co/lvwerra" rel="nofollow"><strong>Leandro von Werra</strong></a> is a machine learning engineer in the open-source team at Hugging Face and also a co-author of the O’Reilly book <a href="https://www.oreilly.com/library/view/natural-language-processing/9781098136789/" rel="nofollow">Natural Language Processing with Transformers</a>. He has several years of industry experience bringing NLP projects to production by working across the whole machine learning stack..</p> <h2 class="relative group"><a id="faq" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#faq"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>FAQ</span></h2> <p>Here are some answers to frequently asked questions:</p> <ul><li><p><strong>Does taking this course lead to a certification?</strong> Currently we do not have any certification for this course. However, we are working on a certification program for the Hugging Face ecosystem — stay tuned!</p></li> <li><p><strong>How much time should I spend on this course?</strong> Each chapter in this course is designed to be completed in 1 week, with approximately 6-8 hours of work per week. However, you can take as much time as you need to complete the course.</p></li> <li><p><strong>Where can I ask a question if I have one?</strong> If you have a question about any section of the course, just click on the ”<em>Ask a question</em>” banner at the top of the page to be automatically redirected to the right section of the <a href="https://discuss.huggingface.co/" rel="nofollow">Hugging Face forums</a>:</p></li></ul> <img src="https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/forum-button.png" alt="Link to the Hugging Face forums" width="75%"> <p>Note that a list of <a href="https://discuss.huggingface.co/c/course/course-event/25" rel="nofollow">project ideas</a> is also available on the forums if you wish to practice more once you have completed the course.</p> <ul><li><strong>Where can I get the code for the course?</strong> For each section, click on the banner at the top of the page to run the code in either Google Colab or Amazon SageMaker Studio Lab:</li></ul> <img src="https://huggingface.co/datasets/huggingface-course/documentation-images/resolve/main/en/chapter1/notebook-buttons.png" alt="Link to the Hugging Face course notebooks" width="75%"> <p>The Jupyter notebooks containing all the code from the course are hosted on the <a href="https://github.com/huggingface/notebooks" rel="nofollow"><code>huggingface/notebooks</code></a> repo. If you wish to generate them locally, check out the instructions in the <a href="https://github.com/huggingface/course#-jupyter-notebooks" rel="nofollow"><code>course</code></a> repo on GitHub.</p> <ul><li><p><strong>How can I contribute to the course?</strong> There are many ways to contribute to the course! If you find a typo or a bug, please open an issue on the <a href="https://github.com/huggingface/course" rel="nofollow"><code>course</code></a> repo. If you would like to help translate the course into your native language, check out the instructions <a href="https://github.com/huggingface/course#translating-the-course-into-your-language" rel="nofollow">here</a>.</p></li> <li><p><strong>What were the choices made for each translation?</strong> Each translation has a glossary and <code>TRANSLATING.txt</code> file that details the choices that were made for machine learning jargon etc. You can find an example for German <a href="https://github.com/huggingface/course/blob/main/chapters/de/TRANSLATING.txt" rel="nofollow">here</a>.</p></li></ul> <ul><li><strong>Can I reuse this course?</strong> Of course! The course is released under the permissive <a href="https://www.apache.org/licenses/LICENSE-2.0.html" rel="nofollow">Apache 2 license</a>. This means that you must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use. If you would like to cite the course, please use the following BibTeX:</li></ul> <div class="code-block relative"><div class="absolute top-2.5 right-4"><button class="inline-flex items-center relative text-sm focus:text-green-500 cursor-pointer focus:outline-none transition duration-200 ease-in-out opacity-0 mx-0.5 text-gray-600 " title="code excerpt" type="button"><svg class="" xmlns="http://www.w3.org/2000/svg" aria-hidden="true" fill="currentColor" focusable="false" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 32 32"><path d="M28,10V28H10V10H28m0-2H10a2,2,0,0,0-2,2V28a2,2,0,0,0,2,2H28a2,2,0,0,0,2-2V10a2,2,0,0,0-2-2Z" transform="translate(0)"></path><path d="M4,18H2V4A2,2,0,0,1,4,2H18V4H4Z" transform="translate(0)"></path><rect fill="none" width="32" height="32"></rect></svg> <div class="absolute pointer-events-none transition-opacity bg-black text-white py-1 px-2 leading-tight rounded font-normal shadow left-1/2 top-full transform -translate-x-1/2 translate-y-2 opacity-0"><div class="absolute bottom-full left-1/2 transform -translate-x-1/2 w-0 h-0 border-black border-4 border-t-0" style="border-left-color: transparent; border-right-color: transparent; "></div> Copied</div></button></div> <pre>@misc{huggingfacecourse, <span class="hljs-attr">author</span> = {Hugging Face}, <span class="hljs-attr">title</span> = {The Hugging Face Course, <span class="hljs-number">2022</span>}, <span class="hljs-attr">howpublished</span> = <span class="hljs-string">"\url{https://huggingface.co/course}"</span>, <span class="hljs-attr">year</span> = {<span class="hljs-number">2022</span>}, <span class="hljs-attr">note</span> = <span class="hljs-string">"[Online; accessed &lt;today&gt;]"</span> }</pre></div> <h2 class="relative group"><a id="lets-go" class="header-link block pr-1.5 text-lg no-hover:hidden with-hover:absolute with-hover:p-1.5 with-hover:opacity-0 with-hover:group-hover:opacity-100 with-hover:right-full" href="#lets-go"><span><svg class="" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" width="1em" height="1em" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path d="M167.594 88.393a8.001 8.001 0 0 1 0 11.314l-67.882 67.882a8 8 0 1 1-11.314-11.315l67.882-67.881a8.003 8.003 0 0 1 11.314 0zm-28.287 84.86l-28.284 28.284a40 40 0 0 1-56.567-56.567l28.284-28.284a8 8 0 0 0-11.315-11.315l-28.284 28.284a56 56 0 0 0 79.196 79.197l28.285-28.285a8 8 0 1 0-11.315-11.314zM212.852 43.14a56.002 56.002 0 0 0-79.196 0l-28.284 28.284a8 8 0 1 0 11.314 11.314l28.284-28.284a40 40 0 0 1 56.568 56.567l-28.285 28.285a8 8 0 0 0 11.315 11.314l28.284-28.284a56.065 56.065 0 0 0 0-79.196z" fill="currentColor"></path></svg></span></a> <span>Let's Go</span></h2> Are you ready to roll? In this chapter, you will learn: <ul><li>How to use the <code>pipeline()</code> function to solve NLP tasks such as text generation and classification</li> <li>About the Transformer architecture</li> <li>How to distinguish between encoder, decoder, and encoder-decoder architectures and use cases</li></ul> <div id="svelte-announcer" aria-live="assertive" aria-atomic="true" style="position: absolute; left: 0px; top: 0px; clip: rect(0px, 0px, 0px, 0px); clip-path: inset(50%); overflow: hidden; white-space: nowrap; width: 1px; height: 1px;"></div></div> <div class="mx-auto mt-16 flex max-w-4xl items-center pb-8 font-sans font-medium leading-6 xl:mt-32"><a href="/learn/nlp-course/chapter0/1?fw=pt" class="mr-8 flex transform items-center text-gray-600 transition-all hover:-translate-x-px hover:text-gray-900 dark:hover:text-gray-300"><span class="mr-2 translate-y-px">←</span>Introduction</a> <a href="/learn/nlp-course/chapter1/2?fw=pt" class="ml-auto flex transform items-center text-right text-gray-600 transition-all hover:translate-x-px hover:text-gray-900 dark:hover:text-gray-300">Natural Language Processing<span class="ml-2 translate-y-px">→</span></a></div></div></div> <div class="sticky top-0 self-start"><div class="SVELTE_HYDRATER contents" data-props="{&quot;chapter&quot;:{&quot;title&quot;:&quot;Introduction&quot;,&quot;isExpanded&quot;:false,&quot;id&quot;:&quot;introduction&quot;,&quot;url&quot;:&quot;#introduction&quot;,&quot;sections&quot;:[{&quot;title&quot;:&quot;Welcome to the 🤗 Course!&quot;,&quot;id&quot;:&quot;welcome-to-the-course&quot;,&quot;url&quot;:&quot;#welcome-to-the-course&quot;},{&quot;title&quot;:&quot;What to expect?&quot;,&quot;id&quot;:&quot;what-to-expect&quot;,&quot;url&quot;:&quot;#what-to-expect&quot;},{&quot;title&quot;:&quot;Who are we?&quot;,&quot;id&quot;:&quot;who-are-we&quot;,&quot;url&quot;:&quot;#who-are-we&quot;},{&quot;title&quot;:&quot;FAQ&quot;,&quot;id&quot;:&quot;faq&quot;,&quot;url&quot;:&quot;#faq&quot;},{&quot;title&quot;:&quot;Let's Go&quot;,&quot;id&quot;:&quot;lets-go&quot;,&quot;url&quot;:&quot;#lets-go&quot;}]}}" data-target="SubSideMenu"><nav class="hidden h-screen w-[270px] flex-none flex-col space-y-3 overflow-y-auto break-words border-l pt-24 pl-6 pr-10 pb-16 text-sm lg:flex 2xl:w-[305px]"><a href="#introduction" class=" text-gray-400 transform hover:translate-x-px hover:text-gray-700 dark:hover:text-gray-300" id="nav-introduction"><wbr>Introduction</a> <a href="#welcome-to-the-course" class="pl-4 text-gray-400 transform hover:translate-x-px hover:text-gray-700 dark:hover:text-gray-300" id="nav-welcome-to-the-course"><wbr>Welcome to the 🤗 <wbr>Course!</a> <a href="#what-to-expect" class="pl-4 text-gray-400 transform hover:translate-x-px hover:text-gray-700 dark:hover:text-gray-300" id="nav-what-to-expect"><wbr>What to expect?</a> <a href="#who-are-we" class="pl-4 text-gray-400 transform hover:translate-x-px hover:text-gray-700 dark:hover:text-gray-300" id="nav-who-are-we"><wbr>Who are we?</a> <a href="#faq" class="pl-4 text-gray-400 transform hover:translate-x-px hover:text-gray-700 dark:hover:text-gray-300" id="nav-faq">FAQ</a> <a href="#lets-go" class="pl-4 text-gray-400 transform hover:translate-x-px hover:text-gray-700 dark:hover:text-gray-300" id="nav-lets-go"><wbr>Let's <wbr>Go</a> </nav></div></div></div> <div id="doc-footer"></div></main> </div> <script> import("/front/build/kube-c0d76de/index.js"); window.moonSha = "kube-c0d76de/"; window.hubConfig = JSON.parse(`{"features":{"signupDisabled":false},"sshGitUrl":"git@hf.co","moonHttpUrl":"https://huggingface.co","captchaApiKey":"bd5f2066-93dc-4bdd-a64b-a24646ca3859","stripePublicKey":"pk_live_x2tdjFXBCvXo2FFmMybezpeM00J6gPCAAc"}`); </script> <!-- Stripe --> <script> if (["hf.co", "huggingface.co"].includes(window.location.hostname)) { const script = document.createElement("script"); script.src = "https://js.stripe.com/v3/"; script.async = true; document.head.appendChild(script); } </script> <!-- Google analytics v4 --> <script> if (["hf.co", "huggingface.co"].includes(window.location.hostname)) { const script = document.createElement("script"); script.src = "https://www.googletagmanager.com/gtag/js?id=G-8Q63TH4CSL"; script.async = true; document.head.appendChild(script); window.dataLayer = window.dataLayer || []; function gtag() { if (window.dataLayer !== undefined) { window.dataLayer.push(arguments); } } gtag("js", new Date()); gtag("config", "G-8Q63TH4CSL", { page_path: "/learn/nlp-course/chapter1/1" }); /// ^ See https://developers.google.com/analytics/devguides/collection/gtagjs/pages gtag("consent", "default", { ad_storage: "denied", analytics_storage: "denied" }); /// ^ See https://developers.google.com/tag-platform/gtagjs/reference#consent /// TODO: ask the user for their consent and update this with gtag('consent', 'update') } </script> <!-- Google Analytics v3 (deprecated) --> <script> if (["hf.co", "huggingface.co"].includes(window.location.hostname)) { (function (i, s, o, g, r, a, m) { i["GoogleAnalyticsObject"] = r; (i[r] = i[r] || function () { (i[r].q = i[r].q || []).push(arguments); }), (i[r].l = 1 * new Date()); (a = s.createElement(o)), (m = s.getElementsByTagName(o)[0]); a.async = 1; a.src = g; m.parentNode.insertBefore(a, m); })(window, document, "script", "https://www.google-analytics.com/analytics.js", "ganalytics"); ganalytics("create", "UA-83738774-2", "auto"); ganalytics("send", "pageview", "/learn/nlp-course/chapter1/1"); } </script> <iframe name="__privateStripeMetricsController5750" frameborder="0" allowtransparency="true" scrolling="no" role="presentation" allow="payment *" src="https://js.stripe.com/v3/m-outer-93afeeb17bc37e711759584dbfc50d47.html#url=https%3A%2F%2Fhuggingface.co%2Flearn%2Fnlp-course%2Fchapter1%2F1%3Ffw%3Dpt&amp;title=Introduction%20-%20Hugging%20Face%20NLP%20Course&amp;referrer=&amp;muid=NA&amp;sid=NA&amp;version=6&amp;preview=false" aria-hidden="true" tabindex="-1" style="border: none !important; margin: 0px !important; padding: 0px !important; width: 1px !important; min-width: 100% !important; overflow: hidden !important; display: block !important; visibility: hidden !important; position: fixed !important; height: 1px !important; pointer-events: none !important; user-select: none !important;"></iframe></body></html>
2023-06-27T20:00:02.673Z
Transformers, what can they do? - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/3?fw=pt
"## [](#transformers-what-can-they-do)Transformers, what can they do?\n\n[![Ask a Question](https://(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:03.574Z
Natural Language Processing - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/2?fw=pt
"NLP Course documentation\n\nNatural Language Processing\n\n3\\. Fine-tuning a pretrained model\n\n4(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:03.635Z
Encoder models - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/5?fw=pt
"## [](#encoder-models)Encoder models\n\n[![Ask a Question](https://img.shields.io/badge/Ask%20a%20q(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:04.874Z
How do Transformers work? - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/4?fw=pt
"## [](#how-do-transformers-work)How do Transformers work?\n\n[![Ask a Question](https://img.shields(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:04.982Z
Decoder models - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/6?fw=pt
"## [](#decoder-models)Decoder models\n\n[![Ask a Question](https://img.shields.io/badge/Ask%20a%20q(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:05.751Z
Sequence-to-sequence models[sequence-to-sequence-models] - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/7?fw=pt
"## [](#sequencetosequence-modelssequencetosequencemodels)Sequence-to-sequence models\\[sequence-to-(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:05.887Z
Bias and limitations - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/8?fw=pt
"3\\. Fine-tuning a pretrained model\n\n4\\. Sharing models and tokenizers\n\n5\\. The 🤗 Datasets(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:06.452Z
Summary - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/9?fw=pt
"## [](#summary)Summary\n\n[![Ask a Question](https://img.shields.io/badge/Ask%20a%20question-ffcb4c(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:06.604Z
End-of-chapter quiz - Hugging Face NLP Course
https://huggingface.co/learn/nlp-course/chapter1/10?fw=pt
"3\\. Fine-tuning a pretrained model\n\n4\\. Sharing models and tokenizers\n\n5\\. The 🤗 Datasets(...TRUNCATED)
"<!DOCTYPE html><html class=\"\"><head>\n\t\t<meta charset=\"utf-8\">\n\t\t<meta name=\"viewport\" c(...TRUNCATED)
2023-06-27T20:00:06.996Z
README.md exists but content is empty. Use the Edit dataset card button to edit it.
Downloads last month
2
Edit dataset card