NLP Course documentation

End-of-chapter quiz

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

End-of-chapter quiz

Ask a Question

This chapter covered a lot of ground! Don’t worry if you didn’t grasp all the details; the next chapters will help you understand how things work under the hood.

First, though, let’s test what you learned in this chapter!

1. Explore the Hub and look for the roberta-large-mnli checkpoint. What task does it perform?

2. What will the following code return?

from transformers import pipeline

ner = pipeline("ner", grouped_entities=True)
ner("My name is Sylvain and I work at Hugging Face in Brooklyn.")

3. What should replace … in this code sample?

from transformers import pipeline

filler = pipeline("fill-mask", model="bert-base-cased")
result = filler("...")

4. Why will this code fail?

from transformers import pipeline

classifier = pipeline("zero-shot-classification")
result = classifier("This is a course about the Transformers library")

5. What does “transfer learning” mean?

6. True or false? A language model usually does not need labels for its pretraining.

7. Select the sentence that best describes the terms “model”, “architecture”, and “weights”.

8. Which of these types of models would you use for completing prompts with generated text?

9. Which of those types of models would you use for summarizing texts?

10. Which of these types of models would you use for classifying text inputs according to certain labels?

11. What possible source can the bias observed in a model have?

Summary