NLP Course documentation

Fine-tuning, Check!

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Fine-tuning, Check!

Ask a Question

That was fun! In the first two chapters you learned about models and tokenizers, and now you know how to fine-tune them for your own data. To recap, in this chapter you:

  • Learned about datasets in the Hub
  • Learned how to load and preprocess datasets, including using dynamic padding and collators
  • Implemented your own fine-tuning and evaluation of a model
  • Implemented a lower-level training loop
  • Used 🤗 Accelerate to easily adapt your training loop so it works for multiple GPUs or TPUs