🤗 Transformers Notebooks¶

You can find here a list of the official notebooks provided by Hugging Face.

Also, we would like to list here interesting content created by the community. If you wrote some notebook(s) leveraging 🤗 Transformers and would like be listed here, please open a Pull Request so it can be included under the Community notebooks.

Hugging Face’s notebooks 🤗¶

Notebook Description
Getting Started Tokenizers How to train and use your very own tokenizer Open In Colab
Getting Started Transformers How to easily start using transformers Open In Colab
How to use Pipelines Simple and efficient way to use State-of-the-Art models on downstream tasks through transformers Open In Colab
How to fine-tune a model on text classification Show how to preprocess the data and fine-tune a pretrained model on any GLUE task. Open in Colab
How to fine-tune a model on language modeling Show how to preprocess the data and fine-tune a pretrained model on a causal or masked LM task. Open in Colab
How to fine-tune a model on token classification Show how to preprocess the data and fine-tune a pretrained model on a token classification task (NER, PoS). Open in Colab
How to fine-tune a model on question answering Show how to preprocess the data and fine-tune a pretrained model on SQUAD. Open in Colab
How to train a language model from scratch Highlight all the steps to effectively train Transformer model on custom data Open in Colab
How to generate text How to use different decoding methods for language generation with transformers Open in Colab
How to export model to ONNX Highlight how to export and run inference workloads through ONNX
How to use Benchmarks How to benchmark models with transformers Open in Colab
Reformer How Reformer pushes the limits of language modeling Open in Colab

Community notebooks:¶

More notebooks developed by the community are available here.