🤗 Transformers Notebooks¶
You can find here a list of the official notebooks provided by Hugging Face.
Also, we would like to list here interesting content created by the community. If you wrote some notebook(s) leveraging 🤗 Transformers and would like be listed here, please open a Pull Request so it can be included under the Community notebooks.
Hugging Face’s notebooks 🤗¶
|Getting Started Tokenizers||How to train and use your very own tokenizer|
|Getting Started Transformers||How to easily start using transformers|
|How to use Pipelines||Simple and efficient way to use State-of-the-Art models on downstream tasks through transformers|
|How to fine-tune a model on text classification||Show how to preprocess the data and fine-tune a pretrained model on any GLUE task.|
|How to fine-tune a model on language modeling||Show how to preprocess the data and fine-tune a pretrained model on a causal or masked LM task.|
|How to fine-tune a model on token classification||Show how to preprocess the data and fine-tune a pretrained model on a token classification task (NER, PoS).|
|How to fine-tune a model on question answering||Show how to preprocess the data and fine-tune a pretrained model on SQUAD.|
|How to train a language model from scratch||Highlight all the steps to effectively train Transformer model on custom data|
|How to generate text||How to use different decoding methods for language generation with transformers|
|How to export model to ONNX||Highlight how to export and run inference workloads through ONNX|
|How to use Benchmarks||How to benchmark models with transformers|
|Reformer||How Reformer pushes the limits of language modeling|