Introduction to Optimum Graphcore |
Introduce Optimum-Graphcore with a BERT fine-tuning example. |
 |
Train an external model |
Show how to train an external model that is not supported by Optimum or Transformers. |
|
Train your language model |
Show how to train a model for causal or masked language modelling from scratch. |
|
How to fine-tune a model on text classification |
Show how to preprocess the data and fine-tune a pretrained model on any GLUE task. |
 |
How to fine-tune a model on language modeling |
Show how to preprocess the data and fine-tune a pretrained model on a causal or masked LM task. |
|
How to fine-tune a model on token classification |
Show how to preprocess the data and fine-tune a pretrained model on a token classification task (NER, PoS). |
|
How to fine-tune a model on question answering |
Show how to preprocess the data and fine-tune a pretrained model on SQUAD. |
|
How to fine-tune a model on multiple choice |
Show how to preprocess the data and fine-tune a pretrained model on SWAG. |
|
How to fine-tune a model on translation |
Show how to preprocess the data and fine-tune a pretrained model on WMT. |
|
How to fine-tune a model on summarization |
Show how to preprocess the data and fine-tune a pretrained model on XSUM. |
|
How to fine-tune a model on audio classification |
Show how to preprocess the data and fine-tune a pretrained Speech model on Keyword Spotting |
|
How to fine-tune a model on image classfication |
Show how to preprocess the data and fine-tune a pretrained model on image classification. |
 |
wav2vec 2.0 Fine-Tuning on IPU |
How to fine-tune a pre-trained wav2vec 2.0 model with PyTorch on the Graphcore IPU-POD16 system. |
|
wav2vec 2.0 Inference on IPU |
How to run inference on the wav2vec 2.0 model with PyTorch on the Graphcore IPU-POD16 system. |
|