Optimum documentation

Quickstart

You are viewing main version, which requires installation from source. If you'd like regular pip install, checkout the latest stable version (v1.17.1).
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Quickstart

πŸ€— Optimum Graphcore was designed with one goal in mind: to make training and evaluation straightforward for any πŸ€— Transformers user while leveraging the complete power of IPUs.

Installation

To install the latest release of πŸ€— Optimum Graphcore:

pip install optimum-graphcore

Optimum Graphcore is a fast-moving project, and you may want to install from source.

pip install git+https://github.com/huggingface/optimum-graphcore.git

Environment setup

You need to have the Poplar SDK enabled to use πŸ€— Optimum Graphcore. Refer to the Getting Started guide for your system for details on how to enable the Poplar SDK. Also refer to the Jupyter Quick Start guide for how to set up Jupyter to be able to run notebooks on a remote IPU machine.

Example scripts and Jupyter notebooks

We have a range of example scripts in /examples and Jupyter notebooks in /notebooks which you can refer to to see how specific elements of the Optimum Graphcore library are used.

Using Optimum Graphcore

The main Optimum Graphcore classes are:

  • IPUTrainer: This class handles compiling the model to run on IPUs, as well as performing the training and evaluation. Refer to the section on training for more information.
  • IPUConfig: This class specifies attributes and configuration parameters to compile and put the model on the IPU. Refer to the section on configuration section for more information.

IPUTrainer is very similar to the πŸ€— Transformers Trainer class, and you can easily adapt a script that currently uses Trainer to make it work with IPUs. You will mostly swap Trainer for IPUTrainer. This is how most of the Optimum Graphcore example scripts were adapted from the original Hugging Face scripts.

-from transformers import Trainer, TrainingArguments
+from optimum.graphcore import IPUConfig, IPUTrainer, IPUTrainingArguments

-training_args = TrainingArguments(
+training_args = IPUTrainingArguments(
     per_device_train_batch_size=4,
     learning_rate=1e-4,
+    # Any IPUConfig on the Hub or stored locally
+    ipu_config_name="Graphcore/bert-base-ipu",
+)
+
+# Loading the IPUConfig needed by the IPUTrainer to compile and train the model on IPUs
+ipu_config = IPUConfig.from_pretrained(
+    training_args.ipu_config_name,
 )

 # Initialize our Trainer
-trainer = Trainer(
+trainer = IPUTrainer(
     model=model,
+    ipu_config=ipu_config,
     args=training_args,
     train_dataset=train_dataset if training_args.do_train else None,
     ...  # Other arguments

We also support the pipeline API, so you can easily run a model on a given input, for example text image or audio, on IPUs:

->>> from transformers import pipeline
+>>> from optimum.graphcore import pipeline

# Allocate a pipeline for sentiment-analysis
->>> classifier = pipeline('sentiment-analysis', model="distilbert-base-uncased-finetuned-sst-2-english")
+>>> classifier = pipeline('sentiment-analysis', model="distilbert-base-uncased-finetuned-sst-2-english", ipu_config = "Graphcore/distilbert-base-ipu")
>>> classifier('We are very happy to introduce pipeline to the transformers repository.')
[{'label': 'POSITIVE', 'score': 0.9996947050094604}]