Optimum documentation

Single-HPU Training

You are viewing v1.7.3 version. A newer version v1.19.0 is available.
Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Single-HPU Training

Training on a single device is as simple as in Transformers:

  • You need to replace the Transformers’ Trainer class with the GaudiTrainer class,
  • You need to replace the Transformers’ TrainingArguments class with the GaudiTrainingArguments class and add the following arguments:
    • use_habana to execute your script on an HPU,
    • use_lazy_mode to use lazy mode or not (i.e. eager mode),
    • gaudi_config_name to give the name of (Hub) or the path to (local) your Gaudi configuration file.

You can find several examples in the official repository for the following tasks:

To go further, we invite you to read our guides about accelerating training and pretraining.