Optimum documentation

πŸ€— Optimum Habana

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

πŸ€— Optimum Habana

πŸ€— Optimum Habana is the interface between the πŸ€— Transformers and πŸ€— Diffusers libraries and Habana’s Gaudi processor (HPU). It provides a set of tools that enable simple model loading, training and inference on single- and multi-HPU settings for various downstream tasks.

The table below shows which model architectures, tasks and device distributions are currently supported for πŸ€— Optimum Habana:

Text Classification Question Answering Language Modeling Summarization Translation Image Classification Audio Classification Speech Recognition Single Card Multi Card DeepSpeed
BERT βœ… βœ… βœ… ❌ ❌ ❌ ❌ ❌ βœ… βœ… βœ…
RoBERTa ❌ βœ… βœ… ❌ ❌ ❌ ❌ ❌ βœ… βœ… βœ…
ALBERT ❌ βœ… βœ… ❌ ❌ ❌ ❌ ❌ βœ… βœ… βœ…
DistilBERT ❌ βœ… βœ… ❌ ❌ ❌ ❌ ❌ βœ… βœ… βœ…
GPT2 ❌ ❌ βœ… ❌ ❌ ❌ ❌ ❌ βœ… βœ… βœ…
T5 ❌ ❌ ❌ βœ… βœ… ❌ ❌ ❌ βœ… βœ… βœ…
ViT ❌ ❌ ❌ ❌ ❌ βœ… ❌ ❌ βœ… βœ… βœ…
Swin ❌ ❌ ❌ ❌ ❌ βœ… ❌ ❌ βœ… βœ… βœ…
Wav2Vec2 ❌ ❌ ❌ ❌ ❌ ❌ βœ… βœ… βœ… βœ… βœ…
Stable Diffusion βœ… ❌ ❌

Other models and tasks supported by the πŸ€— Transformers library may also work. You can refer to the Quickstart for examples on using them with πŸ€— Optimum Habana. Besides, this page explains how to modify any example from the πŸ€— Transformers library to make it work with πŸ€— Optimum Habana.