🤗 Optimum Habana is the interface between the 🤗 Transformers and 🤗 Diffusers libraries and Habana’s Gaudi processor (HPU). It provides a set of tools that enable simple model loading, training and inference on single- and multi-HPU settings for various downstream tasks.
The table below shows which model architectures, tasks and device distributions are currently supported for 🤗 Optimum Habana:
|Text Classification||Question Answering||Language Modeling||Summarization||Translation||Image Classification||Audio Classification||Speech Recognition||Single Card||Multi Card||DeepSpeed|
Other models and tasks supported by the 🤗 Transformers library may also work. You can refer to the Quickstart for examples on using them with 🤗 Optimum Habana. Besides, this page explains how to modify any example from the 🤗 Transformers library to make it work with 🤗 Optimum Habana.
Learn the basics and become familiar with training transformers on HPUs with 🤗 Optimum. Start here if you are using 🤗 Optimum Habana for the first time!
Practical guides to help you achieve a specific goal. Take a look at these guides to learn how to use 🤗 Optimum Habana to solve real-world problems.
High-level explanations for building a better understanding about important topics such as HPUs.
Technical descriptions of how the Habana classes and methods of 🤗 Optimum Habana work.