regisss HF staff commited on
Commit
49793c7
1 Parent(s): 3b2e653

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -9
README.md CHANGED
@@ -6,7 +6,7 @@ license: apache-2.0
6
  Learn more about how to take advantage of the power of Habana HPUs to train Transformers models at [hf.co/Habana](https://huggingface.co/Habana).
7
 
8
 
9
- # RoBERTa Base model HPU configuration
10
 
11
  This model contains just the `GaudiConfig` file for running the [roberta-base](https://huggingface.co/roberta-base) model on Habana's Gaudi processors (HPU).
12
 
@@ -25,17 +25,27 @@ This enables to specify:
25
  ## Usage
26
 
27
  The model is instantiated the same way as in the Transformers library.
28
- The only difference is that a Gaudi configuration associated to this model has to be loaded and provided to the trainer.
29
 
30
  ```
31
- from transformers import RobertaTokenizer, RobertaModel
32
- from optimum.habana import GaudiTrainer, GaudiTrainingArguments, GaudiConfig
33
 
34
- tokenizer = RobertaTokenizer.from_pretrained('roberta-base')
35
- model = RobertaModel.from_pretrained('roberta-base')
36
- gaudi_config = GaudiConfig.from_pretrained("Habana/roberta-base")
37
- args = GaudiTrainingArguments(output_dir=path_to_my_output_dir, use_habana=True, use_lazy_mode=True)
38
 
39
- trainer = GaudiTrainer(model=model, gaudi_config=gaudi_config, args=args, tokenizer=tokenizer)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
  trainer.train()
41
  ```
 
6
  Learn more about how to take advantage of the power of Habana HPUs to train Transformers models at [hf.co/Habana](https://huggingface.co/Habana).
7
 
8
 
9
+ ## RoBERTa Base model HPU configuration
10
 
11
  This model contains just the `GaudiConfig` file for running the [roberta-base](https://huggingface.co/roberta-base) model on Habana's Gaudi processors (HPU).
12
 
 
25
  ## Usage
26
 
27
  The model is instantiated the same way as in the Transformers library.
28
+ The only difference is that the Gaudi configuration has to be loaded and provided to the trainer:
29
 
30
  ```
31
+ from optimum.habana import GaudiConfig, GaudiTrainer, GaudiTrainingArguments
32
+ from transformers import RobertaModel, RobertaTokenizer
33
 
 
 
 
 
34
 
35
+ tokenizer = RobertaTokenizer.from_pretrained("roberta-base")
36
+ model = RobertaModel.from_pretrained("roberta-base")
37
+ gaudi_config = GaudiConfig.from_pretrained("Habana/roberta-base")
38
+ args = GaudiTrainingArguments(
39
+ output_dir="/tmp/output_dir",
40
+ use_habana=True,
41
+ use_lazy_mode=True,
42
+ )
43
+
44
+ trainer = GaudiTrainer(
45
+ model=model,
46
+ gaudi_config=gaudi_config,
47
+ args=args,
48
+ tokenizer=tokenizer,
49
+ )
50
  trainer.train()
51
  ```