regisss HF staff commited on
Commit
53a787c
1 Parent(s): ea3256c

Update README.md

Browse files

Remove GaudiConfig from the usage example because it is not mandatory anymore

Files changed (1) hide show
  1. README.md +3 -4
README.md CHANGED
@@ -23,24 +23,23 @@ This enables to specify:
23
  ## Usage
24
 
25
  The model is instantiated the same way as in the Transformers library.
26
- The only difference is that the Gaudi configuration has to be loaded and provided to the trainer:
27
 
28
  ```
29
- from optimum.habana import GaudiConfig, GaudiTrainer, GaudiTrainingArguments
30
  from transformers import BertTokenizer, BertModel
31
 
32
  tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")
33
  model = BertModel.from_pretrained("bert-base-uncased")
34
- gaudi_config = GaudiConfig.from_pretrained("Habana/bert-base-uncased")
35
  args = GaudiTrainingArguments(
36
  output_dir="/tmp/output_dir",
37
  use_habana=True,
38
  use_lazy_mode=True,
 
39
  )
40
 
41
  trainer = GaudiTrainer(
42
  model=model,
43
- gaudi_config=gaudi_config,
44
  args=args,
45
  tokenizer=tokenizer,
46
  )
 
23
  ## Usage
24
 
25
  The model is instantiated the same way as in the Transformers library.
26
+ The only difference is that there are a few new training arguments specific to HPUs:
27
 
28
  ```
29
+ from optimum.habana import GaudiTrainer, GaudiTrainingArguments
30
  from transformers import BertTokenizer, BertModel
31
 
32
  tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")
33
  model = BertModel.from_pretrained("bert-base-uncased")
 
34
  args = GaudiTrainingArguments(
35
  output_dir="/tmp/output_dir",
36
  use_habana=True,
37
  use_lazy_mode=True,
38
+ gaudi_config_name="Habana/bert-base-uncased",
39
  )
40
 
41
  trainer = GaudiTrainer(
42
  model=model,
 
43
  args=args,
44
  tokenizer=tokenizer,
45
  )