regisss HF staff commited on
Commit
f5a9c6f
1 Parent(s): 011e0e6

Update broken link

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -7,13 +7,13 @@ Learn more about how to take advantage of the power of Habana HPUs to train Tran
7
 
8
  ## ALBERT XXLarge model HPU configuration
9
 
10
- This model contains just the `GaudiConfig` file for running the [albert-xxlarge-v1](https://huggingface.co/albert-xxlarge-v1) model on Habana's Gaudi processors (HPU).
11
 
12
  **This model contains no model weights, only a GaudiConfig.**
13
 
14
  This enables to specify:
15
  - `use_habana_mixed_precision`: whether to use Habana Mixed Precision (HMP)
16
- - `hmp_opt_level`: optimization level for HMP, see [here](https://docs.habana.ai/en/latest/PyTorch/PyTorch_User_Guide/PT_Mixed_Precision.html#configuration-options) for a detailed explanation
17
  - `hmp_bf16_ops`: list of operators that should run in bf16
18
  - `hmp_fp32_ops`: list of operators that should run in fp32
19
  - `hmp_is_verbose`: verbosity
 
7
 
8
  ## ALBERT XXLarge model HPU configuration
9
 
10
+ This model only contains the `GaudiConfig` file for running the [albert-xxlarge-v1](https://huggingface.co/albert-xxlarge-v1) model on Habana's Gaudi processors (HPU).
11
 
12
  **This model contains no model weights, only a GaudiConfig.**
13
 
14
  This enables to specify:
15
  - `use_habana_mixed_precision`: whether to use Habana Mixed Precision (HMP)
16
+ - `hmp_opt_level`: optimization level for HMP, see [here](https://docs.habana.ai/en/latest/PyTorch/PyTorch_Mixed_Precision/PT_Mixed_Precision.html#configuration-options) for a detailed explanation
17
  - `hmp_bf16_ops`: list of operators that should run in bf16
18
  - `hmp_fp32_ops`: list of operators that should run in fp32
19
  - `hmp_is_verbose`: verbosity