fxmarty HF staff maanavdalal commited on
Commit
ba0ac1d
β€’
1 Parent(s): 1433ab1

Fixed description wording for clarity and typos (#2)

Browse files

- Fixed description wording for clarity and typos (ded6a788cadca08944cb99c2c9271dfa7db740ad)


Co-authored-by: Maanav D <maanavdalal@users.noreply.huggingface.co>

Files changed (1) hide show
  1. app.py +4 -4
app.py CHANGED
@@ -91,17 +91,17 @@ TITLE = """
91
 
92
  # for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
93
  DESCRIPTION = """
94
- This Space allows to automatically convert to ONNX πŸ€— transformers PyTorch models hosted on the Hugging Face Hub. It opens a PR on the target model, and it is up to the owner of the original model
95
  to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
96
 
97
- Once converted, the model can for example be used in the [πŸ€— Optimum](https://huggingface.co/docs/optimum/) library following closely the transormers API.
98
  Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
99
 
100
- The steps are the following:
101
  - Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
102
  - Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
103
  - Click "Convert to ONNX"
104
- - That's it! You'll get feedback if it works or not, and if it worked, you'll get the URL of the opened PR!
105
 
106
  Note: in case the model to convert is larger than 2 GB, it will be saved in a subfolder called `onnx/`. To load it from Optimum, the argument `subfolder="onnx"` should be provided.
107
  """
 
91
 
92
  # for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
93
  DESCRIPTION = """
94
+ This Space allows you to automatically convert πŸ€— transformers PyTorch models hosted on the Hugging Face Hub to [ONNX](https://onnx.ai/). It opens a PR on the target model, and it is up to the owner of the original model
95
  to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
96
 
97
+ Once converted, the model can, for example, be used in the [πŸ€— Optimum](https://huggingface.co/docs/optimum/) library closely following the transformers API.
98
  Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
99
 
100
+ The steps are as following:
101
  - Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
102
  - Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
103
  - Click "Convert to ONNX"
104
+ - That's it! You'll get feedback on if the conversion was successful or not, and if it was, you'll get the URL of the opened PR!
105
 
106
  Note: in case the model to convert is larger than 2 GB, it will be saved in a subfolder called `onnx/`. To load it from Optimum, the argument `subfolder="onnx"` should be provided.
107
  """