Spaces:
onnx
/
Running on CPU Upgrade

Felix Marty commited on
Commit
04e8a16
β€’
1 Parent(s): c9b76af

rename to export

Browse files
Files changed (2) hide show
  1. README.md +2 -2
  2. app.py +8 -8
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
- title: Convert to ONNX
3
- emoji: πŸ‘€
4
  colorFrom: green
5
  colorTo: purple
6
  sdk: gradio
1
  ---
2
+ title: Export to ONNX
3
+ emoji: 🏎️
4
  colorFrom: green
5
  colorTo: purple
6
  sdk: gradio
app.py CHANGED
@@ -59,7 +59,7 @@ def onnx_export(token: str, model_id: str, task: str, opset: Union[int, str]) ->
59
  commit_url = repo.push_to_hub()
60
  print("[dataset]", commit_url)
61
 
62
- return f"#### Success πŸ”₯ Yay! This model was successfully converted and a PR was open using your token, here: [{commit_info.pr_url}]({commit_info.pr_url})"
63
  except Exception as e:
64
  return f"#### Error: {e}"
65
 
@@ -89,26 +89,26 @@ TITLE = """
89
  "
90
  >
91
  <h1 style="font-weight: 900; margin-bottom: 10px; margin-top: 10px;">
92
- Convert transformers model to ONNX with πŸ€— Optimum exporters 🏎️ (Beta)
93
  </h1>
94
  </div>
95
  """
96
 
97
  # for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
98
  DESCRIPTION = """
99
- This Space allows you to automatically convert πŸ€— transformers PyTorch models hosted on the Hugging Face Hub to [ONNX](https://onnx.ai/). It opens a PR on the target model, and it is up to the owner of the original model
100
  to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
101
 
102
- Once converted, the model can, for example, be used in the [πŸ€— Optimum](https://huggingface.co/docs/optimum/) library closely following the transformers API.
103
  Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
104
 
105
  The steps are as following:
106
  - Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
107
  - Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
108
- - Click "Convert to ONNX"
109
- - That's it! You'll get feedback on if the conversion was successful or not, and if it was, you'll get the URL of the opened PR!
110
 
111
- Note: in case the model to convert is larger than 2 GB, it will be saved in a subfolder called `onnx/`. To load it from Optimum, the argument `subfolder="onnx"` should be provided.
112
  """
113
 
114
  with gr.Blocks() as demo:
@@ -140,7 +140,7 @@ with gr.Blocks() as demo:
140
  label="ONNX opset (optional, can be left blank)",
141
  )
142
 
143
- btn = gr.Button("Convert to ONNX")
144
  output = gr.Markdown(label="Output")
145
 
146
  btn.click(
59
  commit_url = repo.push_to_hub()
60
  print("[dataset]", commit_url)
61
 
62
+ return f"#### Success πŸ”₯ Yay! This model was successfully exported and a PR was open using your token, here: [{commit_info.pr_url}]({commit_info.pr_url})"
63
  except Exception as e:
64
  return f"#### Error: {e}"
65
 
89
  "
90
  >
91
  <h1 style="font-weight: 900; margin-bottom: 10px; margin-top: 10px;">
92
+ Export transformers model to ONNX with πŸ€— Optimum exporters 🏎️ (Beta)
93
  </h1>
94
  </div>
95
  """
96
 
97
  # for some reason https://huggingface.co/settings/tokens is not showing as a link by default?
98
  DESCRIPTION = """
99
+ This Space allows you to automatically export πŸ€— transformers PyTorch models hosted on the Hugging Face Hub to [ONNX](https://onnx.ai/). It opens a PR on the target model, and it is up to the owner of the original model
100
  to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!
101
 
102
+ Once exported, the model can, for example, be used in the [πŸ€— Optimum](https://huggingface.co/docs/optimum/) library closely following the transformers API.
103
  Check out [this guide](https://huggingface.co/docs/optimum/main/en/onnxruntime/usage_guides/models) to see how!
104
 
105
  The steps are as following:
106
  - Paste a read-access token from [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens). Read access is enough given that we will open a PR against the source repo.
107
  - Input a model id from the Hub (for example: [textattack/distilbert-base-cased-CoLA](https://huggingface.co/textattack/distilbert-base-cased-CoLA))
108
+ - Click "Export to ONNX"
109
+ - That's it! You'll get feedback on if the export was successful or not, and if it was, you'll get the URL of the opened PR!
110
 
111
+ Note: in case the model to export is larger than 2 GB, it will be saved in a subfolder called `onnx/`. To load it from Optimum, the argument `subfolder="onnx"` should be provided.
112
  """
113
 
114
  with gr.Blocks() as demo:
140
  label="ONNX opset (optional, can be left blank)",
141
  )
142
 
143
+ btn = gr.Button("Export to ONNX")
144
  output = gr.Markdown(label="Output")
145
 
146
  btn.click(