Adding ONNX file of this model

#1
by sappho192 - opened

Beep boop I am the ONNX export bot πŸ€–πŸŽοΈ. On behalf of sappho192, I would like to add to this repository the model converted to ONNX.

What is ONNX? It stands for "Open Neural Network Exchange", and is the most commonly used open standard for machine learning interoperability. You can find out more at onnx.ai!

The exported ONNX model can be then be consumed by various backends as TensorRT or TVM, or simply be used in a few lines with πŸ€— Optimum through ONNX Runtime, check out how here!

This Space allows you to automatically export πŸ€— transformers PyTorch models hosted on the Hugging Face Hub to ONNX. It opens a PR on the target model, and it is up to the owner of the original model to merge the PR to allow people to leverage the ONNX standard to share and use the model on a wide range of devices!

Once exported, the model can, for example, be used in the πŸ€— Optimum library closely following the transformers API. Check out this guide to see how!

The steps are as following:

Paste a read-access token from https://huggingface.co/settings/tokens. Read access is enough given that we will open a PR against the source repo.
Input a model id from the Hub (for example: textattack/distilbert-base-cased-CoLA)
Click "Export to ONNX"
That's it! You'll get feedback on if the export was successful or not, and if it was, you'll get the URL of the opened PR!
Note: in case the model to export is larger than 2 GB, it will be saved in a subfolder called onnx/. To load it from Optimum, the argument subfolder="onnx" should be provided.
Success πŸ”₯ Yay! This model was successfully exported and a PR was open using your token, here: https://huggingface.co/sappho192/ffxiv-ja-ko-translator/discussions/1. If you would like to use the exported model without waiting for the PR to be approved, head to https://huggingface.co/sappho192/ffxiv-ja-ko-translator/tree/refs%2Fpr%2F1
sappho192 changed pull request status to merged

Sign up or log in to comment