ONNX version of intfloat/multilingual-e5-large
This is a sentence-transformers model: It maps sentences & paragraphs to a N dimensional dense vector space and can be used for tasks like clustering or semantic search.
The model conversion was made with onnx-convert tool with the following parameters:
python convert.sh --model_id intfloat/multilingual-e5-large --quantize QInt8 --optimize 0
There are two versions of model available:
model.onnx
- Float32 version, with optimize=0model_opt0_QInt8.onnx
- QInt8 quantized version, with optimize=0
Compared to the base/small versions of the model, this one is not optimized due to a bug in ONNX runtime: https://github.com/microsoft/onnxruntime/issues/15563
License
Apache 2.0
- Downloads last month
- 19
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.