Edit model card

This is the ONNX variant of the bge-large-en-v1.5 embeddings model created with the DeepSparse Optimum integration.

To replicate ONNX export, run:

pip install git+https://github.com/neuralmagic/optimum-deepsparse.git
from optimum.deepsparse import DeepSparseModelForFeatureExtraction
from transformers.onnx.utils import get_preprocessor
from pathlib import Path

model_id = "BAAI/bge-large-en-v1.5"

# load model and convert to onnx
model = DeepSparseModelForFeatureExtraction.from_pretrained(model_id, export=True)
tokenizer = get_preprocessor(model_id)

# save onnx checkpoint and tokenizer
onnx_path = Path("bge-large-en-v1.5-dense")
model.save_pretrained(onnx_path)
tokenizer.save_pretrained(onnx_path)
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.