Compatible with transformer.js or ONNX Runtime

#2
by agibrat - opened

Is it possible to allow this model to be compatible with transformer.js or ONNX Runtime, my goal is to make it run within browser.

attempt done @ https://huggingface.co/gmarcFG/tech-keywords-extractor-onnx

but we have error when we try to run it within browser with transformer.js

onnxruntime/core/graph/model.cc:146 onnxruntime::Model::Model(ModelProto &&, const PathString &, const IOnnxRuntimeOpSchemaRegistryList *, const logging::Logger &, const ModelOptions &) Unsupported model IR version: 9, max supported IR version: 8

image.png

I've also had some issue when converting to ONNX:

python3 -m scripts.convert --quantize --model_id ilsilfverskiold/tech-keywords-extractor

The ONNX export succeeded with the warning: The maximum absolute difference between the output of the reference model and the ONNX exported model is not within the set tolerance 1e-05:

  • logits: max diff = 6.866455078125e-05.
    The exported model was saved at: models/ilsilfverskiold/tech-keywords-extractor
    Quantizing: 0%| | 0/4 [00:09<?, ?it/s]
    ...
    TypeError: quantize_dynamic() got an unexpected keyword argument 'optimize_model'

Sign up or log in to comment