BUG - Could not load model Error
I am getting the following error.
Could not load model akdeniz27/bert-base-turkish-cased-ner-quantized with any of the following classes: (<class 'transformers.models.bert.modeling_bert.BertForTokenClassification'>, <class 'transformers.models.bert.modeling_tf_bert.TFBertForTokenClassification'>).
It should work if "optimum[onnxruntime]" is installed and "ORTModelForTokenClassification" is imported:
!pip install "optimum[onnxruntime]"
from transformers import AutoTokenizer, pipeline
from optimum.onnxruntime import ORTModelForTokenClassification
model = ORTModelForTokenClassification.from_pretrained("akdeniz27/bert-base-turkish-cased-ner-quantized", file_name="model_quantized.onnx")
tokenizer = AutoTokenizer.from_pretrained("akdeniz27/bert-base-turkish-cased-ner-quantized")
ner = pipeline("ner", model=model, tokenizer=tokenizer, aggregation_strategy="first")
ner("your text here")
Thanks for support. It worked flawlessly as you specified.
It works even though the "file_name" parameter is not added.
Glad to hear it's working. By the way, as you've pointed, the "file_name" parameter is not needed any more as long as the model file extension is ".onnx".