Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

from optimum.onnxruntime import ORTModelForSequenceClassification from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline

转换 onnx 模型

def convert(path, onnx_path, onnx_path): onnx_model = ORTModelForSequenceClassification.from_pretrained(path, from_transformers=True) tokenizer = AutoTokenizer.from_pretrained(path)

onnx_model.save_pretrained(onnx_path)
tokenizer.save_pretrained(onnx_path)

加载模型,用pipeline包装

def load_model(model_name): model = AutoModelForSequenceClassification.from_pretrained(model_name) tokenizer = AutoTokenizer.from_pretrained(model_name) text_classification_pipeline = pipeline("text-classification", model=model, tokenizer=tokenizer) print(text_classification_pipeline('这是一个简单的demo,用来防止忘记')) return text_classification_pipeline

加载 onnx 模型,用pipeline包装

def load_onnx_model(onnx_path): lang_tokenizer = AutoTokenizer.from_pretrained(onnx_path) lang_model = ORTModelForSequenceClassification.from_pretrained(onnx_path) lang_detecter = pipeline("text-classification", model=lang_model, tokenizer=lang_tokenizer, truncation=True) print(lang_detecter('这是一个简单的demo,用来防止忘记')) return lang_detecter

Downloads last month
29
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.