Error: Exception during initialization

#1
by Noahloghman - opened

Hi @Xenova
I have this error when I run this code : Error: Exception during initialization: D:\a_work\1\s\onnxruntime\core\optimizer\initializer.cc:31 onnxruntime::Initializer::Initializer !model_path.IsEmpty() was false. model_path must not be empty. Ensure that a path is provided when the model is created or loaded.
did you added or modified some line inside the .onnx file in order to make it work?

am I, missing something?

Hi there! Are you using this from transformers.js? Could you provide the piece of code which resulted in this issue?

Yes, I'm using transformers.js library

when I call your repo_id all works good
import { pipeline } from '@xenova/transformers';
const pipe = await pipeline('text-generation', 'Xenova/gpt2-large-conversational');

but when I call same files (all .onnx and .onnx_data inside onnx folder and the rest outside) (from your repo id) I have the error above
import { pipeline } from '@xenova/transformers';
const pipe = await pipeline('text-generation', my repo id);

I added the library transformers.js and granted access into it.
I did this tests because I'am training my own model but I have same error, this is why I tried with yours in order to understand what I'm missing
Maybe I'm missing this: Is there a way to call ORTModelForCausalLM from optimum.onnxruntime inside pipeline of transformers.js (this is in python, )does it exit in transformers.js?

Is your repo id has more privileges than mine? because I uploaded same files inside my new repo_id for testing and I had same error

Since your repo is private, are you correctly passing environment variables to allow access to the model? See this guide for more info: https://huggingface.co/docs/transformers.js/guides/private

Yes I tried this passed correctly environment variables process.env.HF_ACCESS_TOKEN = 'hf_...', even I made it public, but still have the same message error.

Thank you

Otherwise , with transformers.js , can we call our model from local machine, or only from huggingface?

Noahloghman changed discussion status to closed

Sign up or log in to comment