Error converting gemma3n

#21
by r4yd3n - opened
ONNX Community org

Hi there!
New to this community 🤗
I was trying to use convert to onnx https://huggingface.co/spaces/onnx-community/convert-to-onnx
to convert google/gemma-3n-E4B-it-litert-preview

It isint authorized without an access token since its a gated model. So I provided one after obtaining access.

image.png

The converter runs without the 403 error but throws the following error, which is kind of weird. Any thoughts on what may be happening here would be much appreciated!


Conversion failed: Traceback (most recent call last): File "/usr/local/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/usr/local/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/home/user/app/transformers.js/scripts/convert.py", line 456, in main() File "/home/user/app/transformers.js/scripts/convert.py", line 242, in main raise e File "/home/user/app/transformers.js/scripts/convert.py", line 235, in main tokenizer = AutoTokenizer.from_pretrained(tokenizer_id, **from_pretrained_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 963, in from_pretrained return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2036, in from_pretrained raise EnvironmentError( OSError: Can't load tokenizer for 'google/gemma-3n-E4B-it-litert-preview'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'google/gemma-3n-E4B-it-litert-preview' is the correct path to a directory containing all relevant files for a GemmaTokenizerFast tokenizer.

Sign up or log in to comment