Charmap issue

#1
by VertexMachine - opened

Not sure if this is the problem with the model or how I'm using it, but when I select it in oobabooga I get:

Traceback (most recent call last):
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\gradio\queueing.py", line 407, in call_prediction
    output = await route_utils.call_process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\gradio\route_utils.py", line 226, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1550, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\gradio\blocks.py", line 1185, in call_function
    prediction = await anyio.to_thread.run_sync(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\anyio\to_thread.py", line 33, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\anyio\_backends\_asyncio.py", line 877, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\anyio\_backends\_asyncio.py", line 807, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\site-packages\gradio\utils.py", line 661, in wrapper
    response = f(*args, **kwargs)
               ^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\modules\models_settings.py", line 216, in apply_model_settings_to_state
    model_settings = get_model_metadata(model)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\modules\models_settings.py", line 105, in get_model_metadata
    metadata = json.loads(open(path, 'r').read())
                          ^^^^^^^^^^^^^^^^^^^^^^
  File "F:\LLMs\oobabooga\installer_files\env\Lib\encodings\cp1252.py", line 23, in decode
    return codecs.charmap_decode(input,self.errors,decoding_table)[0]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
UnicodeDecodeError: 'charmap' codec can't decode byte 0x81 in position 125: character maps to <undefined>

Model works fine for me under ooba. Please make sure your ooba and exllamav2 loader are up-to-date.

I've updated earlier today (with update_windows.bat), but just checked now:

(F:\LLMs\oobabooga\installer_files\env) F:\LLMs\oobabooga>pip list | grep exll
exllama                   0.0.18+cu121
exllamav2                 0.0.10+cu121

and

(F:\LLMs\oobabooga\installer_files\env) F:\LLMs\oobabooga>git pull
Already up to date.

The stack trace seems to indicate an issue reading the model's config. I don't know if it's the config.yaml in ooba or the model's config json it's having issues with. You can put a print(path) statement in the models_settings before line 105 to see what file it isn't happy with.

It was complaining about models\LoneStriker_deepseek-coder-33b-instruct-4.65bpw-h6-exl2\tokenizer_config.json

...

it's bug in ooba

After I added encoding='utf-8' to

metadata = json.loads(open(path, 'r', encoding='utf-8').read())

it works.

Thanks for your help!

VertexMachine changed discussion status to closed

What ooba file did you add this too "metadata = json.loads(open(path, 'r', encoding='utf-8').read())"

Thank you:)

Sign up or log in to comment