mlx-vlm 0.0.6 not yet released

#1
by cmgzy - opened

mlx-vlm 0.0.6 not yet released and 0.0.5 does not support Model type paligemma.
Is there a git version ready to use?

MLX Community org

Yes, you can try it using this branch:
https://github.com/Blaizzy/mlx-vlm/tree/pc/quantise-irregular

WIP PR:
https://github.com/Blaizzy/mlx-vlm/pull/24

It works but there are a few bugs I'm still ironning out :)

still not work with mlx-vlm 0.0.6 & mlx-lm 0.14.0 & mlx 0.14.0

python -m mlx_vlm.generate --model mlx-community/paligemma-3b-mix-224-8bit --max-tokens 100 --temp 0.0 --image ~/Pictures/yy.png --prompt "What's my current speed?"
Fetching 9 files: 100%|████████████████████████| 9/9 [00:00<00:00, 37190.87it/s]

No chat template is defined for this tokenizer - using a default chat template that implements the ChatML format (without BOS/EOS tokens!). If the default is not appropriate for your model, please set `tokenizer.chat_template` to an appropriate template. See https://huggingface.co/docs/transformers/main/chat_templating for more information.

Traceback (most recent call last):
  File "/Users/chenmi/miniforge3/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/Users/chenmi/miniforge3/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/Users/chenmi/miniforge3/lib/python3.10/site-packages/mlx_vlm/generate.py", line 108, in <module>
    main()
  File "/Users/chenmi/miniforge3/lib/python3.10/site-packages/mlx_vlm/generate.py", line 74, in main
    prompt = processor.apply_chat_template(
  File "/Users/chenmi/miniforge3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1791, in apply_chat_template
    rendered_chat = compiled_template.render(
  File "/Users/chenmi/miniforge3/lib/python3.10/site-packages/jinja2/environment.py", line 1304, in render
    self.environment.handle_exception()
  File "/Users/chenmi/miniforge3/lib/python3.10/site-packages/jinja2/environment.py", line 939, in handle_exception
    raise rewrite_traceback_stack(source=source)
  File "<template>", line 2, in top-level template code
jinja2.exceptions.UndefinedError: 'str object' has no attribute 'role'

similar error message for mlx-community/paligemma-3b-mix-448-8bit

MLX Community org

It’s released:

pip install -U mlx-vlm

MLX Community org

And it works fine.

Please double check your local environment.

prince-canuma changed discussion status to closed

Sign up or log in to comment