ValueError: Model type qwen2_moe not supported

#1
by defloration - opened

I have mlx_lm version 0.5.0 and mlx version 0.9.0.
Do I need to use any other version?

same here,

ModuleNotFoundError: No module named 'mlx_lm.models.qwen2_moe'

MLX Community org

I'm still working on it.
I will update you guys when I'm done.

You can track the progress here: https://github.com/ml-explore/mlx-examples/pull/640

MLX Community org
โ€ข
edited Apr 3

Done โœ…

You can use it now, just update your mlx-lm and enjoy ;)

pip install -U mlx-lm

Btw, donโ€™t use the base model with chat-template it produces bad results. Instead use it with the instruction tuned chat model here:

https://huggingface.co/mlx-community/Qwen1.5-MoE-A2.7B-Chat-4bit

Sign up or log in to comment