how did you make mlx_lm support glm-4?

#2
by oldhu - opened

When I try to convert using mlx_lm.convert, it says chatglm is not supported.

When I try to convert using mlx_lm.convert, it says chatglm is not supported.

I rewrote the chatglm model using mlx, so I can perform parameter conversion (chatglm.py). However, there are some problems with the inferencing, which are still being debugged.

Sign up or log in to comment