Quantization Error

#1
by ch4rL - opened

When I try to use the model via mlx-lm i get the following error message:
ValueError: [quantize] All dimensions should be divisible by 32 for now

MLX Community org

Update mlx to latest version pip install -U mlx

MLX Community org

@ch4rL have you been able to run this with latest mlx version?

Sign up or log in to comment