--- language: - code license: other tags: - code - mlx inference: false license_name: mnpl license_link: https://mistral.ai/licences/MNPL-0.1.md --- # mlx-community/Codestral-22B-v0.1-4bit The Model [mlx-community/Codestral-22B-v0.1-4bit](https://huggingface.co/mlx-community/Codestral-22B-v0.1-4bit) was converted to MLX format from [bullerwins/Codestral-22B-v0.1-hf](https://huggingface.co/bullerwins/Codestral-22B-v0.1-hf) using mlx-lm version **0.14.0**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/Codestral-22B-v0.1-4bit") response = generate(model, tokenizer, prompt="hello", verbose=True) ```