--- language: - fr - it - de - es - en license: apache-2.0 tags: - mlx inference: parameters: temperature: 0.5 widget: - messages: - role: user content: What is your favorite condiment? --- # mlx-community/Mixtral-8x7B-Instruct-v0.1 The Model [mlx-community/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mlx-community/Mixtral-8x7B-Instruct-v0.1) was converted to MLX format from [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) using mlx-lm version **0.12.0**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/Mixtral-8x7B-Instruct-v0.1") response = generate(model, tokenizer, prompt="hello", verbose=True) ```