--- title: README emoji: 📚 colorFrom: green colorTo: indigo sdk: static pinned: false --- # MLX Community A community org for model weights compatible with [mlx-examples](https://github.com/ml-explore/mlx-examples) powered by [MLX](https://github.com/ml-explore/mlx). These are pre-converted weights and ready to be used in the example scripts. # Quick start Check out the MLX examples repo: ``` git clone git@github.com:ml-explore/mlx-examples.git ``` Install the requirements: ``` cd mlx-examples/hf_llm pip install -r requirements.txt ``` Generate: ``` python generate.py --hf-path mistralai/Mistral-7B-v0.1 --prompt "hello" ``` To upload a new model (for example a 4-bit quantized Mistral-7B), do: ``` python convert.py --hf-path mistralai/Mistral-7B-v0.1 -q --upload-name mistral-v0.1-4bit ```