--- language: - en license: apache-2.0 tags: - llava - multimodal - qwen - mlx --- # mlx-community/nanoLLaVA-4bit This model was converted to MLX format from [`qnguyen3/nanoLLaVA`]() using mlx-vllm version **0.0.3**. Refer to the [original model card](https://huggingface.co/qnguyen3/nanoLLaVA) for more details on the model. ## Use with mlx ```bash pip install -U mlx-vlm ``` ```bash python -m mlx_vlm.generate --model mlx-community/nanoLLaVA-4bit --max-tokens 100 --temp 0.0 ```