--- language: - en license: other library_name: transformers tags: - causal-lm - code - mlx datasets: - tiiuae/falcon-refinedweb - bigcode/the-stack-github-issues - bigcode/commitpackft - bigcode/starcoderdata - EleutherAI/proof-pile-2 - meta-math/MetaMathQA metrics: - code_eval model-index: - name: StarCoderBase-3B results: - task: type: text-generation dataset: name: MultiPL-HumanEval (Python) type: nuprl/MultiPL-E metrics: - type: pass@1 value: 32.4 name: pass@1 verified: false - type: pass@1 value: 30.9 name: pass@1 verified: false - type: pass@1 value: 32.1 name: pass@1 verified: false - type: pass@1 value: 32.1 name: pass@1 verified: false - type: pass@1 value: 24.2 name: pass@1 verified: false - type: pass@1 value: 23.0 name: pass@1 verified: false --- # mlx-community/stable-code-3b-4bit This model was converted to MLX format from [`stabilityai/stable-code-3b`](). Refer to the [original model card](https://huggingface.co/stabilityai/stable-code-3b) for more details on the model. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/stable-code-3b-4bit") response = generate(model, tokenizer, prompt="hello", verbose=True) ```