Text Generation
Transformers
MLX
English
stablelm_epoch
causal-lm
code
custom_code
Eval Results
stable-code-3b-mlx / README.md
Pierre-obi's picture
Upload folder using huggingface_hub
2e0e400 verified
metadata
language:
  - en
license: other
library_name: transformers
tags:
  - causal-lm
  - code
  - mlx
datasets:
  - tiiuae/falcon-refinedweb
  - bigcode/the-stack-github-issues
  - bigcode/commitpackft
  - bigcode/starcoderdata
  - EleutherAI/proof-pile-2
  - meta-math/MetaMathQA
metrics:
  - code_eval
model-index:
  - name: StarCoderBase-3B
    results:
      - task:
          type: text-generation
        dataset:
          name: MultiPL-HumanEval (Python)
          type: nuprl/MultiPL-E
        metrics:
          - type: pass@1
            value: 32.4
            name: pass@1
            verified: false
          - type: pass@1
            value: 30.9
            name: pass@1
            verified: false
          - type: pass@1
            value: 32.1
            name: pass@1
            verified: false
          - type: pass@1
            value: 32.1
            name: pass@1
            verified: false
          - type: pass@1
            value: 24.2
            name: pass@1
            verified: false
          - type: pass@1
            value: 23
            name: pass@1
            verified: false

mlx-community/stable-code-3b-mlx

This model was converted to MLX format from stabilityai/stable-code-3b. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/stable-code-3b-mlx")
response = generate(model, tokenizer, prompt="hello", verbose=True)