Text Generation
Transformers
Safetensors
MLX
English
stablelm
causal-lm
code
Eval Results
Inference Endpoints
stable-code-3b-4bit / README.md
ashish-datta's picture
Upload folder using huggingface_hub
6606676 verified
metadata
language:
  - en
license: other
library_name: transformers
tags:
  - causal-lm
  - code
  - mlx
datasets:
  - tiiuae/falcon-refinedweb
  - bigcode/the-stack-github-issues
  - bigcode/commitpackft
  - bigcode/starcoderdata
  - EleutherAI/proof-pile-2
  - meta-math/MetaMathQA
metrics:
  - code_eval
model-index:
  - name: StarCoderBase-3B
    results:
      - task:
          type: text-generation
        dataset:
          name: MultiPL-HumanEval (Python)
          type: nuprl/MultiPL-E
        metrics:
          - type: pass@1
            value: 32.4
            name: pass@1
            verified: false
          - type: pass@1
            value: 30.9
            name: pass@1
            verified: false
          - type: pass@1
            value: 32.1
            name: pass@1
            verified: false
          - type: pass@1
            value: 32.1
            name: pass@1
            verified: false
          - type: pass@1
            value: 24.2
            name: pass@1
            verified: false
          - type: pass@1
            value: 23
            name: pass@1
            verified: false

mlx-community/stable-code-3b-4bit

This model was converted to MLX format from stabilityai/stable-code-3b. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/stable-code-3b-4bit")
response = generate(model, tokenizer, prompt="hello", verbose=True)