--- license: apache-2.0 library_name: transformers tags: - code - mlx datasets: - codeparrot/github-code-clean - bigcode/starcoderdata - open-web-math/open-web-math - math-ai/StackMathQA metrics: - code_eval pipeline_tag: text-generation inference: false model-index: - name: granite-3b-code-base results: - task: type: text-generation dataset: name: MBPP type: mbpp metrics: - type: pass@1 value: 36.0 name: pass@1 - task: type: text-generation dataset: name: MBPP+ type: evalplus/mbppplus metrics: - type: pass@1 value: 45.1 name: pass@1 - task: type: text-generation dataset: name: HumanEvalSynthesis(Python) type: bigcode/humanevalpack metrics: - type: pass@1 value: 36.6 name: pass@1 - type: pass@1 value: 37.2 name: pass@1 - type: pass@1 value: 40.9 name: pass@1 - type: pass@1 value: 26.2 name: pass@1 - type: pass@1 value: 35.4 name: pass@1 - type: pass@1 value: 22.0 name: pass@1 - type: pass@1 value: 25.0 name: pass@1 - type: pass@1 value: 18.9 name: pass@1 - type: pass@1 value: 29.9 name: pass@1 - type: pass@1 value: 17.1 name: pass@1 - type: pass@1 value: 26.8 name: pass@1 - type: pass@1 value: 14.0 name: pass@1 - type: pass@1 value: 18.3 name: pass@1 - type: pass@1 value: 23.2 name: pass@1 - type: pass@1 value: 29.9 name: pass@1 - type: pass@1 value: 24.4 name: pass@1 - type: pass@1 value: 16.5 name: pass@1 - type: pass@1 value: 3.7 name: pass@1 --- # mlx-community/granite-3b-code-base-4bit The Model [mlx-community/granite-3b-code-base-4bit](https://huggingface.co/mlx-community/granite-3b-code-base-4bit) was converted to MLX format from [ibm-granite/granite-3b-code-base](https://huggingface.co/ibm-granite/granite-3b-code-base) using mlx-lm version **0.12.0**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/granite-3b-code-base-4bit") response = generate(model, tokenizer, prompt="hello", verbose=True) ```