metadata
language:
- code
license: llama2
tags:
- llama-2
- mlx
pipeline_tag: text-generation
widget:
- text: |
<s>Source: system
You are a helpful and honest code assistant <step> Source: user
Print a hello world in Python <step> Source: assistant
Destination: user
inference:
parameters:
max_new_tokens: 200
stop:
- </s>
- <step>
DamienDrash/CodeLlama-70B-Instruct
This model was converted to MLX format from codellama/CodeLlama-70b-Instruct-hf.
Refer to the original model card for more details on the model.
Use with mlx
pip install mlx
git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/llms/hf_llm
python generate.py --model mlx-community/DamienDrash/CodeLlama-70B-Instruct --prompt "My name is"