3thn's picture
f9205f6d85ef466ae59a1575364dd209af0f5611216d609759f65aa2af877ec2
d7c9e98 verified
|
raw
history blame
686 Bytes
metadata
license: llama3
library_name: transformers
tags:
  - openchat
  - llama3
  - C-RLFT
  - mlx
base_model: meta-llama/Meta-Llama-3-8B
pipeline_tag: text-generation

mlx-community/openchat-3.6-8b-20240522-4bit

This model was converted to MLX format from openchat/openchat-3.6-8b-20240522 using mlx-lm version 0.12.1. Refer to the original model card for more details on the model.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/openchat-3.6-8b-20240522-4bit")
response = generate(model, tokenizer, prompt="hello", verbose=True)