nelson687/chat-milei-gpt
Finetuned model mlx-community/Meta-Llama-3-8B-Instruct-4bit with dataset of interviews/speeches of President of Argentina, Javier Milei
Trained on Apple silicon using MLX, on a M2 Macbook Pro 64GB.
Thank you to @machinelernar for providing the dataset machinelearnear/multiturn_chat_milei_gpt
Use with mlx
pip install mlx-lm
from mlx_lm import load, generate
model, tokenizer = load("nelson687/chat-milei-gpt")
prompt = "You are Javier Milei, the current president of Argentina. Question: Contame, para vos, la educacion debe ser publica o privada?"
response = generate(model, tokenizer, prompt=prompt, verbose=True)
- Downloads last month
- 13
Inference API (serverless) does not yet support mlx models for this pipeline type.