Model doesn't produce useful output

#2
by depasquale - opened
MLX Community org

When I try to run this model with the Python MLX API like so:

from mlx_lm import load, generate
model_id = "mlx-community/Llama-3-8B-Instruct-262k-2bit"
user_message = "Name a color."
model, tokenizer = load(model_id)
prompt = f"<|begin_of_text|><|start_header_id|>user<|end_header_id|>\n\n{user_message}<|start_header_id|>assistant<|end_header_id|>\n\n"
response = generate(model, tokenizer, prompt=prompt, temp=0.5, max_tokens=50, verbose=True)

I get this output:

 a # abyssbyssbyss byss  a #byss with a pile pile #byss # t #byss of # of Col # # # in Cab  of # of of Cab of of of of of  of of of of Cab # of

Sign up or log in to comment