Text Generation
Transformers
MLX
English
stablelm_epoch
causal-lm
code
custom_code
Eval Results

Error when loading model using MLX

#1
by maorsg1 - opened

Getting error:

"Could not locate the configuration_stablelm_epoch.py inside mlx-community/stable-code-3b-mlx."

code:
from mlx_lm import load, generate
from transformers import pipeline
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("mlx-community/stable-code-3b-mlx", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(
"mlx-community/stable-code-3b-mlx",
trust_remote_code=True,

    )

generation_pipeline = pipeline("text-generation", model=model, tokenizer=tokenizer, trust_remote_code=True)
result = generation_pipeline(full_prompt, max_new_tokens=max_tokens)

maorsg1 changed discussion title from Error when loading model using to Error when loading model using MLX

Sign up or log in to comment