Not able to execute

#2
by sintuk - opened
MLX Community org

I tried both ways mentioned in this page but not able to execute. getting below issues. If anyone faced same issue or solved same issues, please help me too to solve it.

1. Using Terminal

(python -m mlx_lm.generate --model mlx-community/dbrx-instruct-4bit --prompt "Hello" --trust-remote-code --max-tokens 500)

Issue:
ModuleNotFoundError: No module named 'mlx_lm.models.dbrx'
During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/Users/sintu/mlx-examples/llms/mlx_lm/generate.py", line 148, in
main(args)
File "/Users/sintu/mlx-examples/llms/mlx_lm/generate.py", line 112, in main
model, tokenizer = load(
File "/Users/sintu/mlx-examples/llms/mlx_lm/utils.py", line 380, in load
model = load_model(model_path, lazy)
File "/Users/sintu/mlx-examples/llms/mlx_lm/utils.py", line 315, in load_model
model_class, model_args_class = _get_classes(config=config)
File "/Users/sintu/mlx-examples/llms/mlx_lm/utils.py", line 58, in _get_classes
raise ValueError(msg)
ValueError: Model type dbrx not supported.

2. Using code

from mlx_lm import load, generate

model, tokenizer = load( "mlx-community/dbrx-instruct-4bit", tokenizer_config={"trust_remote_code": True} )

chat = [ {"role": "user", "content": "What's the difference between PCA vs UMAP vs t-SNE?"}, {"role": "assistant", "content": "The "}, ]

prompt = tokenizer.apply_chat_template(chat, tokenize=False)

#We need to remove the last <|im_end|> token so that the AI continues generation
prompt = prompt[::-1].replace("<|im_end|>"[::-1], "", 1)[::-1]

response = generate(model, tokenizer, prompt=prompt, verbose=True, temp=0.6, max_tokens=1500)

Issue:

File ~/mlx-examples/llms/mlx_lm/utils.py:14
11 from textwrap import dedent
12 from typing import Any, Callable, Dict, Generator, List, Optional, Tuple, Union
---> 14 import mlx.core as mx
15 import mlx.nn as nn
16 from huggingface_hub import snapshot_download

ModuleNotFoundError: No module named 'mlx.core'

MLX Community org
edited Mar 30

Even after updating mlx_lm, I didn't find the dbrx - file in my venv. Do you have the file? Otherwise you may want to download and integrate it by yourself
https://github.com/ml-explore/mlx-examples/blob/main/llms/mlx_lm/models/dbrx.py

MLX Community org

Thanks so much @BenBenser , Yeah I also didn't had this dbrx.py file even after updating the library. I put this file in models folder, now I'm getting 2nd issue -
File "/Users/sintu/mlx-examples/llms/mlx_lm/utils.py", line 14, in
import mlx.core as mx
ModuleNotFoundError: No module named 'mlx' **

I checked mlx library as well for core.py or any related file but not able to find it. Do you have any info regarding this?

MLX Community org

Hmm, core will be imported by the init, you don't have an own core file. But do you actually have mlx as a package installed? Or just mlx_lm? Because you need both packages.

You may also try to leave out the trust_remote - statement here: model, tokenizer = load( "mlx-community/dbrx-instruct-4bit", tokenizer_config={"trust_remote_code": True} )

so actually:
model, tokenizer = load( "mlx-community/dbrx-instruct-4bit" )

But this shouldn't be the cause for the missing mlx.

MLX Community org
edited Mar 31

I also have the same issue

MLX Community org

The latest version doesn't include DBRX (yet), that's why the instructions I've added were to install from source. Did you guys install from source?

Ideally first remove the old version:

pip uninstall mlx-lm

then

git clone git@github.com:ml-explore/mlx-examples.git
cd mlx-examples/llms/
python setup.py build
python setup.py install

then you can do

python -m mlx_lm.generate --model mlx-community/dbrx-instruct-4bit --prompt "What's the difference between PCA vs UMAP vs t-SNE?" --trust-remote-code --use-default-chat-template  --max-tokens 1000

@eek is correct. interestingly, after the successful run from that folder I tried to pip install . from within mlx-examples/llms/ folder. For the second example, when importing the libraries and creating generations within another python script, just don't repeat my mistake of cloning this repo and running the test from within the repo. I'd run into a tiktoken conflict because this repo has a file called tiktoken.py. Other than that, everything works and it's quite exciting to run this model locally, albeit having to close about everything else haha.

MLX Community org

Thanks so much @eek , It worked.

eek changed discussion status to closed

Sign up or log in to comment