Oobabooga: "Could not locate the configuration_mpt.py inside models\OccamRazor_mpt-7b-storywriter-4bit-128g"

#5
by ProPatte - opened

I'm trying to use the model with oobabooga but when I start the webui I get the error: "Could not locate the configuration_mpt.py inside models\OccamRazor_mpt-7b-storywriter-4bit-128g" is it correct to use the configuration_mpt.py from the /mosaicml/mpt-7b-storywriter model?

This comment has been hidden

Why is @bartman081523 comment hidden?

It seems you need to download the files that it says is missing from the mosaicml model. I needed to get: adapt_tokenizer, attention, blocks, configuration_mpt, hf_prefixlm_converter, meta_init_context, modeling_mpt, norm, and param_init_fns. After putting those in the model folder (oobabooga_windows\text-generation-webui\models\OccamRazor_mpt-7b-storywriter-4bit-128g", it worked for me.

@ProPatte

Why is @bartman081523 comment hidden?

My post was wrong info

I was facing the same issue, I've downloaded those files and more but now I get this AttributeError:

2023-08-23 09:45:17 INFO:Loading OccamRazor_mpt-7b-storywriter-4bit-128g...
2023-08-23 09:45:17 ERROR:Failed to load the model.
Traceback (most recent call last):
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\text-generation-webui\server.py", line 69, in load_model_wrapper
    shared.model, shared.tokenizer = load_model(shared.model_name, loader)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\text-generation-webui\modules\models.py", line 78, in load_model
    output = load_func_map[loader](model_name)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\text-generation-webui\modules\models.py", line 148, in huggingface_loader
    model = LoaderClass.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}"), low_cpu_mem_usage=True, torch_dtype=torch.bfloat16 if shared.args.bf16 else torch.float16, trust_remote_code=shared.args.trust_remote_code)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 488, in from_pretrained
    return model_class.from_pretrained(
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 2629, in from_pretrained
    state_dict = load_state_dict(resolved_archive_file)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 449, in load_state_dict
    if metadata.get("format") not in ["pt", "tf", "flax"]:
AttributeError: 'NoneType' object has no attribute 'get'```

I was facing the same issue, I've downloaded those files and more but now I get this AttributeError:

2023-08-23 09:45:17 INFO:Loading OccamRazor_mpt-7b-storywriter-4bit-128g...
2023-08-23 09:45:17 ERROR:Failed to load the model.
Traceback (most recent call last):
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\text-generation-webui\server.py", line 69, in load_model_wrapper
    shared.model, shared.tokenizer = load_model(shared.model_name, loader)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\text-generation-webui\modules\models.py", line 78, in load_model
    output = load_func_map[loader](model_name)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\text-generation-webui\modules\models.py", line 148, in huggingface_loader
    model = LoaderClass.from_pretrained(Path(f"{shared.args.model_dir}/{model_name}"), low_cpu_mem_usage=True, torch_dtype=torch.bfloat16 if shared.args.bf16 else torch.float16, trust_remote_code=shared.args.trust_remote_code)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 488, in from_pretrained
    return model_class.from_pretrained(
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 2629, in from_pretrained
    state_dict = load_state_dict(resolved_archive_file)
  File "D:\AI_Stuff\oobabooga\oobabooga-windows\oobabooga-windows\installer_files\env\lib\site-packages\transformers\modeling_utils.py", line 449, in load_state_dict
    if metadata.get("format") not in ["pt", "tf", "flax"]:
AttributeError: 'NoneType' object has no attribute 'get'```

Same thing I got! Did you end up finding a solution? I really want to use this model!

Sign up or log in to comment