Error when loading the Model in Oobabooga.

#2
by Xeno1X - opened

When I tried to run it with Oobabooga:

Traceback (most recent call last):
File “C:\Users***\Downloads\AI\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\auto\configuration_auto.py”, line 919, in from_pretrained
raise ValueError(
ValueError: Loading models\TehVenom_MPT-7b-WizardLM_Uncensored-Storywriter-Merge requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error.

Check with ooba how to load MPT's models, since they require support for custom code.

TehVenom changed discussion status to closed

When I tried to run it with Oobabooga:

Traceback (most recent call last):
File “C:\Users***\Downloads\AI\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\auto\configuration_auto.py”, line 919, in from_pretrained
raise ValueError(
ValueError: Loading models\TehVenom_MPT-7b-WizardLM_Uncensored-Storywriter-Merge requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error.

Xeno, what did you find?

Sign up or log in to comment