ValueError in KoboldAI when loading the model

#66
by JermemyHaschal - opened

"ValueError: Loading F:#Ablage\falcon-40b-instruct requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error."

Does anyone know where I can set "trust_remote_code=True" and which configuration file I need to execute?
I already tried executing configuration_RW.py using "python configuration_RW.py" but nothing really happened.

It's a parameter that you pass in when setting up the transformers pipeline.
If you're trying to do inference on Falcon-40b-instruct, you could try an API that's running it: https://ai.chainconductor.io/3NN8BnB

Sign up or log in to comment