Using in with oobabooga/text-generation-webui

#1
by latent-variable - opened

I am a bit new to using Webui, but I wanted to give this model a try to see how well it does. But every time I would try to load it I would get an error. It took me a while to realize that I had to update the webui.py

I updated the following line
'''
run_cmd("python server.py --chat --model-menu") # put your flags here!
'''
with
'''
run_cmd("python server.py --chat --model oasst-sft-7-llama-30b-4bit --wbits 4 --model_type llama")
'''

If anyone is curious the model requires >18GB of VRAM

I don't have this line in that PY file... "run_cmd("python server.py --chat --model-menu") # put your flags here!"
Ideas?

I don't have this line in that PY file... "run_cmd("python server.py --chat --model-menu") # put your flags here!"
Ideas?

Are you using Windows or Linux?

I don't have this line in that PY file... "run_cmd("python server.py --chat --model-menu") # put your flags here!"
Ideas?

Are you using Windows or Linux?

I'm using Windows 11

Sign up or log in to comment