Works with with the current oobabooga version.

#15
by robert1968 - opened

with the current https://github.com/oobabooga/text-generation-webui (linux) version
it works without any patching or magic. just download and install.
It seems it using only CPU. No GPU used.

Speed on my PC is 2-4 tps.

Good to know

@TheBloke Hi.
Pls Update Readme with Oobabooga (standard) works with this model. No hacks need.
Thanks

deleted

for me, the most current does not work at all on anything i had to roll back.

today installed Linux version again and this is works still. load the model a bit faster. 31 seconds vs. 60 seconds yesterday.

GPU is still not used, still very long wait (~25sec) until first response started -for each :( but then it speed is ~2tps.
Output generated in 64.59 seconds (2.17 tokens/s, 140 tokens, context 82, seed 963111989)

I did a complete tear down of the venv and rebuilt with snapshot as of 20 minutes ago. it seems to work now with this model too. something must have went sideways with my last update, even tho rolling back worked to get me where i was yesterday.... I donno. it works now, i guess that is all that matters.

Sign up or log in to comment