Request of assistance

#1
by Rivaidan - opened

I feel a bit ashamed having to seek some assistance because I've used tons of your models in the past without issues but today's 1.2 version models seem to only output gibberish for me. at first, I thought I was using the wrong instruct template but it doesn't seem to matter what I choose it always comes out broken. is there a setting or file I missed somewhere?
example.jpg

i too can't seems to get it to work. gibberish output. i've also got some error when using oobabooga webui

Traceback (most recent call last):
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/gradio/routes.py", line 427, in run_predict
output = await app.get_blocks().process_api(
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/gradio/blocks.py", line 1323, in process_api
result = await self.call_function(
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/gradio/blocks.py", line 1067, in call_function
prediction = await utils.async_iteration(iterator)
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/gradio/utils.py", line 336, in async_iteration
return await iterator.anext()
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/gradio/utils.py", line 329, in anext
return await anyio.to_thread.run_sync(
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/anyio/to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/anyio/_backends/_asyncio.py", line 867, in run
result = context.run(func, *args)
File "/home/shawhu/ooba/installer_files/env/lib/python3.10/site-packages/gradio/utils.py", line 312, in run_sync_iterator_async
return next(iterator)
File "/home/shawhu/ooba/text-generation-webui/modules/chat.py", line 332, in generate_chat_reply_wrapper
for i, history in enumerate(generate_chat_reply(text, shared.history, state, regenerate, _continue, loading_message=True)):
File "/home/shawhu/ooba/text-generation-webui/modules/chat.py", line 317, in generate_chat_reply
for history in chatbot_wrapper(text, history, state, regenerate=regenerate, _continue=_continue, loading_message=loading_message):
File "/home/shawhu/ooba/text-generation-webui/modules/chat.py", line 195, in chatbot_wrapper
stopping_strings = get_stopping_strings(state)
File "/home/shawhu/ooba/text-generation-webui/modules/chat.py", line 129, in get_stopping_strings
state['turn_template'].split('<|user-message|>')[1].split('<|bot|>')[0] + '<|bot|>',
IndexError: list index out of range

Yep, confirmed gibberish output with GPTQ-for-llama + 4bit + 128g @TheBloke

Sorry guys, I will re-generate it

Sorry for the problems with the gibberish file.

It's been re-generated and re-uploaded - please download again and try again.

The reupload works, thanks for the quick fix @TheBloke !

Sign up or log in to comment