Transformers
llama
text-generation-inference

how to get it working with Exlama ?

#1
by DQ83 - opened

i get in chatbot_wrapper
stopping_strings = get_stopping_strings(state)
File "E:\oobabooga_windows\text-generation-webui\modules\chat.py", line 160, in get_stopping_strings
state['turn_template'].split('<|user-message|>')[1].split('<|bot|>')[0] + '<|bot|>',

which turn template could work ?

ah i see ChatML

Sign up or log in to comment