最好能给出Instruction模板和示例,另外请问底层是llama2-base还是llama2-chat?

#3
by bash99 - opened

方便没加群的上手测试。

我试过
User:
Assistant:
这样可以,比较奇特的是直接调用
You are .... assistant. Think it over and answer user question correctly.
User:
Assistant:
可以,但是用ooba ui + extllama加载后,api调用时,却需要
User: You are .... assistant. Think it over and answer user question correctly. \n
Assistant:
这样才行。

同时做领域finetune测试时,发现上述对话格式做模板 loss下降很慢,如果换成llama2的instruction模板,就是带
的那个,loss下降快很多。

OpenBuddy org

你好,Instruction模板和示例请参考我们的Github上的example:https://github.com/OpenBuddy/OpenBuddy/blob/main/examples/hello.py

sft的时候,System Prompt只需You are a helpful assistant.

OpenBuddy org

另外,底层是llama2-base

Sign up or log in to comment