Edit model card

RWKV 7B world focus on reading comprehension

This is a experimental model based on RWKV 7B world.

why this model is special? ===> remove eod, add special token, change vocabs.

this model is used to QA in large texts, do some in context learning with knowledge indexed database.

trainning details

train with this kind of new format,

<s>User: <sys>xxxx\n\n</sys>xxxxx\n\n</s><s>Assistant: xxxxx\n\n</s><s>User: xxxx\n\n</s><s>Assistant: \n\n</s>

so ,use User Assistant as your prefix names. and when inference in RWKV runner, just use the following format is fine.

User: xxxx\n\nAssistant: xxxx\n\n,in which are the test cases used.


to use this model with RWKV runner,some effort needed, copy back-python folder to a new one ,which is in the same folder with rwkv-runner.exe(or the file to run) , then pastin rwkv_vocab_v20230424.txt into rwkv_pip folder to replace the vocabs file

../py310/python main.py in this new folder, then use RWKV runner setting API to 127.0.0.0.1:8000, and go to 127.0.0.1:8000/docs to switch model using this one

try different temp and topp , 1.2 0.5 may works.

image/png

image/png

image/png

temp 1.2 topp 0.6 image/png

Downloads last month
0
Unable to determine this model's library. Check the docs .

Datasets used to train xiaol/RWKV-paper-reviewer-and-reading-comprehension-wenda-Worldv4-7B-16k