--- license: apache-2.0 --- This is full finetune model from RWKV 4 world 7B CHNTuned using data from Readflow tech (readflow.com.cn) , finetuned for 32k context length, used to summary news article using inf-ctx training https://github.com/SynthiaDL/TrainChatGalRWKV you can test summary prompt using RWKV runner in chat mode , and check conversation files in examples folders. https://discord.gg/pWH5MkvtNR ------------------------------------------------------------------------------------------------- 这是和会读Readflow合作的模型( readflow.com.cn),用于超长微信文章的摘要,训练了32k的context长度,能一次性输入整篇文章进行摘要,如下图以及在example文件夹中的例子,23k tokens一次性输入摘要, RWKV world的token对中文以及各类语言效率很高,token和字比例,基本是1:1,甚至1:n,欢迎测试,并加入微调QQ群讨论439087067 ![微信截图_20230720145202.png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/3prZb97XOKeHhVQyYevA_.png)