File size: 1,484 Bytes
c098990 b80c71a 378293f b80c71a a704cd2 b80c71a a704cd2 b80c71a 3fbf03f b80c71a cd7bb4f b80c71a 3fbf03f a704cd2 3fbf03f b80c71a 3fbf03f b80c71a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
---
license: apache-2.0
---
This is full finetuned model from RWKV 4 world 7B CHNTuned model using data from Readflow tech (readflow.com.cn) ,
finetuned for 32k context length, used to summary news article
using inf-ctx training https://github.com/SynthiaDL/TrainChatGalRWKV with fixed VRAM
you can test summary prompt using RWKV runner(https://github.com/josStorer/RWKV-Runner) in chat mode , and check conversation files in examples folders.
https://discord.gg/pWH5MkvtNR
![QQ图片20230721210758.jpg](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/NTa346CoQzH-k0civqG1Q.jpeg)
-------------------------------------------------------------------------------------------------
这是和会读Readflow合作的模型( readflow.com.cn),用于超长微信文章的摘要,训练了32k的context长度,能一次性输入整篇文章进行摘要,如下图以及在example文件夹中的例子,
23k tokens一次性输入摘要,不增显存,如果有state状态,可以秒出,可以反复重试多种可能性,
“Assistant:"是模型输出内容
可以使用runner进行测试 https://github.com/josStorer/RWKV-Runner
RWKV world的token对中文以及各类语言效率很高,token和字比例,基本是1:1,甚至1:n,欢迎测试,并加入微调QQ群讨论439087067
![微信截图_20230720145202.png](https://cdn-uploads.huggingface.co/production/uploads/6176b32847ee6431f632981e/3prZb97XOKeHhVQyYevA_.png)
|