Edit model card

RWKV v4 world 7B 65k context

This is the model to replace the old rwkv 65k claude model, with special token and lower learning rate to maintain model former abilities. and trained a lots of English high quality textbooks and chinese novels with 65k context length.

using it with rwkv runner only need 16G vram.(https://github.com/josStorer/RWKV-Runner)

contributor

@KevinMr @Remixa

trainning details

https://wandb.ai/one-/one-rwkv-64k/runs/jn05hyc4

image/png

Testcase

https://rwkv-next-web.ai-creator.net/ (temporary)

https://rwkv.ai-creator.net/risu

how to use

use vocabs files in runner config

image/png

Downloads last month
0
Unable to determine this model's library. Check the docs .

Datasets used to train xiaol/RWKV-v4-world-7B-one-state-65k