RWKV-7 "Goose" with Expressive Dynamic State Evolution Paper • 2503.14456 • Published 19 days ago • 135
BlackGoose Rimer: Harnessing RWKV-7 as a Simple yet Superior Replacement for Transformers in Large-Scale Time Series Modeling Paper • 2503.06121 • Published 29 days ago • 5
ARWKV: Pretrain is not what we need, an RNN-Attention-Based Language Model Born from Transformer Paper • 2501.15570 • Published Jan 26 • 23