YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Model Notes

Linear models offer a promising approach to significantly reduce computational costs at scale, particularly for large context lengths. This enables a more than 1000x improvement in inference cost efficiency, enabling both O1-style inference time thinking and wider AI accessibility. We are able to convert any previously trained QKV Attention-based model, such as Qwen and LLaMA, into an RWKV variant without requiring retraining from scratch. Enabling us to rapidly test and validate the significantly more efficient RWKV Linear attention mechanism at a larger scale with a much smaller budget, bypassing the need for training from scratch.

This approach demonstrates the architecture design and scalability of RWKV, reinforcing the idea that QKV attention is not the sole essential component. One downside to this technique is that the model's inherent knowledge and dataset training are inherited from its "parent" model. Consequently, unlike previous RWKV models trained on over 100+ languages, the QRWKV model is limited to approximately 30 languages supported by the Qwen line of models.

But gets the inference time performance speed up of a linear model

Benchmark Numbers

Tasks Version Filter n-shot Metric Value Stderr
mmlu 2 none 0 acc ↑ 0.7767 ± 0.0033
arc_challenge 1 none 0 acc ↑ 0.6152 ± 0.0142
none 0 acc_norm ↑ 0.6297 ± 0.0141
arc_easy 1 none 0 acc ↑ 0.8565 ± 0.0072
none 0 acc_norm ↑ 0.8304 ± 0.0077
hellaswag 1 none 0 acc ↑ 0.6780 ± 0.0047
none 0 acc_norm ↑ 0.8587 ± 0.0035
lambada_openai 1 none 0 acc ↑ 0.7502 ± 0.0060
none 0 perplexity ↓ 2.9369 ± 0.0624
piqa 1 none 0 acc ↑ 0.8237 ± 0.0089
none 0 acc_norm ↑ 0.8368 ± 0.0086
winogrande 1 none 0 acc ↑ 0.7806 ± 0.0116
Downloads last month
62
Safetensors
Model size
74.3B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.