--- language: - en tags: - pytorch - text-generation - causal-lm - rwkv license: apache-2.0 datasets: - the_pile --- # RWKV-4 7B ## Model Description RWKV-4 7B is a L32-D4096 causal language model trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details. ** Note: It's a BF16 model, and it may overflow if you are using FP16 (probably fixable by rescaling the weights). ** At this moment you have to use my Github code (https://github.com/BlinkDL/RWKV-LM) to run it. ctx_len = 1024 n_layer = 32 n_embd = 4096 Preview checkpoint: RWKV-4-Pile-7B-20221030-6224.pth : Trained on the Pile for 257B tokens. * Pile loss 1.8553 * LAMBADA ppl 4.36, acc 67.42% * PIQA acc 75.68% * SC2016 acc 72.58% * Hellaswag acc_norm 64.87%