rwkv-4-pile-1b5 / README.md
BlinkDL's picture
Update README.md
c2182de
metadata
language:
  - en
tags:
  - pytorch
  - text-generation
  - causal-lm
  - rwkv
license: apache-2.0
datasets:
  - the_pile

RWKV-4 1.5B

Model Description

RWKV-4 1.5B is a L24-D2048 causal language model trained on the Pile. See https://github.com/BlinkDL/RWKV-LM for details.

Use https://github.com/BlinkDL/ChatRWKV to run it.

ctx_len = 1024 n_layer = 24 n_embd = 2048

New checkpoint: RWKV-4-Pile-1B5-20220929-ctx4096.pth : Fine-tuned to ctx_len = 4096

Final checkpoint: RWKV-4-Pile-1B5-20220903-8040.pth : Trained on the Pile for 332B tokens.

  • Pile loss 2.0415
  • LAMBADA ppl 7.04, acc 56.43%
  • PIQA acc 72.36%
  • SC2016 acc 68.73%
  • Hellaswag acc_norm 52.48%

Note: 4 / 4a / 4b models ARE NOT compatible. Use RWKV-4 unless you know what you are doing.