v5-Eagle-7B-pth / README.md
picocreator's picture
Update README.md
614427b verified
metadata
license: apache-2.0

An eagle soaring above a transformer robot

Eagle 7B - in short

Eagle 7B is a 7.52B parameter model that:

  • Built on the RWKV-v5 architecture (a linear transformer with 10-100x+ lower inference cost)
  • Ranks as the world’s greenest 7B model (per token)
  • Trained on 1.1 Trillion Tokens across 100+ languages
  • Outperforms all 7B class models in multi-lingual benchmarks
  • Approaches Falcon (1.5T), LLaMA2 (2T), Mistral (>2T?) level of performance in English evals
  • Trade blows with MPT-7B (1T) in English evals
  • All while being an “Attention-Free Transformer”
  • Is a foundation model, with a very small instruct tune - further fine-tuning is required for various use cases!

Find out more at our model announcment: https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers

Or our wiki: https://wiki.rwkv.com