radi-cho's picture
Update README.md
74fdabd verified
metadata
license: llama3.1
Wiki C4 PIQA ARC-E ARC-C HellaSwag Wino Avg.
Unquantized 2.82 7.18 82.81 85.31 59.64 67.49 82.00 75.45
W4G64 3.09 7.53 83.03 85.52 58.19 67.04 80.43 74.84
W3G64 4.29 8.91 82.04 83.29 54.78 64.99 78.14 72.65

Revisions available in this repository:

  • main (W4G64, scales learned);
  • nfl_w3g64 (W3G64, scales learned);

Evaluations are provided for models with learned scales.
Benchmark scores (zero-shot) are computed with lm-evaluation-harness.