NicoNico's picture
Update README.md
d33bd5f
|
raw
history blame
6.44 kB
metadata
license: apache-2.0

GreenBit LLaMA

This is GreenBitAI's pretrained 2-bit TinyLLaMA model with extreme compression yet still strong performance.

Please refer to our Github page for the code to run the model and more information.

Model Description

Zero-Shot Evaluation

Task Metric TinyLLaMA 1.1B q2g32 TinyLLaMA 1.1B q2g8 LLaMA 3B q2g32 LLaMA 3B q2g16 LLaMA 3B q2g8 LLaMA-1 7B q2g32 LLaMA-2 7B q2g32 LLaMA-2 7B q2g8 LLaMA 1.1B FP16 LLaMA 3B FP16 LLaMA-1 7B FP16
Openbookqa acc 0.152 0.192 0.196 0.238 0.242 0.224 0.246 0.296 0.208 0.27 0.29
ac_norm 0.328 0.338 0.332 0.358 0.362 0.388 0.376 0.4 0.368 0.4 0.41
arc_challenge acc 0.3268 0.2278 0.279 0.2978 0.3148 0.3422 0.3268 0.3618 0.243 0.34 0.39
ac_norm 0.3387 0.273 0.2944 0.3319 0.3345 0.3387 0.3387 0.372 0.288 0.37 0.41
hellawswag acc 0.34 0.3769 0.4238 0.444 0.462 0.4996 0.4961 0.5379 0.403 0.49 0.68
ac_norm 0.4097 0.4711 0.5685 0.5988 0.6242 0.6447 0.6464 0.7014 0.503 0.67 0.73
piqa acc 0.6518 0.6931 0.7024 0.716 0.7291 0.7476 0.7503 0.7715 0.71 0.75 0.78
ac_norm 0.6393 0.6812 0.7116 0.7247 0.7312 0.7443 0.7421 0.7568 0.688 0.76 0.78
arc_easy acc 0.4411 0.5109 0.5997 0.646 0.6528 0.6061 0.6174 0.6254 0.533 0.69 0.68
ac_norm 0.3716 0.412 0.5417 0.58 0.5972 0.4566 0.4781 0.4958 0.43 0.65 0.52
Winogrande acc 0.532 0.5249 0.5683 0.5888 0.6054 0.6283 0.6298 0.6582 0.558 0.62 0.68
boolq acc 0.592 0.6174 0.6281 0.6636 0.6327 0.6425 0.7061 0.7242 0.583 0.68 0.75
truthfulqa_mc mc1 0.2338 0.2277 0.2509 0.2118 0.2252 0.224 0.2313 0.2399 0.228 0.22 0.21
mc2 0.4211 0.406 0.3962 0.3501 0.3625 0.3702 0.3854 0.3795 0.401 0.35 0.34
anli_r1 acc 0.363 0.336 0.337 0.334 0.344 0.331 0.333 0.363 0.354 0.33 0.35
anli_r2 acc 0.331 0.346 0.335 0.332 0.331 0.326 0.349 0.347 0.341 0.32 0.34
anli_r3 acc 0.3758 0.3633 0.3358 0.3383 0.3425 0.3417 0.36 0.3733 0.358 0.35 0.37
wic acc 0.5 0.5 0.4984 0.5094 0.4969 0.4984 0.4953 0.489 0.5 0.48 0.5
rte acc 0.4874 0.4874 0.5596 0.5993 0.5632 0.639 0.6065 0.6426 0.516 0.58 0.56
record f1 0.7608 0.8023 0.8502 0.8625 0.8687 0.8859 0.8872 0.9037 0.82 0.88 0.91
em 0.753 0.7934 0.8427 0.8545 0.8612 0.8781 0.8801 0.8959 0.818 0.89 0.91
Average 0.438 0.4498 0.4881 0.5037 0.5087 0.5122 0.5181 0.5391 0.469 0.528 0.5519
model size GiB 0.5 0.6 1.2 1.3 1.5 2.2 2.2 2.9 4.4 6.8 12.5