Files changed (1) hide show
  1. README.md +6 -7
README.md CHANGED
@@ -49,13 +49,12 @@ Model evaluation metrics and results.
49
 
50
  | Benchmark | Metric | Llama-2-7b-instruct | Llama-2-7b-pruned70-retrained-instruct-quant-ds |
51
  |------------------------------------------------|---------------|-------------|-------------------------------|
52
- | [MMLU](https://arxiv.org/abs/2009.03300) | 5-shot, top-1 | xxxx | xxxx |
53
- | [HellaSwag](https://arxiv.org/abs/1905.07830) | 0-shot | xxxx | xxxx |
54
- | [WinoGrande](https://arxiv.org/abs/1907.10641) | partial score | xxxx | xxxx |
55
- | [ARC-c](https://arxiv.org/abs/1911.01547) | | xxxx | xxxx |
56
- | [TruthfulQA](https://arxiv.org/abs/2109.07958) | 5-shot | xxxx | xxxx |
57
- | [HumanEval](https://arxiv.org/abs/2107.03374) | pass@1 | xxxx | xxxx |
58
- | [GSM8K](https://arxiv.org/abs/2110.14168) | maj@1 | xxxx | xxxx |
59
 
60
  ## Help
61
 
 
49
 
50
  | Benchmark | Metric | Llama-2-7b-instruct | Llama-2-7b-pruned70-retrained-instruct-quant-ds |
51
  |------------------------------------------------|---------------|-------------|-------------------------------|
52
+ | [MMLU](https://arxiv.org/abs/2009.03300) | 5-shot | 48.60% | 41.21% |
53
+ | [HellaSwag](https://arxiv.org/abs/1905.07830) | 10-shot | 79.45% | 76.88% |
54
+ | [WinoGrande](https://arxiv.org/abs/1907.10641) | 5-shot | 75.69% | 70.24% |
55
+ | [ARC-c](https://arxiv.org/abs/1911.01547) | 25-shot | 53.92% | 47.61% |
56
+ | [TruthfulQA](https://arxiv.org/abs/2109.07958) | 0-shot | 43.63% | 42.04% |
57
+ | [GSM8K](https://arxiv.org/abs/2110.14168) | 5-shot | 15.92% | 12.13% |
 
58
 
59
  ## Help
60