Update README.md
Browse files
README.md
CHANGED
@@ -3,13 +3,78 @@ license: llama2
|
|
3 |
datasets:
|
4 |
- meta-math/MetaMathQA
|
5 |
---
|
6 |
-
arxiv.org/abs/2309.12284
|
7 |
|
8 |
View the project page:
|
9 |
https://meta-math.github.io/
|
10 |
|
|
|
11 |
|
12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
|
14 |
```bibtex
|
15 |
@article{yu2023metamath,
|
|
|
3 |
datasets:
|
4 |
- meta-math/MetaMathQA
|
5 |
---
|
6 |
+
see our paper in https://arxiv.org/abs/2309.12284
|
7 |
|
8 |
View the project page:
|
9 |
https://meta-math.github.io/
|
10 |
|
11 |
+
## Model Details
|
12 |
|
13 |
+
MetaMath-Llemma-7B is fully fine-tuned on the MetaMathQA datasets and based on the powerful Llemma-7B model. It is glad to see using MetaMathQA datasets and change the base model from llama-2-7B to Llemma-7B can boost the MATH performance from 19.8 to **30.0**.
|
14 |
+
|
15 |
+
## Installation
|
16 |
+
|
17 |
+
```
|
18 |
+
pip install transformers==4.35.0
|
19 |
+
pip install torch==2.0.1
|
20 |
+
pip install sentencepiece==0.1.99
|
21 |
+
pip install tokenizers==0.13.3
|
22 |
+
pip install accelerate==0.21.0
|
23 |
+
pip install bitsandbytes==0.40.0
|
24 |
+
pip install vllm
|
25 |
+
pip install fraction
|
26 |
+
pip install protobuf
|
27 |
+
```
|
28 |
+
|
29 |
+
## Model Usage
|
30 |
+
|
31 |
+
prompting template:
|
32 |
+
|
33 |
+
'''
|
34 |
+
|
35 |
+
"Below is an instruction that describes a task. "
|
36 |
+
"Write a response that appropriately completes the request.\n\n"
|
37 |
+
"### Instruction:\n{instruction}\n\n### Response: Let's think step by step."
|
38 |
+
|
39 |
+
'''
|
40 |
+
|
41 |
+
where you need to use your query question to replace the {instruction}
|
42 |
+
|
43 |
+
## Experiments
|
44 |
+
|
45 |
+
| Model | GSM8k Pass@1 | MATH Pass@1 |
|
46 |
+
|---------------------|--------------|-------------|
|
47 |
+
| MPT-7B | 6.8 | 3.0 |
|
48 |
+
| Falcon-7B | 6.8 | 2.3 |
|
49 |
+
| LLaMA-1-7B | 11.0 | 2.9 |
|
50 |
+
| LLaMA-2-7B | 14.6 | 2.5 |
|
51 |
+
| MPT-30B | 15.2 | 3.1 |
|
52 |
+
| LLaMA-1-13B | 17.8 | 3.9 |
|
53 |
+
| GPT-Neo-2.7B | 19.5 | -- |
|
54 |
+
| Falcon-40B | 19.6 | 2.5 |
|
55 |
+
| Baichuan-chat-13B | 23.9 | -- |
|
56 |
+
| Vicuna-v1.3-13B | 27.6 | -- |
|
57 |
+
| LLaMA-2-13B | 28.7 | 3.9 |
|
58 |
+
| InternLM-7B | 31.2 | -- |
|
59 |
+
| ChatGLM-2-6B | 32.4 | -- |
|
60 |
+
| GPT-J-6B | 34.9 | -- |
|
61 |
+
| LLaMA-1-33B | 35.6 | 3.9 |
|
62 |
+
| LLaMA-2-34B | 42.2 | 6.24 |
|
63 |
+
| RFT-7B | 50.3 | -- |
|
64 |
+
| LLaMA-1-65B | 50.9 | 10.6 |
|
65 |
+
| Qwen-7B | 51.6 | -- |
|
66 |
+
| WizardMath-7B | 54.9 | 10.7 |
|
67 |
+
| LLaMA-2-70B | 56.8 | 13.5 |
|
68 |
+
| WizardMath-13B | 63.9 | 14.0 |
|
69 |
+
| MAmmoTH-7B (COT) | 50.5 | 10.4 |
|
70 |
+
| MAmmoTH-7B (POT+COT)| 53.6 | 31.5 |
|
71 |
+
| Arithmo-Mistral-7B | 74.7 | 25.3 |
|
72 |
+
| MetaMath-7B | 66.5 | 19.8 |
|
73 |
+
| MetaMath-13B | 72.3 | 22.4 |
|
74 |
+
| 🔥 **MetaMath-Llemma-7B** | **69.2** | **30.0** |
|
75 |
+
| 🔥 **MetaMath-Mistral-7B** | **77.7** | **28.2** |
|
76 |
+
|
77 |
+
## Citation
|
78 |
|
79 |
```bibtex
|
80 |
@article{yu2023metamath,
|