File size: 7,946 Bytes
7682df3
a4240d9
7682df3
 
 
e07cd67
 
35828e9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7682df3
35828e9
7682df3
e07cd67
 
35828e9
 
e07cd67
 
7682df3
 
 
 
 
35828e9
 
 
 
 
82929e6
35828e9
 
 
 
7682df3
 
35828e9
 
 
 
 
f2cdfc1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
35828e9
 
 
 
 
7682df3
 
 
35828e9
7682df3
35828e9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7682df3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
35828e9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
---
base_model: AI-MO/NuminaMath-72B-CoT
tags:
- alignment-handbook
- generated_from_trainer
- math
- aimo
widget:
- example_title: Math problem
  messages:
  - role: user
    content: >-
      For how many values of the constant $k$ will the polynomial $x^{2}+kx+36$
      have two distinct integer roots?
  output:
    text: >-
      ### Solution: 

      1- For the polynomial \\( x^2 + kx + 36 \\) to have two distinct integer
      roots, let's denote these roots by \\( r_1 \\) and \\( r_2 \\).



      2- According to Vieta's formulas, the sum of the roots \\( r_1 + r_2 \\)
      is equal to \\(-k\\), and the product of the roots \\( r_1 \\cdot r_2 \\)
      is equal to 36.



      3- To find the distinct integer pairs \\((r_1, r_2)\\) whose product is
      36, we need to determine all pairs of integers \\((r_1, r_2)\\) such that
      \\( r_1 \\cdot r_2 = 36 \\) and \\( r_1 \\neq r_2 \\).



      4- From the equation \\( r_1 + r_2 = -k \\), for each valid pair \\((r_1,
      r_2)\\), we can compute the corresponding value of \\( k \\).



      5- Finally, since we need the polynomial to have two distinct integer
      roots, we need to ensure that \\( r_1 \\) and \\( r_2 \\) are distinct.

      Let's start by finding all pairs \\((r_1, r_2)\\) such that \\( r_1 \\cdot
      r_2 = 36 \\). We'll then determine the values of \\( k \\) and ensure the
      roots are distinct.

      ```python import itertools

      # Find all pairs (r1, r2) such that r1 * r2 = 36 product_36 = 36
      factor_pairs = []

      for i in range(1, product_36 + 1):
        if product_36 % i == 0:
          pair = (i, product_36 // i)
          if pair[0] != pair[1]: # Ensure distinct pairs
            factor_pairs.append(pair)
          
          # Calculate k for each pair and ensure distinct integer roots
          valid_k_values = set()
          for r1, r2 in factor_pairs:
            if r1 != r2:
              k = -(r1 + r2)
              valid_k_values.add(k)
              
          print((len(valid_k_values), sorted(valid_k_values)))
       ```
       
       ```output
       (4, [-37, -20, -15,-13])
       ```
       The distinct integer values of \\( k \\) that make the
      polynomial \\( x^2 + kx + 36 \\) have two distinct integer roots are
      \\(-37, -20, -15, \\text{and} -13\\).

      Therefore, the number of such values of \\( k \\) is:

      [ \\boxed{4} \\]
pipeline_tag: text-generation
model-index:
- name: NuminaMath-72B-TIR
  results: []
license: other
license_name: tongyi-qianwen
datasets:
- AI-MO/NuminaMath-TIR
language:
- en
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

<img src="https://huggingface.co/AI-MO/NuminaMath-7B-TIR/resolve/main/thumbnail.png" alt="Numina Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>


# Model Card for NuminaMath 72B TIR

NuminaMath is a series of language models that are trained with two stages of supervised fine-tuning to solve math problems using chain of thought (CoT) and tool-integrated reasoning (TIR):

* **Stage 1:** fine-tune the base model on a large, diverse dataset of natural language math problems and solutions, where each solution is templated with Chain of Thought (CoT) to facilitate reasoning. 
* **Stage 2:** fine-tune the model from Stage 1 on a synthetic dataset of tool-integrated reasoning, where each math problem is decomposed into a sequence of rationales, Python programs, and their outputs.

## Model description

- **Model type:** A 72B parameter math LLM fine-tuned on a dataset with 860k+ math problem-solution pairs.
- **Language(s) (NLP):** Primarily English
- **License:** Tongyi Qianwen
- **Finetuned from model:** [Qwen/Qwen2-72B](https://huggingface.co/Qwen/Qwen2-72B)

## Model performance

| | | NuminaMath-72B-CoT | NuminaMath-72B-TIR | Qwen2-72B-Instruct | Llama3-70B-Instruct | Claude-3.5-Sonnet | GPT-4o-0513 |
| --- | --- | :---: | :---: | :---: | :---: | :---: | :---: |
| **GSM8k** | 0-shot | 91.4% | 91.5% | 91.1% | 93.0% | **96.4%** | 95.8% |
| Grade school math | 
| **MATH** | 0-shot | 68.0% | 75.8% | 59.7% | 50.4% | 71.1% | **76.6%** |
| Math problem-solving |
| **AMC 2023** | 0-shot | 21/40 | **24/40** | 19/40 | 13/40 | 17/40 | 20/40 |
| Competition-level math | maj@64 | 24/40 | **34/40** | 21/40 | 13/40 | - | - |
| **AIME 2024** | 0-shot | 1/30 | **5/30** | 3/30 | 0/30 | 2/30 | 2/30 |
| Competition-level math | maj@64 | 3/30 | **12/30** | 4/30 | 2/30 | - | - |

*Table: Comparison of various open weight and proprietary language models on different math benchmarks. All scores except those for NuminaMath-72B-TIR are reported without tool-integrated reasoning.*

### Model Sources

<!-- Provide the basic links for the model. -->

- **Repository:** https://github.com/project-numina/aimo-progress-prize

## Intended uses & limitations

Here's how you can run the model using the `pipeline()` function from 🤗 Transformers:

```python
import re
import torch
from transformers import pipeline

pipe = pipeline("text-generation", model="AI-MO/NuminaMath-72B-TIR", torch_dtype=torch.bfloat16, device_map="auto")

messages = [
    {"role": "user", "content": "For how many values of the constant $k$ will the polynomial $x^{2}+kx+36$ have two distinct integer roots?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)

gen_config = {
    "max_new_tokens": 1024,
    "do_sample": False,
    "stop_strings": ["```output"], # Generate until Python code block is complete
    "tokenizer": pipe.tokenizer,
}

outputs = pipe(prompt, **gen_config)
text = outputs[0]["generated_text"]
print(text)

# WARNING: This code will execute the python code in the string. We show this for eductional purposes only.
# Please refer to our full pipeline for a safer way to execute code.
python_code = re.findall(r"```python(.*?)```", text, re.DOTALL)[0]
exec(python_code)
```

The above executes a single step of Python code - for more complex problems, you will want to run the logic for several steps to obtain the final solution.  

## Bias, Risks, and Limitations

<!-- This section is meant to convey both technical and sociotechnical limitations. -->

NuminaMath 72B TIR was created to solve problems in the narrow domain of competition-level mathematics. As a result, the model should not be used for general chat applications. With greedy decoding, we find the model is capable of solving problems at the level of [AMC 12](https://artofproblemsolving.com/wiki/index.php/2023_AMC_12A_Problems), but often struggles generate a valid solution on harder problems at the AIME and Math Olympiad level. The model also struggles to solve geometry problems, likely due to it's limited capacity and lack of other modalities like vision. 


## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- distributed_type: multi-GPU
- num_devices: 32
- total_train_batch_size: 32
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 4


### Framework versions

- Transformers 4.42.3
- Pytorch 2.3.0+cu121
- Datasets 2.18.0
- Tokenizers 0.19.1

## Citation

If you find NuminaMath 7B TIR is useful in your work, please cite it with:

```
@misc{numina_math_7b,
  author = {Edward Beeching and Shengyi Costa Huang and Albert Jiang and Jia Li and Benjamin Lipkin and Zihan Qina and Kashif Rasul and Ziju Shen and Roman Soletskyi and Lewis Tunstall},
  title = {NuminaMath 7B TIR},
  year = {2024},
  publisher = {Numina & Hugging Face},
  journal = {Hugging Face repository},
  howpublished = {\url{https://huggingface.co/AI-MO/NuminaMath-7B-TIR}}
}
```