NuminaMath-7B-TIR / README.md
edbeeching's picture
edbeeching HF staff
Update README.md
7027b1c verified
|
raw
history blame
No virus
6.81 kB
metadata
base_model: deepseek-ai/deepseek-math-7b-base
tags:
  - alignment-handbook
  - generated_from_trainer
datasets:
  - AI-MO/tora-chosen-v0.7
widget:
  - example_title: Math problem
    messages:
      - role: user
        content: >-
          For how many values of the constant $k$ will the polynomial
          $x^{2}+kx+36$ have two distinct integer roots?
    output:
      text: >-
        ### Solution: 1. For the polynomial \\( x^2 + kx + 36 \\) to have two
        distinct integer roots, let's denote these roots by \\( r_1 \\) and \\(
        r_2 \\).\n\n2. According to Vieta's formulas, the sum of the roots \\(
        r_1 + r_2 \\) is equal to \\(-k\\), and the product of the roots \\( r_1
        \\cdot r_2 \\) is equal to 36.\n\n3. To find the distinct integer pairs
        \\((r_1, r_2)\\) whose product is 36, we need to determine all pairs of
        integers \\((r_1, r_2)\\) such that \\( r_1 \\cdot r_2 = 36 \\) and \\(
        r_1 \\neq r_2 \\).\n\n4. From the equation \\( r_1 + r_2 = -k \\), for
        each valid pair \\((r_1, r_2)\\), we can compute the corresponding value
        of \\( k \\).\n\n5. Finally, since we need the polynomial to have two
        distinct integer roots, we need to ensure that \\( r_1 \\) and \\( r_2
        \\) are distinct.\n\nLet's start by finding all pairs \\((r_1, r_2)\\)
        such that \\( r_1 \\cdot r_2 = 36 \\). We'll then determine the values
        of \\( k \\) and ensure the roots are distinct.\n```python\nimport
        itertools\n\n# Find all pairs (r1, r2) such that r1 * r2 =
        36\nproduct_36 = 36\nfactor_pairs = []\n\nfor i in range(1, product_36 +
        1):\n if product_36 % i == 0:\n pair = (i, product_36 // i)\n if pair[0]
        != pair[1]: # Ensure distinct pairs\n factor_pairs.append(pair)\n\n#
        Calculate k for each pair and ensure distinct integer
        roots\nvalid_k_values = set()\nfor r1, r2 in factor_pairs:\n if r1 !=
        r2:\n k = -(r1 + r2)\n
        valid_k_values.add(k)\n\nprint((len(valid_k_values),
        sorted(valid_k_values)))\n```\n```output\n(4, [-37, -20, -15,
        -13])\n```\nThe distinct integer values of \\( k \\) that make the
        polynomial \\( x^2 + kx + 36 \\) have two distinct integer roots are
        \\(-37, -20, -15, \\text{and} -13\\).\n\nTherefore, the number of such
        values of \\( k \\) is:\n\n\\[\n\\boxed{4}\n\\]\n
pipeline_tag: text-generation
model-index:
  - name: Numina-Math-7B
    results: []

image/png

Model Card for Numina-Math-7B

Numina-Math is a series of language models that are trained to solve math problems using tool integrated reasoning. Numina-Math-7b won the first AI Math Olympiad, with a score of 29/50 on the public and private tests sets. This model is a fine-tuned version of deepseek-ai/deepseek-math-7b-base with two stages on Math Question answers and multi-step synthetic generations using tool integrated reasoning.

image/png

Model description

  • Model type: A 7B parameter Math model fine-tuned in two stages on Math Question answers and multi-step synthetic generations using tool integrated reasoning.
  • Language(s) (NLP): Primarily English
  • License: MIT
  • Finetuned from model: deepseek-ai/deepseek-math-7b-base

Model Sources

Intended uses & limitations

Here's how you can run the model using the pipeline() function from 🤗 Transformers:

# Install transformers from source - only needed for versions <= v4.34
# pip install git+https://github.com/huggingface/transformers.git
# pip install accelerate

import torch
from transformers import pipeline

pipe = pipeline("text-generation", model="HuggingFaceH4/zephyr-7b-beta", torch_dtype=torch.bfloat16, device_map="auto")

# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
    {
        "role": "system",
        "content": "You are a friendly chatbot who always responds in the style of a pirate",
    },
    {"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
# <|system|>
# You are a friendly chatbot who always responds in the style of a pirate.</s>
# <|user|>
# How many helicopters can a human eat in one sitting?</s>
# <|assistant|>
# Ah, me hearty matey! But yer question be a puzzler! A human cannot eat a helicopter in one sitting, as helicopters are not edible. They be made of metal, plastic, and other materials, not food!

Bias, Risks, and Limitations

Numina-Math-7B was create to solve math problems, the model has not been align to preferences beyond the domain of solving math, and should not be used in a general chat setting.

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 8
  • total_train_batch_size: 32
  • total_eval_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 4.0

Training results

Training Loss Epoch Step Validation Loss
0.4295 1.0 1733 0.4313
0.3638 2.0 3466 0.4332
0.2951 3.0 5199 0.4704
0.2225 4.0 6932 0.5302

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.3.1
  • Datasets 2.18.0
  • Tokenizers 0.19.1

Citation

If you find Numina-Math is useful in your work, please cite it with:

@misc{beeching2024numina-math,
      title={Numina Math}, 
      author={Edward Beeching and Lewis Tunstall and Roman Soletskyi and Kashif Rasul and Shengyi Huang and Jia Li},
      year={2024},
      publisher = {Hugging Face},
      journal = {Hugging Face repository},
      howpublished = {\url{https://huggingface.co/AI-MO/Numina-Math-7B}}
}