MathCoder2
Collection
6 items
•
Updated
•
1
The MathCoder2 models are created by conducting continued pretraining on MathCode-Pile. They are introduced in the paper MathCoder2: Better Math Reasoning from Continued Pretraining on Model-translated Mathematical Code.
The mathematical pretraining dataset includes mathematical code accompanied with natural language reasoning steps, making it a superior resource for models aimed at performing advanced mathematical reasoning tasks.
Base model
meta-llama/Meta-Llama-3-8B