MetaMath-Cybertron

Merge fblgit/una-cybertron-7b-v2-bf16 and meta-math/MetaMath-Mistral-7B using slerp merge.

You can use ChatML format.

Open LLM Leaderboard Evaluation Results

Detailed results can be found Coming soon

Metric Value
Avg. Coming soon
ARC (25-shot) Coming soon
HellaSwag (10-shot) Coming soon
MMLU (5-shot) Coming soon
TruthfulQA (0-shot) Coming soon
Winogrande (5-shot) Coming soon
GSM8K (5-shot) Coming soon
Downloads last month
87
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Q-bert/MetaMath-Cybertron

Finetuned
(2)
this model
Merges
1 model

Dataset used to train Q-bert/MetaMath-Cybertron

Spaces using Q-bert/MetaMath-Cybertron 16