Edit model card

FusionNet_34Bx2_MoE_v0.1

Fine-tuned model on English language using MoE method. The improved version from FusionNet_34Bx2_MoE.

Model description

The FusionNet_34Bx2_MoE_v0.1 is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The FusionNet has 60.8B parameters, and this model is fine-tuned. Enjoy!

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 77.38
AI2 Reasoning Challenge (25-Shot) 73.72
HellaSwag (10-Shot) 86.46
MMLU (5-Shot) 76.72
TruthfulQA (0-shot) 71.01
Winogrande (5-shot) 83.35
GSM8k (5-shot) 73.01
Downloads last month
2,392
Safetensors
Model size
60.8B params
Tensor type
BF16
·

Evaluation results