TomGrc's picture
Update README.md
644451f verified
|
raw
history blame
450 Bytes
metadata
license: mit
language:
  - en
pipeline_tag: text-generation
tags:
  - moe

FusionNet_7Bx2_MoE_v0.1

Fine-tuned model on English language using MoE method. The improved version from FusionNet_7Bx2_MoE_14B.

Model description

The FusionNet_7Bx2_MoE_v0.1 is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The FusionNet has 12.9B parameters, and this model is fine-tuned. Enjoy!