OpenMoE
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
Please see this link for detailed information.
- Downloads last month
- 0
Unable to determine this model's library. Check the
docs
.
A family of open-sourced Mixture-of-Experts (MoE) Large Language Models
Please see this link for detailed information.