OpenMoE_8B / README.md
fuzhao's picture
Update README.md
23ecc67
|
raw
history blame
217 Bytes
metadata
license: bigcode-openrail-m

OpenMoE

A family of open-sourced Mixture-of-Experts (MoE) Large Language Models

Please see this link for detailed information.