OpenLLaMA_Base / README.md
fuzhao's picture
Update README.md
889ff8a
|
raw
history blame
219 Bytes
metadata
license: bigcode-openrail-m

OpenMoE

A family of open-sourced Mixture-of-Experts (MoE) Large Language Models

Please see this link for detailed information.