Post
1607
🐣Ai2 Releasing OLMoE!
OLMoE-1B-7B-Instruct is a Mixture-of-Experts LLM with 1B active and 7B total parameters, and, OLMoE is 100% open-source in model, code-base, datasets!
🦖Paper: https://arxiv.org/abs/2409.02060
🤗Model: allenai/OLMoE-1B-7B-0924-Instruct
💾Datasets: allenai/OLMoE-mix-0924
OLMoE-1B-7B-Instruct is a Mixture-of-Experts LLM with 1B active and 7B total parameters, and, OLMoE is 100% open-source in model, code-base, datasets!
🦖Paper: https://arxiv.org/abs/2409.02060
🤗Model: allenai/OLMoE-1B-7B-0924-Instruct
💾Datasets: allenai/OLMoE-mix-0924