OLMoE
Collection
Artifacts for open mixture-of-experts language models.
•
13 items
•
Updated
•
25
This model is an intermediate training checkpoint during post-training, after the Supervised Fine-Tuning (SFT) step. For best performance, we recommend you use the OLMoE-Instruct version.
Branches:
main
: Instruction tuned / supervised finetuned (SFT) model of https://hf.co/allenai/OLMoE-1B-7B-0924 (main
branch)load-balancing
: Ablation with load balancing loss during SFTnon-annealed
: Ablation starting from the checkpoint prior to annealing (branch step1200000-tokens5033B
of https://hf.co/allenai/OLMoE-1B-7B-0924) rather than the annealed checkpoint (branch main
of https://hf.co/allenai/OLMoE-1B-7B-0924)@misc{muennighoff2024olmoeopenmixtureofexpertslanguage,
title={OLMoE: Open Mixture-of-Experts Language Models},
author={Niklas Muennighoff and Luca Soldaini and Dirk Groeneveld and Kyle Lo and Jacob Morrison and Sewon Min and Weijia Shi and Pete Walsh and Oyvind Tafjord and Nathan Lambert and Yuling Gu and Shane Arora and Akshita Bhagia and Dustin Schwenk and David Wadden and Alexander Wettig and Binyuan Hui and Tim Dettmers and Douwe Kiela and Ali Farhadi and Noah A. Smith and Pang Wei Koh and Amanpreet Singh and Hannaneh Hajishirzi},
year={2024},
eprint={2409.02060},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2409.02060},
}
Base model
allenai/OLMoE-1B-7B-0924