Safetensors
English
olmoe
Mixture of Experts
olmo
Edit model card
OLMoE Logo.

Model Summary

This model is an intermediate training checkpoint during post-training, after the Supervised Fine-Tuning (SFT) step. For best performance, we recommend you use the OLMoE-Instruct version.

Branches:

Citation

@misc{muennighoff2024olmoeopenmixtureofexpertslanguage,
      title={OLMoE: Open Mixture-of-Experts Language Models}, 
      author={Niklas Muennighoff and Luca Soldaini and Dirk Groeneveld and Kyle Lo and Jacob Morrison and Sewon Min and Weijia Shi and Pete Walsh and Oyvind Tafjord and Nathan Lambert and Yuling Gu and Shane Arora and Akshita Bhagia and Dustin Schwenk and David Wadden and Alexander Wettig and Binyuan Hui and Tim Dettmers and Douwe Kiela and Ali Farhadi and Noah A. Smith and Pang Wei Koh and Amanpreet Singh and Hannaneh Hajishirzi},
      year={2024},
      eprint={2409.02060},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2409.02060}, 
}
Downloads last month
1,468
Safetensors
Model size
6.92B params
Tensor type
BF16
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for allenai/OLMoE-1B-7B-0924-SFT

Finetuned
(3)
this model
Finetunes
2 models
Quantizations
1 model

Dataset used to train allenai/OLMoE-1B-7B-0924-SFT

Collection including allenai/OLMoE-1B-7B-0924-SFT