This OLMo version was used to develop Molmo-O-7B. It was trained on the OLMoE-Mix and uses the Dolma 2 tokenizer.
This model is not intented to be used as is--it is provided as research artifacts to facilitate reproduction and research on Molmo. Details about this model family will be presented in an upcoming OLMo manuscript.
- Downloads last month
- 107