This OLMo version was used to develop Molmo-O-7B. It was trained on the OLMoE-Mix and uses the Dolma 2 tokenizer.

This model is not intented to be used as is--it is provided as research artifacts to facilitate reproduction and research on Molmo. Details about this model family will be presented in an upcoming OLMo manuscript.

Downloads last month
107
Safetensors
Model size
7.3B params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for allenai/OLMo-7B-1024-preview

Quantizations
1 model