soldni commited on
Commit
438d270
1 Parent(s): 089b5a5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +35 -2
README.md CHANGED
@@ -14,14 +14,15 @@ tags:
14
 
15
  # Molmo 7B-D
16
 
17
- Molmo is an open vision-language model developed by the Allen Institute for AI. Molmo models are trained on PixMo, a dataset of 1 million, highly-curated image-text pairs. It has state-of-the-art performance among multimodal models with a similar size while being fully open-source. You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
 
18
 
19
  Molmo 7B-D is based on [Qwen2-7B](https://huggingface.co/Qwen/Qwen2-7B) and uses [OpenAI CLIP](https://huggingface.co/openai/clip-vit-large-patch14-336) as vision backbone.
20
  It performs comfortably between GPT-4V and GPT-4o on both academic benchmarks and human evaluation.
21
 
22
  This checkpoint is a **preview** of the Molmo release. All artifacts used in creating Molmo (PixMo dataset, training code, evaluations, intermediate checkpoints) will be made available at a later date, furthering our commitment to open-source AI development and reproducibility.
23
 
24
- **[Sign up here](https://docs.google.com/forms/d/e/1FAIpQLSdML1MhNNBDsCHpgWG65Oydg2SjZzVasyqlP08nBrWjZp_c7A/viewform)** to be the first to know when artifacts are released.
25
 
26
 
27
 
@@ -83,6 +84,38 @@ print(generated_text)
83
  # wooden deck. The deck's planks, which are a mix of light and dark brown with ...
84
  ```
85
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
86
  ## License and Use
87
 
88
  This model is licensed under Apache 2.0. It is intended for research and educational use.
 
14
 
15
  # Molmo 7B-D
16
 
17
+ Molmo is a family of open vision-language models developed by the Allen Institute for AI. Molmo models are trained on PixMo, a dataset of 1 million, highly-curated image-text pairs. It has state-of-the-art performance among multimodal models with a similar size while being fully open-source. You can find all models in the Molmo family [here](https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19).
18
+ **Learn more** about the Molmo family [in our announcement blog post](https://molmo.allenai.org/blog).
19
 
20
  Molmo 7B-D is based on [Qwen2-7B](https://huggingface.co/Qwen/Qwen2-7B) and uses [OpenAI CLIP](https://huggingface.co/openai/clip-vit-large-patch14-336) as vision backbone.
21
  It performs comfortably between GPT-4V and GPT-4o on both academic benchmarks and human evaluation.
22
 
23
  This checkpoint is a **preview** of the Molmo release. All artifacts used in creating Molmo (PixMo dataset, training code, evaluations, intermediate checkpoints) will be made available at a later date, furthering our commitment to open-source AI development and reproducibility.
24
 
25
+ [**Sign up here**](https://docs.google.com/forms/d/e/1FAIpQLSdML1MhNNBDsCHpgWG65Oydg2SjZzVasyqlP08nBrWjZp_c7A/viewform) to be the first to know when artifacts are released.
26
 
27
 
28
 
 
84
  # wooden deck. The deck's planks, which are a mix of light and dark brown with ...
85
  ```
86
 
87
+ ## Evaluations
88
+
89
+ | Model | Average Score on 11 Academic Benchmarks | Human Preference Elo Rating |
90
+ |-----------------------------|-----------------------------------------|-----------------------------|
91
+ | Molmo 72B | 81.2 | 1077 |
92
+ | **Molmo 7B-D** (this model) | **77.3** | **1056** |
93
+ | Molmo 7B-O | 74.6 | 1051 |
94
+ | MolmoE 1B | 68.6 | 1032 |
95
+ | GPT-4o | 78.5 | 1079 |
96
+ | GPT-4V | 71.1 | 1041 |
97
+ | Gemini 1.5 Pro | 78.3 | 1074 |
98
+ | Gemini 1.5 Flash | 75.1 | 1054 |
99
+ | Claude 3.5 Sonnet | 76.7 | 1069 |
100
+ | Claude 3 Opus | 66.4 | 971 |
101
+ | Claude 3 Haiku | 65.3 | 999 |
102
+ | Qwen VL2 72B | 79.4 | 1037 |
103
+ | Qwen VL2 7B | 73.7 | 1025 |
104
+ | Intern VL2 LLAMA 76B | 77.1 | 1018 |
105
+ | Intern VL2 8B | 69.4 | 953 |
106
+ | Pixtral 12B | 69.5 | 1016 |
107
+ | Phi3.5-Vision 4B | 59.7 | 982 |
108
+ | PaliGemma 3B | 50.0 | 937 |
109
+ | LLAVA OneVision 72B | 76.6 | 1051 |
110
+ | LLAVA OneVision 7B | 72.0 | 1024 |
111
+ | Cambrian-1 34B | 66.8 | 953 |
112
+ | Cambrian-1 8B | 63.4 | 952 |
113
+ | xGen - MM - Interleave 4B | 59.5 | 979 |
114
+ | LLAVA-1.5 13B | 43.9 | 960 |
115
+ | LLAVA-1.5 7B | 40.7 | 951 |
116
+
117
+ *Benchmarks: AI2D test, ChartQA test, VQA v2.0 test, DocQA test, InfographicVQA test, TextVQA val, RealWorldQA, MMMU val, MathVista testmini, CountBenchQA, Flickr Count (we collected this new dataset that is significantly harder than CountBenchQA).*
118
+
119
  ## License and Use
120
 
121
  This model is licensed under Apache 2.0. It is intended for research and educational use.