Update README.md
Browse files
README.md
CHANGED
|
@@ -55,6 +55,14 @@ Multimodal large language models (MLLMs) play a pivotal role in advancing the qu
|
|
| 55 |
</div>
|
| 56 |
|
| 57 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 58 |
|
| 59 |
|
| 60 |
## π Preparation
|
|
@@ -181,133 +189,6 @@ python3 inference_edit.py \
|
|
| 181 |
|
| 182 |
|
| 183 |
|
| 184 |
-
## π Performance
|
| 185 |
-
|
| 186 |
-
|
| 187 |
-
### 1. Visual Understanding
|
| 188 |
-
|
| 189 |
-
| Model | #LLM | MMB | MMStar | MathVista | SEED | MME-P | MMMU | OCRBench | POPE | DocVQA |
|
| 190 |
-
| ----- | ---- | --- | ------ | --------- | ---- | ----- | ---- | -------- | ---- | ------ |
|
| 191 |
-
| Janus-Pro | 7B | 79.2 | 87.4 | - | 72.1 | 1567.1 | 41.0 | - | - | - |
|
| 192 |
-
| BLIP3-o | 8B | 83.5 | - | - | 77.5 | 1682.6 | 50.6 | - | - | - |
|
| 193 |
-
| Show-o2 | 7B | 79.3 | 56.6 | - | 69.8 | 1620.0 | 48.9 | - | - | - |
|
| 194 |
-
| MetaQuery-XL | 7B | 83.5 | - | - | 76.9 | 1685.2 | 58.6 | - | - | - |
|
| 195 |
-
| Bagel | 14B | 85.0 | - | 73.1 | - | 1687.0 | 55.3 | - | - | - |
|
| 196 |
-
| Ovis-U1 | 1.5B | 77.8 | - | 69.4 | - | - | 51.1 | 88.3 | - | - |
|
| 197 |
-
| ILLUME+ | 3B | 80.8 | - | - | 73.3 | 1414.0 | 44.3 | 67.2 | 87.6 | 80.8 |
|
| 198 |
-
| X-Omni | 7B | 74.8 | - | - | 74.1 | - | - | 70.4 | 89.3 | 88.6 |
|
| 199 |
-
| **STAR-3B** | 3B | **80.1** | **55.8** | **62.3** | **74.0** | **1592.3** | **53.1** | **79.7** | **85.9** | **93.9** |
|
| 200 |
-
| **STAR-7B** | 7B | **83.9** | **63.9** | **68.1** | **77.0** | **1690.1** | **58.6** | **86.4** | **86.6** | **95.7** |
|
| 201 |
-
|
| 202 |
-
### 2. Text-to-Image Generation
|
| 203 |
-
|
| 204 |
-
#### GenEval
|
| 205 |
-
|
| 206 |
-
| Model | Single | Two | Count. | Colors | Pos. | Color Attr. | Overall |
|
| 207 |
-
| ----- | ------ | --- | ------ | ------ | ---- | ----------- | ------- |
|
| 208 |
-
| <td colspan="6" align="center">**Generation-Only Models**</td> |
|
| 209 |
-
| SDXL | 0.98 | 0.74 | 0.39 | 0.85 | 0.15 | 0.23 | 0.55 |
|
| 210 |
-
| DALL-E | 0.96 | 0.87 | 0.47 | 0.83 | 0.43 | 0.45 | 0.67 |
|
| 211 |
-
| SD3-medium | 0.99 | 0.94 | 0.72 | 0.89 | 0.33 | 0.60 | 0.74 |
|
| 212 |
-
| FLUX.1-dev | 0.98 | 0.93 | 0.75 | 0.93 | 0.68 | 0.65 | 0.82 |
|
| 213 |
-
| OmniGen2 | 0.99 | 0.96 | 0.74 | 0.98 | 0.72 | 0.75 | 0.86 |
|
| 214 |
-
| <td colspan="6" align="center">**Unified Models**</td> |
|
| 215 |
-
| Emu3 | 0.99 | 0.81 | 0.42 | 0.80 | 0.49 | 0.45 | 0.66 |
|
| 216 |
-
| ILLUME+ | 0.99 | 0.88 | 0.62 | 0.84 | 0.42 | 0.53 | 0.72 |
|
| 217 |
-
| Janus-Pro | 0.99 | 0.89 | 0.59 | 0.90 | 0.79 | 0.66 | 0.80 |
|
| 218 |
-
| MetaQuery | - | - | - | - | - | - | 0.80 |
|
| 219 |
-
| BLIP3-o | - | - | - | - | - | - | 0.84 |
|
| 220 |
-
| UniWorld-V1 | 0.99 | 0.93 | 0.81 | 0.89 | 0.74 | 0.71 | 0.84 |
|
| 221 |
-
| Mogao | 1.00 | 0.97 | 0.83 | 0.93 | 0.84 | 0.80 | 0.89 |
|
| 222 |
-
| BAGEL | 0.98 | 0.95 | 0.84 | 0.95 | 0.78 | 0.77 | 0.88 |
|
| 223 |
-
| Show-o2 | 1.00 | 0.87 | 0.58 | 0.92 | 0.52 | 0.62 | 0.76 |
|
| 224 |
-
| GPT-4o | 0.99 | 0.92 | 0.85 | 0.92 | 0.75 | 0.61 | 0.84 |
|
| 225 |
-
| X-Omni | 0.98 | 0.95 | 0.75 | 0.91 | 0.71 | 0.68 | 0.83 |
|
| 226 |
-
| Ovis-U1 | 0.98 | 0.98 | 0.90 | 0.92 | 0.79 | 0.75 | 0.89 |
|
| 227 |
-
| **STAR-3B** | 0.98 | 0.87 | 0.85 | 0.91 | 0.79 | 0.76 | **0.86** |
|
| 228 |
-
| **STAR-7B** | 0.98 | 0.94 | 0.90 | 0.92 | 0.91 | 0.80 | **0.91** |
|
| 229 |
-
|
| 230 |
-
#### DPG-Bench
|
| 231 |
-
|
| 232 |
-
| Model | Global | Entity | Attr. | Relation | Other | Overall |
|
| 233 |
-
| ----- | ------ | ------ | ----- | -------- | ----- | ------- |
|
| 234 |
-
| <td colspan="5" align="center">**Generation-Only Models**</td> |
|
| 235 |
-
| SDXL | 83.27 | 82.43 | 80.91 | 86.76 | 80.41 | 74.65 |
|
| 236 |
-
| DALL-E | 90.97 | 89.61 | 88.39 | 90.58 | 89.83 | 83.50 |
|
| 237 |
-
| SD3-medium | 87.90 | 91.01 | 88.83 | 80.70 | 88.68 | 84.08 |
|
| 238 |
-
| FLUX.1-dev | 82.10 | 89.50 | 88.70 | 91.10 | 89.40 | 84.00 |
|
| 239 |
-
| OmniGen2 | 88.81 | 88.83 | 90.18 | 89.37 | 90.27 | 83.57 |
|
| 240 |
-
| <td colspan="5" align="center">**Unified Models**</td> |
|
| 241 |
-
| Emu3 | 85.21 | 86.68 | 86.84 | 90.22 | 83.15 | 80.60 |
|
| 242 |
-
| ILLUME+ | - | - | - | - | - | - |
|
| 243 |
-
| Janus-Pro | 86.90 | 88.90 | 89.40 | 89.32 | 89.48 | 84.19 |
|
| 244 |
-
| MetaQuery | - | - | - | - | - | 82.05 |
|
| 245 |
-
| BLIP3-o | - | - | - | - | - | 81.60 |
|
| 246 |
-
| UniWorld-V1 | 83.64 | 88.39 | 88.44 | 89.27 | 87.22 | 81.38 |
|
| 247 |
-
| Mogao | 82.37 | 90.03 | 88.26 | 93.18 | 85.40 | 84.33 |
|
| 248 |
-
| BAGEL | 88.94 | 90.37 | 91.29 | 90.82 | 88.67 | 85.07 |
|
| 249 |
-
| Show-o2 | 89.00 | 91.78 | 89.96 | 91.81 | 91.64 | 86.14 |
|
| 250 |
-
| GPT-4o | 82.27 | 91.27 | 87.67 | 93.85 | 88.71 | 86.23 |
|
| 251 |
-
| X-Omni | 84.80 | 92.59 | 90.63 | 94.75 | 84.20 | 87.65 |
|
| 252 |
-
| Ovis-U1 | 82.37 | 90.08 | 88.68 | 93.35 | 85.20 | 83.72 |
|
| 253 |
-
| **STAR-3B** | 93.00 | 90.49 | 91.71 | 90.72 | 92.75 | **87.30** |
|
| 254 |
-
| **STAR-7B** | 94.97 | 92.91 | 91.62 | 94.30 | 83.82 | **87.44** |
|
| 255 |
-
|
| 256 |
-
#### WISE (World Knowledge Reasoning)
|
| 257 |
-
|
| 258 |
-
| Model | Cultural | Time | Space | Biology | Physics | Chemistry | Overall |
|
| 259 |
-
| ----- | -------- | ---- | ----- | ------- | ------- | --------- | ------- |
|
| 260 |
-
| <td colspan="6" align="center">**Generation-Only Models**</td> |
|
| 261 |
-
| SD-XL | 0.43 | 0.48 | 0.47 | 0.44 | 0.45 | 0.27 | 0.43 |
|
| 262 |
-
| SD-3.5-large | 0.44 | 0.50 | 0.58 | 0.44 | 0.52 | 0.31 | 0.46 |
|
| 263 |
-
| FLUX.1-dev | 0.48 | 0.58 | 0.62 | 0.42 | 0.51 | 0.35 | 0.50 |
|
| 264 |
-
| <td colspan="6" align="center">**Unified Models**</td> |
|
| 265 |
-
| Emu3 | 0.34 | 0.45 | 0.48 | 0.41 | 0.45 | 0.27 | 0.39 |
|
| 266 |
-
| Janus-Pro-7B | 0.30 | 0.37 | 0.49 | 0.36 | 0.42 | 0.26 | 0.35 |
|
| 267 |
-
| MetaQuery-XL | 0.56 | 0.55 | 0.62 | 0.49 | 0.63 | 0.41 | 0.55 |
|
| 268 |
-
| BLIP3-o | - | - | - | - | - | - | 0.62 |
|
| 269 |
-
| BAGEL | 0.76 | 0.69 | 0.75 | 0.65 | 0.75 | 0.58 | 0.70 |
|
| 270 |
-
| GPT-4o | 0.94 | 0.64 | 0.98 | 0.93 | 0.98 | 0.95 | 0.89 |
|
| 271 |
-
| **STAR-3B** | 0.58 | 0.54 | 0.48 | 0.49 | 0.51 | 0.54 | **0.52** |
|
| 272 |
-
| **STAR-7B** | 0.61 | 0.67 | 0.61 | 0.74 | 0.69 | 0.66 | **0.66** |
|
| 273 |
-
|
| 274 |
-
### 3. Image Editing
|
| 275 |
-
|
| 276 |
-
#### MagicBrush
|
| 277 |
-
|
| 278 |
-
| Model | L1 β | CLIP-I β | DINO β |
|
| 279 |
-
| ----- | ---- | -------- | ------ |
|
| 280 |
-
| MagicBrush | 0.074 | 0.908 | 0.847 |
|
| 281 |
-
| Instruct-Pix2Pix | 0.114 | 0.851 | 0.744 |
|
| 282 |
-
| UltraEdit | 0.066 | 0.904 | 0.852 |
|
| 283 |
-
| ICEdit | 0.060 | 0.928 | 0.853 |
|
| 284 |
-
| OmniGen | 0.116 | 0.863 | 0.821 |
|
| 285 |
-
| UniReal | 0.081 | 0.903 | 0.837 |
|
| 286 |
-
| BAGEL | 0.074 | 0.914 | 0.827 |
|
| 287 |
-
| **STAR-3B** | **0.056** | **0.934** | **0.857** |
|
| 288 |
-
| **STAR-7B** | **0.060** | **0.931** | **0.853** |
|
| 289 |
-
|
| 290 |
-
#### ImgEdit-Bench
|
| 291 |
-
|
| 292 |
-
| Model | Add | Adjust | Extract | Replace | Remove | Background | Style | Hybrid | Action | Overall |
|
| 293 |
-
| ----- | --- | ------ | ------- | ------- | ------ | ---------- | ----- | ------ | ------ | ------- |
|
| 294 |
-
| <td colspan="9" align="center">**Editing-Only Models**</td> |
|
| 295 |
-
| MagicBrush | 2.84 | 1.58 | 1.51 | 1.97 | 1.58 | 1.75 | 2.38 | 1.62 | 1.22 | 1.90 |
|
| 296 |
-
| Instruct-Pix2Pix | 2.45 | 1.83 | 1.44 | 2.01 | 1.50 | 1.44 | 3.55 | 1.20 | 1.46 | 1.88 |
|
| 297 |
-
| AnyEdit | 3.18 | 2.95 | 1.88 | 2.47 | 2.23 | 2.24 | 2.85 | 1.56 | 2.65 | 2.45 |
|
| 298 |
-
| UltraEdit | 3.44 | 2.81 | 2.13 | 2.96 | 1.45 | 2.83 | 3.76 | 1.91 | 2.98 | 2.70 |
|
| 299 |
-
| Step1X-Edit | 3.88 | 3.14 | 1.76 | 3.40 | 2.41 | 3.16 | 4.63 | 2.64 | 2.52 | 3.06 |
|
| 300 |
-
| ICEdit | 3.58 | 3.39 | 1.73 | 3.15 | 2.93 | 3.08 | 3.84 | 2.04 | 3.68 | 3.05 |
|
| 301 |
-
| <td colspan="9" align="center">**Unified Models**</td> |
|
| 302 |
-
| GPT-4o | 4.61 | 4.33 | 2.90 | 4.35 | 3.66 | 4.57 | 4.93 | 3.96 | 4.89 | 4.20 |
|
| 303 |
-
| OmniGen | 3.47 | 3.04 | 1.71 | 2.94 | 2.43 | 3.21 | 4.19 | 2.24 | 3.38 | 2.96 |
|
| 304 |
-
| BAGEL | 3.56 | 3.31 | 1.70 | 3.30 | 2.62 | 3.24 | 4.49 | 2.38 | 4.17 | 3.20 |
|
| 305 |
-
| UniWorld-V1 | 3.82 | 3.64 | 2.27 | 3.47 | 3.24 | 2.99 | 4.21 | 2.96 | 2.74 | 3.26 |
|
| 306 |
-
| OmniGen2 | 3.57 | 3.06 | 1.77 | 3.74 | 3.20 | 3.57 | 4.81 | 2.52 | 4.68 | 3.44 |
|
| 307 |
-
| Ovis-U1 | 4.13 | 3.62 | 2.98 | 4.45 | 4.06 | 4.22 | 4.69 | 3.45 | 4.61 | 4.00 |
|
| 308 |
-
| **STAR-3B** | **4.26** | **4.06** | **3.78** | **4.46** | **4.34** | **4.19** | **4.53** | **3.29** | **4.38** | **4.14** |
|
| 309 |
-
| **STAR-7B** | **4.33** | **4.19** | **4.19** | **4.59** | **4.58** | **4.36** | **4.59** | **3.67** | **4.60** | **4.34** |
|
| 310 |
-
|
| 311 |
|
| 312 |
## βοΈ Citation
|
| 313 |
|
|
|
|
| 55 |
</div>
|
| 56 |
|
| 57 |
|
| 58 |
+
## π Model Checkpoint
|
| 59 |
+
|
| 60 |
+
|
| 61 |
+
| Model Name | Checkpoint | Config |
|
| 62 |
+
| :--------: | :--------: | :----: |
|
| 63 |
+
| STAR-3B | [Link](#) | [Config](star/configs/STAR_Qwen2.5-VL-3B.json) |
|
| 64 |
+
| STAR-7B | [Link](#) | [Config](star/configs/STAR_Qwen2.5-VL-7B.json) |
|
| 65 |
+
| VQ Model | [Link](#) | - |
|
| 66 |
|
| 67 |
|
| 68 |
## π Preparation
|
|
|
|
| 189 |
|
| 190 |
|
| 191 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 192 |
|
| 193 |
## βοΈ Citation
|
| 194 |
|