File size: 10,350 Bytes
b508412 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 d884bac fea4142 e17e64b 736527e e17e64b b508412 e17e64b 79f070e e17e64b 640e371 5e27c09 e17e64b 5e27c09 640e371 79f070e 253f78a 79f070e 6c9dd0a 253f78a a4f7c4d 6c9dd0a 79f070e 253f78a 5e27c09 253f78a 79f070e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 |
---
license: cc-by-2.0
dataset_info:
features:
- name: sample_id
dtype: int32
- name: task_instruction
dtype: string
- name: task_instance
struct:
- name: context
dtype: string
- name: images_path
sequence: string
- name: choice_list
sequence: string
- name: combined_1_images
sequence: string
- name: response
dtype: string
splits:
- name: ActionLocalization_test
num_bytes: 291199
num_examples: 200
- name: ActionLocalization_adv
num_bytes: 291199
num_examples: 200
- name: ActionPrediction_test
num_bytes: 255687
num_examples: 200
- name: ActionPrediction_adv
num_bytes: 255687
num_examples: 200
- name: ActionSequence_test
num_bytes: 262234
num_examples: 200
- name: ActionSequence_adv
num_bytes: 262234
num_examples: 200
- name: ALFRED_test
num_bytes: 112715
num_examples: 200
- name: ALFRED_adv
num_bytes: 112715
num_examples: 200
- name: CharacterOrder_test
num_bytes: 274821
num_examples: 200
- name: CharacterOrder_adv
num_bytes: 274821
num_examples: 200
- name: CLEVR_Change_test
num_bytes: 114792
num_examples: 200
- name: CLEVR_Change_adv
num_bytes: 114792
num_examples: 200
- name: CounterfactualInference_test
num_bytes: 129074
num_examples: 200
- name: CounterfactualInference_adv
num_bytes: 129074
num_examples: 200
- name: DocVQA_test
num_bytes: 76660
num_examples: 200
- name: DocVQA_adv
num_bytes: 76660
num_examples: 200
- name: EgocentricNavigation_test
num_bytes: 559193
num_examples: 200
- name: EgocentricNavigation_adv
num_bytes: 559193
num_examples: 200
- name: GPR1200_test
num_bytes: 579624
num_examples: 600
- name: IEdit_test
num_bytes: 50907
num_examples: 200
- name: IEdit_adv
num_bytes: 50907
num_examples: 200
- name: ImageNeedleInAHaystack_test
num_bytes: 303423
num_examples: 320
- name: MMCoQA_test
num_bytes: 344623
num_examples: 200
- name: MMCoQA_adv
num_bytes: 344623
num_examples: 200
- name: MovingAttribute_test
num_bytes: 97299
num_examples: 200
- name: MovingAttribute_adv
num_bytes: 97299
num_examples: 200
- name: MovingDirection_test
num_bytes: 115832
num_examples: 200
- name: MovingDirection_adv
num_bytes: 115832
num_examples: 200
- name: MultiModalQA_test
num_bytes: 87978
num_examples: 200
- name: MultiModalQA_adv
num_bytes: 87978
num_examples: 200
- name: nuscenes_test
num_bytes: 87282
num_examples: 200
- name: nuscenes_adv
num_bytes: 87282
num_examples: 200
- name: ObjectExistence_test
num_bytes: 94139
num_examples: 200
- name: ObjectExistence_adv
num_bytes: 94139
num_examples: 200
- name: ObjectInteraction_test
num_bytes: 264032
num_examples: 200
- name: ObjectInteraction_adv
num_bytes: 264032
num_examples: 200
- name: ObjectShuffle_test
num_bytes: 289186
num_examples: 200
- name: ObjectShuffle_adv
num_bytes: 289186
num_examples: 200
- name: OCR_VQA_test
num_bytes: 80940
num_examples: 200
- name: OCR_VQA_adv
num_bytes: 80940
num_examples: 200
- name: SceneTransition_test
num_bytes: 266203
num_examples: 200
- name: SceneTransition_adv
num_bytes: 266203
num_examples: 200
- name: SlideVQA_test
num_bytes: 89462
num_examples: 200
- name: SlideVQA_adv
num_bytes: 89462
num_examples: 200
- name: Spot_the_Diff_test
num_bytes: 47823
num_examples: 200
- name: Spot_the_Diff_adv
num_bytes: 47823
num_examples: 200
- name: StateChange_test
num_bytes: 286783
num_examples: 200
- name: StateChange_adv
num_bytes: 286783
num_examples: 200
- name: TextNeedleInAHaystack_test
num_bytes: 11140730
num_examples: 320
- name: TQA_test
num_bytes: 92861
num_examples: 200
- name: TQA_adv
num_bytes: 92861
num_examples: 200
- name: WebQA_test
num_bytes: 202682
num_examples: 200
- name: WebQA_adv
num_bytes: 202682
num_examples: 200
- name: WikiVQA_test
num_bytes: 2557847
num_examples: 200
- name: WikiVQA_adv
num_bytes: 2557847
num_examples: 200
download_size: 12035444
dataset_size: 26288285
configs:
- config_name: default
data_files:
- split: ActionLocalization_test
path: preview/ActionLocalization_test-*
- split: ActionLocalization_adv
path: preview/ActionLocalization_adv-*
- split: ActionPrediction_test
path: preview/ActionPrediction_test-*
- split: ActionPrediction_adv
path: preview/ActionPrediction_adv-*
- split: ActionSequence_test
path: preview/ActionSequence_test-*
- split: ActionSequence_adv
path: preview/ActionSequence_adv-*
- split: ALFRED_test
path: preview/ALFRED_test-*
- split: ALFRED_adv
path: preview/ALFRED_adv-*
- split: CharacterOrder_test
path: preview/CharacterOrder_test-*
- split: CharacterOrder_adv
path: preview/CharacterOrder_adv-*
- split: CLEVR_Change_test
path: preview/CLEVR_Change_test-*
- split: CLEVR_Change_adv
path: preview/CLEVR_Change_adv-*
- split: CounterfactualInference_test
path: preview/CounterfactualInference_test-*
- split: CounterfactualInference_adv
path: preview/CounterfactualInference_adv-*
- split: DocVQA_test
path: preview/DocVQA_test-*
- split: DocVQA_adv
path: preview/DocVQA_adv-*
- split: EgocentricNavigation_test
path: preview/EgocentricNavigation_test-*
- split: EgocentricNavigation_adv
path: preview/EgocentricNavigation_adv-*
- split: GPR1200_test
path: preview/GPR1200_test-*
- split: IEdit_test
path: preview/IEdit_test-*
- split: IEdit_adv
path: preview/IEdit_adv-*
- split: ImageNeedleInAHaystack_test
path: preview/ImageNeedleInAHaystack_test-*
- split: MMCoQA_test
path: preview/MMCoQA_test-*
- split: MMCoQA_adv
path: preview/MMCoQA_adv-*
- split: MovingAttribute_test
path: preview/MovingAttribute_test-*
- split: MovingAttribute_adv
path: preview/MovingAttribute_adv-*
- split: MovingDirection_test
path: preview/MovingDirection_test-*
- split: MovingDirection_adv
path: preview/MovingDirection_adv-*
- split: MultiModalQA_test
path: preview/MultiModalQA_test-*
- split: MultiModalQA_adv
path: preview/MultiModalQA_adv-*
- split: nuscenes_test
path: preview/nuscenes_test-*
- split: nuscenes_adv
path: preview/nuscenes_adv-*
- split: ObjectExistence_test
path: preview/ObjectExistence_test-*
- split: ObjectExistence_adv
path: preview/ObjectExistence_adv-*
- split: ObjectInteraction_test
path: preview/ObjectInteraction_test-*
- split: ObjectInteraction_adv
path: preview/ObjectInteraction_adv-*
- split: ObjectShuffle_test
path: preview/ObjectShuffle_test-*
- split: ObjectShuffle_adv
path: preview/ObjectShuffle_adv-*
- split: OCR_VQA_test
path: preview/OCR_VQA_test-*
- split: OCR_VQA_adv
path: preview/OCR_VQA_adv-*
- split: SceneTransition_test
path: preview/SceneTransition_test-*
- split: SceneTransition_adv
path: preview/SceneTransition_adv-*
- split: SlideVQA_test
path: preview/SlideVQA_test-*
- split: SlideVQA_adv
path: preview/SlideVQA_adv-*
- split: Spot_the_Diff_test
path: preview/Spot_the_Diff_test-*
- split: Spot_the_Diff_adv
path: preview/Spot_the_Diff_adv-*
- split: StateChange_test
path: preview/StateChange_test-*
- split: StateChange_adv
path: preview/StateChange_adv-*
- split: TextNeedleInAHaystack_test
path: preview/TextNeedleInAHaystack_test-*
- split: TQA_test
path: preview/TQA_test-*
- split: TQA_adv
path: preview/TQA_adv-*
- split: WebQA_test
path: preview/WebQA_test-*
- split: WebQA_adv
path: preview/WebQA_adv-*
- split: WikiVQA_test
path: preview/WikiVQA_test-*
- split: WikiVQA_adv
path: preview/WikiVQA_adv-*
task_categories:
- visual-question-answering
- question-answering
- text-generation
- image-to-text
- video-classification
language:
- en
tags:
- Long-context
- MLLM
- VLM
- LLM
- Benchmark
pretty_name: MileBench
size_categories:
- 1K<n<10K
---
# MileBench
## Introduction
We introduce MileBench, a pioneering benchmark designed to test the **M**ult**I**modal **L**ong-cont**E**xt capabilities of MLLMs.
This benchmark comprises not only multimodal long contexts, but also multiple tasks requiring both comprehension and generation.
We establish two distinct evaluation sets, diagnostic and realistic, to systematically assess MLLMs’ long-context adaptation capacity and their ability to completetasks in long-context scenarios
<img src="./images/MileBench.png" width="600" alt="MileBench" align="center" />
To construct our evaluation sets, we gather 6,440 multimodal long-context samples from 21 pre-existing or self-constructed datasets,
with an average of 15.2 images and 422.3 words each, as depicted in the figure, and we categorize them into their respective subsets.
<center class="half">
<img src="./images/stat2.png" width="300" alt="stat2"/><img src="./images/stat1.png" width="300" alt="stat1"/>
</center>
## How to use?
Please download MileBench.tar.gz and refer to [Code for MileBench](https://github.com/MileBench/MileBench?tab=readme-ov-file#-dataset-preparation).
## Links
- **Homepage:** [MileBench Homepage](https://milebench.github.io/)
- **Repository:** [MileBench GitHub](https://github.com/MileBench/MileBench)
- **Paper:** [Arxiv](https://arxiv.org/abs/2404.18532)
- **Point of Contact:** [Dingjie Song](mailto:bbsngg@outlook.com)
## Citation
If you find this project useful in your research, please consider cite:
```BibTeX
@misc{song2024milebench,
title={MileBench: Benchmarking MLLMs in Long Context},
author={Dingjie Song and Shunian Chen and Guiming Hardy Chen and Fei Yu and Xiang Wan and Benyou Wang},
year={2024},
eprint={2404.18532},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@article{song2024milebench,
title={MileBench: Benchmarking MLLMs in Long Context},
author={Song, Dingjie and Chen, Shunian and Chen, Guiming Hardy and Yu, Fei and Wan, Xiang and Wang, Benyou},
journal={arXiv preprint arXiv:2404.18532},
year={2024}
}
``` |