The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: DatasetGenerationCastError Exception: DatasetGenerationCastError Message: An error occurred while generating the dataset All the data files must have the same columns, but at some point there are 3 new columns ({'template', 'label', 'placeholders'}) and 2 missing columns ({'conversations', 'images'}). This happened while the json dataset builder was generating data using hf://datasets/yfqiu-nlp/AURORA/data/something/labels/validation.json (at revision fe7e83c797d4d5b583728d2329f25320a62e78fb) Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations) Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1870, in _prepare_split_single writer.write_table(table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 622, in write_table pa_table = table_cast(pa_table, self._schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2292, in table_cast return cast_table_to_schema(table, schema) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2240, in cast_table_to_schema raise CastError( datasets.table.CastError: Couldn't cast id: string label: string template: string placeholders: list<item: string> child 0, item: string -- schema metadata -- pandas: '{"index_columns": [], "column_indexes": [], "columns": [{"name":' + 562 to {'id': Value(dtype='string', id=None), 'images': Sequence(feature=Value(dtype='string', id=None), length=-1, id=None), 'conversations': [{'from': Value(dtype='string', id=None), 'value': Value(dtype='string', id=None)}]} because column names don't match During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1438, in compute_config_parquet_and_info_response parquet_operations = convert_to_parquet(builder) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1050, in convert_to_parquet builder.download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 924, in download_and_prepare self._download_and_prepare( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1000, in _download_and_prepare self._prepare_split(split_generator, **prepare_split_kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1741, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1872, in _prepare_split_single raise DatasetGenerationCastError.from_cast_error( datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset All the data files must have the same columns, but at some point there are 3 new columns ({'template', 'label', 'placeholders'}) and 2 missing columns ({'conversations', 'images'}). This happened while the json dataset builder was generating data using hf://datasets/yfqiu-nlp/AURORA/data/something/labels/validation.json (at revision fe7e83c797d4d5b583728d2329f25320a62e78fb) Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
id
string | images
sequence | conversations
list |
---|---|---|
000000105143 | [
"AURORA/data/something/frames/154202/first.jpg",
"AURORA/data/something/frames/154202/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: turning the camera left while filming camel"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105144 | [
"AURORA/data/something/frames/195805/first.jpg",
"AURORA/data/something/frames/195805/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: plugging the plug into the socket"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105145 | [
"AURORA/data/something/frames/190795/first.jpg",
"AURORA/data/something/frames/190795/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: turning the camera downwards while filming bottle"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105146 | [
"AURORA/data/something/frames/131125/first.jpg",
"AURORA/data/something/frames/131125/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: bending carrot until it breaks"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105147 | [
"AURORA/data/something/frames/220123/first.jpg",
"AURORA/data/something/frames/220123/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering card with papper"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105148 | [
"AURORA/data/something/frames/32/first.jpg",
"AURORA/data/something/frames/32/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting cup"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105149 | [
"AURORA/data/something/frames/1147/first.jpg",
"AURORA/data/something/frames/1147/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving mobile phone and specs closer to each other"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105150 | [
"AURORA/data/something/frames/141261/first.jpg",
"AURORA/data/something/frames/141261/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tearing a plastic bag just a little bit"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105151 | [
"AURORA/data/something/frames/35295/first.jpg",
"AURORA/data/something/frames/35295/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: plugging plug into plug socket"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105152 | [
"AURORA/data/something/frames/184107/first.jpg",
"AURORA/data/something/frames/184107/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving key chain up"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105153 | [
"AURORA/data/something/frames/5140/first.jpg",
"AURORA/data/something/frames/5140/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: taking bottle"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105154 | [
"AURORA/data/something/frames/71847/first.jpg",
"AURORA/data/something/frames/71847/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving plate down"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105155 | [
"AURORA/data/something/frames/133331/first.jpg",
"AURORA/data/something/frames/133331/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: dropping remote control onto desk"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105156 | [
"AURORA/data/something/frames/149139/first.jpg",
"AURORA/data/something/frames/149139/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting 2 toy cars onto duster"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105157 | [
"AURORA/data/something/frames/157845/first.jpg",
"AURORA/data/something/frames/157845/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pouring apple juice into a glass"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105158 | [
"AURORA/data/something/frames/183482/first.jpg",
"AURORA/data/something/frames/183482/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: turning the camera right while filming board"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105159 | [
"AURORA/data/something/frames/198475/first.jpg",
"AURORA/data/something/frames/198475/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: plugging a charger into an outlet"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105160 | [
"AURORA/data/something/frames/102187/first.jpg",
"AURORA/data/something/frames/102187/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tipping bottle over"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105161 | [
"AURORA/data/something/frames/126211/first.jpg",
"AURORA/data/something/frames/126211/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pushing fork from right to left"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105162 | [
"AURORA/data/something/frames/14292/first.jpg",
"AURORA/data/something/frames/14292/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting a teaspoon behind a mug"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105163 | [
"AURORA/data/something/frames/208472/first.jpg",
"AURORA/data/something/frames/208472/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pushing something so that it almost falls off but doesn't"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105164 | [
"AURORA/data/something/frames/52696/first.jpg",
"AURORA/data/something/frames/52696/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: lifting notebook with spoon on it"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105165 | [
"AURORA/data/something/frames/64006/first.jpg",
"AURORA/data/something/frames/64006/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: taking marker"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105166 | [
"AURORA/data/something/frames/37382/first.jpg",
"AURORA/data/something/frames/37382/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tearing paper into two pieces"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105167 | [
"AURORA/data/something/frames/6853/first.jpg",
"AURORA/data/something/frames/6853/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: turning the camera downwards while filming shampoo bottle"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105168 | [
"AURORA/data/something/frames/60360/first.jpg",
"AURORA/data/something/frames/60360/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving a pen and a marker away from each other"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105169 | [
"AURORA/data/something/frames/45788/first.jpg",
"AURORA/data/something/frames/45788/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pulling rubbing alcohol from behind of a soup dispenser"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105170 | [
"AURORA/data/something/frames/176844/first.jpg",
"AURORA/data/something/frames/176844/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering a doll with a face towel"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105171 | [
"AURORA/data/something/frames/207533/first.jpg",
"AURORA/data/something/frames/207533/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: sprinkling wheat bran onto a counter"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105172 | [
"AURORA/data/something/frames/53834/first.jpg",
"AURORA/data/something/frames/53834/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting knife next to glass cup"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105173 | [
"AURORA/data/something/frames/98523/first.jpg",
"AURORA/data/something/frames/98523/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: sprinkling sugar onto plate"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105174 | [
"AURORA/data/something/frames/89300/first.jpg",
"AURORA/data/something/frames/89300/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: opening oreo package"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105175 | [
"AURORA/data/something/frames/174121/first.jpg",
"AURORA/data/something/frames/174121/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving foot of feet"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105176 | [
"AURORA/data/something/frames/81048/first.jpg",
"AURORA/data/something/frames/81048/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving earring up"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105177 | [
"AURORA/data/something/frames/129488/first.jpg",
"AURORA/data/something/frames/129488/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pulling bottle from left to right"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105178 | [
"AURORA/data/something/frames/216225/first.jpg",
"AURORA/data/something/frames/216225/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: dropping pillow onto floor"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105179 | [
"AURORA/data/something/frames/179339/first.jpg",
"AURORA/data/something/frames/179339/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: taking dominoes"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105180 | [
"AURORA/data/something/frames/97447/first.jpg",
"AURORA/data/something/frames/97447/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tipping phone over"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105181 | [
"AURORA/data/something/frames/112080/first.jpg",
"AURORA/data/something/frames/112080/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pouring canola oi into the hot pan"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105182 | [
"AURORA/data/something/frames/98309/first.jpg",
"AURORA/data/something/frames/98309/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting book on a surface"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105183 | [
"AURORA/data/something/frames/136247/first.jpg",
"AURORA/data/something/frames/136247/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tearing paper into two pieces"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105184 | [
"AURORA/data/something/frames/129901/first.jpg",
"AURORA/data/something/frames/129901/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tipping a spray bottle over"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105185 | [
"AURORA/data/something/frames/141900/first.jpg",
"AURORA/data/something/frames/141900/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: dropping a textbook onto a box"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105186 | [
"AURORA/data/something/frames/103377/first.jpg",
"AURORA/data/something/frames/103377/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: picking headphones up"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105187 | [
"AURORA/data/something/frames/95874/first.jpg",
"AURORA/data/something/frames/95874/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: plugging a cable into the wall but pulling it right out as you remove your hand"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105188 | [
"AURORA/data/something/frames/88820/first.jpg",
"AURORA/data/something/frames/88820/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering chapstick with a pot lid"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105189 | [
"AURORA/data/something/frames/95683/first.jpg",
"AURORA/data/something/frames/95683/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering a toy with a towel"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105190 | [
"AURORA/data/something/frames/104527/first.jpg",
"AURORA/data/something/frames/104527/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: bending small magazine so that it deforms"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105191 | [
"AURORA/data/something/frames/215203/first.jpg",
"AURORA/data/something/frames/215203/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering computer with blanket"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105192 | [
"AURORA/data/something/frames/44727/first.jpg",
"AURORA/data/something/frames/44727/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: poking pc so lightly that it doesn't or almost doesn't move"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105193 | [
"AURORA/data/something/frames/158759/first.jpg",
"AURORA/data/something/frames/158759/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pushing a mouse from left to right"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105194 | [
"AURORA/data/something/frames/218282/first.jpg",
"AURORA/data/something/frames/218282/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: bending stick until it breaks"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105195 | [
"AURORA/data/something/frames/72990/first.jpg",
"AURORA/data/something/frames/72990/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: twisting (wringing) towel wet until water comes out"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105196 | [
"AURORA/data/something/frames/169237/first.jpg",
"AURORA/data/something/frames/169237/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: hitting a teddy with a spoon"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105197 | [
"AURORA/data/something/frames/133764/first.jpg",
"AURORA/data/something/frames/133764/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering glasses with cloth"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105198 | [
"AURORA/data/something/frames/16687/first.jpg",
"AURORA/data/something/frames/16687/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pushing a rubber duck so that it almost falls off but doesn't"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105199 | [
"AURORA/data/something/frames/106031/first.jpg",
"AURORA/data/something/frames/106031/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving tapemeasure across a surface without it falling down"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105200 | [
"AURORA/data/something/frames/74405/first.jpg",
"AURORA/data/something/frames/74405/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering tablet with newspaper"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105201 | [
"AURORA/data/something/frames/4887/first.jpg",
"AURORA/data/something/frames/4887/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting a pen"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105202 | [
"AURORA/data/something/frames/156885/first.jpg",
"AURORA/data/something/frames/156885/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tipping teacup with tea over, so tea falls out"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105203 | [
"AURORA/data/something/frames/203074/first.jpg",
"AURORA/data/something/frames/203074/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving away from wastebin with your camera"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105204 | [
"AURORA/data/something/frames/16512/first.jpg",
"AURORA/data/something/frames/16512/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: unfolding paper sheet"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105205 | [
"AURORA/data/something/frames/175606/first.jpg",
"AURORA/data/something/frames/175606/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: approaching small green ball with your camera"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105206 | [
"AURORA/data/something/frames/104280/first.jpg",
"AURORA/data/something/frames/104280/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting cup in front of outlet"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105207 | [
"AURORA/data/something/frames/111134/first.jpg",
"AURORA/data/something/frames/111134/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: covering a cream tube with a paper"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105208 | [
"AURORA/data/something/frames/89658/first.jpg",
"AURORA/data/something/frames/89658/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: bending bed sheet so that it deforms"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105209 | [
"AURORA/data/something/frames/767/first.jpg",
"AURORA/data/something/frames/767/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: uncovering glass jar"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105210 | [
"AURORA/data/something/frames/96652/first.jpg",
"AURORA/data/something/frames/96652/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tipping train car with chapstick over, so chapstick falls out"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105211 | [
"AURORA/data/something/frames/220035/first.jpg",
"AURORA/data/something/frames/220035/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving cycle towards the camera"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105212 | [
"AURORA/data/something/frames/149505/first.jpg",
"AURORA/data/something/frames/149505/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving compass with pencil down"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105213 | [
"AURORA/data/something/frames/49419/first.jpg",
"AURORA/data/something/frames/49419/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: hitting bowl with bottle"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105214 | [
"AURORA/data/something/frames/86126/first.jpg",
"AURORA/data/something/frames/86126/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: toy colliding with toy and both come to a halt"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105215 | [
"AURORA/data/something/frames/73765/first.jpg",
"AURORA/data/something/frames/73765/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: dropping a ball in front of headphones"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105216 | [
"AURORA/data/something/frames/154331/first.jpg",
"AURORA/data/something/frames/154331/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: taking a small toy"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105217 | [
"AURORA/data/something/frames/174861/first.jpg",
"AURORA/data/something/frames/174861/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting yellow clay container into orange bowl"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105218 | [
"AURORA/data/something/frames/33834/first.jpg",
"AURORA/data/something/frames/33834/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: picking soda can up"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105219 | [
"AURORA/data/something/frames/35202/first.jpg",
"AURORA/data/something/frames/35202/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: stacking 2 cup"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105220 | [
"AURORA/data/something/frames/70281/first.jpg",
"AURORA/data/something/frames/70281/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: turning the camera upwards while filming atm machine"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105221 | [
"AURORA/data/something/frames/32562/first.jpg",
"AURORA/data/something/frames/32562/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: stuffing pillow into pillowcase"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105222 | [
"AURORA/data/something/frames/22383/first.jpg",
"AURORA/data/something/frames/22383/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving the pen up"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105223 | [
"AURORA/data/something/frames/95173/first.jpg",
"AURORA/data/something/frames/95173/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: tearing paper into two pieces"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105224 | [
"AURORA/data/something/frames/45504/first.jpg",
"AURORA/data/something/frames/45504/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: turning the camera right while filming pick up van"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105225 | [
"AURORA/data/something/frames/107032/first.jpg",
"AURORA/data/something/frames/107032/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: sprinkling water onto referigerator"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105226 | [
"AURORA/data/something/frames/124218/first.jpg",
"AURORA/data/something/frames/124218/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: turning the camera right while filming steps"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105227 | [
"AURORA/data/something/frames/129920/first.jpg",
"AURORA/data/something/frames/129920/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting glue stick upright on the table"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105228 | [
"AURORA/data/something/frames/67825/first.jpg",
"AURORA/data/something/frames/67825/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: unfolding paper"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105229 | [
"AURORA/data/something/frames/150802/first.jpg",
"AURORA/data/something/frames/150802/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: piling clothes up"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105230 | [
"AURORA/data/something/frames/101585/first.jpg",
"AURORA/data/something/frames/101585/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving pen and spoon closer to each other"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105231 | [
"AURORA/data/something/frames/107628/first.jpg",
"AURORA/data/something/frames/107628/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: stacking 3 breads"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105232 | [
"AURORA/data/something/frames/67266/first.jpg",
"AURORA/data/something/frames/67266/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting fork next to mug"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105233 | [
"AURORA/data/something/frames/124412/first.jpg",
"AURORA/data/something/frames/124412/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: stacking ten twigs"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105234 | [
"AURORA/data/something/frames/154312/first.jpg",
"AURORA/data/something/frames/154312/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving garlic presser towards the camera"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105235 | [
"AURORA/data/something/frames/148258/first.jpg",
"AURORA/data/something/frames/148258/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: taking orange"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105236 | [
"AURORA/data/something/frames/220302/first.jpg",
"AURORA/data/something/frames/220302/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: pulling cup from right to left"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105237 | [
"AURORA/data/something/frames/134700/first.jpg",
"AURORA/data/something/frames/134700/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: hitting glass with magnifying glass"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105238 | [
"AURORA/data/something/frames/195636/first.jpg",
"AURORA/data/something/frames/195636/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: plugging cable into laptop"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105239 | [
"AURORA/data/something/frames/9576/first.jpg",
"AURORA/data/something/frames/9576/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: unfolding cloth"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105240 | [
"AURORA/data/something/frames/2185/first.jpg",
"AURORA/data/something/frames/2185/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: putting pencil on a surface"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105241 | [
"AURORA/data/something/frames/48640/first.jpg",
"AURORA/data/something/frames/48640/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: attaching magnet to fridge"
},
{
"from": "gpt",
"value": "<image>"
}
] |
000000105242 | [
"AURORA/data/something/frames/10979/first.jpg",
"AURORA/data/something/frames/10979/last.jpg"
] | [
{
"from": "human",
"value": "<image>\nEditing the given image according to the following prompt: moving post-it notes down"
},
{
"from": "gpt",
"value": "<image>"
}
] |
AURORA: Learning Action and Reasoning-Centric Image Editing from Videos and Simulation
AURORA (Action Reasoning Object Attribute) enables training an instruction-guided image editing model that can perform action and reasoning-centric edits, in addition to "simpler" established object, attribute or global edits. Here we release 1) training data, 2) trained model, 3) benchmark, 4) reproducible training and evaluation.
Please reach out to benno.krojer@mila.quebec or raise an issue if anything does not work!
Updates
5th December 2024: uploaded cleaner (actually usable) human ratings on AURORA-Bench. This can be useful for evaluating metrics via human correlation across a wide range of tasks. It includes 2K human ratings on outputs from 5 models.
Data
On the data side, we release three artifacts and a Datasheet documentation:
- The training dataset (AURORA)
- A benchmark for testing diverse editing skills (AURORA-Bench): object-centric, action-centric, reasoning-centric, and global edits
- Human ratings on AURORA-Bench, i.e. for other researchers working image editing metrics
You can also check out our Huggingface dataset.
Training Data (AURORA)
The edit instructions are stored as data/TASK/train.json
for each of the four tasks.
For the image pairs, you can download them easily via zenodo:
wget https://zenodo.org/record/11552426/files/ag_images.zip
wget https://zenodo.org/record/11552426/files/kubric_images.zip
wget https://zenodo.org/record/11552426/files/magicbrush_images.zip
Now put them into their respective directory data/NAME
and rename them images.zip.
So in the end you should have data/kubric/images
as a directory etc.
For Something-Something-Edit, you need to go to the original source and download all the zip files and put all the videos in a folder named data/something/videos/
.
Then run
cd data/something
python extract_frames.py
python filter_keywords.py
For each sub-dataset of AURORA, an entry would look like this:
[
{
"instruction": "Leave the door while standing closer",
"input": "data/ag/images/1K0SU.mp4_4_left.png",
"output": "data/ag/images/1K0SU.mp4_4_right.png"
},
{"..."}
]
If you are interested in developing your own similar Kubric data, it takes some effort (i.e. Docker+Blender setup), but we provide some starting code under eq-kubric
.
Benchmark: AURORA-Bench
For measuring how well models do on various editing skills (action, reasoning, object/attribute, global), we introduce AURORA-Bench hosted here on this repository under test.json
with the respective images under data/TASK/images/
.
Human Ratings
We also release human ratings of image editing outputs on AURORA-Bench examples, which forms the basis of our main evaluation in the paper. The output images and assocaciated human ratings can be downloaded from Google Drive and is straightforward to use e.g. for computing correlations with a new metric: json, images files
Running stuff
Similar to MagicBrush we adopt the pix2pix codebase for running and training models.
Inference
Please create a python environment and install the requirements.txt file (it is unfortunately important to use 3.9 due to taming-transformers):
python3.9 -m venv env
pip3 install -r requirements.txt
You can download our trained checkpoint from Google Drive: Link, place it in the main directory and run our AURORA-trained model on an example image:
python3 edit_cli.py
Training
To reproduce our training, first download an initial checkpoint that is the reproduced MagicBrush model: Google Drive Link
Due to weird versioning of libraries/python, you have to go to env/src/taming-transformers/taming/data/utils.py
and comment out line 11: from torch._six import string_classes
.
Now you can run the the train script (hyperparameters can be changed under configs/finetune_magicbrush_ag_something_kubric_15-15-1-1_init-magic.yaml
):
python3 main.py --gpus 0,
Specify more gpus with i.e. --gpus 0,1,2,3
.
Reproduce Evaluation
We primarily rely on human evaluation of model outputs on AURORA-Bench. However our second proposed evaluation metric is automatic and here is how you reproduce it.
First, run python3 disc_edit.py --task TASK
(i.e. --task whatsup
). This will generate outputs in a folder called itm_evaluation, that will then be evaluated via python3 eval_disc_edit.py
Citation
@inproceedings{krojer2024aurora,
author={Benno Krojer and Dheeraj Vattikonda and Luis Lara and Varun Jampani and Eva Portelance and Christopher Pal and Siva Reddy},
title={{Learning Action and Reasoning-Centric Image Editing from Videos and Simulations}},
booktitle={NeurIPS},
year={2024},
note={Spotlight Paper},
url={https://arxiv.org/abs/2407.03471}
}
Acknowledgements & License
We use the MIT License.
We want to thank several repositories that made our life much easier on this project:
- The MagicBrush and InstructPix2Pix code base and datasets, especially the correspondance with MagicBrush authors helped us a lot.
- The dataset/engines we use to build AURORA: Something Something v2, Action-Genome and Kubric
- Source code from EQBEN for generating images with the Kubric engine
- Downloads last month
- 281