split stringclasses 1 value | image_id stringlengths 12 25 | file_name stringlengths 16 29 | image_info dict | caption_info dict | mask_annotations listlengths 2 52 | categories listlengths 1 1 |
|---|---|---|---|---|---|---|
train | 000000417619 | 000000417619.jpg | {
"data_source": "COCONut",
"file_name": "000000417619.jpg",
"height": 640,
"id": "000000417619",
"width": 427
} | {
"caption": "A smiling groom in a grey suit and light colored shirt with a blue patterned tie holding a clear umbrella and a bride in a white lace wedding dress also holding another clear umbrella walk together along a wet stone path,. They are surrounded by lush green grass and leafy green trees, with a stone church tower that has a white-faced clock visible in the background against the overcast sky.",
"caption_ann": "A smiling <8:groom in a grey suit and light colored shirt> with a <6:blue patterned tie> holding a <7:clear umbrella> and a <10:bride in a white lace wedding dress> also holding another <9:clear umbrella> walk together along a <4:wet stone path>,. They are surrounded by <3:lush green grass> and <2:leafy green trees>, with a <1:stone church tower> that has a <5:white-faced clock> visible in the background against the <0:overcast sky>.",
"id": 1200,
"image_id": "000000417619",
"label_matched": [
{
"mask_ids": [
8
],
"txt_desc": "groom in a grey suit and light colored shirt"
},
{
"mask_ids": [
6
],
"txt_desc": "blue patterned tie"
},
{
"mask_ids": [
7
],
"txt_desc": "clear umbrella"
},
{
"mask_ids": [
10
],
"txt_desc": "bride in a white lace wedding dress"
},
{
"mask_ids": [
9
],
"txt_desc": "clear umbrella"
},
{
"mask_ids": [
4
],
"txt_desc": "wet stone path"
},
{
"mask_ids": [
3
],
"txt_desc": "lush green grass"
},
{
"mask_ids": [
2
],
"txt_desc": "leafy green trees"
},
{
"mask_ids": [
1
],
"txt_desc": "stone church tower"
},
{
"mask_ids": [
5
],
"txt_desc": "white-faced clock"
},
{
"mask_ids": [
0
],
"txt_desc": "overcast sky"
}
],
"labels": [
"sky-other-merged",
"wall-brick",
"tree-merged",
"grass-merged",
"pavement-merged",
"clock",
"tie",
"umbrella",
"person",
"umbrella",
"person"
]
} | [
{
"area": 4392,
"bbox": [
0,
0,
258,
272
],
"category_id": 187,
"id": 14259,
"image_id": "000000417619",
"iscrowd": 0,
"segmentation": {
"counts": "o72nc011NPd01R\\O0P\\O10Nfc0<L00O2O1O0000001O00jce0J`hYO:^c031000000000000000000001O0000000i]OHf`0i0c^O]O]... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000419777 | 000000419777.jpg | {
"data_source": "COCONut",
"file_name": "000000419777.jpg",
"height": 480,
"id": "000000419777",
"width": 640
} | {
"caption": "During a motocross race on a hilly course made of green grass and a dirt track, a rider in yellow, black, and white gear goes airborne on a dirt bike, while several other competitors, including a rider in a blue and yellow jersey, a rider in green, a rider in red, and a more distant rider, navigate a curve on their respective motorcycles in the background. On the side of the track, in the foreground a rider is partially visible on a motorcycle. The entire scene is bordered by dense green trees.",
"caption_ann": "During a motocross race on a hilly course made of <1:green grass> and a <2:dirt track>, a <11:rider in yellow, black, and white gear> goes airborne on a <3:dirt bike>, while several other competitors, including a <12:rider in a blue and yellow jersey>, a <7:rider in green>, a <10:rider in red>, and a <8:more distant rider>, navigate a curve on their respective <4,14,5,13:motorcycles> in the background. On the side of the track, in the foreground a <9:rider> is partially visible on a <6:motorcycle>. The entire scene is bordered by <0:dense green trees>.",
"id": 1201,
"image_id": "000000419777",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "green grass"
},
{
"mask_ids": [
2
],
"txt_desc": "dirt track"
},
{
"mask_ids": [
11
],
"txt_desc": "rider in yellow, black, and white gear"
},
{
"mask_ids": [
3
],
"txt_desc": "dirt bike"
},
{
"mask_ids": [
12
],
"txt_desc": "rider in a blue and yellow jersey"
},
{
"mask_ids": [
7
],
"txt_desc": "rider in green"
},
{
"mask_ids": [
10
],
"txt_desc": "rider in red"
},
{
"mask_ids": [
8
],
"txt_desc": "more distant rider"
},
{
"mask_ids": [
4,
14,
5,
13
],
"txt_desc": "motorcycles"
},
{
"mask_ids": [
9
],
"txt_desc": "rider"
},
{
"mask_ids": [
6
],
"txt_desc": "motorcycle"
},
{
"mask_ids": [
0
],
"txt_desc": "dense green trees"
}
],
"labels": [
"tree-merged",
"grass-merged",
"dirt-merged",
"motorcycle",
"motorcycle",
"motorcycle",
"motorcycle",
"person",
"person",
"person",
"person",
"person",
"person",
"motorcycle",
"motorcycle"
]
} | [
{
"area": 81984,
"bbox": [
0,
0,
640,
171
],
"category_id": 184,
"id": 14270,
"image_id": "000000419777",
"iscrowd": 0,
"segmentation": {
"counts": "0[5e9000000000000000000O1N22NO1000000O100000000O1O100O1000000001O00O1O100000000000000000000O100O1O1O100O1... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000421322 | 000000421322.jpg | {
"data_source": "COCONut",
"file_name": "000000421322.jpg",
"height": 427,
"id": "000000421322",
"width": 640
} | {
"caption": "A brightly colored Wizz Air airplane is positioned on the pavement of an airport runway. The pink and purple aircraft stands out against the cloudy sky. In the distance, a range of mountains and patches of grass frame the scene. Another white and red airplane at distance is also visible. ",
"caption_ann": "A <4:brightly colored Wizz Air airplane> is positioned on the <1:pavement of an airport runway>. The <4:pink and purple aircraft> stands out against the <0:cloudy sky>. In the distance, a <3:range of mountains> and <2:patches of grass> frame the scene. Another <5:white and red airplane at distance> is also visible. ",
"id": 1202,
"image_id": "000000421322",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "brightly colored Wizz Air airplane"
},
{
"mask_ids": [
1
],
"txt_desc": "pavement of an airport runway"
},
{
"mask_ids": [
4
],
"txt_desc": "pink and purple aircraft"
},
{
"mask_ids": [
0
],
"txt_desc": "cloudy sky"
},
{
"mask_ids": [
3
],
"txt_desc": "range of mountains"
},
{
"mask_ids": [
2
],
"txt_desc": "patches of grass"
},
{
"mask_ids": [
5
],
"txt_desc": "white and red airplane at distance"
}
],
"labels": [
"sky-other-merged",
"pavement-merged",
"grass-merged",
"mountain-merged",
"airplane",
"airplane"
]
} | [
{
"area": 128815,
"bbox": [
0,
0,
640,
268
],
"category_id": 187,
"id": 14285,
"image_id": "000000421322",
"iscrowd": 0,
"segmentation": {
"counts": "0[8P51O00O1O1000000O10000aNoJcJR5]5nJcJR5\\5oJdJQ5\\5oJdJQ5\\5oJdJQ5[5PKeJP5[5PKeJP5[5QKdJo4\\5QKdJo4\\5... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000421588 | 000000421588.jpg | {
"data_source": "COCONut",
"file_name": "000000421588.jpg",
"height": 329,
"id": "000000421588",
"width": 500
} | {
"caption": "A smiling man in a light-colored polo shirt sits with his arms crossed in a black office chair at a wooden desk, where a black keyboard stand and a computer mouse are placed. The background consists of a grey wall with the \"QUARTUS ENGINEERING\" logo.",
"caption_ann": "A <3:smiling man in a light-colored polo shirt> sits with his arms crossed in a <2:black office chair> at a <1:wooden desk>, where a <5:black keyboard stand> and a <4:computer mouse> are placed. The background consists of a <0:grey wall with the \"QUARTUS ENGINEERING\" logo>.",
"id": 1203,
"image_id": "000000421588",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "smiling man in a light-colored polo shirt"
},
{
"mask_ids": [
2
],
"txt_desc": "black office chair"
},
{
"mask_ids": [
1
],
"txt_desc": "wooden desk"
},
{
"mask_ids": [
5
],
"txt_desc": "black keyboard stand"
},
{
"mask_ids": [
4
],
"txt_desc": "computer mouse"
},
{
"mask_ids": [
0
],
"txt_desc": "grey wall with the \"QUARTUS ENGINEERING\" logo"
}
],
"labels": [
"wall-other-merged",
"table-merged",
"chair",
"person",
"mouse",
"keyboard"
]
} | [
{
"area": 120003,
"bbox": [
0,
0,
500,
329
],
"category_id": 199,
"id": 14291,
"image_id": "000000421588",
"iscrowd": 0,
"segmentation": {
"counts": "0a9h000O10000000000000000000000000000000000000000O100000000000000000000000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000422130 | 000000422130.jpg | {
"data_source": "COCONut",
"file_name": "000000422130.jpg",
"height": 427,
"id": "000000422130",
"width": 640
} | {
"caption": "Under a grey, overcast sky, a dilapidated wooden shipwreck lies tilted on a sandy shore, partially submerged in the calm water. In the background, a grassy hill is dotted with a few white houses with dark gray roofs and a distant car",
"caption_ann": "Under a <0:grey, overcast sky>, a <5:dilapidated wooden shipwreck> lies tilted on a <2:sandy shore>, partially submerged in the <3:calm water>. In the background, a <4:grassy hill> is dotted with <1:a few white houses with dark gray roofs> and a <6:distant car>",
"id": 1204,
"image_id": "000000422130",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "grey, overcast sky"
},
{
"mask_ids": [
5
],
"txt_desc": "dilapidated wooden shipwreck"
},
{
"mask_ids": [
2
],
"txt_desc": "sandy shore"
},
{
"mask_ids": [
3
],
"txt_desc": "calm water"
},
{
"mask_ids": [
4
],
"txt_desc": "grassy hill"
},
{
"mask_ids": [
1
],
"txt_desc": "a few white houses with dark gray roofs"
},
{
"mask_ids": [
6
],
"txt_desc": "distant car"
}
],
"labels": [
"sky-other-merged",
"house",
"sand",
"sea",
"mountain-merged",
"boat",
"car"
]
} | [
{
"area": 84658,
"bbox": [
0,
0,
640,
154
],
"category_id": 187,
"id": 14297,
"image_id": "000000422130",
"iscrowd": 0,
"segmentation": {
"counts": "0Y4R9000000O100O1001O1OO11O0000001O0000O11O000000O11O0000000000O11OO1000000O100O11O00O100O10000O100O10000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000422782 | 000000422782.jpg | {
"data_source": "COCONut",
"file_name": "000000422782.jpg",
"height": 427,
"id": "000000422782",
"width": 640
} | {
"caption": "On a dark leather couch set against a wall with striped patterns, two women are sharing a moment. The woman on the left, wearing a maroon top and glasses, holds and reads from an open catalog, while the woman on the right, in a grey sweater and glasses looks at the same catalogue and gently pets a partially visible small white dog resting beside her.",
"caption_ann": "On a <5:dark leather couch> set against a <0:wall with striped patterns>, two women are sharing a moment. The <3:woman on the left, wearing a maroon top and glasses,> holds and reads from an <1:open catalog>, while the <4:woman on the right, in a grey sweater and glasses> looks at the same catalogue and gently pets a <2:partially visible small white dog> resting beside her.",
"id": 1205,
"image_id": "000000422782",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "dark leather couch"
},
{
"mask_ids": [
0
],
"txt_desc": "wall with striped patterns"
},
{
"mask_ids": [
3
],
"txt_desc": "woman on the left, wearing a maroon top and glasses,"
},
{
"mask_ids": [
1
],
"txt_desc": "open catalog"
},
{
"mask_ids": [
4
],
"txt_desc": "woman on the right, in a grey sweater and glasses"
},
{
"mask_ids": [
2
],
"txt_desc": "partially visible small white dog"
}
],
"labels": [
"wall-other-merged",
"book",
"dog",
"person",
"person",
"couch"
]
} | [
{
"area": 67103,
"bbox": [
0,
0,
640,
181
],
"category_id": 199,
"id": 14304,
"image_id": "000000422782",
"iscrowd": 0,
"segmentation": {
"counts": "0`5k7O1000000O1000000000000000000000000O1000000000000000000001O00000000000000000000001O000000001O0000001O... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000423337 | 000000423337.jpg | {
"data_source": "COCONut",
"file_name": "000000423337.jpg",
"height": 530,
"id": "000000423337",
"width": 640
} | {
"caption": "A fluffy, ginger cat sleeps peacefully, curled up on a light-colored pillow with one paw draped slightly over a blue book titled 'The Happiness Project', all situated on a grey textured rug.",
"caption_ann": "A <2:fluffy, ginger cat> sleeps peacefully, curled up on a <1:light-colored pillow> with one paw draped slightly over a <3:blue book titled 'The Happiness Project'>, all situated on a <0:grey textured rug>.",
"id": 1206,
"image_id": "000000423337",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "fluffy, ginger cat"
},
{
"mask_ids": [
1
],
"txt_desc": "light-colored pillow"
},
{
"mask_ids": [
3
],
"txt_desc": "blue book titled 'The Happiness Project'"
},
{
"mask_ids": [
0
],
"txt_desc": "grey textured rug"
}
],
"labels": [
"rug-merged",
"pillow",
"cat",
"book"
]
} | [
{
"area": 48224,
"bbox": [
0,
0,
640,
530
],
"category_id": 200,
"id": 14310,
"image_id": "000000423337",
"iscrowd": 0,
"segmentation": {
"counts": "0W[34UUM7H5K4L3M4M2O2N1O2M2O2N1O2M3N1O2N2M2oF`No2b1nLcNn2_1nLfNo2\\1nLfNP3]1mLeNR3^1jLeNT3^1hLeNV3^1gLcNX... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000424002 | 000000424002.jpg | {
"data_source": "COCONut",
"file_name": "000000424002.jpg",
"height": 640,
"id": "000000424002",
"width": 480
} | {
"caption": "On a rocky mountaintop, two hikers and their dogs pause for a photo. A man in a red jacket and striped beanie carries a large green and yellow backpack, while a man in a red jacket and blue beanie carries a green backpack. They are accompanied by a black dog and a black and brown dog, both outfitted with blue saddlebags. In the background, a vast mountain range covered in bare as well as gray leafy trees stretches out under a pale sky.",
"caption_ann": "On a <1:rocky mountaintop>, two hikers and their dogs pause for a photo. A <4:man in a red jacket and striped beanie> carries a <5:large green and yellow backpack>, while a <8:man in a red jacket and blue beanie> carries a <3:green backpack>. They are accompanied by a <6:black dog> and a <9:black and brown dog>, both outfitted with <7,10:blue saddlebags>. In the background, a <2:vast mountain range covered in bare as well as gray leafy trees> stretches out under a <0:pale sky>.",
"id": 1207,
"image_id": "000000424002",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "rocky mountaintop"
},
{
"mask_ids": [
4
],
"txt_desc": "man in a red jacket and striped beanie"
},
{
"mask_ids": [
5
],
"txt_desc": "large green and yellow backpack"
},
{
"mask_ids": [
8
],
"txt_desc": "man in a red jacket and blue beanie"
},
{
"mask_ids": [
3
],
"txt_desc": "green backpack"
},
{
"mask_ids": [
6
],
"txt_desc": "black dog"
},
{
"mask_ids": [
9
],
"txt_desc": "black and brown dog"
},
{
"mask_ids": [
7,
10
],
"txt_desc": "blue saddlebags"
},
{
"mask_ids": [
2
],
"txt_desc": "vast mountain range covered in bare as well as gray leafy trees"
},
{
"mask_ids": [
0
],
"txt_desc": "pale sky"
}
],
"labels": [
"sky-other-merged",
"rock-merged",
"mountain-merged",
"backpack",
"person",
"backpack",
"dog",
"backpack",
"person",
"dog",
"backpack"
]
} | [
{
"area": 106470,
"bbox": [
0,
0,
480,
297
],
"category_id": 187,
"id": 14314,
"image_id": "000000424002",
"iscrowd": 0,
"segmentation": {
"counts": "0Z6f=0000000000O10000001O000000000000000000000000000000000000000000000000000000001O000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000424102 | 000000424102.jpg | {
"data_source": "COCONut",
"file_name": "000000424102.jpg",
"height": 427,
"id": "000000424102",
"width": 640
} | {
"caption": "A man wearing a wide-brimmed hat, sunglasses, and a polo shirt points up at a distance. A large, detailed model of a U.S. Air Force jet is flying low in the sky. The man is likely the operator of the remote-controlled aircraft. The scene takes place in an open, rural area, with a dirt field in the foreground, a field of dry, yellow grass behind it, and a line of green trees in the background, all under a bright, partly cloudy sky.",
"caption_ann": "A <5:man wearing a wide-brimmed hat, sunglasses, and a polo shirt> points up at a distance. A <4:large, detailed model of a U.S. Air Force jet> is flying low in the sky. The man is likely the operator of the remote-controlled aircraft. The scene takes place in an open, rural area, with a <3:dirt field> in the foreground, a field of <2:dry, yellow grass> behind it, and a line of <1:green trees> in the background, all under a <0:bright, partly cloudy sky>.",
"id": 1208,
"image_id": "000000424102",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "man wearing a wide-brimmed hat, sunglasses, and a polo shirt"
},
{
"mask_ids": [
4
],
"txt_desc": "large, detailed model of a U.S. Air Force jet"
},
{
"mask_ids": [
3
],
"txt_desc": "dirt field"
},
{
"mask_ids": [
2
],
"txt_desc": "dry, yellow grass"
},
{
"mask_ids": [
1
],
"txt_desc": "green trees"
},
{
"mask_ids": [
0
],
"txt_desc": "bright, partly cloudy sky"
}
],
"labels": [
"sky-other-merged",
"tree-merged",
"grass-merged",
"dirt-merged",
"airplane",
"person"
]
} | [
{
"area": 121718,
"bbox": [
0,
0,
640,
257
],
"category_id": 187,
"id": 14325,
"image_id": "000000424102",
"iscrowd": 0,
"segmentation": {
"counts": "0b7i51O0000N200LYJeHf5[7500000000001O1O0000O1N21O1O002N0000O13M0YJ`Hc5d7000M\\J^He5a73O100O100000000O100... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000424819 | 000000424819.jpg | {
"data_source": "COCONut",
"file_name": "000000424819.jpg",
"height": 458,
"id": "000000424819",
"width": 640
} | {
"caption": "A smiling woman with brown hair, wearing a bright purple jacket and jeans, sits at a sturdy, concrete picnic table which has two benches and it's placed on a pavemented white platform. The picnic table and benches are situated on a <square shaped concrete platform>. The scenic rest stop is located on a gravel and paved area next to a patch of tall, wild grass. The backdrop is a breathtaking coastal landscape, featuring the choppy, turquoise sea with white-capped waves rolling towards the shore. A dramatic, green and rugged mountain rises steeply from the coastline, all under a moody, overcast sky.",
"caption_ann": "A <6:smiling woman with brown hair, wearing a bright purple jacket and jeans,> sits at a sturdy, <9:concrete picnic table> which has <7,8:two benches> and it's placed on a <5:pavemented white platform>. The <7,8,9:picnic table and benches> are situated on a <square shaped concrete platform>. The scenic rest stop is located on a <4:gravel and paved area> next to a patch of <3:tall, wild grass>. The backdrop is a breathtaking coastal landscape, featuring the <2:choppy, turquoise sea> with white-capped waves rolling towards the shore. A dramatic, <1:green and rugged mountain> rises steeply from the coastline, all under a <0:moody, overcast sky>.",
"id": 1209,
"image_id": "000000424819",
"label_matched": [
{
"mask_ids": [
6
],
"txt_desc": "smiling woman with brown hair, wearing a bright purple jacket and jeans,"
},
{
"mask_ids": [
9
],
"txt_desc": "concrete picnic table"
},
{
"mask_ids": [
7,
8
],
"txt_desc": "two benches"
},
{
"mask_ids": [
5
],
"txt_desc": "pavemented white platform"
},
{
"mask_ids": [
7,
8,
9
],
"txt_desc": "picnic table and benches"
},
{
"mask_ids": [
4
],
"txt_desc": "gravel and paved area"
},
{
"mask_ids": [
3
],
"txt_desc": "tall, wild grass"
},
{
"mask_ids": [
2
],
"txt_desc": "choppy, turquoise sea"
},
{
"mask_ids": [
1
],
"txt_desc": "green and rugged mountain"
},
{
"mask_ids": [
0
],
"txt_desc": "moody, overcast sky"
}
],
"labels": [
"sky-other-merged",
"mountain-merged",
"sea",
"grass-merged",
"gravel",
"pavement-merged",
"person",
"bench",
"bench",
"dining table"
]
} | [
{
"area": 61117,
"bbox": [
0,
0,
640,
191
],
"category_id": 187,
"id": 14331,
"image_id": "000000424819",
"iscrowd": 0,
"segmentation": {
"counts": "0n5\\8000000000000000000000000O11O000000000000000000001OO10000000000001O00O100000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000425555 | 000000425555.jpg | {
"data_source": "COCONut",
"file_name": "000000425555.jpg",
"height": 426,
"id": "000000425555",
"width": 640
} | {
"caption": "On a dark asphalt pavement, an elderly man with white hair, wearing a dark vest and patterned pants, sits thoughtfully in a manual wheelchair. His pose appears like someone who is tired, pensive, or deep in thought. Behind him, on a raised concrete curb, a dark-colored dog sleeps in front of a large, yellow wall with brown trim. On the far right side on pavement stands a red fire hydrant with '26' written on it, and a portion of small white-framed window is visible high on the wall.",
"caption_ann": "On a <2:dark asphalt pavement>, an <5:elderly man with white hair, wearing a dark vest and patterned pants,> sits thoughtfully in a <7:manual wheelchair>. His pose appears like someone who is tired, pensive, or deep in thought. Behind him, on a <0:raised concrete curb>, a <4:dark-colored dog> sleeps in front of a <1:large, yellow wall with brown trim>. On the far right side on <2:pavement> stands a <6:red fire hydrant with '26' written on it>, and a portion of <3:small white-framed window> is visible high on the <1:wall>.",
"id": 1210,
"image_id": "000000425555",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "dark asphalt pavement"
},
{
"mask_ids": [
5
],
"txt_desc": "elderly man with white hair, wearing a dark vest and patterned pants,"
},
{
"mask_ids": [
7
],
"txt_desc": "manual wheelchair"
},
{
"mask_ids": [
0
],
"txt_desc": "raised concrete curb"
},
{
"mask_ids": [
4
],
"txt_desc": "dark-colored dog"
},
{
"mask_ids": [
1
],
"txt_desc": "large, yellow wall with brown trim"
},
{
"mask_ids": [
2
],
"txt_desc": "pavement"
},
{
"mask_ids": [
6
],
"txt_desc": "red fire hydrant with '26' written on it"
},
{
"mask_ids": [
3
],
"txt_desc": "small white-framed window"
},
{
"mask_ids": [
1
],
"txt_desc": "wall"
}
],
"labels": [
"floor-other-merged",
"wall-other-merged",
"pavement-merged",
"window-other",
"dog",
"person",
"fire hydrant",
"chair"
]
} | [
{
"area": 14961,
"bbox": [
141,
218,
499,
37
],
"category_id": 190,
"id": 14341,
"image_id": "000000425555",
"iscrowd": 0,
"segmentation": {
"counts": "_lj12V=3N1O1O2O0O1O1O1O2N2N1O2N2O001M3N2O0101N01O001O00001O00000000UOcCi0^<1000000O10000000000001O0000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000426075 | 000000426075.jpg | {
"data_source": "COCONut",
"file_name": "000000426075.jpg",
"height": 479,
"id": "000000426075",
"width": 640
} | {
"caption": "In a field of dry, brownish grass, a small baby zebra with brownish fur nurses from a large adult zebra while the adult grazes with its head lowered.",
"caption_ann": "In a field of <0:dry, brownish grass>, a <1:small baby zebra with brownish fur> nurses from a <2:large adult zebra> while the adult grazes with its head lowered.",
"id": 1211,
"image_id": "000000426075",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "dry, brownish grass"
},
{
"mask_ids": [
1
],
"txt_desc": "small baby zebra with brownish fur"
},
{
"mask_ids": [
2
],
"txt_desc": "large adult zebra"
}
],
"labels": [
"grass-merged",
"zebra",
"zebra"
]
} | [
{
"area": 216932,
"bbox": [
0,
0,
640,
479
],
"category_id": 193,
"id": 14349,
"image_id": "000000426075",
"iscrowd": 0,
"segmentation": {
"counts": "0kgU1a0aXjNCS<7oCI1h0B]OQ<T1RD0c;m1H7J5O2N1N3N1O2M2N2N3M2nMoKTIS4i6[LjHg3S7gL`HZ3^7HfGWL0O1159T8\\3jIULW... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000438331 | 000000438331.jpg | {
"data_source": "COCONut",
"file_name": "000000438331.jpg",
"height": 427,
"id": "000000438331",
"width": 640
} | {
"caption": "From a high-angle perspective, a man wearing a red shirt and yellow helmet navigates his motorcycle through traffic on a grey road, carrying two children with him. Seated in front of the driver is a child in a light blue jacket, while the child with dark hair, wearing black and white dress sits on the back, holding a pink cartoon backpack in right hand. The surrounding traffic includes a blue scooter carrying two persons whose right legs, shoes, and pants are visible, and another man on a separate motorcycle is visible in the foreground.",
"caption_ann": "From a high-angle perspective, a <6:man wearing a red shirt and yellow helmet> navigates his <10:motorcycle> through traffic on a <0:grey road>, carrying two children with him. Seated in front of the driver is a <7:child in a light blue jacket>, while the <5:child with dark hair, wearing black and white dress> sits on the back, holding a <4:pink cartoon backpack> in right hand. The surrounding traffic includes a <3:blue scooter> carrying <1,2:two persons whose right legs, shoes, and pants are visible>, and another <8:man> on a separate <9:motorcycle> is visible in the foreground.",
"id": 1212,
"image_id": "000000438331",
"label_matched": [
{
"mask_ids": [
6
],
"txt_desc": "man wearing a red shirt and yellow helmet"
},
{
"mask_ids": [
10
],
"txt_desc": "motorcycle"
},
{
"mask_ids": [
0
],
"txt_desc": "grey road"
},
{
"mask_ids": [
7
],
"txt_desc": "child in a light blue jacket"
},
{
"mask_ids": [
5
],
"txt_desc": "child with dark hair, wearing black and white dress"
},
{
"mask_ids": [
4
],
"txt_desc": "pink cartoon backpack"
},
{
"mask_ids": [
3
],
"txt_desc": "blue scooter"
},
{
"mask_ids": [
1,
2
],
"txt_desc": "two persons whose right legs, shoes, and pants are visible"
},
{
"mask_ids": [
8
],
"txt_desc": "man"
},
{
"mask_ids": [
9
],
"txt_desc": "motorcycle"
}
],
"labels": [
"road",
"person",
"person",
"motorcycle",
"backpack",
"person",
"person",
"person",
"person",
"motorcycle",
"motorcycle"
]
} | [
{
"area": 140473,
"bbox": [
0,
0,
640,
427
],
"category_id": 149,
"id": 14352,
"image_id": "000000438331",
"iscrowd": 0,
"segmentation": {
"counts": "0WZm02RSSO4L2N3O0RDHi:6TELn:2QENP;3oDMR;2lD0T;1iD1n0L]82cF4n0K`81`F6n0Jc80]F8n0Ie80ZF;o0Eg81WF=P1Ci82RF?... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000439089 | 000000439089.jpg | {
"data_source": "COCONut",
"file_name": "000000439089.jpg",
"height": 480,
"id": "000000439089",
"width": 640
} | {
"caption": "In a rustic scene, a vintage brown bicycle is propped against a whitewashed stone wall, just below a green-framed window. In the windowsill, a plant with pink flowers sits inside a white pot. In the foreground, three chickens peck around on a narrow pavement strip and a patch of green grass, while another dark potted plant is partially visible at the far left.",
"caption_ann": "In a rustic scene, a <5:vintage brown bicycle> is propped against a <0:whitewashed stone wall>, just below a <3:green-framed window>. In the windowsill, a <4:plant with pink flowers> sits inside a <10:white pot>. In the foreground, <6,7,9:three chickens> peck around on a <2:narrow pavement strip> and a patch of <1:green grass>, while another <8:dark potted plant> is partially visible at the far left.",
"id": 1213,
"image_id": "000000439089",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "vintage brown bicycle"
},
{
"mask_ids": [
0
],
"txt_desc": "whitewashed stone wall"
},
{
"mask_ids": [
3
],
"txt_desc": "green-framed window"
},
{
"mask_ids": [
4
],
"txt_desc": "plant with pink flowers"
},
{
"mask_ids": [
10
],
"txt_desc": "white pot"
},
{
"mask_ids": [
6,
7,
9
],
"txt_desc": "three chickens"
},
{
"mask_ids": [
2
],
"txt_desc": "narrow pavement strip"
},
{
"mask_ids": [
1
],
"txt_desc": "green grass"
},
{
"mask_ids": [
8
],
"txt_desc": "dark potted plant"
}
],
"labels": [
"wall-other-merged",
"gravel",
"pavement-merged",
"window-other",
"potted plant",
"bicycle",
"bird",
"bird",
"potted plant",
"bird",
"vase"
]
} | [
{
"area": 152936,
"bbox": [
0,
0,
640,
343
],
"category_id": 199,
"id": 14363,
"image_id": "000000439089",
"iscrowd": 0,
"segmentation": {
"counts": "0Q8;ZIn4e6RK[Io4e6QK\\In4d6RK[Io4d6RK]Im4c6SK\\In4c6SK\\In4c6SK\\In4d6QK[IQ5e6nJ]IQ5c6nJ^IR5c6lJ^IT5b6kJ... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000443564 | 000000443564.jpg | {
"data_source": "COCONut",
"file_name": "000000443564.jpg",
"height": 319,
"id": "000000443564",
"width": 480
} | {
"caption": "The image shows a young man posing against a dark background. He is wearing a white and green football jersey with the number 6 and the word \\“Panthers\\” printed on it. He holds a black and red baseball bat horizontally across his shoulders, supporting it with both hands. A basketball rests on top of the bat near his left shoulder.",
"caption_ann": "The image shows a <0:young man> posing against a dark background. He is wearing a white and green football jersey with the number 6 and the word \\“Panthers\\” printed on it. He holds a <1:black and red baseball bat> horizontally across his shoulders, supporting it with both hands. A <1:basketball> rests on top of the <2:bat> near his left shoulder.",
"id": 1214,
"image_id": "000000443564",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "young man"
},
{
"mask_ids": [
1
],
"txt_desc": "black and red baseball bat"
},
{
"mask_ids": [
1
],
"txt_desc": "basketball"
},
{
"mask_ids": [
2
],
"txt_desc": "bat"
}
],
"labels": [
"person",
"sports ball",
"baseball bat"
]
} | [
{
"area": 56620,
"bbox": [
92,
38,
369,
280
],
"category_id": 1,
"id": 14374,
"image_id": "000000443564",
"iscrowd": 0,
"segmentation": {
"counts": "kil01S2e1b3\\NiJ1L8e0o1^4iMQK2Ek3R5TLYK[4`4eK`Ka4Z4_KfKh4T4WKlKn4`44L2N0000000000000ZOPK]LP5_3TKaLl4\\3WK... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000445192 | 000000445192.jpg | {
"data_source": "COCONut",
"file_name": "000000445192.jpg",
"height": 428,
"id": "000000445192",
"width": 640
} | {
"caption": "A man wearing a red t-shirt and white shorts is riding a motorcycle on a asphalt road beside a moving blue truck with a yellow rear section. The road is lined with green grass and surrounded by numerous trees. Above, the sky is slightly overcast but bright,",
"caption_ann": "A <5:man wearing a red t-shirt and white shorts> is riding a <6:motorcycle> on a <1:asphalt road> beside a <4:moving blue truck with a yellow rear section>. The <1:road> is lined with <3:green grass> and surrounded by numerous <2:trees>. Above, the <0:sky> is slightly overcast but bright,",
"id": 1215,
"image_id": "000000445192",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "man wearing a red t-shirt and white shorts"
},
{
"mask_ids": [
6
],
"txt_desc": "motorcycle"
},
{
"mask_ids": [
1
],
"txt_desc": "asphalt road"
},
{
"mask_ids": [
4
],
"txt_desc": "moving blue truck with a yellow rear section"
},
{
"mask_ids": [
1
],
"txt_desc": "road"
},
{
"mask_ids": [
3
],
"txt_desc": "green grass"
},
{
"mask_ids": [
2
],
"txt_desc": "trees"
},
{
"mask_ids": [
0
],
"txt_desc": "sky"
}
],
"labels": [
"sky-other-merged",
"road",
"tree-merged",
"grass-merged",
"truck",
"person",
"motorcycle"
]
} | [
{
"area": 12986,
"bbox": [
46,
0,
594,
141
],
"category_id": 187,
"id": 14377,
"image_id": "000000445192",
"iscrowd": 0,
"segmentation": {
"counts": "jWc02k<0WCOO02012f<OXC110N0j<4XC1f<1YCOg<1YCOg<;1OO0O20N2001OO100HWCK0Li<9WCHn<9RCFn<<000DRC6R=JnB6R=JoB... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000448308 | 000000448308.jpg | {
"data_source": "COCONut",
"file_name": "000000448308.jpg",
"height": 612,
"id": "000000448308",
"width": 612
} | {
"caption": "On a white paper napkin, a partially peeled orange sits with its peel cut open like a flower, while a red pocket knife with its blade extended lies beside the orange. The image displays all the objects resting on plain white surface.",
"caption_ann": "On a <0:white paper napkin>, a <2:partially peeled orange> sits with its peel cut open like a flower, while a <1:red pocket knife with its blade extended> lies beside the <2:orange>. The image displays all the objects resting on plain white surface.",
"id": 1216,
"image_id": "000000448308",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "white paper napkin"
},
{
"mask_ids": [
2
],
"txt_desc": "partially peeled orange"
},
{
"mask_ids": [
1
],
"txt_desc": "red pocket knife with its blade extended"
},
{
"mask_ids": [
2
],
"txt_desc": "orange"
}
],
"labels": [
"table-merged",
"knife",
"orange"
]
} | [
{
"area": 234339,
"bbox": [
0,
0,
612,
612
],
"category_id": 189,
"id": 14384,
"image_id": "000000448308",
"iscrowd": 0,
"segmentation": {
"counts": "0Qje2S1]gZM]1jN>J6I6K6J6L4M3M3N2M3M2N3N2M3M3N2M3M2O2N2N2N2N2N2N2N1O2N2N2N2N2N2N1O2N2N2N2N1O1O1O001O1O1O0... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000448376 | 000000448376.jpg | {
"data_source": "COCONut",
"file_name": "000000448376.jpg",
"height": 640,
"id": "000000448376",
"width": 427
} | {
"caption": "On a rainy day, a long, weathered train is stopped at a wet platform alongside the railroad tracks. From an open door of the train, a person's arm holds a large blue umbrella out into the downpour, while in the distance, a faint building is visible through the hazy atmosphere.",
"caption_ann": "On a rainy day, a <5:long, weathered train> is stopped at a <1:wet platform> alongside the <2:railroad tracks>. From an open door of the <5:train>, a <4:person's arm> holds a <3:large blue umbrella> out into the downpour, while in the distance, a <0:faint building> is visible through the hazy atmosphere.",
"id": 1217,
"image_id": "000000448376",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "long, weathered train"
},
{
"mask_ids": [
1
],
"txt_desc": "wet platform"
},
{
"mask_ids": [
2
],
"txt_desc": "railroad tracks"
},
{
"mask_ids": [
5
],
"txt_desc": "train"
},
{
"mask_ids": [
4
],
"txt_desc": "person's arm"
},
{
"mask_ids": [
3
],
"txt_desc": "large blue umbrella"
},
{
"mask_ids": [
0
],
"txt_desc": "faint building"
}
],
"labels": [
"building-other-merged",
"platform",
"railroad",
"umbrella",
"person",
"train"
]
} | [
{
"area": 52084,
"bbox": [
0,
0,
332,
374
],
"category_id": 197,
"id": 14387,
"image_id": "000000448376",
"iscrowd": 0,
"segmentation": {
"counts": "0m9S:1O00001O001O001O001O1O001O001O1O2N1O1O1O1O1O1O1O1O001O001O1O001O1O1O001O1O1O2N1O001O001O001O00001O00... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000448697 | 000000448697.jpg | {
"data_source": "COCONut",
"file_name": "000000448697.jpg",
"height": 640,
"id": "000000448697",
"width": 480
} | {
"caption": "The iconic stone clock tower of Big Ben rises into a pale, overcast sky, it's two ornate clock faces on full display, while a large traffic light showing a red signal and an adjacent smaller traffic light dominate the immediate foreground from a low-angle perspective.",
"caption_ann": "The iconic <1:stone clock tower of Big Ben> rises into a <0:pale, overcast sky>, it's <2,3:two ornate clock faces> on full display, while a <4:large traffic light> showing a red signal and an adjacent <5:smaller traffic light> dominate the immediate foreground from a low-angle perspective.",
"id": 1218,
"image_id": "000000448697",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "stone clock tower of Big Ben"
},
{
"mask_ids": [
0
],
"txt_desc": "pale, overcast sky"
},
{
"mask_ids": [
2,
3
],
"txt_desc": "two ornate clock faces"
},
{
"mask_ids": [
4
],
"txt_desc": "large traffic light"
},
{
"mask_ids": [
5
],
"txt_desc": "smaller traffic light"
}
],
"labels": [
"sky-other-merged",
"building-other-merged",
"clock",
"clock",
"traffic light",
"traffic light"
]
} | [
{
"area": 192574,
"bbox": [
0,
0,
480,
640
],
"category_id": 187,
"id": 14393,
"image_id": "000000448697",
"iscrowd": 0,
"segmentation": {
"counts": "0da0\\2O11ONeM_^O[2ca0000N22N00N22N00O12N00N2002N1ON2BgMP_O^2k`0`MQ_Ok2o`03N2N2O1M3L4JgLb_OO0[3\\`07M3G[... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000448698 | 000000448698.jpg | {
"data_source": "COCONut",
"file_name": "000000448698.jpg",
"height": 480,
"id": "000000448698",
"width": 640
} | {
"caption": "On a vibrant green grassy field, a vintage white airplane with the tail number NC 18130 is parked under a vivid blue sky dotted with clouds. The airfield in the background features several buildings, including a gray barn-like house and a gabled structure, surrounded by trees. In the distance, another yellow airplane is partially visible on a paved area near some low-lying hills.",
"caption_ann": "On a <4:vibrant green grassy field>, a <7:vintage white airplane with the tail number NC 18130> is parked under a <0:vivid blue sky dotted with clouds>. The airfield in the background features several buildings, including a <3:gray barn-like house> and a <2:gabled structure>, surrounded by <5:trees>. In the distance, another <8:yellow airplane> is partially visible on a <1:paved area> near some <6:low-lying hills>.",
"id": 1219,
"image_id": "000000448698",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "vibrant green grassy field"
},
{
"mask_ids": [
7
],
"txt_desc": "vintage white airplane with the tail number NC 18130"
},
{
"mask_ids": [
0
],
"txt_desc": "vivid blue sky dotted with clouds"
},
{
"mask_ids": [
3
],
"txt_desc": "gray barn-like house"
},
{
"mask_ids": [
2
],
"txt_desc": "gabled structure"
},
{
"mask_ids": [
5
],
"txt_desc": "trees"
},
{
"mask_ids": [
8
],
"txt_desc": "yellow airplane"
},
{
"mask_ids": [
1
],
"txt_desc": "paved area"
},
{
"mask_ids": [
6
],
"txt_desc": "low-lying hills"
}
],
"labels": [
"sky-other-merged",
"road",
"building-other-merged",
"house",
"grass-merged",
"tree-merged",
"mountain-merged",
"airplane",
"airplane"
]
} | [
{
"area": 113486,
"bbox": [
0,
0,
640,
200
],
"category_id": 187,
"id": 14399,
"image_id": "000000448698",
"iscrowd": 0,
"segmentation": {
"counts": "0V6j81OO11O1OO10000001OO10000FXG\\Jh8d5XG\\Jh8d5YG\\Jf8e5YGZJh8d5XGUJ08g8e5ZGUJON00h8g5XG\\J2Oj8e5TG\\J3... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000448701 | 000000448701.jpg | {
"data_source": "COCONut",
"file_name": "000000448701.jpg",
"height": 640,
"id": "000000448701",
"width": 546
} | {
"caption": "On a sunlit, snow-covered slope, a person dressed in a light blue and gray jacket, black pants, and a helmet joyfully raises their arms while standing on a two skis beside a dense grove of bare trees.",
"caption_ann": "On a <1:sunlit, snow-covered slope>, a <2:person dressed in a light blue and gray jacket, black pants, and a helmet> joyfully raises their arms while standing on a <3,4:two skis> beside a <0:dense grove of bare trees>.",
"id": 1220,
"image_id": "000000448701",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "sunlit, snow-covered slope"
},
{
"mask_ids": [
2
],
"txt_desc": "person dressed in a light blue and gray jacket, black pants, and a helmet"
},
{
"mask_ids": [
3,
4
],
"txt_desc": "two skis"
},
{
"mask_ids": [
0
],
"txt_desc": "dense grove of bare trees"
}
],
"labels": [
"tree-merged",
"snow",
"person",
"skis",
"skis"
]
} | [
{
"area": 164776,
"bbox": [
0,
0,
546,
526
],
"category_id": 184,
"id": 14408,
"image_id": "000000448701",
"iscrowd": 0,
"segmentation": {
"counts": "0l0Tc01O001O00O12N2N3M001O001OO100L4O1N2L400O1M3M3O1001O001OO11O00O100001O00O13MO11O1OO1O11OO12NO11O0000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000449108 | 000000449108.jpg | {
"data_source": "COCONut",
"file_name": "000000449108.jpg",
"height": 519,
"id": "000000449108",
"width": 640
} | {
"caption": "In a grassy enclosure bordered by a wire fence, a zebra with its head down grazes in the foreground. In the background, a large, dark-feathered ostrich rests on the ground, while a vibrant blue peacock stands near the fence.",
"caption_ann": "In a <0:grassy enclosure> bordered by a <1:wire fence>, a <4:zebra with its head down> grazes in the foreground. In the background, a <2:large, dark-feathered ostrich> rests on the ground, while a <3:vibrant blue peacock> stands near the <1:fence>.",
"id": 1221,
"image_id": "000000449108",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "grassy enclosure"
},
{
"mask_ids": [
1
],
"txt_desc": "wire fence"
},
{
"mask_ids": [
4
],
"txt_desc": "zebra with its head down"
},
{
"mask_ids": [
2
],
"txt_desc": "large, dark-feathered ostrich"
},
{
"mask_ids": [
3
],
"txt_desc": "vibrant blue peacock"
},
{
"mask_ids": [
1
],
"txt_desc": "fence"
}
],
"labels": [
"grass-merged",
"fence-merged",
"bird",
"bird",
"zebra"
]
} | [
{
"area": 196938,
"bbox": [
0,
117,
640,
402
],
"category_id": 193,
"id": 14413,
"image_id": "000000449108",
"iscrowd": 0,
"segmentation": {
"counts": "i:^5f:3O1N2N2N2K5O1M3M3M3K5I7G9F:K5N2M3N2O1O1N2M3N2M3M3N2N2M3L4M3N2K5M3N2N2N200N2O1O100O1O1O1O1000000O... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000449377 | 000000449377.jpg | {
"data_source": "COCONut",
"file_name": "000000449377.jpg",
"height": 357,
"id": "000000449377",
"width": 500
} | {
"caption": "A black-and-white photograph captures an intimate scene of a man and woman sitting together on a long, stone bench in what appears to be a piazza. The woman on the left, wearing a black jacket, leans their head affectionately against the man on the right, who is wearing a relatively lighter jacket. The bench is placed on a paved ground surface in the foreground. Behind them stands an old building with a weathered stone facade. In the center of the building is a large, arched doorway containing a dark, paneled wooden door. On the left and the right of the door,there are two windows with dark grilles - one on each side.",
"caption_ann": "A black-and-white photograph captures an intimate scene of a <4,6:man and woman> sitting together on a <5:long, stone bench> in what appears to be a piazza. The <4:woman on the left, wearing a black jacket>, leans their head affectionately against the <6:man on the right, who is wearing a relatively lighter jacket>. The <5:bench> is placed on a <1:paved ground surface> in the foreground. Behind them stands an <0:old building with a weathered stone facade>. In the center of the building is a large, arched doorway containing a <3:dark, paneled wooden door>. On the left and the right of the door,there are <2:two windows with dark grilles - one on each side>.",
"id": 1222,
"image_id": "000000449377",
"label_matched": [
{
"mask_ids": [
4,
6
],
"txt_desc": "man and woman"
},
{
"mask_ids": [
5
],
"txt_desc": "long, stone bench"
},
{
"mask_ids": [
4
],
"txt_desc": "woman on the left, wearing a black jacket"
},
{
"mask_ids": [
6
],
"txt_desc": "man on the right, who is wearing a relatively lighter jacket"
},
{
"mask_ids": [
5
],
"txt_desc": "bench"
},
{
"mask_ids": [
1
],
"txt_desc": "paved ground surface"
},
{
"mask_ids": [
0
],
"txt_desc": "old building with a weathered stone facade"
},
{
"mask_ids": [
3
],
"txt_desc": "dark, paneled wooden door"
},
{
"mask_ids": [
2
],
"txt_desc": "two windows with dark grilles - one on each side"
}
],
"labels": [
"building-other-merged",
"pavement-merged",
"window-other",
"door-stuff",
"person",
"bench",
"person"
]
} | [
{
"area": 90742,
"bbox": [
0,
0,
500,
224
],
"category_id": 197,
"id": 14418,
"image_id": "000000449377",
"iscrowd": 0,
"segmentation": {
"counts": "0l6Y400000000000000000000000000000000000000000000000000000000000000O10000`IPL:Me4S4jJiKNU1W5d40000O1O1O1O... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000449879 | 000000449879.jpg | {
"data_source": "COCONut",
"file_name": "000000449879.jpg",
"height": 500,
"id": "000000449879",
"width": 375
} | {
"caption": "A man with long, wavy brown hair wearing a black leather jacket and blue jeans sits astride a chrome motorcycle and tenderly cradles a small black kitten against his chest. The man's attire is completed by blue jeans, with their right leg resting on the pavement. The setting is outdoors, in front of a red brick wall that has a grey window shutter and a white window door on it. The scene is framed by green bushes and grass in the background.",
"caption_ann": "A <7:man with long, wavy brown hair wearing a black leather jacket and blue jeans> sits astride a <6:chrome motorcycle> and tenderly cradles a <8:small black kitten> against his chest. The <7:man's> attire is completed by blue jeans, with their right leg resting on the <3:pavement>. The setting is outdoors, in front of a <0:red brick wall> that has a <2:grey window shutter> and a <4:white window door on it>. The scene is framed by <1:green bushes> and <5:grass> in the background.",
"id": 1223,
"image_id": "000000449879",
"label_matched": [
{
"mask_ids": [
7
],
"txt_desc": "man with long, wavy brown hair wearing a black leather jacket and blue jeans"
},
{
"mask_ids": [
6
],
"txt_desc": "chrome motorcycle"
},
{
"mask_ids": [
8
],
"txt_desc": "small black kitten"
},
{
"mask_ids": [
7
],
"txt_desc": "man's"
},
{
"mask_ids": [
3
],
"txt_desc": "pavement"
},
{
"mask_ids": [
0
],
"txt_desc": "red brick wall"
},
{
"mask_ids": [
2
],
"txt_desc": "grey window shutter"
},
{
"mask_ids": [
4
],
"txt_desc": "white window door on it"
},
{
"mask_ids": [
1
],
"txt_desc": "green bushes"
},
{
"mask_ids": [
5
],
"txt_desc": "grass"
}
],
"labels": [
"wall-brick",
"tree-merged",
"window-blind",
"pavement-merged",
"window-other",
"grass-merged",
"motorcycle",
"person",
"cat"
]
} | [
{
"area": 13907,
"bbox": [
25,
0,
350,
109
],
"category_id": 171,
"id": 14425,
"image_id": "000000449879",
"iscrowd": 0,
"segmentation": {
"counts": "bY<1Z?901O00010O1O1O00010O00001O001O001O1O001O001O001O00000000000000O2N1O1O1M3N3N2M_]3MdbL2O100000001N2O... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000450737 | 000000450737.jpg | {
"data_source": "COCONut",
"file_name": "000000450737.jpg",
"height": 640,
"id": "000000450737",
"width": 427
} | {
"caption": "A man with short dark hair and glasses, wearing a olive green graphic t-shirt, takes a bite of a slice of cheese and tomato pizza while holding a white paper napkin in left hand. In the foreground, the top of a clear plastic water bottle is visible. The entire scene is set against the backdrop of a large, ornate building with a green and gray patterned marble facade, captured from a low angle.",
"caption_ann": "A <2:man with short dark hair and glasses, wearing a olive green graphic t-shirt,> takes a bite of a <4:slice of cheese and tomato pizza> while holding a <1:white paper napkin> in left hand. In the foreground, the top of a <3:clear plastic water bottle> is visible. The entire scene is set against the backdrop of a <0:large, ornate building with a green and gray patterned marble facade>, captured from a low angle.",
"id": 1224,
"image_id": "000000450737",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "man with short dark hair and glasses, wearing a olive green graphic t-shirt,"
},
{
"mask_ids": [
4
],
"txt_desc": "slice of cheese and tomato pizza"
},
{
"mask_ids": [
1
],
"txt_desc": "white paper napkin"
},
{
"mask_ids": [
3
],
"txt_desc": "clear plastic water bottle"
},
{
"mask_ids": [
0
],
"txt_desc": "large, ornate building with a green and gray patterned marble facade"
}
],
"labels": [
"building-other-merged",
"paper-merged",
"person",
"bottle",
"pizza"
]
} | [
{
"area": 195317,
"bbox": [
0,
0,
427,
640
],
"category_id": 197,
"id": 14434,
"image_id": "000000450737",
"iscrowd": 0,
"segmentation": {
"counts": "0ba0m1?4L3_^OnMQa0U2j^OnMTa0`2N2O1O1O1N2O1O1O1O1O100O1O1O1O1O1O100O100O100M3@`0O1O1K5I7K5K5L4K5K5M3M3N2N... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000450759 | 000000450759.jpg | {
"data_source": "COCONut",
"file_name": "000000450759.jpg",
"height": 427,
"id": "000000450759",
"width": 640
} | {
"caption": "The image features a textured stone wall that spans the upper portion. In the foreground, a dark green metal bench is positioned, and a person dressed in a white t-shirt and blue jeans is lying on the bench with their head resting on their hand, casting a distinct shadow on the light-coloured pavement below.",
"caption_ann": "The image features a <0:textured stone wall> that spans the upper portion. In the foreground, a <3:dark green metal bench> is positioned, and a <2:person dressed in a white t-shirt and blue jeans> is lying on the <3:bench> with their head resting on their hand, casting a distinct shadow on the <1:light-coloured pavement> below.",
"id": 1225,
"image_id": "000000450759",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "textured stone wall"
},
{
"mask_ids": [
3
],
"txt_desc": "dark green metal bench"
},
{
"mask_ids": [
2
],
"txt_desc": "person dressed in a white t-shirt and blue jeans"
},
{
"mask_ids": [
3
],
"txt_desc": "bench"
},
{
"mask_ids": [
1
],
"txt_desc": "light-coloured pavement"
}
],
"labels": [
"wall-stone",
"pavement-merged",
"person",
"bench"
]
} | [
{
"area": 222832,
"bbox": [
0,
0,
640,
386
],
"category_id": 175,
"id": 14439,
"image_id": "000000450759",
"iscrowd": 0,
"segmentation": {
"counts": "0R<Y10000000000000000000000000000000000000000O1000000000000000000000000000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000450840 | 000000450840.jpg | {
"data_source": "COCONut",
"file_name": "000000450840.jpg",
"height": 440,
"id": "000000450840",
"width": 640
} | {
"caption": "The image shows a young man in a light blue t-shirt and plaid shorts asleep on a white sofa, resting his head on a black backpack next to him. A black Sony Vaio laptop is open and resting on his lap while his left arm is extended, holding onto the laptop.",
"caption_ann": "The image shows a <2:young man in a light blue t-shirt and plaid shorts> asleep on a <3:white sofa>, resting his head on a <0:black backpack> next to him. A <1:black Sony Vaio laptop> is open and resting on his lap while his left arm is extended, holding onto the <1:laptop>.",
"id": 1226,
"image_id": "000000450840",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "young man in a light blue t-shirt and plaid shorts"
},
{
"mask_ids": [
3
],
"txt_desc": "white sofa"
},
{
"mask_ids": [
0
],
"txt_desc": "black backpack"
},
{
"mask_ids": [
1
],
"txt_desc": "black Sony Vaio laptop"
},
{
"mask_ids": [
1
],
"txt_desc": "laptop"
}
],
"labels": [
"backpack",
"laptop",
"person",
"couch"
]
} | [
{
"area": 24230,
"bbox": [
0,
102,
165,
224
],
"category_id": 27,
"id": 14443,
"image_id": "000000450840",
"iscrowd": 0,
"segmentation": {
"counts": "j88]=4L4K6H7I7J6I8I6J6K6J6K4K6J6K4L4L3L5L3M4L3M3L4M4L3M3M3M3M4L3N2M3M3M4M2M3M3N2M4M2M3N3L3N2M4M2M3N3L3O2... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000451038 | 000000451038.jpg | {
"data_source": "COCONut",
"file_name": "000000451038.jpg",
"height": 500,
"id": "000000451038",
"width": 388
} | {
"caption": "A black and white image focuses on a person's hand holding a silver knife above a gray and speckled apple. The tip of the knife is positioned to pierce the top of the apple, near its stem. The whole scene is set against a plain, light background.",
"caption_ann": "A black and white image focuses on a <1:person's hand> holding a <2:silver knife> above a <3:gray and speckled apple>. The tip of the <2:knife> is positioned to pierce the top of the <3:apple>, near its stem. The whole scene is set against a <0:plain, light background>.",
"id": 1227,
"image_id": "000000451038",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "person's hand"
},
{
"mask_ids": [
2
],
"txt_desc": "silver knife"
},
{
"mask_ids": [
3
],
"txt_desc": "gray and speckled apple"
},
{
"mask_ids": [
2
],
"txt_desc": "knife"
},
{
"mask_ids": [
3
],
"txt_desc": "apple"
},
{
"mask_ids": [
0
],
"txt_desc": "plain, light background"
}
],
"labels": [
"wall-other-merged",
"person",
"knife",
"apple"
]
} | [
{
"area": 61276,
"bbox": [
0,
0,
388,
500
],
"category_id": 199,
"id": 14447,
"image_id": "000000451038",
"iscrowd": 0,
"segmentation": {
"counts": "P9d6Q9O4L2N1O1O1O1O1O1O1O1O2N1OZJoIj1P6WNRJg1m5YNUJf1j5[NXJc1g5]N\\Ja1c5`N_J^1`5bNbJ]1]5dNdJ\\1Z5eNhJY1W5... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000451420 | 000000451420.jpg | {
"data_source": "COCONut",
"file_name": "000000451420.jpg",
"height": 425,
"id": "000000451420",
"width": 640
} | {
"caption": "The image shows a large goose with a long black and white neck bending its head down to the water's surface, surrounded by six gooselings. A gooseling is near the large goose, also with its head down, as if drinking water from the water body. To the right of the gooseling, another gooseling is resting in the water. A two gooselings is positioned directly to the left of the gooseling. In the top of the image, two gooselings are wading in the water.",
"caption_ann": "The image shows a <1:large goose with a long black and white neck> bending its head down to the <0:water's surface>, surrounded by <2,3,4,5,6,7:six gooselings>. A <2:gooseling> is near the <1:large goose>, also with its head down, as if drinking water from the <0:water body>. To the right of the <2:gooseling>, another <3:gooseling> is resting in the <0:water>. A <4,5:two gooselings> is positioned directly to the left of the <2:gooseling>. In the top of the image, <6,7:two gooselings> are wading in the <0:water>.",
"id": 1228,
"image_id": "000000451420",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "large goose with a long black and white neck"
},
{
"mask_ids": [
0
],
"txt_desc": "water's surface"
},
{
"mask_ids": [
2,
3,
4,
5,
6,
7
],
"txt_desc": "six gooselings"
},
{
"mask_ids": [
2
],
"txt_desc": "gooseling"
},
{
"mask_ids": [
1
],
"txt_desc": "large goose"
},
{
"mask_ids": [
0
],
"txt_desc": "water body"
},
{
"mask_ids": [
2
],
"txt_desc": "gooseling"
},
{
"mask_ids": [
3
],
"txt_desc": "gooseling"
},
{
"mask_ids": [
0
],
"txt_desc": "water"
},
{
"mask_ids": [
4,
5
],
"txt_desc": "two gooselings"
},
{
"mask_ids": [
2
],
"txt_desc": "gooseling"
},
{
"mask_ids": [
6,
7
],
"txt_desc": "two gooselings"
},
{
"mask_ids": [
0
],
"txt_desc": "water"
}
],
"labels": [
"river",
"bird",
"bird",
"bird",
"bird",
"bird",
"bird",
"bird"
]
} | [
{
"area": 170363,
"bbox": [
0,
0,
640,
425
],
"category_id": 148,
"id": 14451,
"image_id": "000000451420",
"iscrowd": 0,
"segmentation": {
"counts": "0W`;2a`D1a;4\\D2\\;4_D0\\;9\\DJa;:\\DGd;o0O2N1O2N1O100O100O2O0O101N101O0O2O0O2O00010O001O01O01O0001O00O1... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000452781 | 000000452781.jpg | {
"data_source": "COCONut",
"file_name": "000000452781.jpg",
"height": 427,
"id": "000000452781",
"width": 640
} | {
"caption": "A young woman in a gray long-sleeved jacket and dark shorts is sitting on a set of stone stairs. The woman is turned slightly to the right, looking down, speaking on the black cell phone while resting her left hand on her head. The stairs lead up to a pair of large, wooden double doors. A portion of a light-colored stone building with a protruding column is visible to the right, while the gray pavement is at the bottom of the image.",
"caption_ann": "A <4:young woman in a gray long-sleeved jacket and dark shorts> is sitting on a set of <1:stone stairs>. The <4:woman> is turned slightly to the right, looking down, speaking on the <5:black cell phone> while resting her left hand on her head. The <1:stairs> lead up to a pair of large, <2:wooden double doors>. A portion of a <0:light-colored stone building with a protruding column> is visible to the right, while the <3:gray pavement> is at the bottom of the image.",
"id": 1229,
"image_id": "000000452781",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "young woman in a gray long-sleeved jacket and dark shorts"
},
{
"mask_ids": [
1
],
"txt_desc": "stone stairs"
},
{
"mask_ids": [
4
],
"txt_desc": "woman"
},
{
"mask_ids": [
5
],
"txt_desc": "black cell phone"
},
{
"mask_ids": [
1
],
"txt_desc": "stairs"
},
{
"mask_ids": [
2
],
"txt_desc": "wooden double doors"
},
{
"mask_ids": [
0
],
"txt_desc": "light-colored stone building with a protruding column"
},
{
"mask_ids": [
3
],
"txt_desc": "gray pavement"
}
],
"labels": [
"wall-other-merged",
"stairs",
"door-stuff",
"pavement-merged",
"person",
"cell phone"
]
} | [
{
"area": 77975,
"bbox": [
386,
0,
254,
391
],
"category_id": 199,
"id": 14459,
"image_id": "000000452781",
"iscrowd": 0,
"segmentation": {
"counts": "gnP52X=g0iBMj;i0100001?@Zj5VO`VJ7I8H<D7I6J6J8H8H6J8H;E6cEnLU:W3M6J:F7I3kFPLi8o3WGULf8j3XGZL6GU8m3fG_L4G... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000452824 | 000000452824.jpg | {
"data_source": "COCONut",
"file_name": "000000452824.jpg",
"height": 500,
"id": "000000452824",
"width": 326
} | {
"caption": "The image features a clear glass vase filled with small, smooth pebbles of various colors, including white, gray, and brown. Four autumn leaves with jagged edges, in shades of yellow and red, are seen. The vase is placed on a dark, flat surface, which contrasts with the white wall with shadows behind it.",
"caption_ann": "The image features a <3:clear glass vase filled with small, smooth pebbles of various colors>, including white, gray, and brown. Four <2:autumn leaves with jagged edges>, in shades of yellow and red, are seen. The <3:vase> is placed on a <1:dark, flat surface>, which contrasts with the <0:white wall with shadows> behind it.",
"id": 1230,
"image_id": "000000452824",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "clear glass vase filled with small, smooth pebbles of various colors"
},
{
"mask_ids": [
2
],
"txt_desc": "autumn leaves with jagged edges"
},
{
"mask_ids": [
3
],
"txt_desc": "vase"
},
{
"mask_ids": [
1
],
"txt_desc": "dark, flat surface"
},
{
"mask_ids": [
0
],
"txt_desc": "white wall with shadows"
}
],
"labels": [
"wall-other-merged",
"table-merged",
"tree-merged",
"vase"
]
} | [
{
"area": 87311,
"bbox": [
0,
0,
326,
432
],
"category_id": 199,
"id": 14465,
"image_id": "000000452824",
"iscrowd": 0,
"segmentation": {
"counts": "0`=T2O100O10000O10000O10000O10000O100O100N200O10000O100O1N200O1O1000000O100YM\\M]He2b7^MjEM0000\\2e2i7fMV... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000452866 | 000000452866.jpg | {
"data_source": "COCONut",
"file_name": "000000452866.jpg",
"height": 612,
"id": "000000452866",
"width": 612
} | {
"caption": "A gray-haired man wearing a striped blue and white shirt and a gray-haired woman wearing a tank top and gold chain with pendant are standing side by side, with the man holding a cake server to cut a cake on a dining table with the assistance of the woman. To the right of the woman is a black office chair. In the background, there is a light gray wall, a portion of a white curtain, and a dark rug is visible at the bottom left of the picture.",
"caption_ann": "A <7:gray-haired man wearing a striped blue and white shirt> and a <6:gray-haired woman wearing a tank top and gold chain with pendant> are standing side by side, with the <7:man> holding a <5:cake server> to cut a <4:cake> on a <3:dining table> with the assistance of the <6:woman>. To the right of the <6:woman> is a <8:black office chair>. In the background, there is a <1:light gray wall>, a portion of a <2:white curtain>, and a <0:dark rug> is visible at the bottom left of the picture.",
"id": 1231,
"image_id": "000000452866",
"label_matched": [
{
"mask_ids": [
7
],
"txt_desc": "gray-haired man wearing a striped blue and white shirt"
},
{
"mask_ids": [
6
],
"txt_desc": "gray-haired woman wearing a tank top and gold chain with pendant"
},
{
"mask_ids": [
7
],
"txt_desc": "man"
},
{
"mask_ids": [
5
],
"txt_desc": "cake server"
},
{
"mask_ids": [
4
],
"txt_desc": "cake"
},
{
"mask_ids": [
3
],
"txt_desc": "dining table"
},
{
"mask_ids": [
6
],
"txt_desc": "woman"
},
{
"mask_ids": [
6
],
"txt_desc": "woman"
},
{
"mask_ids": [
8
],
"txt_desc": "black office chair"
},
{
"mask_ids": [
1
],
"txt_desc": "light gray wall"
},
{
"mask_ids": [
2
],
"txt_desc": "white curtain"
},
{
"mask_ids": [
0
],
"txt_desc": "dark rug"
}
],
"labels": [
"rug-merged",
"wall-other-merged",
"curtain",
"dining table",
"cake",
"knife",
"person",
"person",
"chair"
]
} | [
{
"area": 10579,
"bbox": [
0,
201,
58,
353
],
"category_id": 200,
"id": 14469,
"image_id": "000000452866",
"iscrowd": 0,
"segmentation": {
"counts": "Y6Q;X8K4L3M3M3M5K5K5J4N2L6K5K3M2N3M3M4L<D=Cc0]Ok0UOW1hN2O1O2M2O1O2N1O3L4M4L3M4K4M4L3Ma0_Oa0_O5K4K6K5K4K6... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000454072 | 000000454072.jpg | {
"data_source": "COCONut",
"file_name": "000000454072.jpg",
"height": 480,
"id": "000000454072",
"width": 640
} | {
"caption": "The image features a green Stella Artois bottle and black-handled scissors resting on a brown wooden table next to a baking tray full of ravioli.",
"caption_ann": "The image features a <2:green Stella Artois bottle> and <3:black-handled scissors> resting on a <0:brown wooden table> next to a <1:baking tray full of ravioli>.",
"id": 1232,
"image_id": "000000454072",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "green Stella Artois bottle"
},
{
"mask_ids": [
3
],
"txt_desc": "black-handled scissors"
},
{
"mask_ids": [
0
],
"txt_desc": "brown wooden table"
},
{
"mask_ids": [
1
],
"txt_desc": "baking tray full of ravioli"
}
],
"labels": [
"table-merged",
"food-other-merged",
"bottle",
"scissors"
]
} | [
{
"area": 161868,
"bbox": [
0,
0,
640,
480
],
"category_id": 189,
"id": 14478,
"image_id": "000000454072",
"iscrowd": 0,
"segmentation": {
"counts": "8h>80M3O100000000O10000000000000000000000O100O10000000000000000000000000000000ei81YVG2N3M2N3L3N2N2N2N1N3... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000454449 | 000000454449.jpg | {
"data_source": "COCONut",
"file_name": "000000454449.jpg",
"height": 426,
"id": "000000454449",
"width": 640
} | {
"caption": "Image displays a grey and white owl with bright yellow eye sits in the green grass next to a person wearing gray pants.",
"caption_ann": "Image displays a <1:grey and white owl with bright yellow eye> sits in the <0:green grass> next to a <2:person wearing gray pants>.",
"id": 1233,
"image_id": "000000454449",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "grey and white owl with bright yellow eye"
},
{
"mask_ids": [
0
],
"txt_desc": "green grass"
},
{
"mask_ids": [
2
],
"txt_desc": "person wearing gray pants"
}
],
"labels": [
"grass-merged",
"bird",
"person"
]
} | [
{
"area": 187886,
"bbox": [
0,
0,
628,
426
],
"category_id": 193,
"id": 14482,
"image_id": "000000454449",
"iscrowd": 0,
"segmentation": {
"counts": "0ika13^a^N1N2O1N2O2M2O1N2O1N10O0100O100O0kC^Oc;c0ZDBb;`0\\DC`;`0^DC_;?^DE_;<`DG^;:_DJ_;7_DL_;Q1N2N2N2O1N... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000454543 | 000000454543.jpg | {
"data_source": "COCONut",
"file_name": "000000454543.jpg",
"height": 640,
"id": "000000454543",
"width": 361
} | {
"caption": "Image displays a toddler in a dark blue and yellow top reaches for a blue hairdryer held by a hand of a person over a concrete floor and in a background of a white wall with a white sports ball in front of it.",
"caption_ann": "Image displays a <5:toddler in a dark blue and yellow top> reaches for a <2:blue hairdryer> held by a <4:hand of a person> over a <0:concrete floor> and in a background of a <1:white wall> with a <3:white sports ball> in front of it.",
"id": 1234,
"image_id": "000000454543",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "toddler in a dark blue and yellow top"
},
{
"mask_ids": [
2
],
"txt_desc": "blue hairdryer"
},
{
"mask_ids": [
4
],
"txt_desc": "hand of a person"
},
{
"mask_ids": [
0
],
"txt_desc": "concrete floor"
},
{
"mask_ids": [
1
],
"txt_desc": "white wall"
},
{
"mask_ids": [
3
],
"txt_desc": "white sports ball"
}
],
"labels": [
"floor-other-merged",
"wall-other-merged",
"hair drier",
"sports ball",
"person",
"person"
]
} | [
{
"area": 65391,
"bbox": [
0,
236,
361,
404
],
"category_id": 190,
"id": 14485,
"image_id": "000000454543",
"iscrowd": 0,
"segmentation": {
"counts": "Rb0n1Sb0O2N1O2N3M2N3M3M2N3M2N1O3M2N1O2N1O1O1O1O1O1O1O001O1O001O1O1O1O001O001O001O001O00001O001O1O001O00... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000456446 | 000000456446.jpg | {
"data_source": "COCONut",
"file_name": "000000456446.jpg",
"height": 414,
"id": "000000456446",
"width": 640
} | {
"caption": "Image displays a person in a dark wetsuit holds a large sail with red and black markings while riding a board with a slightly visible person on the choppy sea, with a distant mountain range under a hazy sky and waves crashing near a rocky pier in the background.",
"caption_ann": "Image displays a <4:person in a dark wetsuit> holds a <5:large sail with red and black markings> while riding a board with a <6:slightly visible person> on the <1:choppy sea>, with a <2:distant mountain range> under a <0:hazy sky> and waves crashing near a <3:rocky pier> in the background.",
"id": 1235,
"image_id": "000000456446",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "person in a dark wetsuit"
},
{
"mask_ids": [
5
],
"txt_desc": "large sail with red and black markings"
},
{
"mask_ids": [
6
],
"txt_desc": "slightly visible person"
},
{
"mask_ids": [
1
],
"txt_desc": "choppy sea"
},
{
"mask_ids": [
2
],
"txt_desc": "distant mountain range"
},
{
"mask_ids": [
0
],
"txt_desc": "hazy sky"
},
{
"mask_ids": [
3
],
"txt_desc": "rocky pier"
}
],
"labels": [
"sky-other-merged",
"sea",
"mountain-merged",
"rock-merged",
"person",
"boat",
"person"
]
} | [
{
"area": 49088,
"bbox": [
0,
0,
640,
103
],
"category_id": 187,
"id": 14491,
"image_id": "000000456446",
"iscrowd": 0,
"segmentation": {
"counts": "0^1`;00001O00001O0000001O00001O00001O1O1O1O1O1O001O00O11O000000000000O1001O000000000000000000O10000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000456732 | 000000456732.jpg | {
"data_source": "COCONut",
"file_name": "000000456732.jpg",
"height": 640,
"id": "000000456732",
"width": 480
} | {
"caption": "Image displays a white airplane toilet with the lid and seat up is positioned against a white wall with a flush button and beneath a brightly lit oval window, all on a dark-colored floor.",
"caption_ann": "Image displays a <3:white airplane toilet with the lid and seat up> is positioned against a <1:white wall with a flush button> and beneath a <2:brightly lit oval window>, all on a <0:dark-colored floor>.",
"id": 1236,
"image_id": "000000456732",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "white airplane toilet with the lid and seat up"
},
{
"mask_ids": [
1
],
"txt_desc": "white wall with a flush button"
},
{
"mask_ids": [
2
],
"txt_desc": "brightly lit oval window"
},
{
"mask_ids": [
0
],
"txt_desc": "dark-colored floor"
}
],
"labels": [
"floor-other-merged",
"wall-other-merged",
"window-other",
"toilet"
]
} | [
{
"area": 21802,
"bbox": [
65,
543,
415,
97
],
"category_id": 190,
"id": 14498,
"image_id": "000000456732",
"iscrowd": 0,
"segmentation": {
"counts": "mWY13jc03M3M3N2LDd\\O<Xc0Hg\\O9Wc09M3M3N2L4L4N2M3N2M3N2M3N2M3M3N2L4M3M3M3M3M3M3O1O1N2O1O100O100O1O10000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000457085 | 000000457085.jpg | {
"data_source": "COCONut",
"file_name": "000000457085.jpg",
"height": 426,
"id": "000000457085",
"width": 640
} | {
"caption": "Image displays a long-haired, dark-colored cat lies down on a window next to a fluffy, black-and-white cat that is sitting up, both positioned in front of a window that looks out on a dark-colored car, with the scene set against a burgundy-colored wall.",
"caption_ann": "Image displays a <3:long-haired, dark-colored cat> lies down on a <1:window> next to a <4:fluffy, black-and-white cat> that is sitting up, both positioned in front of a <1:window> that looks out on a <2:dark-colored car>, with the scene set against a <0:burgundy-colored wall>.",
"id": 1237,
"image_id": "000000457085",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "long-haired, dark-colored cat"
},
{
"mask_ids": [
1
],
"txt_desc": "window"
},
{
"mask_ids": [
4
],
"txt_desc": "fluffy, black-and-white cat"
},
{
"mask_ids": [
1
],
"txt_desc": "window"
},
{
"mask_ids": [
2
],
"txt_desc": "dark-colored car"
},
{
"mask_ids": [
0
],
"txt_desc": "burgundy-colored wall"
}
],
"labels": [
"wall-other-merged",
"window-other",
"car",
"cat",
"cat"
]
} | [
{
"area": 59060,
"bbox": [
0,
0,
640,
426
],
"category_id": 199,
"id": 14502,
"image_id": "000000457085",
"iscrowd": 0,
"segmentation": {
"counts": "0cV36_VM5K8aF`0f4FVKk0X4XOfKm1U3TNjLQ3P2PMPN[4QMhJZ3o0DZ6cNhI\\1f6oL\\Hm0Q1S2Q9000001O0000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000458103 | 000000458103.jpg | {
"data_source": "COCONut",
"file_name": "000000458103.jpg",
"height": 427,
"id": "000000458103",
"width": 640
} | {
"caption": "A baseball player in a white and blue uniform with the number 19 leaps into the air, his body parallel to the ground, to make a spectacular catch. He extends his arm high, catching a baseball in his brown leather glove just in front of a tall, green outfield wall. The action takes place on the edge of the grassy baseball field. Beyond the wall, a white house and lush, green palm trees are visible in the background.",
"caption_ann": "A <4:baseball player in a white and blue uniform with the number 19> leaps into the air, his body parallel to the ground, to make a spectacular catch. He extends his arm high, catching a <6:baseball> in his <5:brown leather glove> just in front of a <1:tall, green outfield wall>. The action takes place on the edge of the <3:grassy baseball field>. Beyond the wall, a <0:white house> and <2:lush, green palm trees> are visible in the background.",
"id": 1238,
"image_id": "000000458103",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "baseball player in a white and blue uniform with the number 19"
},
{
"mask_ids": [
6
],
"txt_desc": "baseball"
},
{
"mask_ids": [
5
],
"txt_desc": "brown leather glove"
},
{
"mask_ids": [
1
],
"txt_desc": "tall, green outfield wall"
},
{
"mask_ids": [
3
],
"txt_desc": "grassy baseball field"
},
{
"mask_ids": [
0
],
"txt_desc": "white house"
},
{
"mask_ids": [
2
],
"txt_desc": "lush, green palm trees"
}
],
"labels": [
"house",
"wall-other-merged",
"tree-merged",
"playingfield",
"person",
"baseball glove",
"sports ball"
]
} | [
{
"area": 12166,
"bbox": [
0,
37,
251,
99
],
"category_id": 128,
"id": 14507,
"image_id": "000000458103",
"iscrowd": 0,
"segmentation": {
"counts": "Q28S=1N2O1N1O1O0010O010O10O02O0OhR<FamC2O4M2N3L2O002M2O1O00O00100O10O10001O0O100O10I\\CFb13X9=gFEW97nFIR9... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000458637 | 000000458637.jpg | {
"data_source": "COCONut",
"file_name": "000000458637.jpg",
"height": 640,
"id": "000000458637",
"width": 426
} | {
"caption": "Image displays a snowboarder in a multicolored jacket and orange pants is captured mid-air on their snowboard against a clear blue sky, soaring above a large black banner on the snow-covered ground with two vertical banners visible on the right.",
"caption_ann": "Image displays a <4:snowboarder in a multicolored jacket and orange pants> is captured mid-air on their <3:snowboard> against a <0:clear blue sky>, soaring above a <1:large black banner> on the <2:snow-covered ground> with <5,6:two vertical banners> visible on the right.",
"id": 1239,
"image_id": "000000458637",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "snowboarder in a multicolored jacket and orange pants"
},
{
"mask_ids": [
3
],
"txt_desc": "snowboard"
},
{
"mask_ids": [
0
],
"txt_desc": "clear blue sky"
},
{
"mask_ids": [
1
],
"txt_desc": "large black banner"
},
{
"mask_ids": [
2
],
"txt_desc": "snow-covered ground"
},
{
"mask_ids": [
5,
6
],
"txt_desc": "two vertical banners"
}
],
"labels": [
"sky-other-merged",
"building-other-merged",
"snow",
"snowboard",
"person",
"snowboard",
"snowboard"
]
} | [
{
"area": 204789,
"bbox": [
0,
0,
426,
606
],
"category_id": 187,
"id": 14514,
"image_id": "000000458637",
"iscrowd": 0,
"segmentation": {
"counts": "0R?n400000000000000000000000000000000000000000000000000000000001O000000000000000000001O000000000000001O0... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000459733 | 000000459733.jpg | {
"data_source": "COCONut",
"file_name": "000000459733.jpg",
"height": 428,
"id": "000000459733",
"width": 640
} | {
"caption": "In the foreground of the calm blue sea, two seagulls are perched top of a large brown rocks, while in the background a person in a black wetsuit paddles a long yellow surfboard with a small black and white dog standing at the front.",
"caption_ann": "In the foreground of the <0:calm blue sea>, <2,3:two seagulls> are perched top of a <1:large brown rocks>, while in the background a <5:person in a black wetsuit> paddles a <6:long yellow surfboard> with a <4:small black and white dog> standing at the front.",
"id": 1240,
"image_id": "000000459733",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "calm blue sea"
},
{
"mask_ids": [
2,
3
],
"txt_desc": "two seagulls"
},
{
"mask_ids": [
1
],
"txt_desc": "large brown rocks"
},
{
"mask_ids": [
5
],
"txt_desc": "person in a black wetsuit"
},
{
"mask_ids": [
6
],
"txt_desc": "long yellow surfboard"
},
{
"mask_ids": [
4
],
"txt_desc": "small black and white dog"
}
],
"labels": [
"sea",
"rock-merged",
"bird",
"bird",
"dog",
"person",
"surfboard"
]
} | [
{
"area": 252019,
"bbox": [
0,
0,
640,
428
],
"category_id": 155,
"id": 14521,
"image_id": "000000459733",
"iscrowd": 0,
"segmentation": {
"counts": "0_i24fcM4L5M2N101N100O100O10O02N1O1O100O100O100O10000O100O2O00000O10000O100O1O100O1O1O1O1O1O2N1O100O1O10... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000459823 | 000000459823.jpg | {
"data_source": "COCONut",
"file_name": "000000459823.jpg",
"height": 467,
"id": "000000459823",
"width": 640
} | {
"caption": "In a tranquil, rural setting, two dogs are out on a walk. A black and white dog on a leash has ventured into the shallow edge of a calm river, creating gentle ripples around its legs. On the grassy bank, a slender, brindle-colored whippet wearing a blue checkered coat stands on a leash and sniffs the ground. The overgrown green bank extends into the distance under a hazy, bright sky.",
"caption_ann": "In a tranquil, rural setting, <3,4:two dogs> are out on a walk. A <3:black and white dog> on a leash has ventured into the shallow edge of a calm <0:river>, creating gentle ripples around its legs. On the <1:grassy bank>, a <4:slender, brindle-colored whippet wearing a blue checkered coat> stands on a leash and sniffs the ground. The <1:overgrown green bank> extends into the distance under a <2:hazy, bright sky>.",
"id": 1241,
"image_id": "000000459823",
"label_matched": [
{
"mask_ids": [
3,
4
],
"txt_desc": "two dogs"
},
{
"mask_ids": [
3
],
"txt_desc": "black and white dog"
},
{
"mask_ids": [
0
],
"txt_desc": "river"
},
{
"mask_ids": [
1
],
"txt_desc": "grassy bank"
},
{
"mask_ids": [
4
],
"txt_desc": "slender, brindle-colored whippet wearing a blue checkered coat"
},
{
"mask_ids": [
1
],
"txt_desc": "overgrown green bank"
},
{
"mask_ids": [
2
],
"txt_desc": "hazy, bright sky"
}
],
"labels": [
"river",
"grass-merged",
"water-other",
"dog",
"dog"
]
} | [
{
"area": 133577,
"bbox": [
0,
45,
515,
422
],
"category_id": 148,
"id": 14528,
"image_id": "000000459823",
"iscrowd": 0,
"segmentation": {
"counts": "_1T=_10O1000000O1000000000000001O0000O1000000000000001O00000000000000001O0000000000001O0000001O00001O00... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000460997 | 000000460997.jpg | {
"data_source": "COCONut",
"file_name": "000000460997.jpg",
"height": 427,
"id": "000000460997",
"width": 640
} | {
"caption": "A young man wearing a black wetsuit, is surfing in the sea. He appears to be losing his balance, with one leg kicked high in the air and arms spread wide, as he starts to fall from the surfboard attached to his ankle.",
"caption_ann": "A <1:young man wearing a black wetsuit>, is surfing in the <0:sea>. He appears to be losing his balance, with one leg kicked high in the air and arms spread wide, as he starts to fall from the <2:surfboard> attached to his ankle.",
"id": 1242,
"image_id": "000000460997",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "young man wearing a black wetsuit"
},
{
"mask_ids": [
0
],
"txt_desc": "sea"
},
{
"mask_ids": [
2
],
"txt_desc": "surfboard"
}
],
"labels": [
"sea",
"person",
"surfboard"
]
} | [
{
"area": 263134,
"bbox": [
0,
0,
640,
427
],
"category_id": 155,
"id": 14533,
"image_id": "000000460997",
"iscrowd": 0,
"segmentation": {
"counts": "0Unb13P_]N6L2N4M2M2O2O2M7I3N4L3N1N2N2O03M3L2N2N1O2N2N1O2N0O2O0O2O001N2O0O1000O102M4M3L2O1N2N3N1N2O0O100O... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000465740 | 000000465740.jpg | {
"data_source": "COCONut",
"file_name": "000000465740.jpg",
"height": 480,
"id": "000000465740",
"width": 640
} | {
"caption": "The image shows a smiling person wearing white clothing equipped with a red backpack, standing on red and black skis on a snow-covered slope. The terrain around them consists of rocky patches breaking through the snow. In the background, expansive mountains stretch out into the distance under a clear blue sky, creating a scenic, high-altitude view.",
"caption_ann": "The image shows a <6:smiling person wearing white clothing> equipped with a <5:red backpack>, standing on <4:red and black skis> on a <1:snow-covered slope>. The terrain around them consists of <2:rocky patches> breaking through the <1:snow>. In the background, expansive <3:mountains> stretch out into the distance under a <0:clear blue sky>, creating a scenic, high-altitude view.",
"id": 1243,
"image_id": "000000465740",
"label_matched": [
{
"mask_ids": [
6
],
"txt_desc": "smiling person wearing white clothing"
},
{
"mask_ids": [
5
],
"txt_desc": "red backpack"
},
{
"mask_ids": [
4
],
"txt_desc": "red and black skis"
},
{
"mask_ids": [
1
],
"txt_desc": "snow-covered slope"
},
{
"mask_ids": [
2
],
"txt_desc": "rocky patches"
},
{
"mask_ids": [
1
],
"txt_desc": "snow"
},
{
"mask_ids": [
3
],
"txt_desc": "mountains"
},
{
"mask_ids": [
0
],
"txt_desc": "clear blue sky"
}
],
"labels": [
"sky-other-merged",
"snow",
"rock-merged",
"mountain-merged",
"skis",
"backpack",
"person"
]
} | [
{
"area": 24999,
"bbox": [
0,
0,
617,
81
],
"category_id": 187,
"id": 14536,
"image_id": "000000465740",
"iscrowd": 0,
"segmentation": {
"counts": "0m0S>0000000000001O00000000000000000000000000000000O100000000000000001O0000000000001O000000000000001O00000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000469512 | 000000469512.jpg | {
"data_source": "COCONut",
"file_name": "000000469512.jpg",
"height": 640,
"id": "000000469512",
"width": 480
} | {
"caption": "The image shows a small bird perched at the top of a thin tree branch against a clear blue sky. The bird has a light gray and white body, and a black stripe across its eye. ",
"caption_ann": "The image shows a <2:small bird> perched at the top of a <1:thin tree branch> against a <0:clear blue sky>. The <2:bird> has a light gray and white body, and a black stripe across its eye. ",
"id": 1244,
"image_id": "000000469512",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "small bird"
},
{
"mask_ids": [
1
],
"txt_desc": "thin tree branch"
},
{
"mask_ids": [
0
],
"txt_desc": "clear blue sky"
},
{
"mask_ids": [
2
],
"txt_desc": "bird"
}
],
"labels": [
"sky-other-merged",
"tree-merged",
"bird"
]
} | [
{
"area": 262555,
"bbox": [
0,
0,
480,
640
],
"category_id": 187,
"id": 14543,
"image_id": "000000469512",
"iscrowd": 0,
"segmentation": {
"counts": "0Wd`04do_O3L2O1O0O10O1001O1O001O1O010`\\OHWc0a0O1O1O1O001O001O00001O001O1O001O10O01O1O01O01O01O00000O100... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000470032 | 000000470032.jpg | {
"data_source": "COCONut",
"file_name": "000000470032.jpg",
"height": 489,
"id": "000000470032",
"width": 500
} | {
"caption": "The image shows a white parrot with a yellow crest hanging upside down from a wire gripping it with its claws and beak. The background includes a clear sky and some trees with reddish foliage.",
"caption_ann": "The image shows a <2:white parrot with a yellow crest> hanging upside down from a wire gripping it with its claws and beak. The background includes a <0:clear sky> and some <1:trees with reddish foliage>.",
"id": 1245,
"image_id": "000000470032",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "white parrot with a yellow crest"
},
{
"mask_ids": [
0
],
"txt_desc": "clear sky"
},
{
"mask_ids": [
1
],
"txt_desc": "trees with reddish foliage"
}
],
"labels": [
"sky-other-merged",
"tree-merged",
"bird"
]
} | [
{
"area": 84423,
"bbox": [
12,
0,
488,
489
],
"category_id": 187,
"id": 14546,
"image_id": "000000470032",
"iscrowd": 0,
"segmentation": {
"counts": "PR64T92gLNU37iLLS38jLKQ39nLG\\LHa4c0ROGYLKb4`0TOFWLMd4>TOEWLNd0Gb1f0c1EUL0f0G_1f0e1CVL0f0I]1d0h1AUL3g0H[... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000470300 | 000000470300.jpg | {
"data_source": "COCONut",
"file_name": "000000470300.jpg",
"height": 480,
"id": "000000470300",
"width": 640
} | {
"caption": "The image shows a yellow Suzuki motorcycle parked partially on a dark asphalt road and partially on a light-colored pavement. A person, visible from the waist down, stands nearby wearing jeans. In the background, a wall and some shaded objects can be seen inside a garage or similar covered area.\n",
"caption_ann": "The image shows a <3:yellow Suzuki motorcycle> parked partially on a <0:dark asphalt road> and partially on a <2:light-colored pavement>. A <4:person, visible from the waist down>, stands nearby wearing jeans. In the background, a <1:wall> and some shaded objects can be seen inside a garage or similar covered area.\n",
"id": 1246,
"image_id": "000000470300",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "yellow Suzuki motorcycle"
},
{
"mask_ids": [
0
],
"txt_desc": "dark asphalt road"
},
{
"mask_ids": [
2
],
"txt_desc": "light-colored pavement"
},
{
"mask_ids": [
4
],
"txt_desc": "person, visible from the waist down"
},
{
"mask_ids": [
1
],
"txt_desc": "wall"
}
],
"labels": [
"road",
"wall-other-merged",
"pavement-merged",
"motorcycle",
"person"
]
} | [
{
"area": 88930,
"bbox": [
0,
209,
640,
271
],
"category_id": 149,
"id": 14549,
"image_id": "000000470300",
"iscrowd": 0,
"segmentation": {
"counts": "g7Y7g701O00000000001O0000001O0000001O000000001O00000000001O0000001O00001O000000001O00000000001O00001O00... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000474206 | 000000474206.jpg | {
"data_source": "COCONut",
"file_name": "000000474206.jpg",
"height": 640,
"id": "000000474206",
"width": 456
} | {
"caption": "In a dramatic, low-angle action shot, a skateboarder in red t-shirt is captured in mid-air, having just launched off the top of a long concrete staircase. The skatboarder and their skateboard are silhouetted against a bright, overcast sky. The wide staircase is flanked by a sturdy metal railing. The scene is set in an urban environment, with a modern, curved brick building visible in the background and a paved area at the bottom of the stairs.",
"caption_ann": "In a dramatic, low-angle action shot, a <5:skateboarder in red t-shirt> is captured in mid-air, having just launched off the top of a long <2:concrete staircase>. The <5:skatboarder> and their <6:skateboard> are silhouetted against a <0:bright, overcast sky>. The wide staircase is flanked by a <4:sturdy metal railing>. The scene is set in an urban environment, with a <1:modern, curved brick building> visible in the background and a <3:paved area> at the bottom of the stairs.",
"id": 1247,
"image_id": "000000474206",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "skateboarder in red t-shirt"
},
{
"mask_ids": [
2
],
"txt_desc": "concrete staircase"
},
{
"mask_ids": [
5
],
"txt_desc": "skatboarder"
},
{
"mask_ids": [
6
],
"txt_desc": "skateboard"
},
{
"mask_ids": [
0
],
"txt_desc": "bright, overcast sky"
},
{
"mask_ids": [
4
],
"txt_desc": "sturdy metal railing"
},
{
"mask_ids": [
1
],
"txt_desc": "modern, curved brick building"
},
{
"mask_ids": [
3
],
"txt_desc": "paved area"
}
],
"labels": [
"sky-other-merged",
"building-other-merged",
"stairs",
"pavement-merged",
"fence-merged",
"person",
"skateboard"
]
} | [
{
"area": 90887,
"bbox": [
0,
0,
456,
240
],
"category_id": 187,
"id": 14554,
"image_id": "000000474206",
"iscrowd": 0,
"segmentation": {
"counts": "0V7j<00O100000000000000O1000000000000000000000000O1000000000000000000O10000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000474713 | 000000474713.jpg | {
"data_source": "COCONut",
"file_name": "000000474713.jpg",
"height": 375,
"id": "000000474713",
"width": 500
} | {
"caption": "A brown wooden bench with two connected seats is placed on a wooden deck with a visible grain. The bench sits in front of a wall with reddish-brown wood panelling and beneath a large window with two panes, each reflecting a serene landscape of mountains and a blue sky with white clouds.",
"caption_ann": "A <3:brown wooden bench with two connected seats> is placed on a <0:wooden deck with a visible grain>. The <3:bench> sits in front of a <1:wall with reddish-brown wood panelling> and beneath a <2:large window with two panes>, each reflecting a serene landscape of mountains and a blue sky with white clouds.",
"id": 1248,
"image_id": "000000474713",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "brown wooden bench with two connected seats"
},
{
"mask_ids": [
0
],
"txt_desc": "wooden deck with a visible grain"
},
{
"mask_ids": [
3
],
"txt_desc": "bench"
},
{
"mask_ids": [
1
],
"txt_desc": "wall with reddish-brown wood panelling"
},
{
"mask_ids": [
2
],
"txt_desc": "large window with two panes"
}
],
"labels": [
"floor-wood",
"wall-wood",
"window-other",
"bench"
]
} | [
{
"area": 18596,
"bbox": [
0,
259,
492,
116
],
"category_id": 118,
"id": 14561,
"image_id": "000000474713",
"iscrowd": 0,
"segmentation": {
"counts": "S8d3S8000001O0000001O00000000001O000000001O0000001O0000001O00001O00000000001O000000001O00001O00001O0000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000475043 | 000000475043.jpg | {
"data_source": "COCONut",
"file_name": "000000475043.jpg",
"height": 612,
"id": "000000475043",
"width": 612
} | {
"caption": "A red-haired woman in a black jacket and a floral scarf and a man in a plaid shirt are relaxingand working in a living room. The woman is on a dark brown couch and is taking a sip from a glass bottle. The man is sitting on a dark chair, working on his open silver macbook. The woman has a tablet on her lap, and another silver laptop is visible in the foreground on another person's lap, whose feet are in the foreground, with one foot covered in a striped sock. A small wooden side table is between the woman and man, contains a dark bowl with a spoon, and a clear glass cup. A dark glass bottle placed on the wood floor, which is partially which is partially covered with a patterned rug. A circular mirror is mounted on the white wall behind the man.Another laptop's screen is partially visible at the bottom section of the image.",
"caption_ann": "A <8:red-haired woman in a black jacket and a floral scarf> and a <9:man in a plaid shirt> are relaxingand working in a living room. The <8:woman> is on a <16:dark brown couch> and is taking a sip from a <18:glass bottle>. The <9:man> is sitting on a <11:dark chair>, working on his <10:open silver macbook>. The <8:woman> has a <15:tablet> on her lap, and another <6:silver laptop> is visible in the foreground on another <7:person's> lap, whose feet are in the foreground, with one foot covered in a striped sock. A <3:small wooden side table> is between the <8:woman> and <9:man>, contains a <13:dark bowl> with a <12:spoon>, and a <14:clear glass cup>. A <17:dark glass bottle> placed on the <0:wood floor>, which is partially which is partially covered with a <1:patterned rug>. A <4:circular mirror> is mounted on the <2:white wall> behind the <9:man>.Another <5:laptop's screen> is partially visible at the bottom section of the image.",
"id": 1249,
"image_id": "000000475043",
"label_matched": [
{
"mask_ids": [
8
],
"txt_desc": "red-haired woman in a black jacket and a floral scarf"
},
{
"mask_ids": [
9
],
"txt_desc": "man in a plaid shirt"
},
{
"mask_ids": [
8
],
"txt_desc": "woman"
},
{
"mask_ids": [
16
],
"txt_desc": "dark brown couch"
},
{
"mask_ids": [
18
],
"txt_desc": "glass bottle"
},
{
"mask_ids": [
9
],
"txt_desc": "man"
},
{
"mask_ids": [
11
],
"txt_desc": "dark chair"
},
{
"mask_ids": [
10
],
"txt_desc": "open silver macbook"
},
{
"mask_ids": [
8
],
"txt_desc": "woman"
},
{
"mask_ids": [
15
],
"txt_desc": "tablet"
},
{
"mask_ids": [
6
],
"txt_desc": "silver laptop"
},
{
"mask_ids": [
7
],
"txt_desc": "person's"
},
{
"mask_ids": [
3
],
"txt_desc": "small wooden side table"
},
{
"mask_ids": [
8
],
"txt_desc": "woman"
},
{
"mask_ids": [
9
],
"txt_desc": "man"
},
{
"mask_ids": [
13
],
"txt_desc": "dark bowl"
},
{
"mask_ids": [
12
],
"txt_desc": "spoon"
},
{
"mask_ids": [
14
],
"txt_desc": "clear glass cup"
},
{
"mask_ids": [
17
],
"txt_desc": "dark glass bottle"
},
{
"mask_ids": [
0
],
"txt_desc": "wood floor"
},
{
"mask_ids": [
1
],
"txt_desc": "patterned rug"
},
{
"mask_ids": [
4
],
"txt_desc": "circular mirror"
},
{
"mask_ids": [
2
],
"txt_desc": "white wall"
},
{
"mask_ids": [
9
],
"txt_desc": "man"
},
{
"mask_ids": [
5
],
"txt_desc": "laptop's screen"
}
],
"labels": [
"floor-wood",
"rug-merged",
"wall-other-merged",
"table-merged",
"mirror-stuff",
"laptop",
"laptop",
"person",
"person",
"person",
"laptop",
"chair",
"spoon",
"bowl",
"cup",
"cell phone",
"couch",
"bottle",
"bottle"
]
} | [
{
"area": 3421,
"bbox": [
235,
364,
77,
90
],
"category_id": 118,
"id": 14565,
"image_id": "000000475043",
"iscrowd": 0,
"segmentation": {
"counts": "ki\\47mb0LT]OOmb05O0O10000O10001N102N4L<l]OZORa0k0h^OXOVa0l0e^OWOYa0\\1O1O1N2N2O3L2N4L2N4L3M3M2N1OO1M3N2... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000479134 | 000000479134.jpg | {
"data_source": "COCONut",
"file_name": "000000479134.jpg",
"height": 333,
"id": "000000479134",
"width": 500
} | {
"caption": "The image shows a black dog with its head inside a toilet in a bathroom. A couple of stacked books are placed on top of the toilet tank. To the left, there's a sink above a wood-trimmed white cabinet. The floor is covered with light-colored tiles, and a beige rug is positioned in front of the toilet. In the background, a striped shower curtain hangs along the shower wall.",
"caption_ann": "The image shows a <5:black dog> with its head inside a <6:toilet> in a bathroom. A couple of <7,9:stacked books> are placed on top of the <6:toilet tank>. To the left, there's a <8:sink> above a <3:wood-trimmed white cabinet>. The <0:floor> is covered with light-colored tiles, and a <1:beige rug> is positioned in front of the toilet. In the background, a <4:striped shower curtain> hangs along the <2:shower wall>.",
"id": 1250,
"image_id": "000000479134",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "black dog"
},
{
"mask_ids": [
6
],
"txt_desc": "toilet"
},
{
"mask_ids": [
7,
9
],
"txt_desc": "stacked books"
},
{
"mask_ids": [
6
],
"txt_desc": "toilet tank"
},
{
"mask_ids": [
8
],
"txt_desc": "sink"
},
{
"mask_ids": [
3
],
"txt_desc": "wood-trimmed white cabinet"
},
{
"mask_ids": [
0
],
"txt_desc": "floor"
},
{
"mask_ids": [
1
],
"txt_desc": "beige rug"
},
{
"mask_ids": [
4
],
"txt_desc": "striped shower curtain"
},
{
"mask_ids": [
2
],
"txt_desc": "shower wall"
}
],
"labels": [
"floor-other-merged",
"rug-merged",
"wall-other-merged",
"cabinet-merged",
"curtain",
"dog",
"toilet",
"book",
"sink",
"book"
]
} | [
{
"area": 20324,
"bbox": [
56,
174,
444,
159
],
"category_id": 190,
"id": 14584,
"image_id": "000000479134",
"iscrowd": 0,
"segmentation": {
"counts": "Tab01[:1O1O1O100O1O1O1O1O1O1O1O1O1O100O1O1O1O100O1O1O1O1O1O1O1O100O1O1O1O1O1O100O1O1O1O1O1O100O1O1O1O1... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000479504 | 000000479504.jpg | {
"data_source": "COCONut",
"file_name": "000000479504.jpg",
"height": 640,
"id": "000000479504",
"width": 426
} | {
"caption": "A dark-coloured door with an ornate wrought-iron-style scrollwork panel and a silver letterbox is the focal point of the image. The door is set into a white wall with a blue corrugated metal wall above it. To the left of the door, there is a red and green wooden bench. A large, smooth rock sits on the gravel ground in front of the door, which is surrounded by a small patch of pavement. A round light fixture is mounted on the wall above the door.",
"caption_ann": "A <1:dark-coloured door with an ornate wrought-iron-style scrollwork panel and a silver letterbox> is the focal point of the image. The <1:door> is set into a <0:white wall with a blue corrugated metal wall> above it. To the left of the <1:door>, there is a <6:red and green wooden bench>. A <4:large, smooth rock> sits on the <2:gravel ground> in front of the <1:door>, which is surrounded by a small patch of <3:pavement>. A <5:round light fixture> is mounted on the <0:wall> above the <1:door>.",
"id": 1251,
"image_id": "000000479504",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "dark-coloured door with an ornate wrought-iron-style scrollwork panel and a silver letterbox"
},
{
"mask_ids": [
1
],
"txt_desc": "door"
},
{
"mask_ids": [
0
],
"txt_desc": "white wall with a blue corrugated metal wall"
},
{
"mask_ids": [
1
],
"txt_desc": "door"
},
{
"mask_ids": [
6
],
"txt_desc": "red and green wooden bench"
},
{
"mask_ids": [
4
],
"txt_desc": "large, smooth rock"
},
{
"mask_ids": [
2
],
"txt_desc": "gravel ground"
},
{
"mask_ids": [
1
],
"txt_desc": "door"
},
{
"mask_ids": [
3
],
"txt_desc": "pavement"
},
{
"mask_ids": [
5
],
"txt_desc": "round light fixture"
},
{
"mask_ids": [
0
],
"txt_desc": "wall"
},
{
"mask_ids": [
1
],
"txt_desc": "door"
}
],
"labels": [
"wall-other-merged",
"door-stuff",
"gravel",
"pavement-merged",
"rock-merged",
"light",
"bench"
]
} | [
{
"area": 162833,
"bbox": [
0,
0,
426,
562
],
"category_id": 199,
"id": 14594,
"image_id": "000000479504",
"iscrowd": 0,
"segmentation": {
"counts": "0`<o0fDK9X2Q;QNbDH<W2R;QNbDH=V2Q;RNbDH>U2P;SNbDH>U2P;SNbDH>U2P;SNbDH>U2P;SNbDH>U2P;SNbDH>U2P;SNbDH=V2Q;R... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000479608 | 000000479608.jpg | {
"data_source": "COCONut",
"file_name": "000000479608.jpg",
"height": 480,
"id": "000000479608",
"width": 640
} | {
"caption": "The image contains a large toasted dark bread sandwich filled with layers of corned beef and sauerkraut on a white plate with a pile of bright orange-coloured potato fries. Resting on the plate next to the sandwich and fries are a silver fork on the right, a silver knife on the left, and a silver spoon on top of the knife. The entire meal is set on a dark brown wooden table.",
"caption_ann": "The image contains a <2:large toasted dark bread sandwich filled with layers of corned beef and sauerkraut> on a <0:white plate> with a pile of <1:bright orange-coloured potato fries>. Resting on the plate next to the <2:sandwich> and <1:fries> are a <3:silver fork> on the right, a <5:silver knife> on the left, and a <4:silver spoon> on top of the <5:knife>. The entire meal is set on a <6:dark brown wooden table>.",
"id": 1252,
"image_id": "000000479608",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "large toasted dark bread sandwich filled with layers of corned beef and sauerkraut"
},
{
"mask_ids": [
0
],
"txt_desc": "white plate"
},
{
"mask_ids": [
1
],
"txt_desc": "bright orange-coloured potato fries"
},
{
"mask_ids": [
2
],
"txt_desc": "sandwich"
},
{
"mask_ids": [
1
],
"txt_desc": "fries"
},
{
"mask_ids": [
3
],
"txt_desc": "silver fork"
},
{
"mask_ids": [
5
],
"txt_desc": "silver knife"
},
{
"mask_ids": [
4
],
"txt_desc": "silver spoon"
},
{
"mask_ids": [
5
],
"txt_desc": "knife"
},
{
"mask_ids": [
6
],
"txt_desc": "dark brown wooden table"
}
],
"labels": [
"table-merged",
"food-other-merged",
"sandwich",
"fork",
"spoon",
"knife",
"dining table"
]
} | [
{
"area": 58697,
"bbox": [
0,
0,
640,
480
],
"category_id": 189,
"id": 14601,
"image_id": "000000479608",
"iscrowd": 0,
"segmentation": {
"counts": "o>1o>0000000000000000000000000000000000000000000000000bI`0\\M@_2V1PMjNl2^1PMbNl2f1PMZNk2o1QMQNk2V2RMjMi2]... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000482080 | 000000482080.jpg | {
"data_source": "COCONut",
"file_name": "000000482080.jpg",
"height": 480,
"id": "000000482080",
"width": 640
} | {
"caption": "The image displays a man in a brown suit and pink shirt with a colourful patterned tie while standing and speaking, with his hands in his pockets. Behind the man is a red wall with a ceiling and a fluorescent light visible above. On the left side of the image, a large, rectangular TV screen is mounted on the wall, displaying an email inbox. On the right side of the image, a smaller monitor is placed, and an white keyboard can be seen next to the purple office chair.",
"caption_ann": "The image displays a <6:man in a brown suit and pink shirt> with a <7:colourful patterned tie> while standing and speaking, with his hands in his pockets. Behind the <6:man> is a <0:red wall> with a <1:ceiling> and a <2:fluorescent light> visible above. On the left side of the image, a <3:large, rectangular TV screen> is mounted on the <0:wall>, displaying an email inbox. On the right side of the image, a smaller <4:monitor> is placed, and an <8:white keyboard> can be seen next to the <5:purple office chair>.",
"id": 1253,
"image_id": "000000482080",
"label_matched": [
{
"mask_ids": [
6
],
"txt_desc": "man in a brown suit and pink shirt"
},
{
"mask_ids": [
7
],
"txt_desc": "colourful patterned tie"
},
{
"mask_ids": [
6
],
"txt_desc": "man"
},
{
"mask_ids": [
0
],
"txt_desc": "red wall"
},
{
"mask_ids": [
1
],
"txt_desc": "ceiling"
},
{
"mask_ids": [
2
],
"txt_desc": "fluorescent light"
},
{
"mask_ids": [
3
],
"txt_desc": "large, rectangular TV screen"
},
{
"mask_ids": [
0
],
"txt_desc": "wall"
},
{
"mask_ids": [
4
],
"txt_desc": "monitor"
},
{
"mask_ids": [
8
],
"txt_desc": "white keyboard"
},
{
"mask_ids": [
5
],
"txt_desc": "purple office chair"
}
],
"labels": [
"wall-other-merged",
"ceiling-merged",
"light",
"tv",
"tv",
"chair",
"person",
"tie",
"keyboard"
]
} | [
{
"area": 139185,
"bbox": [
0,
0,
640,
480
],
"category_id": 199,
"id": 14608,
"image_id": "000000482080",
"iscrowd": 0,
"segmentation": {
"counts": "0a1[9T401O0000001O0000001O000000001O0000001O0000001O0000001O0000001O0000001O0000001O000000001O0000001N10... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000483049 | 000000483049.jpg | {
"data_source": "COCONut",
"file_name": "000000483049.jpg",
"height": 500,
"id": "000000483049",
"width": 371
} | {
"caption": "The image displays a young boy with blond hair and a red and grey puffy jacket over a black t-shirt and blue jeans standing on a skateboard that has a light-coloured deck and wheels. While skateboarding on a paved path, he is in motion, with his body bent forward. In the background, green grass and patches of white snow can be seen.",
"caption_ann": "The image displays a <3:young boy with blond hair and a red and grey puffy jacket over a black t-shirt and blue jeans> standing on a <4:skateboard> that has a light-coloured deck and wheels. While skateboarding on a <2:paved path>, he is in motion, with his body bent forward. In the background, <0:green grass> and patches of <1:white snow> can be seen.",
"id": 1254,
"image_id": "000000483049",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "young boy with blond hair and a red and grey puffy jacket over a black t-shirt and blue jeans"
},
{
"mask_ids": [
4
],
"txt_desc": "skateboard"
},
{
"mask_ids": [
2
],
"txt_desc": "paved path"
},
{
"mask_ids": [
0
],
"txt_desc": "green grass"
},
{
"mask_ids": [
1
],
"txt_desc": "white snow"
}
],
"labels": [
"grass-merged",
"snow",
"pavement-merged",
"person",
"skateboard"
]
} | [
{
"area": 82677,
"bbox": [
0,
0,
371,
500
],
"category_id": 193,
"id": 14617,
"image_id": "000000483049",
"iscrowd": 0,
"segmentation": {
"counts": "l5Y6X2nJd3R5\\LnJe3Q5[LoJe3Q5[LoJe3Q5[LoJe3Q5[LoJe3Q5[LoJe3Q5[LoJe3Q5\\LnJd3R5\\LnJd3R5\\LoJc3Q5\\LQKc3o4... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000483476 | 000000483476.jpg | {
"data_source": "COCONut",
"file_name": "000000483476.jpg",
"height": 640,
"id": "000000483476",
"width": 426
} | {
"caption": "The image displays a long-haired man with a beard and a woman with a white top and blue jeans who are shown with their mouths wide open, seemingly screaming or singing while holding a small white hair dryer that has a long coiled white power cord as if it were a microphone. They are sanding next to the light-coloured wall, while a white curtain and a tiled wall can be seen in the background. The floor is also visible at the bottom of the image.",
"caption_ann": "The image displays a <5:long-haired man with a beard> and a <4:woman with a white top and blue jeans> who are shown with their mouths wide open, seemingly screaming or singing while holding a <6:small white hair dryer that has a long coiled white power cord> as if it were a microphone. They are sanding next to the <1:light-coloured wall>, while a <3:white curtain> and a <2:tiled wall> can be seen in the background. The <0:floor> is also visible at the bottom of the image.",
"id": 1255,
"image_id": "000000483476",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "long-haired man with a beard"
},
{
"mask_ids": [
4
],
"txt_desc": "woman with a white top and blue jeans"
},
{
"mask_ids": [
6
],
"txt_desc": "small white hair dryer that has a long coiled white power cord"
},
{
"mask_ids": [
1
],
"txt_desc": "light-coloured wall"
},
{
"mask_ids": [
3
],
"txt_desc": "white curtain"
},
{
"mask_ids": [
2
],
"txt_desc": "tiled wall"
},
{
"mask_ids": [
0
],
"txt_desc": "floor"
}
],
"labels": [
"floor-other-merged",
"wall-other-merged",
"wall-tile",
"curtain",
"person",
"person",
"hair drier"
]
} | [
{
"area": 2069,
"bbox": [
194,
546,
40,
94
],
"category_id": 190,
"id": 14622,
"image_id": "000000483476",
"iscrowd": 0,
"segmentation": {
"counts": "iki31ic0:C;H6L4L4N2M3M3L4J6L4J6H8K5L4H8N2O1ONo^OWMQa0f2R_OZMn`0d2T_O\\Ml`0a2W_O_Mh`0_2[_OaMe`0]2]_OcMd`0... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000483496 | 000000483496.jpg | {
"data_source": "COCONut",
"file_name": "000000483496.jpg",
"height": 480,
"id": "000000483496",
"width": 640
} | {
"caption": "The image showcases a smiling woman wearing a black tank top, light-colored shorts, and a traditional conical hat sitting in the center of an uncovered wooden boat on a wide, muddy river. The lush green tree-lined bank is visible in the background, with the river curving away to the right. Another small boat is visible on the far right of the image under a bright blue sky with scattered white clouds.",
"caption_ann": "The image showcases a <3:smiling woman wearing a black tank top, light-colored shorts, and a traditional conical hat> sitting in the center of an <4:uncovered wooden boat> on a <2:wide, muddy river>. The <1:lush green tree-lined bank> is visible in the background, with the <2:river> curving away to the right. Another <5:small boat> is visible on the far right of the image under a <0:bright blue sky with scattered white clouds>.",
"id": 1256,
"image_id": "000000483496",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "smiling woman wearing a black tank top, light-colored shorts, and a traditional conical hat"
},
{
"mask_ids": [
4
],
"txt_desc": "uncovered wooden boat"
},
{
"mask_ids": [
2
],
"txt_desc": "wide, muddy river"
},
{
"mask_ids": [
1
],
"txt_desc": "lush green tree-lined bank"
},
{
"mask_ids": [
2
],
"txt_desc": "river"
},
{
"mask_ids": [
5
],
"txt_desc": "small boat"
},
{
"mask_ids": [
0
],
"txt_desc": "bright blue sky with scattered white clouds"
}
],
"labels": [
"sky-other-merged",
"tree-merged",
"river",
"person",
"boat",
"boat"
]
} | [
{
"area": 90184,
"bbox": [
0,
0,
640,
240
],
"category_id": 187,
"id": 14629,
"image_id": "000000483496",
"iscrowd": 0,
"segmentation": {
"counts": "0Z2f<0000000000000000000000000000001O2N3M2N2N002N001O000000000000001O001^DaMa:`2XEWMB`0S;Z2VEmMi:S2QETNn:... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000483871 | 000000483871.jpg | {
"data_source": "COCONut",
"file_name": "000000483871.jpg",
"height": 427,
"id": "000000483871",
"width": 640
} | {
"caption": "The image shows a brown, black, and white Beagle lying on a beige carpet. The dog is facing a mirror with a dark frame, where its reflection is clearly visible, creating the impression of two identical dogs staring at each other. ",
"caption_ann": "The image shows a <2:brown, black, and white Beagle> lying on a <0:beige carpet>. The <2:dog> is facing a <1:mirror with a dark frame>, where its <3:reflection> is clearly visible, creating the impression of two <2,3:identical dogs> staring at each other. ",
"id": 1257,
"image_id": "000000483871",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "brown, black, and white Beagle"
},
{
"mask_ids": [
0
],
"txt_desc": "beige carpet"
},
{
"mask_ids": [
2
],
"txt_desc": "dog"
},
{
"mask_ids": [
1
],
"txt_desc": "mirror with a dark frame"
},
{
"mask_ids": [
3
],
"txt_desc": "reflection"
},
{
"mask_ids": [
2,
3
],
"txt_desc": "identical dogs"
}
],
"labels": [
"rug-merged",
"mirror-stuff",
"dog",
"dog"
]
} | [
{
"area": 110213,
"bbox": [
278,
0,
362,
427
],
"category_id": 200,
"id": 14635,
"image_id": "000000483871",
"iscrowd": 0,
"segmentation": {
"counts": "eZd38k<8J6K5L4E;I7H8K5L4L4I7I7J6J6J6G9H8L4L4J6I7G9J6L4K5J6E;F:L4L4N2J6C=K5M3F:I7J6M3K5I7H8J6K5J6H8J6I7... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000484362 | 000000484362.jpg | {
"data_source": "COCONut",
"file_name": "000000484362.jpg",
"height": 640,
"id": "000000484362",
"width": 426
} | {
"caption": "The image shows a group of people playing a game on a dirt-covered field. A man in a light gray shirt in a throwing stance is holding a frisbee while a player in a dark brown shirt and olive-green cargo pants is defending closely in front of him. In the background, two players with a white tshirt and a younger boy dressed in grey are either watching or moving into position. The setting is outdoors, surrounded by tall green trees under a partly visible sky.",
"caption_ann": "The image shows a <3,4,6,7,8:group of people> playing a game on a <2:dirt-covered field>. A <6:man in a light gray shirt in a throwing stance> is holding a <5:frisbee> while a <3:player in a dark brown shirt and olive-green cargo pants> is defending closely in front of him. In the background, <4,7:two players with a white tshirt> and a <8:younger boy dressed in grey> are either watching or moving into position. The setting is outdoors, surrounded by <1:tall green trees> under a <0:partly visible sky>.",
"id": 1258,
"image_id": "000000484362",
"label_matched": [
{
"mask_ids": [
3,
4,
6,
7,
8
],
"txt_desc": "group of people"
},
{
"mask_ids": [
2
],
"txt_desc": "dirt-covered field"
},
{
"mask_ids": [
6
],
"txt_desc": "man in a light gray shirt in a throwing stance"
},
{
"mask_ids": [
5
],
"txt_desc": "frisbee"
},
{
"mask_ids": [
3
],
"txt_desc": "player in a dark brown shirt and olive-green cargo pants"
},
{
"mask_ids": [
4,
7
],
"txt_desc": "two players with a white tshirt"
},
{
"mask_ids": [
8
],
"txt_desc": "younger boy dressed in grey"
},
{
"mask_ids": [
1
],
"txt_desc": "tall green trees"
},
{
"mask_ids": [
0
],
"txt_desc": "partly visible sky"
}
],
"labels": [
"sky-other-merged",
"tree-merged",
"dirt-merged",
"person",
"person",
"frisbee",
"person",
"person",
"person"
]
} | [
{
"area": 1354,
"bbox": [
167,
0,
113,
41
],
"category_id": 187,
"id": 14639,
"image_id": "000000484362",
"iscrowd": 0,
"segmentation": {
"counts": "P\\X36jc02N1O0000001O1O1ON2001O2N1O1ODb\\O6^c0Hc\\O9]c0Gd\\O8bc00002K`T35\\kL3N003_\\OC[c0b0O0O110O0_OCX]... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000485561 | 000000485561.jpg | {
"data_source": "COCONut",
"file_name": "000000485561.jpg",
"height": 425,
"id": "000000485561",
"width": 640
} | {
"caption": "The image shows a man in a grey t-shirt and black shorts performing an airborne trick, suspended in the light blue sky while holding onto a kitesurfing bar with his body angled slightly sideways and a dark and yellow surfboard attached to his feet.",
"caption_ann": "The image shows a <1:man in a grey t-shirt and black shorts> performing an airborne trick, suspended in the <0:light blue sky> while holding onto a kitesurfing bar with his body angled slightly sideways and a <2:dark and yellow surfboard> attached to his feet.",
"id": 1259,
"image_id": "000000485561",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "man in a grey t-shirt and black shorts"
},
{
"mask_ids": [
0
],
"txt_desc": "light blue sky"
},
{
"mask_ids": [
2
],
"txt_desc": "dark and yellow surfboard"
}
],
"labels": [
"sky-other-merged",
"person",
"surfboard"
]
} | [
{
"area": 263038,
"bbox": [
0,
0,
640,
425
],
"category_id": 187,
"id": 14648,
"image_id": "000000485561",
"iscrowd": 0,
"segmentation": {
"counts": "0]Xh21jTXM2C5YCM\\<f000000O1O1O10000000000001O000000001O000000000O100000000O101O0O1N102M2NZOlC:R<`0N2N2N... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000485632 | 000000485632.jpg | {
"data_source": "COCONut",
"file_name": "000000485632.jpg",
"height": 375,
"id": "000000485632",
"width": 500
} | {
"caption": "The image displays the side of a suburban road, two white porcelain toilets placed on the edge of the concrete pavement. In the background, a patch of green grass with a tree can be seen. A small black car is visible in a driveway on the left, in front of the garage, and a maroon pickup truck is parked on the street on the right. The sky is a hazy blue with the sun casting a glare on the road and pavement.",
"caption_ann": "The image displays the side of a <1:suburban road>, <6,7:two white porcelain toilets> placed on the edge of the <5:concrete pavement>. In the background, a <4:patch of green grass> with a <3:tree> can be seen. A <8:small black car> is visible in a driveway on the left, in front of the <2:garage>, and a <9:maroon pickup truck> is parked on the street on the right. The <0:sky> is a hazy blue with the sun casting a glare on the <1:road> and <5:pavement>.",
"id": 1260,
"image_id": "000000485632",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "suburban road"
},
{
"mask_ids": [
6,
7
],
"txt_desc": "two white porcelain toilets"
},
{
"mask_ids": [
5
],
"txt_desc": "concrete pavement"
},
{
"mask_ids": [
4
],
"txt_desc": "patch of green grass"
},
{
"mask_ids": [
3
],
"txt_desc": "tree"
},
{
"mask_ids": [
8
],
"txt_desc": "small black car"
},
{
"mask_ids": [
2
],
"txt_desc": "garage"
},
{
"mask_ids": [
9
],
"txt_desc": "maroon pickup truck"
},
{
"mask_ids": [
0
],
"txt_desc": "sky"
},
{
"mask_ids": [
1
],
"txt_desc": "road"
},
{
"mask_ids": [
5
],
"txt_desc": "pavement"
}
],
"labels": [
"sky-other-merged",
"road",
"house",
"tree-merged",
"grass-merged",
"pavement-merged",
"toilet",
"toilet",
"car",
"truck"
]
} | [
{
"area": 5893,
"bbox": [
71,
0,
429,
99
],
"category_id": 187,
"id": 14651,
"image_id": "000000485632",
"iscrowd": 0,
"segmentation": {
"counts": "TPj01e;2O01Ng;3UDM`D3`;M`D6];KcD7Z;301O00000000O11O1O1OOjDCT;`00K5O1O10000O11O000000HdDO01_;0cDO^;0aD1bn^3... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000485789 | 000000485789.jpg | {
"data_source": "COCONut",
"file_name": "000000485789.jpg",
"height": 640,
"id": "000000485789",
"width": 480
} | {
"caption": "A young woman wearing a black tank top and a short skirt is standing in a living room, smiling at the camera, gesturing with her thumb up and index finger pointed out while brushing her teeth with a pink toothbrush. Behind her, a potted plant with large green leaves is placed between a brown armchair and an large, light brown leather couch on a brown rug. A dark brown football is placed on the brown armchair, next to a white wall. A part of the ceiling and a white window blind are partially visible in the top left corner.",
"caption_ann": "A <4:young woman wearing a black tank top and a short skirt> is standing in a living room, smiling at the camera, gesturing with her thumb up and index finger pointed out while brushing her teeth with a <5:pink toothbrush>. Behind her, a <9:potted plant with large green leaves> is placed between a <7:brown armchair> and an <8:large, light brown leather couch> on a <0:brown rug>. A <6:dark brown football> is placed on the <7:brown armchair>, next to a <1:white wall>. A part of the <2:ceiling> and a <3:white window blind> are partially visible in the top left corner.",
"id": 1261,
"image_id": "000000485789",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "young woman wearing a black tank top and a short skirt"
},
{
"mask_ids": [
5
],
"txt_desc": "pink toothbrush"
},
{
"mask_ids": [
9
],
"txt_desc": "potted plant with large green leaves"
},
{
"mask_ids": [
7
],
"txt_desc": "brown armchair"
},
{
"mask_ids": [
8
],
"txt_desc": "large, light brown leather couch"
},
{
"mask_ids": [
0
],
"txt_desc": "brown rug"
},
{
"mask_ids": [
6
],
"txt_desc": "dark brown football"
},
{
"mask_ids": [
7
],
"txt_desc": "brown armchair"
},
{
"mask_ids": [
1
],
"txt_desc": "white wall"
},
{
"mask_ids": [
2
],
"txt_desc": "ceiling"
},
{
"mask_ids": [
3
],
"txt_desc": "white window blind"
}
],
"labels": [
"rug-merged",
"wall-other-merged",
"ceiling-merged",
"window-blind",
"person",
"toothbrush",
"sports ball",
"chair",
"couch",
"potted plant"
]
} | [
{
"area": 5911,
"bbox": [
0,
557,
471,
83
],
"category_id": 200,
"id": 14661,
"image_id": "000000485789",
"iscrowd": 0,
"segmentation": {
"counts": "]a0c2]a001O1O1O001O1O1O1O1O1O1O1O2N1O1O1O1O1O001O1O1O001O1O001O1O1O00O10000000000O100O10000O1O1000000O100... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000486066 | 000000486066.jpg | {
"data_source": "COCONut",
"file_name": "000000486066.jpg",
"height": 480,
"id": "000000486066",
"width": 640
} | {
"caption": "A large collection of three backpacks and two trolley bags with a handbag are piled together on a light-coloured concrete ground, close to a light colored wall. In the background, a person's lower body are partially visible, as they stand on the top right corner of the image.",
"caption_ann": "A large collection of <3,4,5:three backpacks> and <6,7:two trolley bags> with a <8:handbag> are piled together on a <0:light-coloured concrete ground>, close to a <1:light colored wall>. In the background, a <2:person's lower body> are partially visible, as they stand on the top right corner of the image.",
"id": 1262,
"image_id": "000000486066",
"label_matched": [
{
"mask_ids": [
3,
4,
5
],
"txt_desc": "three backpacks"
},
{
"mask_ids": [
6,
7
],
"txt_desc": "two trolley bags"
},
{
"mask_ids": [
8
],
"txt_desc": "handbag"
},
{
"mask_ids": [
0
],
"txt_desc": "light-coloured concrete ground"
},
{
"mask_ids": [
1
],
"txt_desc": "light colored wall"
},
{
"mask_ids": [
2
],
"txt_desc": "person's lower body"
}
],
"labels": [
"floor-other-merged",
"wall-other-merged",
"person",
"backpack",
"backpack",
"backpack",
"suitcase",
"suitcase",
"handbag"
]
} | [
{
"area": 87354,
"bbox": [
0,
0,
640,
480
],
"category_id": 190,
"id": 14671,
"image_id": "000000486066",
"iscrowd": 0,
"segmentation": {
"counts": "W6i8W60001O0000001O00001O0000001O0000001O00001O001O00001O001O001O0bNiISJX6l5jIRJV6n5jIRJV6n5jIRJW6l5kISJU... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000486564 | 000000486564.jpg | {
"data_source": "COCONut",
"file_name": "000000486564.jpg",
"height": 360,
"id": "000000486564",
"width": 640
} | {
"caption": "A seagull is standing on top of a large, smooth, round rock in the foreground, with several other rocks visible in the sea around it. The sea is moving with small waves, and the wet sand in the foreground reflects the pale colours of the sky. In the background, a low sea mountain is visible on the horizon, with the sea stretching out to meet it.",
"caption_ann": "A <4:seagull> is standing on top of a <2:large, smooth, round rock> in the foreground, with several other <2:rocks> visible in the <1:sea> around it. The <1:sea> is moving with small waves, and the wet sand in the foreground reflects the pale colours of the <0:sky>. In the background, a <3:low sea mountain> is visible on the horizon, with the <1:sea> stretching out to meet it.",
"id": 1263,
"image_id": "000000486564",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "seagull"
},
{
"mask_ids": [
2
],
"txt_desc": "large, smooth, round rock"
},
{
"mask_ids": [
2
],
"txt_desc": "rocks"
},
{
"mask_ids": [
1
],
"txt_desc": "sea"
},
{
"mask_ids": [
1
],
"txt_desc": "sea"
},
{
"mask_ids": [
0
],
"txt_desc": "sky"
},
{
"mask_ids": [
3
],
"txt_desc": "low sea mountain"
},
{
"mask_ids": [
1
],
"txt_desc": "sea"
}
],
"labels": [
"sky-other-merged",
"sea",
"rock-merged",
"mountain-merged",
"bird"
]
} | [
{
"area": 51674,
"bbox": [
0,
0,
640,
97
],
"category_id": 187,
"id": 14680,
"image_id": "000000486564",
"iscrowd": 0,
"segmentation": {
"counts": "0a2g8000000000000000000000000000000O10000000000000000000000000000000000000000000000000000000000O1000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000486793 | 000000486793.jpg | {
"data_source": "COCONut",
"file_name": "000000486793.jpg",
"height": 481,
"id": "000000486793",
"width": 640
} | {
"caption": "A large, dark-brown cow walks along a dark, pebbly beach, followed closely by a brown calf nestled next to it and another light-brown calf. In the foreground, there is a patch of dry, brownish-yellow grass. In the background, the dark blue sea stretches out towards a small, rocky island in the distance under an overcast, greyish-blue sky.",
"caption_ann": "A <7:large, dark-brown cow> walks along a <1:dark, pebbly beach>, followed closely by a <6:brown calf nestled next to it> and another <5:light-brown calf>. In the foreground, there is a patch of <2:dry, brownish-yellow grass>. In the background, the <3:dark blue sea> stretches out towards a <4:small, rocky island in the distance> under an <0:overcast, greyish-blue sky>.",
"id": 1264,
"image_id": "000000486793",
"label_matched": [
{
"mask_ids": [
7
],
"txt_desc": "large, dark-brown cow"
},
{
"mask_ids": [
1
],
"txt_desc": "dark, pebbly beach"
},
{
"mask_ids": [
6
],
"txt_desc": "brown calf nestled next to it"
},
{
"mask_ids": [
5
],
"txt_desc": "light-brown calf"
},
{
"mask_ids": [
2
],
"txt_desc": "dry, brownish-yellow grass"
},
{
"mask_ids": [
3
],
"txt_desc": "dark blue sea"
},
{
"mask_ids": [
4
],
"txt_desc": "small, rocky island in the distance"
},
{
"mask_ids": [
0
],
"txt_desc": "overcast, greyish-blue sky"
}
],
"labels": [
"sky-other-merged",
"rock-merged",
"grass-merged",
"sea",
"mountain-merged",
"cow",
"cow",
"cow"
]
} | [
{
"area": 125537,
"bbox": [
0,
0,
640,
201
],
"category_id": 187,
"id": 14685,
"image_id": "000000486793",
"iscrowd": 0,
"segmentation": {
"counts": "0U6l800000000000000000000O100001O000000001O1O00000000001O0000001O00000000000000000000000000000000000000O... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000487502 | 000000487502.jpg | {
"data_source": "COCONut",
"file_name": "000000487502.jpg",
"height": 333,
"id": "000000487502",
"width": 500
} | {
"caption": "A person with white hair wearing a pink visor and a light-colored jacket stands on a gray gray road, looking at two men in sailors' uniforms. The man in a white sailor uniform walks alongside a man in a white sailor uniform who is on the phone, as they cross a black road. To the right, a man with short gray hair wearing a plaid shirt and a dark blue garment tied around his waist. In the background, a gray SUV is parked in front of a pavement near a building with a beige and blue painted wall, next to a white truck and the white partially visible truck. Above the two trucks, the leaves of a green tree are visible.",
"caption_ann": "A <4:person with white hair wearing a pink visor and a light-colored jacket> stands on a gray <0:gray road>, looking at <5,7:two men in sailors' uniforms>. The <5:man in a white sailor uniform> walks alongside a <7:man in a white sailor uniform> who is on the phone, as they cross a black <0:road>. To the right, a <6:man with short gray hair wearing a plaid shirt and a dark blue garment tied around his waist>. In the background, a <8:gray SUV> is parked in front of a <3:pavement> near a <1:building with a beige and blue painted wall>, next to a <9:white truck> and the <10:white partially visible truck>. Above the <9,10:two trucks>, <2:the leaves of a green tree> are visible.",
"id": 1265,
"image_id": "000000487502",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "person with white hair wearing a pink visor and a light-colored jacket"
},
{
"mask_ids": [
0
],
"txt_desc": "gray road"
},
{
"mask_ids": [
5,
7
],
"txt_desc": "two men in sailors' uniforms"
},
{
"mask_ids": [
5
],
"txt_desc": "man in a white sailor uniform"
},
{
"mask_ids": [
7
],
"txt_desc": "man in a white sailor uniform"
},
{
"mask_ids": [
0
],
"txt_desc": "road"
},
{
"mask_ids": [
6
],
"txt_desc": "man with short gray hair wearing a plaid shirt and a dark blue garment tied around his waist"
},
{
"mask_ids": [
8
],
"txt_desc": "gray SUV"
},
{
"mask_ids": [
3
],
"txt_desc": "pavement"
},
{
"mask_ids": [
1
],
"txt_desc": "building with a beige and blue painted wall"
},
{
"mask_ids": [
9
],
"txt_desc": "white truck"
},
{
"mask_ids": [
10
],
"txt_desc": "white partially visible truck"
},
{
"mask_ids": [
9,
10
],
"txt_desc": "two trucks"
},
{
"mask_ids": [
2
],
"txt_desc": "the leaves of a green tree"
}
],
"labels": [
"road",
"building-other-merged",
"tree-merged",
"pavement-merged",
"person",
"person",
"person",
"person",
"car",
"truck",
"truck"
]
} | [
{
"area": 11872,
"bbox": [
141,
209,
266,
124
],
"category_id": 149,
"id": 14693,
"image_id": "000000487502",
"iscrowd": 0,
"segmentation": {
"counts": "RS^19U:5K5K4K8H4L7I7I9G5K5K5K4L4L0O1O100O1H8J6M3O1O10000002N1O000000000000O1000000000000001O000000001... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000489799 | 000000489799.jpg | {
"data_source": "COCONut",
"file_name": "000000489799.jpg",
"height": 425,
"id": "000000489799",
"width": 640
} | {
"caption": "On a green grass soccer field, a goalkeeper in a yellow jersey and black shorts leaps to save a white soccer ball in mid-air, as a player in a white jersey and blue shorts runs nearby and another player in a black jersey with the number 33 looks on. To the left is a white soccer goal, and in the background, a low metal fence and a mowed strip of grass separate the field from a large house with a dark roof and dark green foliage of several trees under an overcast sky.",
"caption_ann": "On a <3:green grass soccer field>, a <9:goalkeeper in a yellow jersey and black shorts> leaps to save a <7:white soccer ball in mid-air>, as a <10:player in a white jersey and blue shorts> runs nearby and another <8:player in a black jersey with the number 33> looks on. To the left is a <5:white soccer goal>, and in the background, a <6:low metal fence> and a <4:mowed strip of grass> separate the field from a <1:large house with a dark roof> and <2:dark green foliage of several trees> under an <0:overcast sky>.",
"id": 1266,
"image_id": "000000489799",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "green grass soccer field"
},
{
"mask_ids": [
9
],
"txt_desc": "goalkeeper in a yellow jersey and black shorts"
},
{
"mask_ids": [
7
],
"txt_desc": "white soccer ball in mid-air"
},
{
"mask_ids": [
10
],
"txt_desc": "player in a white jersey and blue shorts"
},
{
"mask_ids": [
8
],
"txt_desc": "player in a black jersey with the number 33"
},
{
"mask_ids": [
5
],
"txt_desc": "white soccer goal"
},
{
"mask_ids": [
6
],
"txt_desc": "low metal fence"
},
{
"mask_ids": [
4
],
"txt_desc": "mowed strip of grass"
},
{
"mask_ids": [
1
],
"txt_desc": "large house with a dark roof"
},
{
"mask_ids": [
2
],
"txt_desc": "dark green foliage of several trees"
},
{
"mask_ids": [
0
],
"txt_desc": "overcast sky"
}
],
"labels": [
"sky-other-merged",
"house",
"tree-merged",
"grass-merged",
"playingfield",
"net",
"fence-merged",
"sports ball",
"person",
"person",
"person"
]
} | [
{
"area": 22712,
"bbox": [
159,
0,
481,
217
],
"category_id": 187,
"id": 14704,
"image_id": "000000489799",
"iscrowd": 0,
"segmentation": {
"counts": "RVR2130Q=0:0K0O11O00O0g_20cYM12ON21NO30M04OL16MJ36MJ37LI47LI47LI48cIHR60;8KH58KH58KH57LI41_I330Q6Lb03[I... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000494690 | 000000494690.jpg | {
"data_source": "COCONut",
"file_name": "000000494690.jpg",
"height": 480,
"id": "000000494690",
"width": 640
} | {
"caption": "A large black pickup truck is parked on the dirt and pine needle-covered ground in a sunlit forest with tall trees. In the bed of the truck sits a blue four-wheeled ATV, and the truck is towing a dual-axle flatbed trailer that holds a yellow and black utility vehicle with a roof in front of a red all-terrain vehicle.",
"caption_ann": "A <3:large black pickup truck> is parked on the <1:dirt and pine needle-covered ground> in a <0:sunlit forest with tall trees>. In the bed of the truck sits a <2:blue four-wheeled ATV>, and the truck is towing a <6:dual-axle flatbed trailer> that holds a <4:yellow and black utility vehicle with a roof> in front of a <5:red all-terrain vehicle>.",
"id": 1267,
"image_id": "000000494690",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "large black pickup truck"
},
{
"mask_ids": [
1
],
"txt_desc": "dirt and pine needle-covered ground"
},
{
"mask_ids": [
0
],
"txt_desc": "sunlit forest with tall trees"
},
{
"mask_ids": [
2
],
"txt_desc": "blue four-wheeled ATV"
},
{
"mask_ids": [
6
],
"txt_desc": "dual-axle flatbed trailer"
},
{
"mask_ids": [
4
],
"txt_desc": "yellow and black utility vehicle with a roof"
},
{
"mask_ids": [
5
],
"txt_desc": "red all-terrain vehicle"
}
],
"labels": [
"tree-merged",
"dirt-merged",
"motorcycle",
"truck",
"motorcycle",
"motorcycle",
"truck"
]
} | [
{
"area": 147472,
"bbox": [
0,
0,
640,
306
],
"category_id": 184,
"id": 14715,
"image_id": "000000494690",
"iscrowd": 0,
"segmentation": {
"counts": "0`9`5000000000000000000001O1O00O100O10000nNcJ`H^5]7fJYHc5c7bJTHf5i7]JVHd5g7`JXH`5f7bJZH^5f7bJXH`5h7`JWHa... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000496239 | 000000496239.jpg | {
"data_source": "COCONut",
"file_name": "000000496239.jpg",
"height": 428,
"id": "000000496239",
"width": 640
} | {
"caption": "A young girl with light brown hair wearing a pink floral dress stands holding a white rectangular remote in front of a dark brown fabric couch. In the background, a large window is covered by a white horizontal window blind, and a dark red curtain is visible on the right.",
"caption_ann": "A <3:young girl with light brown hair wearing a pink floral dress> stands holding a <4:white rectangular remote> in front of a <2:dark brown fabric couch>. In the background, a large window is covered by a <0:white horizontal window blind>, and a <1:dark red curtain> is visible on the right.",
"id": 1268,
"image_id": "000000496239",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "young girl with light brown hair wearing a pink floral dress"
},
{
"mask_ids": [
4
],
"txt_desc": "white rectangular remote"
},
{
"mask_ids": [
2
],
"txt_desc": "dark brown fabric couch"
},
{
"mask_ids": [
0
],
"txt_desc": "white horizontal window blind"
},
{
"mask_ids": [
1
],
"txt_desc": "dark red curtain"
}
],
"labels": [
"window-blind",
"curtain",
"couch",
"person",
"remote"
]
} | [
{
"area": 107345,
"bbox": [
0,
0,
555,
260
],
"category_id": 180,
"id": 14722,
"image_id": "000000496239",
"iscrowd": 0,
"segmentation": {
"counts": "0T8X50000000000000000000000000000000000000000000000O10000000000O10000O1000000O10000O1O1O100O10000O10000O... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000496373 | 000000496373.jpg | {
"data_source": "COCONut",
"file_name": "000000496373.jpg",
"height": 426,
"id": "000000496373",
"width": 640
} | {
"caption": "A person wearing a yellow and white plaid shirt holds a large, multi-colored umbrella while looking out at a large cruise ship on the calm, grey sea under a foggy grey sky.",
"caption_ann": "A <3:person wearing a yellow and white plaid shirt> holds a <2:large, multi-colored umbrella> while looking out at a <4:large cruise ship> on the <1:calm, grey sea> under a <0:foggy grey sky>.",
"id": 1269,
"image_id": "000000496373",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "person wearing a yellow and white plaid shirt"
},
{
"mask_ids": [
2
],
"txt_desc": "large, multi-colored umbrella"
},
{
"mask_ids": [
4
],
"txt_desc": "large cruise ship"
},
{
"mask_ids": [
1
],
"txt_desc": "calm, grey sea"
},
{
"mask_ids": [
0
],
"txt_desc": "foggy grey sky"
}
],
"labels": [
"sky-other-merged",
"sea",
"umbrella",
"person",
"boat"
]
} | [
{
"area": 113505,
"bbox": [
0,
0,
640,
300
],
"category_id": 187,
"id": 14727,
"image_id": "000000496373",
"iscrowd": 0,
"segmentation": {
"counts": "0l8^40000001O00O100000000O1000000000000000000000000000000O10000000000O1000000000000001O00000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000498443 | 000000498443.jpg | {
"data_source": "COCONut",
"file_name": "000000498443.jpg",
"height": 480,
"id": "000000498443",
"width": 640
} | {
"caption": "A white dalmatian with black spots and a white dog with brown patches walk through the shallow sea under a partly cloudy sky.",
"caption_ann": "A <3:white dalmatian with black spots> and a <2:white dog with brown patches> walk through the shallow <1:sea> under a <0:partly cloudy sky>.",
"id": 1270,
"image_id": "000000498443",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "white dalmatian with black spots"
},
{
"mask_ids": [
2
],
"txt_desc": "white dog with brown patches"
},
{
"mask_ids": [
1
],
"txt_desc": "sea"
},
{
"mask_ids": [
0
],
"txt_desc": "partly cloudy sky"
}
],
"labels": [
"sky-other-merged",
"sea",
"dog",
"dog"
]
} | [
{
"area": 75963,
"bbox": [
0,
0,
640,
124
],
"category_id": 187,
"id": 14732,
"image_id": "000000498443",
"iscrowd": 0,
"segmentation": {
"counts": "0d3\\;000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000498555 | 000000498555.jpg | {
"data_source": "COCONut",
"file_name": "000000498555.jpg",
"height": 640,
"id": "000000498555",
"width": 427
} | {
"caption": "On a white windowsill in front of a window frame and a white curtain, a white vase decorated with a painted blue and yellow bird sits next to a blurry pink and white floral vase.",
"caption_ann": "On a <0:white windowsill> in front of a <1:window frame> and a <2:white curtain>, a <3:white vase> decorated with a painted <5:blue and yellow bird> sits next to a <4:blurry pink and white floral vase>.",
"id": 1271,
"image_id": "000000498555",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "white windowsill"
},
{
"mask_ids": [
1
],
"txt_desc": "window frame"
},
{
"mask_ids": [
2
],
"txt_desc": "white curtain"
},
{
"mask_ids": [
3
],
"txt_desc": "white vase"
},
{
"mask_ids": [
5
],
"txt_desc": "blue and yellow bird"
},
{
"mask_ids": [
4
],
"txt_desc": "blurry pink and white floral vase"
}
],
"labels": [
"table-merged",
"window-other",
"curtain",
"vase",
"vase",
"bird"
]
} | [
{
"area": 97519,
"bbox": [
0,
260,
427,
380
],
"category_id": 189,
"id": 14736,
"image_id": "000000498555",
"iscrowd": 0,
"segmentation": {
"counts": "`=`6`=0O100O100O100O100O100O100O100O100O100O100O100O1O100O100O1O100O100O100O100O100O100O100O100O100O1O1... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000498769 | 000000498769.jpg | {
"data_source": "COCONut",
"file_name": "000000498769.jpg",
"height": 640,
"id": "000000498769",
"width": 427
} | {
"caption": "A bighorn sheep stands on rocky, yellowish terrain overlooking a vast, tree-covered mountainside and a deep blue lake, with a large gray rock sitting on the edge of the cliff.",
"caption_ann": "A <4:bighorn sheep> stands on <1:rocky, yellowish terrain> overlooking a <0:vast, tree-covered mountainside> and a <2:deep blue lake>, with a <3:large gray rock> sitting on the edge of the cliff.",
"id": 1272,
"image_id": "000000498769",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "bighorn sheep"
},
{
"mask_ids": [
1
],
"txt_desc": "rocky, yellowish terrain"
},
{
"mask_ids": [
0
],
"txt_desc": "vast, tree-covered mountainside"
},
{
"mask_ids": [
2
],
"txt_desc": "deep blue lake"
},
{
"mask_ids": [
3
],
"txt_desc": "large gray rock"
}
],
"labels": [
"mountain-merged",
"gravel",
"river",
"rock-merged",
"sheep"
]
} | [
{
"area": 204080,
"bbox": [
0,
0,
427,
564
],
"category_id": 192,
"id": 14742,
"image_id": "000000498769",
"iscrowd": 0,
"segmentation": {
"counts": "0e?[400000000000000000000000000001O000000000000000000000000001O000000001O00000000001O00000000O1000000O10... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000499621 | 000000499621.jpg | {
"data_source": "COCONut",
"file_name": "000000499621.jpg",
"height": 418,
"id": "000000499621",
"width": 640
} | {
"caption": "A large red beach umbrella stands in the sand between two empty lounge chairs, with the calm blue sea and a cloudy sky in the background.",
"caption_ann": "A <3:large red beach umbrella> stands in the <1:sand> between <4,5:two empty lounge chairs>, with the <2:calm blue sea> and a <0:cloudy sky> in the background.",
"id": 1273,
"image_id": "000000499621",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "large red beach umbrella"
},
{
"mask_ids": [
1
],
"txt_desc": "sand"
},
{
"mask_ids": [
4,
5
],
"txt_desc": "two empty lounge chairs"
},
{
"mask_ids": [
2
],
"txt_desc": "calm blue sea"
},
{
"mask_ids": [
0
],
"txt_desc": "cloudy sky"
}
],
"labels": [
"sky-other-merged",
"sand",
"sea",
"umbrella",
"chair",
"chair"
]
} | [
{
"area": 120843,
"bbox": [
0,
0,
640,
204
],
"category_id": 187,
"id": 14747,
"image_id": "000000499621",
"iscrowd": 0,
"segmentation": {
"counts": "0\\6f6O100001O0000O100000000000000001OO10000001O0000O11O000000000000000000O11OO11O0000O11O00000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000501085 | 000000501085.jpg | {
"data_source": "COCONut",
"file_name": "000000501085.jpg",
"height": 640,
"id": "000000501085",
"width": 480
} | {
"caption": "A tall, ornate stone clock tower with a round clock face stands on a corner next to a road and pavement, with a tall streetlight to its left and a traffic light and some green trees to its right, all under a cloudy sky.",
"caption_ann": "A <2:tall, ornate stone clock tower> with a <6:round clock face> stands on a corner next to a <1:road> and <4:pavement>, with a <5:tall streetlight> to its left and a <7:traffic light> and some <3:green trees> to its right, all under a <0:cloudy sky>.",
"id": 1274,
"image_id": "000000501085",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "tall, ornate stone clock tower"
},
{
"mask_ids": [
6
],
"txt_desc": "round clock face"
},
{
"mask_ids": [
1
],
"txt_desc": "road"
},
{
"mask_ids": [
4
],
"txt_desc": "pavement"
},
{
"mask_ids": [
5
],
"txt_desc": "tall streetlight"
},
{
"mask_ids": [
7
],
"txt_desc": "traffic light"
},
{
"mask_ids": [
3
],
"txt_desc": "green trees"
},
{
"mask_ids": [
0
],
"txt_desc": "cloudy sky"
}
],
"labels": [
"sky-other-merged",
"road",
"building-other-merged",
"tree-merged",
"pavement-merged",
"light",
"clock",
"traffic light"
]
} | [
{
"area": 166680,
"bbox": [
0,
0,
480,
558
],
"category_id": 187,
"id": 14753,
"image_id": "000000501085",
"iscrowd": 0,
"segmentation": {
"counts": "0k>U51O1O003M2N1O3M00N2O1O100N2O11OO1000000003QKWAa4T?01O001O001O001O001O001O001O0000K5MeKg@U4c?O00001O0... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000501176 | 000000501176.jpg | {
"data_source": "COCONut",
"file_name": "000000501176.jpg",
"height": 333,
"id": "000000501176",
"width": 500
} | {
"caption": "On a paved road next to a concrete boundary wall with a grassy plant on its edge, a shirtless man rides a small blue bicycle as a shirtless young boy runs beside him. A person with a backpack flies a blue and white kite, and another person attends to three black backpacks near the concrete boundary wall. In the background, blue sea and distant green mountains are visible under a hazy sky.",
"caption_ann": "On a <1:paved road> next to a <2:concrete boundary wall> with a <5:grassy plant> on its edge, a <15:shirtless man> rides a <7:small blue bicycle> as a <6:shirtless young boy> runs beside him. A <14:person> with a <9:backpack> flies a <8:blue and white kite>, and another <10:person> attends to <11,12,13:three black backpacks> near the <2:concrete boundary wall>. In the background, <4:blue sea> and <3:distant green mountains> are visible under a <0:hazy sky>.",
"id": 1275,
"image_id": "000000501176",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "paved road"
},
{
"mask_ids": [
2
],
"txt_desc": "concrete boundary wall"
},
{
"mask_ids": [
5
],
"txt_desc": "grassy plant"
},
{
"mask_ids": [
15
],
"txt_desc": "shirtless man"
},
{
"mask_ids": [
7
],
"txt_desc": "small blue bicycle"
},
{
"mask_ids": [
6
],
"txt_desc": "shirtless young boy"
},
{
"mask_ids": [
14
],
"txt_desc": "person"
},
{
"mask_ids": [
9
],
"txt_desc": "backpack"
},
{
"mask_ids": [
8
],
"txt_desc": "blue and white kite"
},
{
"mask_ids": [
10
],
"txt_desc": "person"
},
{
"mask_ids": [
11,
12,
13
],
"txt_desc": "three black backpacks"
},
{
"mask_ids": [
2
],
"txt_desc": "concrete boundary wall"
},
{
"mask_ids": [
4
],
"txt_desc": "blue sea"
},
{
"mask_ids": [
3
],
"txt_desc": "distant green mountains"
},
{
"mask_ids": [
0
],
"txt_desc": "hazy sky"
}
],
"labels": [
"sky-other-merged",
"road",
"wall-other-merged",
"mountain-merged",
"sea",
"tree-merged",
"person",
"bicycle",
"kite",
"backpack",
"person",
"backpack",
"backpack",
"backpack",
"person",
"person"
]
} | [
{
"area": 10191,
"bbox": [
0,
0,
500,
32
],
"category_id": 187,
"id": 14761,
"image_id": "000000501176",
"iscrowd": 0,
"segmentation": {
"counts": "0f0g900O1O10000O100O100O100O100O100O10000000000000000001O001O001O001O1O1O001O1O00001O00001O00001O00001O001... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000501652 | 000000501652.jpg | {
"data_source": "COCONut",
"file_name": "000000501652.jpg",
"height": 480,
"id": "000000501652",
"width": 640
} | {
"caption": "Inside a car, a young child wearing blue fleece clothes sits in a car seat, holding a pink toy cell phone to their ear while looking at a blue-colored toy, with a pink handbag resting beside them.",
"caption_ann": "Inside a <0:car>, a <1:young child wearing blue fleece clothes> sits in a car seat, holding a <2:pink toy cell phone> to their ear while looking at a blue-colored toy, with a <3:pink handbag> resting beside them.",
"id": 1276,
"image_id": "000000501652",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "car"
},
{
"mask_ids": [
1
],
"txt_desc": "young child wearing blue fleece clothes"
},
{
"mask_ids": [
2
],
"txt_desc": "pink toy cell phone"
},
{
"mask_ids": [
3
],
"txt_desc": "pink handbag"
}
],
"labels": [
"car",
"person",
"cell phone",
"handbag"
]
} | [
{
"area": 235741,
"bbox": [
0,
0,
639,
479
],
"category_id": 3,
"id": 14777,
"image_id": "000000501652",
"iscrowd": 0,
"segmentation": {
"counts": "0o>1000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000502971 | 000000502971.jpg | {
"data_source": "COCONut",
"file_name": "000000502971.jpg",
"height": 427,
"id": "000000502971",
"width": 640
} | {
"caption": "Against a clear blue sky, a person in green and black hangs off the side of their motorcycle, a person in a yellow and red uniform sits upright on their motorcycle, and a person in red is doing a trick on their motorcycle, all flying above the roof of a building and a small partially flag is visible on the right.",
"caption_ann": "Against a <0:clear blue sky>, a <7:person in green and black> hangs off the side of their <8:motorcycle>, a <5:person in a yellow and red uniform> sits upright on their <4:motorcycle>, and a <6:person in red> is doing a trick on their <3:motorcycle>, all flying above the <1:roof of a building> and a <2: small partially flag> is visible on the right.",
"id": 1277,
"image_id": "000000502971",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "clear blue sky"
},
{
"mask_ids": [
7
],
"txt_desc": "person in green and black"
},
{
"mask_ids": [
8
],
"txt_desc": "motorcycle"
},
{
"mask_ids": [
5
],
"txt_desc": "person in a yellow and red uniform"
},
{
"mask_ids": [
4
],
"txt_desc": "motorcycle"
},
{
"mask_ids": [
6
],
"txt_desc": "person in red"
},
{
"mask_ids": [
3
],
"txt_desc": "motorcycle"
},
{
"mask_ids": [
1
],
"txt_desc": "roof of a building"
},
{
"mask_ids": [
2
],
"txt_desc": "small partially flag"
}
],
"labels": [
"sky-other-merged",
"building-other-merged",
"banner",
"motorcycle",
"motorcycle",
"person",
"person",
"person",
"motorcycle"
]
} | [
{
"area": 239074,
"bbox": [
0,
0,
640,
427
],
"category_id": 187,
"id": 14781,
"image_id": "000000502971",
"iscrowd": 0,
"segmentation": {
"counts": "0_72h53O001O000UCL_<5_CO^<1bC1]<NbC4]<MbC4]<LdC4[<LeC4[<LeC5Z<LfC2\\<MdC3\\<NdC2[<NcC5\\<>O1O001O1O01O01... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000509822 | 000000509822.jpg | {
"data_source": "COCONut",
"file_name": "000000509822.jpg",
"height": 425,
"id": "000000509822",
"width": 640
} | {
"caption": "The image shows a black bicycle with thin tires suspended against a large, textured green door with square panels and metal studs. Below the bicycle, a concrete pavement runs along the base of the door.\n",
"caption_ann": "The image shows a <2:black bicycle with thin tires> suspended against a <0:large, textured green door with square panels and metal studs>. Below the <2:bicycle>, a <1:concrete pavement> runs along the base of the <0:door>.\n",
"id": 1278,
"image_id": "000000509822",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "black bicycle with thin tires"
},
{
"mask_ids": [
0
],
"txt_desc": "large, textured green door with square panels and metal studs"
},
{
"mask_ids": [
2
],
"txt_desc": "bicycle"
},
{
"mask_ids": [
1
],
"txt_desc": "concrete pavement"
},
{
"mask_ids": [
0
],
"txt_desc": "door"
}
],
"labels": [
"door-stuff",
"pavement-merged",
"bicycle"
]
} | [
{
"area": 176969,
"bbox": [
0,
0,
640,
389
],
"category_id": 112,
"id": 14790,
"image_id": "000000509822",
"iscrowd": 0,
"segmentation": {
"counts": "0i;`100000000000000000000000000000000000000000000000000000000000000000000000000001O000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000512918 | 000000512918.jpg | {
"data_source": "COCONut",
"file_name": "000000512918.jpg",
"height": 480,
"id": "000000512918",
"width": 640
} | {
"caption": "A person's hand pets a small black pony with a long mane that is standing in a grassy field behind a low stone wall.",
"caption_ann": "A <3:person's hand> pets a <2:small black pony with a long mane> that is standing in a <0:grassy field> behind a <1:low stone wall>.",
"id": 1279,
"image_id": "000000512918",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "person's hand"
},
{
"mask_ids": [
2
],
"txt_desc": "small black pony with a long mane"
},
{
"mask_ids": [
0
],
"txt_desc": "grassy field"
},
{
"mask_ids": [
1
],
"txt_desc": "low stone wall"
}
],
"labels": [
"grass-merged",
"rock-merged",
"horse",
"person"
]
} | [
{
"area": 118337,
"bbox": [
0,
0,
640,
480
],
"category_id": 193,
"id": 14793,
"image_id": "000000512918",
"iscrowd": 0,
"segmentation": {
"counts": "0Vj>_1oUAkNg<P2C?Dc0nChL\\;c3I>A;F8fE_Kh9U5G<C5L4L4K4J7^OdIWHc6e7?L4L4L4M2M3N1O1O2N1O2N100O2O0O2O0O101N1... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000513359 | 000000513359.jpg | {
"data_source": "COCONut",
"file_name": "000000513359.jpg",
"height": 332,
"id": "000000513359",
"width": 500
} | {
"caption": "On a snow-covered slope under a clear blue sky, a person wearing a yellow jacket, brown pants,snow goggles and black helmet rides a black snowboard, while a person sits in the snow behind with a red snowboard near a white flag. A person in a striped jacket and a person in a blue jacket with a black backpack on snowboard stand in the background.",
"caption_ann": "On a <1:snow-covered slope> under a <0:clear blue sky>, a <7:person wearing a yellow jacket, brown pants,snow goggles and black helmet> rides a <6:black snowboard>, while a <5:person> sits in the snow behind with a <4:red snowboard> near a <2:white flag>. A <3:person in a striped jacket> and a <9:person in a blue jacket> with a <10:black backpack> on <8:snowboard> stand in the background.",
"id": 1280,
"image_id": "000000513359",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "snow-covered slope"
},
{
"mask_ids": [
0
],
"txt_desc": "clear blue sky"
},
{
"mask_ids": [
7
],
"txt_desc": "person wearing a yellow jacket, brown pants,snow goggles and black helmet"
},
{
"mask_ids": [
6
],
"txt_desc": "black snowboard"
},
{
"mask_ids": [
5
],
"txt_desc": "person"
},
{
"mask_ids": [
4
],
"txt_desc": "red snowboard"
},
{
"mask_ids": [
2
],
"txt_desc": "white flag"
},
{
"mask_ids": [
3
],
"txt_desc": "person in a striped jacket"
},
{
"mask_ids": [
9
],
"txt_desc": "person in a blue jacket"
},
{
"mask_ids": [
10
],
"txt_desc": "black backpack"
},
{
"mask_ids": [
8
],
"txt_desc": "snowboard"
}
],
"labels": [
"sky-other-merged",
"snow",
"banner",
"person",
"snowboard",
"person",
"snowboard",
"person",
"skis",
"person",
"backpack"
]
} | [
{
"area": 61301,
"bbox": [
0,
0,
500,
184
],
"category_id": 187,
"id": 14797,
"image_id": "000000513359",
"iscrowd": 0,
"segmentation": {
"counts": "0h5d400O1000000O1]N]K]Mc4d2]KZMd4g2]KWMc4j2]KUMc4l2^KQMc4P3]KoLc4R3_KdLJXOh4V4^KfLb4[3^KcLc4^3]KaLc4`3]K^... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000513918 | 000000513918.jpg | {
"data_source": "COCONut",
"file_name": "000000513918.jpg",
"height": 375,
"id": "000000513918",
"width": 500
} | {
"caption": "A white horse stands peacefully in a sunlit field, grazing on the green short grass. The scene is framed by a forest, with trees lining the background.",
"caption_ann": "A <2:white horse> stands peacefully in a <1:sunlit field>, grazing on the <1:green short grass>. The scene is framed by a <0:forest, with trees> lining the background.",
"id": 1281,
"image_id": "000000513918",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "white horse"
},
{
"mask_ids": [
1
],
"txt_desc": "sunlit field"
},
{
"mask_ids": [
1
],
"txt_desc": "green short grass"
},
{
"mask_ids": [
0
],
"txt_desc": "forest, with trees"
}
],
"labels": [
"tree-merged",
"grass-merged",
"horse"
]
} | [
{
"area": 89070,
"bbox": [
0,
0,
500,
198
],
"category_id": 184,
"id": 14808,
"image_id": "000000513918",
"iscrowd": 0,
"segmentation": {
"counts": "0U6b51O00000000N2L_JQJb5P6200O12N0000001O1O00000000O10000000000F^J_Jb5`5_J`Ja5`5_JYJN1g5c5:00001O1O2N2N00... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000521203 | 000000521203.jpg | {
"data_source": "COCONut",
"file_name": "000000521203.jpg",
"height": 640,
"id": "000000521203",
"width": 426
} | {
"caption": "Three people are aboard a boat sailing on the sea under a clear sky. One of the boat’s sails is opened outward, marked with the number \\\"CAN 11165\\\" and an image of a horse.",
"caption_ann": "Three <2,3:people> are aboard a <4:boat> sailing on the <1:sea> under a <0:clear sky>. One of the <4:boat>’s sails is opened outward, marked with the number \\\"CAN 11165\\\" and an image of a horse.",
"id": 1282,
"image_id": "000000521203",
"label_matched": [
{
"mask_ids": [
2,
3
],
"txt_desc": "people"
},
{
"mask_ids": [
4
],
"txt_desc": "boat"
},
{
"mask_ids": [
1
],
"txt_desc": "sea"
},
{
"mask_ids": [
0
],
"txt_desc": "clear sky"
},
{
"mask_ids": [
4
],
"txt_desc": "boat"
}
],
"labels": [
"sky-other-merged",
"sea",
"person",
"person",
"boat"
]
} | [
{
"area": 112509,
"bbox": [
0,
0,
426,
393
],
"category_id": 187,
"id": 14811,
"image_id": "000000521203",
"iscrowd": 0,
"segmentation": {
"counts": "0l;T81O0000000000O10000001O000000000000000000000000000000001O0000000000000000001OO1001O00001O00O10000O10... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000524027 | 000000524027.jpg | {
"data_source": "COCONut",
"file_name": "000000524027.jpg",
"height": 427,
"id": "000000524027",
"width": 640
} | {
"caption": "The image shows a man wearing a loose gray sleeveless shirt with printed text and black shorts while playing basketball indoors. He is holding and dribbling an orange basketball with black seams in his right hand. The background features a plain beige wall.\n",
"caption_ann": "The image shows a <1:man wearing a loose gray sleeveless shirt with printed text and black shorts> while playing basketball indoors. He is holding and dribbling an <2:orange basketball with black seams> in his right hand. The background features a <0:plain beige wall>.\n",
"id": 1283,
"image_id": "000000524027",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "man wearing a loose gray sleeveless shirt with printed text and black shorts"
},
{
"mask_ids": [
2
],
"txt_desc": "orange basketball with black seams"
},
{
"mask_ids": [
0
],
"txt_desc": "plain beige wall"
}
],
"labels": [
"wall-other-merged",
"person",
"sports ball"
]
} | [
{
"area": 222663,
"bbox": [
0,
0,
640,
427
],
"category_id": 199,
"id": 14816,
"image_id": "000000524027",
"iscrowd": 0,
"segmentation": {
"counts": "0So[13S^dN6K8D6L7K4M3L5L3F9K5N2N1O2N2N1O1O00001O00001O1O001O00001N1N3N1O1N3N1O11O00O2O0001O5J2O00001O0O1... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000528112 | 000000528112.jpg | {
"data_source": "COCONut",
"file_name": "000000528112.jpg",
"height": 426,
"id": "000000528112",
"width": 640
} | {
"caption": "A red truck carrying a large load of hay drives across a concrete bridge with a guardrail, while below on the calm river, a person sails a small boat with a bright orange sail, with three boats moored on the bank and a grassy field visible in the distance under a clear sky.",
"caption_ann": "A <6:red truck carrying a large load of hay> drives across a <2:concrete bridge> with a <4:guardrail>, while below on the <1:calm river>, a <7:person> sails a <5:small boat with a bright orange sail>, with <8,9,10:three boats> moored on the bank and a <3:grassy field> visible in the distance under a <0:clear sky>.",
"id": 1284,
"image_id": "000000528112",
"label_matched": [
{
"mask_ids": [
6
],
"txt_desc": "red truck carrying a large load of hay"
},
{
"mask_ids": [
2
],
"txt_desc": "concrete bridge"
},
{
"mask_ids": [
4
],
"txt_desc": "guardrail"
},
{
"mask_ids": [
1
],
"txt_desc": "calm river"
},
{
"mask_ids": [
7
],
"txt_desc": "person"
},
{
"mask_ids": [
5
],
"txt_desc": "small boat with a bright orange sail"
},
{
"mask_ids": [
8,
9,
10
],
"txt_desc": "three boats"
},
{
"mask_ids": [
3
],
"txt_desc": "grassy field"
},
{
"mask_ids": [
0
],
"txt_desc": "clear sky"
}
],
"labels": [
"sky-other-merged",
"river",
"bridge",
"grass-merged",
"fence-merged",
"boat",
"truck",
"person",
"boat",
"boat",
"boat"
]
} | [
{
"area": 96493,
"bbox": [
0,
0,
640,
173
],
"category_id": 187,
"id": 14819,
"image_id": "000000528112",
"iscrowd": 0,
"segmentation": {
"counts": "0j4`80000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000528225 | 000000528225.jpg | {
"data_source": "COCONut",
"file_name": "000000528225.jpg",
"height": 360,
"id": "000000528225",
"width": 640
} | {
"caption": "In a picturesque mountain landscape, a light-colored cow with a bell around its neck walks through a large, sloping pasture of vibrant green grass. Further in the background, a small cluster of traditional-style houses sits at the base of a hill that is covered with a dense forest. The peak of a mountain can be seen rising above the trees against the overcast, gray sky. Five other cows are scattered in the distance, grazing on the expansive field.",
"caption_ann": "In a picturesque <4:mountain landscape>, a <5:light-colored cow with a bell around its neck> walks through a <2:large, sloping pasture of vibrant green grass>. Further in the background, a <1:small cluster of traditional-style houses> sits at the base of a hill that is covered with a <3:dense forest>. The peak of a <4:mountain> can be seen rising above the trees against the <0:overcast, gray sky>. Five other <6,7,8,9,10:cows> are scattered in the distance, grazing on the <2:expansive field>.",
"id": 1285,
"image_id": "000000528225",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "mountain landscape"
},
{
"mask_ids": [
5
],
"txt_desc": "light-colored cow with a bell around its neck"
},
{
"mask_ids": [
2
],
"txt_desc": "large, sloping pasture of vibrant green grass"
},
{
"mask_ids": [
1
],
"txt_desc": "small cluster of traditional-style houses"
},
{
"mask_ids": [
3
],
"txt_desc": "dense forest"
},
{
"mask_ids": [
4
],
"txt_desc": "mountain"
},
{
"mask_ids": [
0
],
"txt_desc": "overcast, gray sky"
},
{
"mask_ids": [
6,
7,
8,
9,
10
],
"txt_desc": "cows"
},
{
"mask_ids": [
2
],
"txt_desc": "expansive field"
}
],
"labels": [
"sky-other-merged",
"house",
"grass-merged",
"tree-merged",
"mountain-merged",
"cow",
"cow",
"cow",
"cow",
"cow",
"cow"
]
} | [
{
"area": 20120,
"bbox": [
0,
0,
640,
47
],
"category_id": 187,
"id": 14830,
"image_id": "000000528225",
"iscrowd": 0,
"segmentation": {
"counts": "0b0f:002NO]E^O01_:a0aEA_:e0O1bEXOZ:m0N1O000000KiEVOZ:h04001O1OO10cEWO[:k00OQFVOG0f9i0gFWO@0i9j0ZFVO34e9f0U... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000528707 | 000000528707.jpg | {
"data_source": "COCONut",
"file_name": "000000528707.jpg",
"height": 480,
"id": "000000528707",
"width": 640
} | {
"caption": "The image shows two elephants used for carrying people, each equipped with wooden benches strapped to their backs. The elephant on the right, carries two tourists, both seated and looking back, with one woman taking a photo. The scene is set in a forested area with trees and vegetation and the animals are walking along a cleared dirt pathway that contrasts with the surroundings. On the left, a wooden structure with a raised floor is also visible, likely part of a gazebo or shelter just outside the frame.",
"caption_ann": "The image shows <3,7:two elephants> used for carrying people, each equipped with <5,6:wooden benches> strapped to their backs. The <7:elephant on the right>, carries <4,8:two tourists>, both seated and looking back, with one <4:woman taking a photo>. The scene is set in a forested area with <1:trees and vegetation> and the <3,7:animals> are walking along a <2:cleared dirt pathway> that contrasts with the surroundings. On the left, a wooden structure with a <0:raised floor> is also visible, likely part of a gazebo or shelter just outside the frame.",
"id": 1286,
"image_id": "000000528707",
"label_matched": [
{
"mask_ids": [
3,
7
],
"txt_desc": "two elephants"
},
{
"mask_ids": [
5,
6
],
"txt_desc": "wooden benches"
},
{
"mask_ids": [
7
],
"txt_desc": "elephant on the right"
},
{
"mask_ids": [
4,
8
],
"txt_desc": "two tourists"
},
{
"mask_ids": [
4
],
"txt_desc": "woman taking a photo"
},
{
"mask_ids": [
1
],
"txt_desc": "trees and vegetation"
},
{
"mask_ids": [
3,
7
],
"txt_desc": "animals"
},
{
"mask_ids": [
2
],
"txt_desc": "cleared dirt pathway"
},
{
"mask_ids": [
0
],
"txt_desc": "raised floor"
}
],
"labels": [
"floor-other-merged",
"tree-merged",
"dirt-merged",
"elephant",
"person",
"bench",
"bench",
"elephant",
"person"
]
} | [
{
"area": 5457,
"bbox": [
139,
217,
159,
64
],
"category_id": 190,
"id": 14841,
"image_id": "000000528707",
"iscrowd": 0,
"segmentation": {
"counts": "S\\Q2;e>0O2O1O1N10OaAB^>=dAC[>;hAEV>;lAES>9PBGo=8SBHl=7UBKh=4XBOg=0YB3e=K[B8d=G[B<d=B\\B`0Q>2M2N1O01O10... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000537685 | 000000537685.jpg | {
"data_source": "COCONut",
"file_name": "000000537685.jpg",
"height": 640,
"id": "000000537685",
"width": 564
} | {
"caption": "On a dark wooden windowsill in front of a window with a white lace curtain, a white teddy bear with a red bow and a dark brown teddy bear sit on a pair of two couch-shaped bookends holding up eight books.",
"caption_ann": "On a <0:dark wooden windowsill> in front of a <2:window> with a <1:white lace curtain>, a <7:white teddy bear with a red bow> and a <5:dark brown teddy bear> sit on a pair of <4,6:two couch-shaped bookends> holding up <3,8,9,10,11,12,13,14:eight books>.",
"id": 1287,
"image_id": "000000537685",
"label_matched": [
{
"mask_ids": [
0
],
"txt_desc": "dark wooden windowsill"
},
{
"mask_ids": [
2
],
"txt_desc": "window"
},
{
"mask_ids": [
1
],
"txt_desc": "white lace curtain"
},
{
"mask_ids": [
7
],
"txt_desc": "white teddy bear with a red bow"
},
{
"mask_ids": [
5
],
"txt_desc": "dark brown teddy bear"
},
{
"mask_ids": [
4,
6
],
"txt_desc": "two couch-shaped bookends"
},
{
"mask_ids": [
3,
8,
9,
10,
11,
12,
13,
14
],
"txt_desc": "eight books"
}
],
"labels": [
"table-merged",
"curtain",
"window-other",
"paper-merged",
"couch",
"teddy bear",
"couch",
"teddy bear",
"book",
"book",
"book",
"book",
"book",
"book",
"book"
]
} | [
{
"area": 81577,
"bbox": [
0,
395,
564,
245
],
"category_id": 189,
"id": 14850,
"image_id": "000000537685",
"iscrowd": 0,
"segmentation": {
"counts": "o<Q7o<00000000000000000000000000000000000000000000000000000000000000000000000O10000000000000000]OoHmC[7... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000539830 | 000000539830.jpg | {
"data_source": "COCONut",
"file_name": "000000539830.jpg",
"height": 429,
"id": "000000539830",
"width": 640
} | {
"caption": "The image displays an outdoor, daytime shot showing a man in a white helmet and yellow t-shirt, smiling towards the camera as he rides a wooden longboard down the road. To his right, a partially visible front of a red bus is visible riding beside the man. In the background, dense green foliage and trees line the side of a concrete sidewalk pavement.",
"caption_ann": "The image displays an outdoor, daytime shot showing a <3:man in a white helmet and yellow t-shirt>, smiling towards the camera as he rides a <4:wooden longboard> down the <0:road>. To his right, a <5:partially visible front of a red bus> is visible riding beside the <3:man>. In the background, <1:dense green foliage and trees> line the side of a <2:concrete sidewalk pavement>.",
"id": 1288,
"image_id": "000000539830",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "man in a white helmet and yellow t-shirt"
},
{
"mask_ids": [
4
],
"txt_desc": "wooden longboard"
},
{
"mask_ids": [
0
],
"txt_desc": "road"
},
{
"mask_ids": [
5
],
"txt_desc": "partially visible front of a red bus"
},
{
"mask_ids": [
3
],
"txt_desc": "man"
},
{
"mask_ids": [
1
],
"txt_desc": "dense green foliage and trees"
},
{
"mask_ids": [
2
],
"txt_desc": "concrete sidewalk pavement"
}
],
"labels": [
"road",
"tree-merged",
"pavement-merged",
"person",
"skateboard",
"bus"
]
} | [
{
"area": 20282,
"bbox": [
0,
385,
640,
44
],
"category_id": 149,
"id": 14865,
"image_id": "000000539830",
"iscrowd": 0,
"segmentation": {
"counts": "Q<\\1Q<0000000000000001O000000000000000000000000000000000000000000000000000000001O0000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000540868 | 000000540868.jpg | {
"data_source": "COCONut",
"file_name": "000000540868.jpg",
"height": 480,
"id": "000000540868",
"width": 640
} | {
"caption": "A young boy with blond hair wearing a striped shirt looks up while holding a chocolate frosted donut with sprinkles, standing on a tan and brown striped rug next to a green couch where a striped pillow and two remote controls are resting.",
"caption_ann": "A <5:young boy with blond hair wearing a striped shirt> looks up while holding a <2:chocolate frosted donut with sprinkles>, standing on a <0:tan and brown striped rug> next to a <6:green couch> where a <1:striped pillow> and <3,4:two remote controls> are resting.",
"id": 1289,
"image_id": "000000540868",
"label_matched": [
{
"mask_ids": [
5
],
"txt_desc": "young boy with blond hair wearing a striped shirt"
},
{
"mask_ids": [
2
],
"txt_desc": "chocolate frosted donut with sprinkles"
},
{
"mask_ids": [
0
],
"txt_desc": "tan and brown striped rug"
},
{
"mask_ids": [
6
],
"txt_desc": "green couch"
},
{
"mask_ids": [
1
],
"txt_desc": "striped pillow"
},
{
"mask_ids": [
3,
4
],
"txt_desc": "two remote controls"
}
],
"labels": [
"rug-merged",
"pillow",
"donut",
"remote",
"remote",
"person",
"couch"
]
} | [
{
"area": 85027,
"bbox": [
301,
0,
339,
480
],
"category_id": 200,
"id": 14871,
"image_id": "000000540868",
"iscrowd": 0,
"segmentation": {
"counts": "oa]41k>4O1N2N2N2M3N2N2L4H8M3M3N2N2N2O1O1M3L4L4N2N2N2N2N2N2N2N2N2N20000000000001O00000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000542054 | 000000542054.jpg | {
"data_source": "COCONut",
"file_name": "000000542054.jpg",
"height": 500,
"id": "000000542054",
"width": 333
} | {
"caption": "A black and white cat with its back arched stands on top of a cream colored shelf against a plain white wall, looking down into a red cup with a yellow and light blue flower design, while a row of books sits on the shelf below.",
"caption_ann": "A <2:black and white cat with its back arched> stands on top of a <1:cream colored shelf> against a <0:plain white wall>, looking down into a <3:red cup with a yellow and light blue flower design>, while a row of <4:books> sits on the shelf below.",
"id": 1290,
"image_id": "000000542054",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "black and white cat with its back arched"
},
{
"mask_ids": [
1
],
"txt_desc": "cream colored shelf"
},
{
"mask_ids": [
0
],
"txt_desc": "plain white wall"
},
{
"mask_ids": [
3
],
"txt_desc": "red cup with a yellow and light blue flower design"
},
{
"mask_ids": [
4
],
"txt_desc": "books"
}
],
"labels": [
"wall-other-merged",
"shelf",
"cat",
"cup",
"book"
]
} | [
{
"area": 58025,
"bbox": [
0,
0,
333,
500
],
"category_id": 199,
"id": 14878,
"image_id": "000000542054",
"iscrowd": 0,
"segmentation": {
"counts": "0k9i50000O1O100O100000000O100O100O100O1O1O1N20000O10000O11O0000000000WNPJnIP6R6YJeIg5[6cJ[I]5e6lJPIV5P7TK... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000542969 | 000000542969.jpg | {
"data_source": "COCONut",
"file_name": "000000542969.jpg",
"height": 360,
"id": "000000542969",
"width": 640
} | {
"caption": "The image shows a bird with a reddish-orange chest, dark gray wings, and a yellow beak perched on the top of some trees with different foliage and colors. The background features a clear blue sky in an outdoor setting.",
"caption_ann": "The image shows a <2:bird with a reddish-orange chest, dark gray wings, and a yellow beak> perched on the top of some <1:trees with different foliage and colors>. The background features a <0:clear blue sky> in an outdoor setting.",
"id": 1291,
"image_id": "000000542969",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "bird with a reddish-orange chest, dark gray wings, and a yellow beak"
},
{
"mask_ids": [
1
],
"txt_desc": "trees with different foliage and colors"
},
{
"mask_ids": [
0
],
"txt_desc": "clear blue sky"
}
],
"labels": [
"sky-other-merged",
"tree-merged",
"bird"
]
} | [
{
"area": 120327,
"bbox": [
0,
0,
640,
360
],
"category_id": 187,
"id": 14883,
"image_id": "000000542969",
"iscrowd": 0,
"segmentation": {
"counts": "0_8Q1dGd0\\8\\OdGd0\\8\\OdGd0\\8T11O1UNcGi0]8WOdGh0]8VOeGi0\\8VOdGXOM\\1`8ZOmGc0T8\\OnGb0R8_OoG?R8CkG=U8... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000544272 | 000000544272.jpg | {
"data_source": "COCONut",
"file_name": "000000544272.jpg",
"height": 512,
"id": "000000544272",
"width": 640
} | {
"caption": "A man wearing a black helmet, goggles, gloves, and a dark jacket is riding a black motorcycle on a smooth paved road. The front of the motorcycle features a circular headlamp with a yellow sticker displaying the number \\\"24.\\\" ",
"caption_ann": "A <2:man wearing a black helmet, goggles, gloves, and a dark jacket> is riding a <1:black motorcycle> on a <0:smooth paved road>. The front of the <1:motorcycle> features a circular headlamp with a yellow sticker displaying the number \\\"24.\\\" ",
"id": 1292,
"image_id": "000000544272",
"label_matched": [
{
"mask_ids": [
2
],
"txt_desc": "man wearing a black helmet, goggles, gloves, and a dark jacket"
},
{
"mask_ids": [
1
],
"txt_desc": "black motorcycle"
},
{
"mask_ids": [
0
],
"txt_desc": "smooth paved road"
},
{
"mask_ids": [
1
],
"txt_desc": "motorcycle"
}
],
"labels": [
"road",
"motorcycle",
"person"
]
} | [
{
"area": 204127,
"bbox": [
0,
0,
640,
512
],
"category_id": 149,
"id": 14886,
"image_id": "000000544272",
"iscrowd": 0,
"segmentation": {
"counts": "0g\\f1:mRZN5L5K4L2N3N2M2O1O1O1O001O001O0[OSOWBm0f=XOXBi0e=ZOZBf0e=\\OZBd0e=^OZBc0d=@ZB`0e=CYB=g=CYB>f=CX... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000544975 | 000000544975.jpg | {
"data_source": "COCONut",
"file_name": "000000544975.jpg",
"height": 640,
"id": "000000544975",
"width": 427
} | {
"caption": "The image shows a zebra and a giraffe in an enclosed area with sparse vegetation. The zebra stands on a dirt surface, closer to the camera, with its head lowered, possibly grazing. The giraffe is farther back, behind a rock wall, stretching its neck to reach some leaves. A small waterfall can also be seen trickling down the rocks to the left.",
"caption_ann": "The image shows a <3:zebra> and a <4:giraffe> in an enclosed area with <0:sparse vegetation>. The <3:zebra> stands on a <2:dirt surface>, closer to the camera, with its head lowered, possibly grazing. The <4:giraffe> is farther back, behind a <1:rock wall>, stretching its neck to reach some leaves. A small waterfall can also be seen trickling down the <1:rocks> to the left.",
"id": 1293,
"image_id": "000000544975",
"label_matched": [
{
"mask_ids": [
3
],
"txt_desc": "zebra"
},
{
"mask_ids": [
4
],
"txt_desc": "giraffe"
},
{
"mask_ids": [
0
],
"txt_desc": "sparse vegetation"
},
{
"mask_ids": [
3
],
"txt_desc": "zebra"
},
{
"mask_ids": [
2
],
"txt_desc": "dirt surface"
},
{
"mask_ids": [
4
],
"txt_desc": "giraffe"
},
{
"mask_ids": [
1
],
"txt_desc": "rock wall"
},
{
"mask_ids": [
1
],
"txt_desc": "rocks"
}
],
"labels": [
"tree-merged",
"rock-merged",
"dirt-merged",
"zebra",
"giraffe"
]
} | [
{
"area": 122370,
"bbox": [
0,
0,
427,
357
],
"category_id": 184,
"id": 14889,
"image_id": "000000544975",
"iscrowd": 0,
"segmentation": {
"counts": "0P;P900000000O10000000000000000O1000000000000O100000000000000000000000000000000O10000O10000000000001O000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000551023 | 000000551023.jpg | {
"data_source": "COCONut",
"file_name": "000000551023.jpg",
"height": 427,
"id": "000000551023",
"width": 640
} | {
"caption": "A woman is standing in ankle-deep water near the sand, facing the sea. She holds a colorful kite with a spiral design and ribbon tails hanging loosely by her side. Gentle waves roll in beneath a grey, overcast sky.\n",
"caption_ann": "A <4:woman> is standing in ankle-deep <2:water> near the <1:sand>, facing the <2:sea>. She holds a <3:colorful kite with a spiral design and ribbon tails> hanging loosely by her side. <2:Gentle waves> roll in beneath a <0:grey, overcast sky>.\n",
"id": 1294,
"image_id": "000000551023",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "woman"
},
{
"mask_ids": [
2
],
"txt_desc": "water"
},
{
"mask_ids": [
1
],
"txt_desc": "sand"
},
{
"mask_ids": [
2
],
"txt_desc": "sea"
},
{
"mask_ids": [
3
],
"txt_desc": "colorful kite with a spiral design and ribbon tails"
},
{
"mask_ids": [
2
],
"txt_desc": "Gentle waves"
},
{
"mask_ids": [
0
],
"txt_desc": "grey, overcast sky"
}
],
"labels": [
"sky-other-merged",
"sand",
"sea",
"kite",
"person"
]
} | [
{
"area": 63750,
"bbox": [
0,
0,
640,
110
],
"category_id": 187,
"id": 14894,
"image_id": "000000551023",
"iscrowd": 0,
"segmentation": {
"counts": "0^3m90000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000555577 | 000000555577.jpg | {
"data_source": "COCONut",
"file_name": "000000555577.jpg",
"height": 612,
"id": "000000555577",
"width": 612
} | {
"caption": "The image shows a cozy bedroom scene. A blond woman is sitting cross-legged on a large bed, holding a drink and looking to the side. On either side of the bed are bedside cabinets with glowing lights. The floor appears to be made of wood. The wall behind the bed is painted a warm tone and features a framed picture.\n",
"caption_ann": "The image shows a cozy bedroom scene. A <4:blond woman> is sitting cross-legged on a <5:large bed>, holding a drink and looking to the side. On either side of the bed are <2:bedside cabinets> with <3:glowing lights>. The <0:floor> appears to be made of wood. The <1:wall> behind the bed is painted a warm tone and features a framed picture.\n",
"id": 1295,
"image_id": "000000555577",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "blond woman"
},
{
"mask_ids": [
5
],
"txt_desc": "large bed"
},
{
"mask_ids": [
2
],
"txt_desc": "bedside cabinets"
},
{
"mask_ids": [
3
],
"txt_desc": "glowing lights"
},
{
"mask_ids": [
0
],
"txt_desc": "floor"
},
{
"mask_ids": [
1
],
"txt_desc": "wall"
}
],
"labels": [
"floor-other-merged",
"wall-other-merged",
"cabinet-merged",
"light",
"person",
"bed"
]
} | [
{
"area": 18262,
"bbox": [
0,
347,
612,
265
],
"category_id": 190,
"id": 14899,
"image_id": "000000555577",
"iscrowd": 0,
"segmentation": {
"counts": "Xb0l0Xb0000O100000000O1O100O1N2M3M300000000000000O10000000000001O00000000000000000000000000000000001O00... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000561613 | 000000561613.jpg | {
"data_source": "COCONut",
"file_name": "000000561613.jpg",
"height": 375,
"id": "000000561613",
"width": 500
} | {
"caption": "The image shows a fast-food meal on a red tray. There is a hot dog with chopped onions and mustard in the foreground, a hamburger sandwich behind it, and a portion of onion rings to the side. Everything is wrapped in paper and placed on a table.",
"caption_ann": "The image shows a fast-food meal on a red tray. There is a <4:hot dog with chopped onions and mustard> in the foreground, a <3:hamburger sandwich> behind it, and a <2:portion of onion rings> to the side. Everything is wrapped in <1:paper> and placed on a <0:table>.",
"id": 1296,
"image_id": "000000561613",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "hot dog with chopped onions and mustard"
},
{
"mask_ids": [
3
],
"txt_desc": "hamburger sandwich"
},
{
"mask_ids": [
2
],
"txt_desc": "portion of onion rings"
},
{
"mask_ids": [
1
],
"txt_desc": "paper"
},
{
"mask_ids": [
0
],
"txt_desc": "table"
}
],
"labels": [
"table-merged",
"paper-merged",
"food-other-merged",
"sandwich",
"hot dog"
]
} | [
{
"area": 6698,
"bbox": [
0,
0,
407,
53
],
"category_id": 189,
"id": 14905,
"image_id": "000000561613",
"iscrowd": 0,
"segmentation": {
"counts": "0e1R:000000O1000000000000O100000000O1000000000000O10000000000000000000000000000O1000000000000O100O100N2N2N2... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000574316 | 000000574316.jpg | {
"data_source": "COCONut",
"file_name": "000000574316.jpg",
"height": 479,
"id": "000000574316",
"width": 640
} | {
"caption": "In a bright green field under a blue sky with white fluffy clouds, a brown and white spotted cow with a yellow ear tag nuzzles two brown and white cows that look up at the camera, while a third white and brown cow stands behind them on the right, and a line of distant trees is visible in the background.",
"caption_ann": "In a <1:bright green field> under a <0:blue sky with white fluffy clouds>, a <3:brown and white spotted cow> with a yellow ear tag nuzzles <4,6:two brown and white cows> that look up at the camera, while a <5:third white and brown cow> stands behind them on the right, and a line of <2:distant trees> is visible in the background.",
"id": 1297,
"image_id": "000000574316",
"label_matched": [
{
"mask_ids": [
1
],
"txt_desc": "bright green field"
},
{
"mask_ids": [
0
],
"txt_desc": "blue sky with white fluffy clouds"
},
{
"mask_ids": [
3
],
"txt_desc": "brown and white spotted cow"
},
{
"mask_ids": [
4,
6
],
"txt_desc": "two brown and white cows"
},
{
"mask_ids": [
5
],
"txt_desc": "third white and brown cow"
},
{
"mask_ids": [
2
],
"txt_desc": "distant trees"
}
],
"labels": [
"sky-other-merged",
"grass-merged",
"tree-merged",
"cow",
"cow",
"cow",
"cow"
]
} | [
{
"area": 61871,
"bbox": [
0,
0,
640,
119
],
"category_id": 187,
"id": 14910,
"image_id": "000000574316",
"iscrowd": 0,
"segmentation": {
"counts": "0e2Z<001OO11O1O001OO1001O1O00N21O00001ON200O1000000001O1O0000O1001OO1000000O1001OO100O11O00000000001OO100... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000574665 | 000000574665.jpg | {
"data_source": "COCONut",
"file_name": "000000574665.jpg",
"height": 426,
"id": "000000574665",
"width": 640
} | {
"caption": "A light blue and grey cement mixer truck is parked on a paved road in front of a wall of cement blocks, with leafy green trees and a large brick building with multiple windows in the background.",
"caption_ann": "A <4:light blue and grey cement mixer truck> is parked on a <0:paved road> in front of a <2:wall of cement blocks>, with <3:leafy green trees> and a <1:large brick building with multiple windows> in the background.",
"id": 1298,
"image_id": "000000574665",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "light blue and grey cement mixer truck"
},
{
"mask_ids": [
0
],
"txt_desc": "paved road"
},
{
"mask_ids": [
2
],
"txt_desc": "wall of cement blocks"
},
{
"mask_ids": [
3
],
"txt_desc": "leafy green trees"
},
{
"mask_ids": [
1
],
"txt_desc": "large brick building with multiple windows"
}
],
"labels": [
"road",
"building-other-merged",
"wall-other-merged",
"tree-merged",
"truck"
]
} | [
{
"area": 58165,
"bbox": [
0,
326,
640,
100
],
"category_id": 149,
"id": 14917,
"image_id": "000000574665",
"iscrowd": 0,
"segmentation": {
"counts": "[:o2[:000000000000000000000000000000000001O000000000000000000000000000000000000000000000000000000000000... | [
{
"id": 1,
"name": "object"
}
] |
train | 000000575882 | 000000575882.jpg | {
"data_source": "COCONut",
"file_name": "000000575882.jpg",
"height": 428,
"id": "000000575882",
"width": 640
} | {
"caption": "On a beige couch, a man in an orange shirt uses a grey Toshiba laptop while sitting next to a person in grey pants who has a Siamese cat resting on their lap and a remote control placed on the right of the beige couch. A wooden table in the foreground holds an orange cup, a clear glass, a small white bowl, and a green spray bottle, while a tall white vase and bamboo blinds rests against the white wall.",
"caption_ann": "On a <4:beige couch>, a <13:man in an orange shirt> uses a <8:grey Toshiba laptop> while sitting next to a <5:person in grey pants> who has a <9:Siamese cat> resting on their lap and a <12:remote control> placed on the right of the <4:beige couch>. A <1:wooden table> in the foreground holds an <6:orange cup>, a <10:clear glass>, a <7:small white bowl>, and a <3:green spray bottle>, while a <11:tall white vase> and <2:bamboo blinds> rests against the <0:white wall>.",
"id": 1299,
"image_id": "000000575882",
"label_matched": [
{
"mask_ids": [
4
],
"txt_desc": "beige couch"
},
{
"mask_ids": [
13
],
"txt_desc": "man in an orange shirt"
},
{
"mask_ids": [
8
],
"txt_desc": "grey Toshiba laptop"
},
{
"mask_ids": [
5
],
"txt_desc": "person in grey pants"
},
{
"mask_ids": [
9
],
"txt_desc": "Siamese cat"
},
{
"mask_ids": [
12
],
"txt_desc": "remote control"
},
{
"mask_ids": [
4
],
"txt_desc": "beige couch"
},
{
"mask_ids": [
1
],
"txt_desc": "wooden table"
},
{
"mask_ids": [
6
],
"txt_desc": "orange cup"
},
{
"mask_ids": [
10
],
"txt_desc": "clear glass"
},
{
"mask_ids": [
7
],
"txt_desc": "small white bowl"
},
{
"mask_ids": [
3
],
"txt_desc": "green spray bottle"
},
{
"mask_ids": [
11
],
"txt_desc": "tall white vase"
},
{
"mask_ids": [
2
],
"txt_desc": "bamboo blinds"
},
{
"mask_ids": [
0
],
"txt_desc": "white wall"
}
],
"labels": [
"wall-other-merged",
"table-merged",
"curtain",
"bottle",
"couch",
"person",
"cup",
"bowl",
"laptop",
"cat",
"cup",
"vase",
"remote",
"person"
]
} | [
{
"area": 61673,
"bbox": [
191,
0,
449,
211
],
"category_id": 199,
"id": 14922,
"image_id": "000000575882",
"iscrowd": 0,
"segmentation": {
"counts": "Zm_21[=0hj00XUO0N20000O1_M1jELY13k84`EKM>2EP11`9W1]EjN2N0211`:T2]EmMd:S1_EhN0=NGd:k0]EPO=1C220`:n0QFRO\... | [
{
"id": 1,
"name": "object"
}
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.