Update script to use JSON with fixed licenses.

#3
by cdleong - opened
SIL International - AI org

Need to:

SIL International - AI org

OK, I've got a problem. There's more album IDs in annotations than there are in albums.

$ jq .albums[].id data/bloom_vist_june15_deduped_june21_langfiltered_june22_with_storylets_licenseupdated.json |sort|uniq|wc -l
7520
$ jq .annotations[][].album_id data/bloom_vist_june15_deduped_june21_langfiltered_june22_with_storylets_licenseupdated.json |sort|uniq|wc -l
9232
SIL International - AI org

Hmmm...

python get_json_top_level_keys.py data/bloom_vist_june15_deduped_june21_langfiltered_june22_with_storylets_licenseupdated.json
data/bloom_vist_june15_deduped_june21_langfiltered_june22_with_storylets_licenseupdated.json
utc_creation_date: 2022-06-21 18:51:02
duplicate_ids_list: <class 'dict'>: 7520 items, 7520 keys
albums: <class 'list'>: 7520 items
images: <class 'list'>: 91190 items
annotations: <class 'list'>: 114359 items
stories: <class 'dict'>: 11548 items, 11548 keys
last_filter_date: <class 'list'>: 1 items
licenses updated: <class 'list'>: 1 items
SIL International - AI org

Presumably this happened in the dedupe process, we need to update the annotations to point to existing albums?

SIL International - AI org

Error message I'm getting:

Traceback (most recent call last):
  File "/home/cleong/projects/personal/SIL/bloom-captioning/test_load.py", line 521, in <module>
    datasetdict = load_dataset("sil-ai/bloom-captioning", random_lang, use_auth_token=True)
  File "/home/cleong/miniconda3/envs/bloom/lib/python3.9/site-packages/datasets/load.py", line 1691, in load_dataset
    builder_instance.download_and_prepare(
  File "/home/cleong/miniconda3/envs/bloom/lib/python3.9/site-packages/datasets/builder.py", line 605, in download_and_prepare
    self._download_and_prepare(
  File "/home/cleong/miniconda3/envs/bloom/lib/python3.9/site-packages/datasets/builder.py", line 1104, in _download_and_prepare
    super()._download_and_prepare(dl_manager, verify_infos, check_duplicate_keys=verify_infos)
  File "/home/cleong/miniconda3/envs/bloom/lib/python3.9/site-packages/datasets/builder.py", line 672, in _download_and_prepare
    split_generators = self._split_generators(dl_manager, **split_generators_kwargs)
  File "/home/cleong/.cache/huggingface/modules/datasets_modules/datasets/sil-ai--bloom-captioning/d137db11c10a94fe158f58c34fdcd3747799068149fd8143e5eff04ed4625db8/bloom-captioning.py", line 218, in _split_generators
    image_captioning_annotations = vist_annotations_to_image_captioning(
  File "/home/cleong/.cache/huggingface/modules/datasets_modules/datasets/sil-ai--bloom-captioning/d137db11c10a94fe158f58c34fdcd3747799068149fd8143e5eff04ed4625db8/bloom-captioning.py", line 128, in vist_annotations_to_image_captioning
    corresponding_album = album_dicts_indexed_by_id[annotation_album_id]
KeyError: 'a72af1a3-98df-4902-9fab-0ca1a3fe59b1'
SIL International - AI org

OK, checking through the duplicate_ids_list, we find that a72af1a3-98df-4902-9fab-0ca1a3fe59b1 is a dupe of d71c13b9-6fb7-4815-afac-577c4c9f436a

SIL International - AI org

So we've got a couple solutions:

  • if there's a keyerror we go hunting through the duplicate_ids_list.
  • before we do any of this, we just go through that list and add all the dupe ids to point to the top-level one. e.g. d71c13b9-6fb7-4815-afac-577c4c9f436a has the following as dupes, we point all of these ids to it:
"a72af1a3-98df-4902-9fab-0ca1a3fe59b1",
    "5855aae8-571d-4cc6-8a43-83f7cdbfad46",
    "b7c7911b-4222-4de2-92f9-6b2264a58353",
    "49c4a620-7327-46a6-a4cb-63be26334723",
    "fb68f97c-ef2d-4fbc-87be-571d73f4177e",
    "cd37d67d-646a-4e12-a367-addf165783dc",
    "a68ace8b-5659-4201-9846-5ad93ad7a8ae",
    "b2b4feb8-c7fa-4208-b256-2b4d8a02cba2",
    "ec252467-559b-4e50-8ded-243cd48025d4",
    "7f734ce2-1ecc-4fe2-a682-44261886f2b9",
    "d61ab154-dd2b-407f-b5df-a18b3a52dc9e",
SIL International - AI org

One potential issue: we are using the license from whichever album we picked as the "keeper" and not the various albums which are designated as duplicates.

Sign up or log in to comment