Reduced zipped file sizes to each below 500 MB, this way Huggingface can process them into parquet files using croissant and visualize them.
583c0cb
John Martinscommited on
First full push of the dataset! -- Corrected a few small mistakes from the first uploaded batch with regard to the indexing of the frames. All ~should~ be in order
d5c20ad
John Martinscommited on
modified style of data frame label and depth imagepathing in order for the dataset to comply with HF's Parquet format and Croissant
9745379
John Martinscommited on
deleted old frames folder --> replaced with 'data/'
95125cb
John Martinscommited on
Uploading batch of december data (1/10)
59160bc
John Martinscommited on
testing inner list organization for skeletons
1a562f0
John Martinscommited on
experimenting with filenames
8b69ae5
John Martinscommited on
Modifying metadata to reflect removal of inner data
7b175dd
John Martinscommited on
Removing inner_depths images to see effect on parquet conversion
4e48ab7
John Martinscommited on
metadata experimentation
e58fbb3
John Martinscommited on
changed metadata again to test dataset viewer and parquet formatting
5d1bdb8
John Martinscommited on
modified metadata to try including a 'filename' key to see if it can be visualized by hugging face's dataset viewer and maybe be able to get converted to parqeut and use croissant
257edbf
John Martinscommited on
Adding 1000 frames to test formatting and hugging face's dataviewer