---
license: cc-by-sa-4.0
dataset_info:
features:
- name: video_id
dtype: string
- name: chunk_idx
dtype: int64
- name: chunk_text
dtype: string
- name: video_metadata
dtype: string
- name: video_language
dtype: string
- name: chunk_media
dtype: string
splits:
- name: shard_10339
num_bytes: 1997009
num_examples: 631
- name: shard_10400
num_bytes: 2638827
num_examples: 722
- name: shard_10324
num_bytes: 1700655
num_examples: 515
- name: shard_10418
num_bytes: 3034319
num_examples: 947
- name: shard_1045
num_bytes: 2042334
num_examples: 648
- name: shard_10428
num_bytes: 2314345
num_examples: 706
- name: shard_10435
num_bytes: 2300183
num_examples: 677
- name: shard_10424
num_bytes: 1839226
num_examples: 552
- name: shard_10442
num_bytes: 1543285
num_examples: 419
- name: shard_10411
num_bytes: 2005599
num_examples: 604
- name: shard_10344
num_bytes: 1796239
num_examples: 589
- name: shard_10439
num_bytes: 1780546
num_examples: 567
- name: shard_10351
num_bytes: 2156111
num_examples: 677
- name: shard_10446
num_bytes: 2117151
num_examples: 525
- name: shard_10457
num_bytes: 1851306
num_examples: 555
- name: shard_10464
num_bytes: 1316832
num_examples: 440
- name: shard_10405
num_bytes: 1820556
num_examples: 613
- name: shard_10471
num_bytes: 2397197
num_examples: 682
- name: shard_10311
num_bytes: 4072154
num_examples: 1148
- name: shard_10456
num_bytes: 1279577
num_examples: 430
- name: shard_1035
num_bytes: 2102014
num_examples: 687
- name: shard_10430
num_bytes: 2293697
num_examples: 686
- name: shard_10469
num_bytes: 2521584
num_examples: 743
- name: shard_10360
num_bytes: 2329044
num_examples: 680
- name: shard_10443
num_bytes: 2222280
num_examples: 641
- name: shard_10453
num_bytes: 3277011
num_examples: 931
- name: shard_10481
num_bytes: 2163505
num_examples: 709
- name: shard_10482
num_bytes: 1885620
num_examples: 503
- name: shard_10365
num_bytes: 1789825
num_examples: 453
- name: shard_10475
num_bytes: 2290432
num_examples: 635
- name: shard_10315
num_bytes: 2911312
num_examples: 743
- name: shard_10444
num_bytes: 1915386
num_examples: 550
- name: shard_10493
num_bytes: 2240928
num_examples: 752
- name: shard_10433
num_bytes: 1728758
num_examples: 554
- name: shard_10486
num_bytes: 1946726
num_examples: 564
- name: shard_1037
num_bytes: 1622214
num_examples: 464
- name: shard_1049
num_bytes: 2142677
num_examples: 691
- name: shard_10507
num_bytes: 1404701
num_examples: 444
- name: shard_10479
num_bytes: 2668644
num_examples: 706
- name: shard_10543
num_bytes: 1567113
num_examples: 498
- name: shard_10494
num_bytes: 2572169
num_examples: 834
- name: shard_10506
num_bytes: 2352799
num_examples: 689
- name: shard_10497
num_bytes: 2130672
num_examples: 640
- name: shard_10503
num_bytes: 2821589
num_examples: 657
- name: shard_10488
num_bytes: 2610372
num_examples: 824
- name: shard_1050
num_bytes: 2380295
num_examples: 610
- name: shard_10379
num_bytes: 2121338
num_examples: 596
- name: shard_10258
num_bytes: 2899614
num_examples: 881
- name: shard_10521
num_bytes: 1751228
num_examples: 578
- name: shard_10477
num_bytes: 1987455
num_examples: 610
- name: shard_10510
num_bytes: 1809438
num_examples: 536
- name: shard_10518
num_bytes: 1554268
num_examples: 534
- name: shard_10514
num_bytes: 2398872
num_examples: 659
- name: shard_10366
num_bytes: 2686341
num_examples: 715
- name: shard_10462
num_bytes: 3202984
num_examples: 912
- name: shard_10512
num_bytes: 2058849
num_examples: 697
- name: shard_10558
num_bytes: 2065125
num_examples: 572
- name: shard_10383
num_bytes: 2580580
num_examples: 859
- name: shard_10550
num_bytes: 2617491
num_examples: 643
- name: shard_10536
num_bytes: 2352902
num_examples: 649
- name: shard_10529
num_bytes: 1970611
num_examples: 633
- name: shard_10565
num_bytes: 1569669
num_examples: 522
- name: shard_10538
num_bytes: 2012923
num_examples: 564
- name: shard_10532
num_bytes: 1839647
num_examples: 594
- name: shard_10531
num_bytes: 2125990
num_examples: 618
- name: shard_10382
num_bytes: 1770026
num_examples: 493
- name: shard_10509
num_bytes: 1324378
num_examples: 402
- name: shard_10572
num_bytes: 1859423
num_examples: 489
- name: shard_1058
num_bytes: 1707150
num_examples: 491
- name: shard_10455
num_bytes: 3275368
num_examples: 750
- name: shard_10206
num_bytes: 3714862
num_examples: 891
- name: shard_10525
num_bytes: 3210740
num_examples: 892
- name: shard_10594
num_bytes: 1369358
num_examples: 458
- name: shard_10289
num_bytes: 3470407
num_examples: 963
- name: shard_10396
num_bytes: 3458836
num_examples: 956
- name: shard_10298
num_bytes: 2823620
num_examples: 791
download_size: 95273788
dataset_size: 169484311
configs:
- config_name: default
data_files:
- split: train
path: data/*.parquet
- split: shard_10339
path: data/shard_10339-*
- split: shard_10400
path: data/shard_10400-*
- split: shard_10424
path: data/shard_10424-*
- split: shard_10324
path: data/shard_10324-*
- split: shard_10428
path: data/shard_10428-*
- split: shard_10258
path: data/shard_10258-*
- split: shard_10396
path: data/shard_10396-*
- split: shard_10411
path: data/shard_10411-*
- split: shard_10418
path: data/shard_10418-*
- split: shard_10206
path: data/shard_10206-*
- split: shard_10442
path: data/shard_10442-*
- split: shard_1045
path: data/shard_1045-*
- split: shard_10289
path: data/shard_10289-*
- split: shard_10298
path: data/shard_10298-*
- split: shard_10344
path: data/shard_10344-*
- split: shard_10435
path: data/shard_10435-*
- split: shard_10311
path: data/shard_10311-*
- split: shard_10405
path: data/shard_10405-*
- split: shard_10464
path: data/shard_10464-*
- split: shard_10457
path: data/shard_10457-*
- split: shard_10439
path: data/shard_10439-*
- split: shard_10351
path: data/shard_10351-*
- split: shard_10446
path: data/shard_10446-*
- split: shard_10315
path: data/shard_10315-*
- split: shard_10471
path: data/shard_10471-*
- split: shard_1035
path: data/shard_1035-*
- split: shard_10456
path: data/shard_10456-*
- split: shard_10486
path: data/shard_10486-*
- split: shard_10430
path: data/shard_10430-*
- split: shard_10469
path: data/shard_10469-*
- split: shard_10360
path: data/shard_10360-*
- split: shard_10443
path: data/shard_10443-*
- split: shard_10453
path: data/shard_10453-*
- split: shard_10462
path: data/shard_10462-*
- split: shard_10481
path: data/shard_10481-*
- split: shard_10482
path: data/shard_10482-*
- split: shard_10365
path: data/shard_10365-*
- split: shard_10475
path: data/shard_10475-*
- split: shard_10444
path: data/shard_10444-*
- split: shard_10493
path: data/shard_10493-*
- split: shard_10433
path: data/shard_10433-*
- split: shard_1037
path: data/shard_1037-*
- split: shard_1049
path: data/shard_1049-*
- split: shard_10507
path: data/shard_10507-*
- split: shard_10521
path: data/shard_10521-*
- split: shard_10479
path: data/shard_10479-*
- split: shard_10543
path: data/shard_10543-*
- split: shard_10494
path: data/shard_10494-*
- split: shard_10565
path: data/shard_10565-*
- split: shard_10558
path: data/shard_10558-*
- split: shard_10506
path: data/shard_10506-*
- split: shard_10497
path: data/shard_10497-*
- split: shard_10503
path: data/shard_10503-*
- split: shard_10488
path: data/shard_10488-*
- split: shard_1050
path: data/shard_1050-*
- split: shard_10379
path: data/shard_10379-*
- split: shard_10366
path: data/shard_10366-*
- split: shard_10512
path: data/shard_10512-*
- split: shard_10529
path: data/shard_10529-*
- split: shard_10477
path: data/shard_10477-*
- split: shard_10510
path: data/shard_10510-*
- split: shard_10518
path: data/shard_10518-*
- split: shard_10514
path: data/shard_10514-*
- split: shard_10383
path: data/shard_10383-*
- split: shard_10550
path: data/shard_10550-*
- split: shard_10525
path: data/shard_10525-*
- split: shard_10536
path: data/shard_10536-*
- split: shard_10531
path: data/shard_10531-*
- split: shard_10538
path: data/shard_10538-*
- split: shard_10532
path: data/shard_10532-*
- split: shard_10382
path: data/shard_10382-*
- split: shard_10509
path: data/shard_10509-*
- split: shard_10572
path: data/shard_10572-*
- split: shard_1058
path: data/shard_1058-*
- split: shard_10455
path: data/shard_10455-*
- split: shard_10594
path: data/shard_10594-*
---
![VALID Dataset](https://huggingface.co/datasets/ontocord/VALID/resolve/main/banner1-1.webp)
# VALID (Video-Audio Large Interleaved Dataset)
## Overview
The **VALID (Video-Audio Large Interleaved Dataset)** is a multimodal dataset comprising approximately 720,000 [Creative Commons licensed](https://creativecommons.org/share-your-work/cclicenses/) videos crawled from YouTube, and processed into audio-video-text data records for machine learning research. **We are in the process of uploading so please be patient.** The dataset provides a unique opportunity for training models to understand relationships between modalities such as video frames, audio clips, and multilingual textual data, making it suitable for applications like multimodal representation learning.
## Features
- Audio-Video-Text Format:
A combination of:
```
English text
```
- The non-text multimodal portion begins the data item and can include multiple media. Some snippets may have more than one audio, and more than one video. Others may have only images/videos or only audio paired with English text. Each video contains multiple frames stored as images, and text captions for each image. There can also be standalone images interleaved as well.
Even though each audio video snippets are no more than 10 seconds, a data record may span over more than 10 secs (e.g., if a data item has two 10 second videos, then the corresponding English text corresponds roughly to 20 seconds of video).
The intention for this format is to teach a model to associate multiple modalities with each other, and understand multiple audio-video elements in an interleaved fashion.
- Data Components:
- **Images**: PNG format, phashed to ensure variability, with 0–10 images per audio snippet. Each image includes a caption created with Florence-2.
- **Audio**: OGG format, multilingual, ~10 seconds per snippet, with shorter sound or music snippets (1–3 seconds) to minimize copyright issues. Each audio snippet is transcribed either with Whisper for non-English, or with the original Youtube ASR for English.
- **Text**: Not including the captions and transcripts, the “text” portion is a concatenation of Youtube’s original English transcripts associated with the above media of around 1–40 words per data record.
- Dataset Size:
- **About 7,000,000 records.**
- **About 15,000,000 images, each captioned with FLorence-2.**
- **About 30,000,000 audio snippets, about half of which transcribed with Whisper-large, and half with Youtube ASR.**
- **Divided into about 12K shards of about 600 records, each in a parquet file and a corresponding .tar.gz file for the media.**
- **About 14TB in total.**
## File Organization
- Each data entry follows the `