dataset_info: | |
features: | |
- name: input_ids | |
sequence: int32 | |
- name: attention_mask | |
sequence: int8 | |
- name: special_tokens_mask | |
sequence: int8 | |
splits: | |
- name: train | |
num_bytes: 11317002300 | |
num_examples: 7310725 | |
- name: test | |
num_bytes: 629128872 | |
num_examples: 406414 | |
- name: valid | |
num_bytes: 628241868 | |
num_examples: 405841 | |
download_size: 2926527051 | |
dataset_size: 12574373040 | |
# Dataset Card for "sinhala_30m_tokenized" | |
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |