Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
yzhou992
/
tokenize_wikitext103
like
0
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Viewer
Files
Files and versions
Community
main
tokenize_wikitext103
/
data
2 contributors
History:
8 commits
yzhou992
Upload data/validation-00000-of-00001.parquet with git-lfs
43ea71a
over 2 years ago
test-00000-of-00001.parquet
Safe
559 kB
LFS
Upload data/test-00000-of-00001.parquet with git-lfs
over 2 years ago
train-00000-of-00002.parquet
Safe
119 MB
LFS
Upload data/train-00000-of-00002.parquet with git-lfs
over 2 years ago
train-00001-of-00002.parquet
Safe
119 MB
LFS
Upload data/train-00001-of-00002.parquet with git-lfs
over 2 years ago
validation-00000-of-00001.parquet
Safe
506 kB
LFS
Upload data/validation-00000-of-00001.parquet with git-lfs
over 2 years ago