Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Datasets:
u-10bei
/
enwiki-20240301-tokenizer
like
0
Modalities:
Text
Formats:
parquet
Languages:
English
Size:
1M - 10M
Libraries:
Datasets
Dask
Croissant
+ 1
License:
cc-by-sa-4.0
Dataset card
Viewer
Files
Files and versions
Community
2
a060ab2
enwiki-20240301-tokenizer
/
data
2 contributors
History:
1 commit
u-10bei
Upload dataset
a060ab2
verified
7 months ago
train-00000-of-00003.parquet
335 MB
LFS
Upload dataset
7 months ago
train-00001-of-00003.parquet
270 MB
LFS
Upload dataset
7 months ago
train-00002-of-00003.parquet
250 MB
LFS
Upload dataset
7 months ago