Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Q-bert
/
tokenized-wikipedia
like
0
Modalities:
Text
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
License:
mit
Dataset card
Viewer
Files
Files and versions
Community
2
refs/convert/parquet
tokenized-wikipedia
/
default
/
train
1 contributor
History:
3 commits
parquet-converter
Delete old duckdb index files
3cf1b3f
verified
10 months ago
0000.parquet
Safe
84.6 MB
LFS
Update parquet files
about 1 year ago
0001.parquet
Safe
85.3 MB
LFS
Update parquet files
about 1 year ago
0002.parquet
Safe
84.8 MB
LFS
Update parquet files
about 1 year ago
0003.parquet
Safe
84.2 MB
LFS
Update parquet files
about 1 year ago
0004.parquet
Safe
84.4 MB
LFS
Update parquet files
about 1 year ago
0005.parquet
Safe
85.2 MB
LFS
Update parquet files
about 1 year ago
0006.parquet
Safe
84.8 MB
LFS
Update parquet files
about 1 year ago
0007.parquet
Safe
85.1 MB
LFS
Update parquet files
about 1 year ago
0008.parquet
Safe
84.6 MB
LFS
Update parquet files
about 1 year ago
0009.parquet
Safe
84.7 MB
LFS
Update parquet files
about 1 year ago
0010.parquet
Safe
83.2 MB
LFS
Update parquet files
about 1 year ago
0011.parquet
Safe
84.1 MB
LFS
Update parquet files
about 1 year ago
0012.parquet
Safe
85.1 MB
LFS
Update parquet files
about 1 year ago
0013.parquet
Safe
83.6 MB
LFS
Update parquet files
about 1 year ago