Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Datasets:
nectec
/
best2009
like
0
Tasks:
Token Classification
Modalities:
Text
Formats:
parquet
Languages:
Thai
Size:
100K<n<1M
Tags:
word-tokenization
Croissant
Libraries:
Datasets
pandas
Croissant
License:
cc-by-nc-sa-3.0
Dataset card
Viewer
Files
Files and versions
Community
4
refs/convert/parquet
best2009
/
best2009
5 contributors
History:
31 commits
parquet-converter
Delete old duckdb index files
9d1af13
verified
4 months ago
test
Delete old duckdb index files
4 months ago
train
Delete old duckdb index files
4 months ago