Hugging Face
Models
Datasets
Spaces
Docs
Solutions
Pricing
Log In
Sign Up
Wakaka
/
bert-finetuned-imdb
like
0
Text Classification
Transformers
PyTorch
imdb
bert
generated_from_trainer
Eval Results
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
3
Train
Deploy
Use in Transformers
main
bert-finetuned-imdb
/
special_tokens_map.json
Wakaka
add tokenizer
000f467
over 1 year ago
raw
history
blame
contribute
delete
No virus
112 Bytes
{
"unk_token"
:
"[UNK]"
,
"sep_token"
:
"[SEP]"
,
"pad_token"
:
"[PAD]"
,
"cls_token"
:
"[CLS]"
,
"mask_token"
:
"[MASK]"
}