Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Sebb
/
german-nli-base-thesis
like
0
Text Classification
Transformers
PyTorch
bert
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
german-nli-base-thesis
/
special_tokens_map.json
Sebb
add tokenizer
36f9bd3
almost 3 years ago
raw
Copy download link
history
blame
contribute
delete
Safe
112 Bytes
{
"unk_token"
:
"[UNK]"
,
"sep_token"
:
"[SEP]"
,
"pad_token"
:
"[PAD]"
,
"cls_token"
:
"[CLS]"
,
"mask_token"
:
"[MASK]"
}