Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
maximedb
/
latexical
like
0
Transformers
PyTorch
bert
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
dbf8b35
latexical
/
special_tokens_map.json
maximedb
add tokenizer
dbf8b35
almost 3 years ago
raw
Copy download link
history
blame
Safe
112 Bytes
{
"unk_token"
:
"[UNK]"
,
"sep_token"
:
"[SEP]"
,
"pad_token"
:
"[PAD]"
,
"cls_token"
:
"[CLS]"
,
"mask_token"
:
"[MASK]"
}