calculator_model_test / tokenizer_config.json
ka24j13's picture
Training in progress, step 50
a8bbd35 verified
raw
history blame contribute delete
199 Bytes
{
"backend": "tokenizers",
"cls_token": "[CLS]",
"eos_token": "[EOS]",
"model_max_length": 1000000000000000019884624838656,
"pad_token": "[PAD]",
"tokenizer_class": "TokenizersBackend"
}