calculator_model_test / tokenizer_config.json
polymatheiia's picture
Training in progress, step 200
5e04d2f verified
raw
history blame contribute delete
199 Bytes
{
"backend": "tokenizers",
"cls_token": "[CLS]",
"eos_token": "[EOS]",
"model_max_length": 1000000000000000019884624838656,
"pad_token": "[PAD]",
"tokenizer_class": "TokenizersBackend"
}