bincorp_ep0.3_wo_ep / checkpoint-19000 /tokenizer_config.json
sheepy928's picture
Training in progress, step 19000
351d7a0
raw
history blame contribute delete
111 Bytes
{
"clean_up_tokenization_spaces": true,
"model_max_length": 512,
"tokenizer_class": "MarindaTokenizer"
}