progressive-3 / tokenizer_config.json
DustinEwan's picture
Training in progress, step 500
c196c36 verified
raw
history blame contribute delete
59 Bytes
{
"model_max_length": 1000000000000000019884624838656
}