auto-datasets / tokenizer_config.json
Xilixmeaty40's picture
Create tokenizer_config.json
b2cab0b verified
raw
history blame contribute delete
178 Bytes
{
"tokenizer_config": {
"vocab_size": 30000,
"max_length": 51200,
"token_type_ids": true,
"pad_token_id": 0,
"bos_token_id": 1,
"eos_token_id": 2
}
}