gpt2large-lhm-05 / tokenizer_config.json
schreon's picture
Training in progress, step 5000
2573286
raw
history blame contribute delete
291 Bytes
{
"model_max_length": 1000000000000000019884624838656,
"name_or_path": "/home/ma/s/schroederl/XNEXT/xnext/data/tokenizer_fast",
"special_tokens_map_file": "/home/ma/s/schroederl/XNEXT/xnext/data/tokenizer_fast/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast"
}