gpt2chatbotenglish / tokenizer_config.json
Görkem Göknar
initial no big file
2f7b134
raw
history blame contribute delete
172 Bytes
{"max_len": 1024, "bos_token": "<bos>", "eos_token": "<eos>", "unk_token": "<|endoftext|>", "pad_token": "<pad>", "additional_special_tokens": ["<speaker1>", "<speaker2>"]}