Text Generation
Transformers
PyTorch
Safetensors
Korean
gpt_neox
text-generation-inference
Inference Endpoints
polyglot-ko-1.3b-chat / special_tokens_map.json
heegyu's picture
Upload tokenizer
00e156b
raw history blame
No virus
185 Bytes
{
"additional_special_tokens": [
"<|endoftext|>",
"<|sep|>",
"<|acc|>",
"<|tel|>",
"<|rrn|>"
],
"eos_token": "<|endoftext|>",
"pad_token": "<|endoftext|>"
}