Text Generation
Transformers
Safetensors
English
phi-llava
custom_code
Inference Endpoints
llava-phi-2-3b / special_tokens_map.json
marianna13's picture
Upload tokenizer
f37b9fb verified
raw
history blame contribute delete
587 Bytes
{
"bos_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"eos_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"unk_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}