add_bos_token
bool
add_prefix_space
bool
added_tokens_decoder
dict
bos_token
string
clean_up_tokenization_spaces
bool
eos_token
string
errors
string
model_max_length
int64
pad_token
null
tokenizer_class
string
unk_token
string
false
false
{ "50256": { "content": "<|endoftext|>", "lstrip": false, "normalized": true, "rstrip": false, "single_word": false, "special": true } }
<|endoftext|>
true
<|endoftext|>
replace
1,024
null
GPT2Tokenizer
<|endoftext|>

No dataset card yet

New: Create and edit this dataset card directly on the website!

Contribute a Dataset Card
Downloads last month
0
Add dataset card