mk-bart-small-v3 / tokenizer_config.json
mk9165's picture
Training in progress, step 500
3e53022
raw
history blame contribute delete
365 Bytes
{
"model_max_length": 1000000000000000019884624838656,
"name_or_path": "digit82/kobart-summarization",
"special_tokens_map_file": "/Users/digit82/.cache/huggingface/transformers/3e6abf40f4fadbea9e7b539c182868d979838d8f7e6cdcdf2ed52ddcf01420c0.15447ae63ad4a2eba8bc7a5146360711dc32b315b4f1488b4806debf35315e9a",
"tokenizer_class": "PreTrainedTokenizerFast"
}