longformer-large-science / added_tokens.json
tanikina's picture
upload the model (resized tokenizer)
bf1ba7e verified
raw
history blame contribute delete
201 Bytes
{"<|par|>": 50274, "</|title|>": 50272, "</|sec|>": 50266, "<|sec-title|>": 50267, "<|sent|>": 50273, "<|title|>": 50271, "<|abs|>": 50269, "<|sec|>": 50265, "</|sec-title|>": 50268, "</|abs|>": 50270}