TahaCakir commited on
Commit
18ce413
1 Parent(s): 502bef3

Upload tokenizer

Browse files
Files changed (4) hide show
  1. merges.txt +0 -0
  2. tokenizer.json +0 -0
  3. tokenizer_config.json +1 -1
  4. vocab.json +0 -0
merges.txt CHANGED
The diff for this file is too large to render. See raw diff
 
tokenizer.json CHANGED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json CHANGED
@@ -15,7 +15,7 @@
15
  "clean_up_tokenization_spaces": true,
16
  "eos_token": "<|endoftext|>",
17
  "errors": "replace",
18
- "max_length": 100,
19
  "model_max_length": 1024,
20
  "pad_token": null,
21
  "stride": 0,
 
15
  "clean_up_tokenization_spaces": true,
16
  "eos_token": "<|endoftext|>",
17
  "errors": "replace",
18
+ "max_length": 70,
19
  "model_max_length": 1024,
20
  "pad_token": null,
21
  "stride": 0,
vocab.json CHANGED
The diff for this file is too large to render. See raw diff