8andage commited on
Commit
3838020
1 Parent(s): ae0ccb0

Create added_tokens.json

Browse files

Copied from alpaca-13B and native, solves an error message when converting to ggml files

Exception: Vocab size mismatch (model has 32001, but models/chavinlo_gpt4-x-alpaca/tokenizer.model has 32000)

Files changed (1) hide show
  1. added_tokens.json +3 -0
added_tokens.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "[PAD]": 32000
3
+ }