Upload tokenizer
#36
by
LucileSaulnier
- opened
No description provided.
When I load this model, it appears the error "ValueError: Non-consecutive added token ' < unk > ' found. Should have index 32000 but has index 0 in saved vocabulary."
Should "added_tokens.json" be removed?
Yes, getting the same with the main branch so I'll have look.
LucileSaulnier
changed pull request status to
merged