Update the tokenizer which inserted added_tokens to vocab
#75
by
tuong-nguyen-prd
- opened
Inserting added_tokens to vocab to fix error like this:
2024-06-05T05:20:37.878921Z WARN tokenizers::tokenizer::serialization: /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/tokenizers-0.19.1/src/tokenizer/serialization.rs:159: Warning: Token '<|assistant|>' was expected to have ID '32001' but was given ID 'None'
...
nguyenbh
changed pull request status to
closed