Unexpected Random Outputs

#2
by langematan - opened

encountering an unexpected and issue with this version. Specifically, I've been following the implementation example provided in the model card directly, anticipating a smooth initiation. However, the model's outputs have been random and disjointed, contrary to the coherent outputs in huggingface space.
what am i doing wrong ?

image.png

Cohere For AI org

Hi, are you using the latest transformers commit from the source? This needs to be installed as pip install 'git+https://github.com/huggingface/transformers.git

I'm using transformers ==4.39.3

Cohere For AI org

This model requires transformers installation from the source repo since there are modifications not included in 4.39.3

ahmetustun changed discussion status to closed

Are you sure this isn't related to the tokenizer.json differences between the 4bit bnb and the original cohere model? we're seeing this cause issues downstream in other conversions too.

see: https://huggingface.co/CohereForAI/c4ai-command-r-plus/discussions/15
and: https://github.com/huggingface/transformers/pull/30027

Cohere For AI org
edited Apr 8

hi @fbjr , the difference between tokenizers.json is unicode encoding in the command-r-plus. Also<|END_OF_TURN_TOKEN|> token is also set as special in 4bit because it is used as eos_token, which is also overwritten in the original tokenizer (command-r-plus) as well. Therefore, tokenizers should work the same. I can not reproduce the issue that @langematan mentioned.

Sign up or log in to comment