thanks for all the nice models. unfortunately all CodeLlamas throw "KeyError: ‘pad_token_id’"

#2
by DQ83 - opened

when i try loading with ExLlama the KeyEroor occurs. Any clue what happens? i am on the latest oobabooga

Add this: "pad_token_id": 0,
on the generation_config.json
image.png

Thank you @TheYuriLover

I have now fixed this on all branches of all CodeLlama GPTQ repos

@TheBloke I made a mistake, you have to put "pad_token_id": 0, on the config.json instead
image.png

very nice , thanks for clarification and thank you for the update @TheBloke

@TheYuriLover yeah I just learned that. I think it should be in both actually

I have just now updated it to config.json in all repos

This comment has been hidden
TheBloke changed discussion status to closed

Sign up or log in to comment