Context size?

#1
by dasChronos1 - opened

"max_position_embeddings": 8192,

Is this due to a typo/copy pasting or is this the real context size?

looks like the config got updated after I made it, it shouldn't affect the quantization process and so should be safe to just update the config

I'll make the changes on my side, but if you've already downloaded, just make your config.json match this one:

https://huggingface.co/Nitral-AI/HerculeanSea-7b-128k/blob/main/config.json

Updated all sizes, thanks for pointing it out

Sign up or log in to comment