Update config.json

#77
by Muennighoff - opened
BigScience Workshop org
edited Aug 13, 2022

I guess we don't need it as it's unlimited in principle though 2048 during pretraining. Other BLOOM model configs have it. Can also remove it there instead.

BigScience Workshop org

We made sure not to add it as we're not bound to any sequence length.

TimeRobber changed pull request status to closed
BigScience Workshop org

Then we should remove it in the other ones though, e.g. https://huggingface.co/bigscience/bloom-7b1/blob/main/config.json

BigScience Workshop org

Still not sure why those configs aren't synchronized ... but yes.

BigScience Workshop org

Thanks @Muennighoff , indeed we should remove them too from the other config files

BigScience Workshop org

Done alrdy 👍

Sign up or log in to comment