Anyone know the trained context length of the 7B instruct and chat models?

#20
by Dihf - opened

Anyone know the trained context length of the 7B instruct and chat models?

Mosaic ML, Inc. org

@0xDing is correct, we trained on 2048. ALiBi should let one extrapolate to 4096

sam-mosaic changed discussion status to closed

Sign up or log in to comment