Quick Q: context size?

#2
by rboehme86 - opened

Hi,

thanks first off for the awesome work! I just wanted to swiftly asked what context size you trained for and if you tested some stable recall? Does it work at 8k or 16k?

Best

Robert

As this model is a fine-tune of Mixtral, the maximum sequence length is 32k. In our DPO fine-tune we limited the max sequence length to 1024.

on the leaderboard it scored better than mixtral 8x7b

but it looks like it's okay-ish to 3k length,
will there be a newer version redo in the future, to improve it's context length?

https://old.reddit.com/r/LocalLLaMA/comments/190r59u/long_context_recall_pressure_test_batch_2/

qugd635ph0bc1.webp
i seriously don't know what's causing it to degrade that much, could it be the training parameters?

86y3y25ph0bc1.webp

8f9cj45ph0bc1.webp

Sign up or log in to comment