8k Context Base Mode?

#3
by basiliskinstitute - opened

I'd like to train this model on long context size data. But I'm not sure how to modify the model to take advantage of this. I see there is an 8k chat so surely it must be possible?

InternLM org

I'd like to train this model on long context size data. But I'm not sure how to modify the model to take advantage of this. I see there is an 8k chat so surely it must be possible?

You can just modify max_position_embedding and start training

Sign up or log in to comment