Transformers
Safetensors
openlm
Inference Endpoints

Context-length

#1
by ceoofcapybaras - opened

Hey, thanks for releasing this model! I am curious why does the title say 8K, the description also says that it has been trained to extend context length from 2k -> 8k. But at the top of the model card it says "Context Length 2048", and config.json also says "seq_len": 2048.

Apple org

Thanks for catching this, Ill fix it

ceoofcapybaras changed discussion status to closed

Sign up or log in to comment