RE: sequence length

#12
by jasonwang178 - opened

Meta states that all CodeLlama model variants can handle sequence lengths of up to 100,000 tokens. I noticed that you fine-tuned the dataset with a sequence length of 4,096. Does the language model (LLM) still maintain support for 100,000 tokens? Thank you!

Sign up or log in to comment