What is the context window size for Guanaco?

#2
by altryne - opened

Would be great to know what's the context window size here!

University of Washington NLP org

It is the same as the LLaMA model, 2048 tokens. I think it is possible to fine-tune it to work with a larger context window size, but I am actually not sure if anybody has ever done this with LoRA. I think QLoRA would make this easy to study.

This comment has been hidden

Sign up or log in to comment