Spaces:
Running
Running
What is the context window size for Guanaco?
#2
by
altryne
- opened
Would be great to know what's the context window size here!
It is the same as the LLaMA model, 2048 tokens. I think it is possible to fine-tune it to work with a larger context window size, but I am actually not sure if anybody has ever done this with LoRA. I think QLoRA would make this easy to study.
This comment has been hidden