What are the context window (input) and output token limits ?

#24
by sniffski - opened

Hey guys I'm interested of the context window limits on this model... Can't find clear explanation with numbers online... πŸ€”

32K for input as seen in the config.json:
"max_position_embeddings": 32768,

Thanks, and what about the output?

Thanks, and what about the output?

input + output = 32768

in fact, according to the paper context is 8192

Parameter Value
dim 4096
n_layers 32
head_dim 128
hidden_dim 14336
n_heads 32
n_kv_heads 8
window_size 4096
context_len 8192
vocab_size 32000

Sign up or log in to comment