I can't find the max_sequence_length that bloom support?????

#45
by ShaneSue - opened

I can't find the max_sequence_length that bloom support?????

BigScience Workshop org

Maximum sequence length at training time was 2048 tokens, but since the model uses ALiBI encodings, it supports sequences longer than that at inference time.

ShaneSue changed discussion status to closed

Sign up or log in to comment