I can't find the max_sequence_length that bloom support?????
#45
by
ShaneSue
- opened
I can't find the max_sequence_length that bloom support?????
Maximum sequence length at training time was 2048 tokens, but since the model uses ALiBI encodings, it supports sequences longer than that at inference time.
ShaneSue
changed discussion status to
closed