Text2Text Generation
Transformers
PyTorch
English
t5
text-generation-inference
Inference Endpoints

Maximum sequence length

#2
by SaraAmd - opened

What is the maximum sequence length the model can handle?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment