What is the maximum token length of the model when using Inference API ?

#5
by AIBoy1993 - opened

When the input string is long, the interface call will fail. So what is the maximum token length of the model when using Inference API ?

Sign up or log in to comment