Token size limit?

#17
by Kishan2k2 - opened

What is the token size limit of the input

512 is the limit for RoBERTa based models.

can we increase it ?

"can we increase it" is such a bizarre thing to say. you clearly have no idea how this works

can you explain please ?

@Nishant474 Different models have different "context window" sizes (the number of tokens you can pass as input). It depends on the specific architecture. BERT and RoBERTa have a fixed 512. To understand why this number is fixed you can have a look here but need to be aware of how transformers work and are trained. See 3b1b videos or medium articles

Sign up or log in to comment