context size

#2
by to-be - opened

I see that if i try this on a longer text, it stops in the middle. What is the window or context size that it accepts?

to-be changed discussion status to closed

384 words, 512 subwords

@tomaarsen do you know good transformer encoder with larger context size ? or any technique to extend it

My bad! Thanks for correcting me

Sign up or log in to comment