Limitation - context length of the BERT classifier?
#17
by
liyucheng
- opened
The classifier is taking the first 1500 chars as input due to the context limit of BERT series models.
This does not make full use of the Llama capability and can fail to identify length educational content.
Do we have any long-context BERT-like models on huggingface?