Is there a way to support > 512 token length?
#5
by
kk3dmax
- opened
Is there a way to support > 512 token length?
Will modify max_position_embeddings from 514 to 2048 help?
The model doesn't support > 512 tokens. Since the max position embeddings is 512 during training, directly modifying the max_position_embeddings cannot perform well.
The next version of reranker will support longer text.
Do we have an ETA of the new version? Say 1 month later or 3 months later...
We plan to release the next version of BGE within 2 months.
Thank you for your reply!
Hi there, I am wondering where could we find the current supported max token length now?