Can model be loaded with Cross Encoder?
Hi,
I want to use this model with CrossEncoder from sentence-transformers but I am getting errors. The mxbai-rerank-large-v1 worked fine with CrossEncoder but V2 doesn't seem to work.
First when I load the model, I am getting this warning:
´´´
from sentence_transformers import CrossEncoder
if cfg.RERANKING_MODEL == "cross_encoder":
reranking_model = CrossEncoder("mixedbread-ai/mxbai-rerank-large-v2", max_length=512, device="mps", trust_remote_code=True)
Some weights of Qwen2ForSequenceClassification were not initialized from the model checkpoint at mixedbread-ai/mxbai-rerank-large-v2 and are newly initialized: ['score.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
´´´
And when using it with the rank function I get this error:
´´´
ValueError: Cannot handle batch sizes > 1 if no padding token is defined.
´´´
So could this model be loaded with CrossEncoder from Sentence Transformers or what would be an alternative to load the model on Mac with "mps" device?
for context: sentence-transfomers version is 3.4.1
+1. Great model, but would rather prefer to use sentence-transformers for unification of inference and fast experiments to compare with other models.
BTW: same issue with base version.