Any example for cross-encoding?

#1
by dcalsky - opened

I used the CrossEncoder in sentence_transformers package and code is following, but the result seems not exactly as same as the result from Inference API (provided by huggingface model card).

from sentence_transformers import CrossEncoder
model = CrossEncoder('./gte-large-zh', max_length=512)
scores = model.predict([('That is a happy person', 'That is a happy dog'), ('That is a happy person', 'That is a very happy person') , ('That is a happy person', 'Today is a sunny day')], show_progress_bar=True)
print(scores)

And It also comes up some warning:

Some weights of BertForSequenceClassification were not initialized from the model checkpoint at ./gte-large-zh and are newly initialized: ['classifier.weight', 'classifier.bias']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

my result:

[0.3533868  0.41155243 0.3988041 ]

image.png

GTE is a text representation model, not a cross-encoder model structure, so CrossEncoder cannot be used to load the model.

Sign up or log in to comment