How to use this model directly from the
from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Capreolus/electra-base-msmarco") model = AutoModelForSequenceClassification.from_pretrained("Capreolus/electra-base-msmarco")
ELECTRA-Base model (
google/electra-base-discriminator) fine-tuned on the MS MARCO passage classification task. It is intended to be used as a
ForSequenceClassification model, but requires some modification since it contains a BERT classification head rather than the standard ELECTRA classification head. See the TFElectraRelevanceHead in the Capreolus BERT-MaxP implementation for a usage example.
This corresponds to the ELECTRA-Base model used to initialize PARADE (ELECTRA) in PARADE: Passage Representation Aggregation for Document Reranking by Li et al. It was converted from the released TFv1 checkpoint. Please cite the PARADE paper if you use these weights.