Back to all models
text-classification mask_token: [MASK]
Query this model
🔥 This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚡️ Upgrade your account to access the Inference API

Share Copied link to clipboard

Monthly model downloads

Capreolus/electra-base-msmarco Capreolus/electra-base-msmarco
247 downloads
last 30 days

pytorch

tf

Contributed by

Capreolus
1 team member · 5 models

How to use this model directly from the 🤗/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Capreolus/electra-base-msmarco") model = AutoModelForSequenceClassification.from_pretrained("Capreolus/electra-base-msmarco")
Uploaded in S3

capreolus/electra-base-msmarco

Model description

ELECTRA-Base model (google/electra-base-discriminator) fine-tuned on the MS MARCO passage classification task. It is intended to be used as a ForSequenceClassification model, but requires some modification since it contains a BERT classification head rather than the standard ELECTRA classification head. See the TFElectraRelevanceHead in the Capreolus BERT-MaxP implementation for a usage example.

This corresponds to the ELECTRA-Base model used to initialize PARADE (ELECTRA) in PARADE: Passage Representation Aggregation for Document Reranking by Li et al. It was converted from the released TFv1 checkpoint. Please cite the PARADE paper if you use these weights.