Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

This model is to reproduce a variant of TCT-ColBERT-V2 dense retrieval models described in the following paper:

Sheng-Chieh Lin, Jheng-Hong Yang, and Jimmy Lin. In-Batch Negatives for Knowledge Distillation with Tightly-CoupledTeachers for Dense Retrieval. RepL4NLP 2021.

You can find our reproduction report in Pyserini here.

Downloads last month
8,575
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using castorini/tct_colbert-v2-hnp-msmarco 1