File size: 664 Bytes
ac9f6bf b95f70d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
---
language: "en"
tags:
- dense-retrieval
- knowledge-distillation
datasets:
- ms_marco
---
# Margin-MSE Trained ColBERT encoder model only
We provide the encoder model of the complete retrieval trained DistilBert-based ColBERT model. This model is trained with Margin-MSE using a 3 teacher BERT_Cat (concatenated BERT scoring) ensemble on MSMARCO-Passage, for more details check the [full model card](https://huggingface.co/sebastian-hofstaetter/colbert-distilbert-margin_mse-T2-msmarco).
This encoder-only model is used as the oracle for distilling term topic embeddings in our ECIR'23 [paper](https://link.springer.com/chapter/10.1007/978-3-031-28238-6_25). |