LilaBoualili's picture
Update README.md
ac9f6bf
|
raw
history blame
664 Bytes
metadata
language: en
tags:
  - dense-retrieval
  - knowledge-distillation
datasets:
  - ms_marco

Margin-MSE Trained ColBERT encoder model only

We provide the encoder model of the complete retrieval trained DistilBert-based ColBERT model. This model is trained with Margin-MSE using a 3 teacher BERT_Cat (concatenated BERT scoring) ensemble on MSMARCO-Passage, for more details check the full model card.

This encoder-only model is used as the oracle for distilling term topic embeddings in our ECIR'23 paper.