File size: 738 Bytes
2131b15 |
1 2 3 4 5 6 7 |
This is a cross-encoder model with dot-product based scoring mechanism trained on MS-MARCO dataset.
The parameters of the cross-encoder are initialized using [bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased).
This model is used as a teacher model for training a [MiniLM-based cross-encoder model](https://huggingface.co/nishantyadav/emb_crossenc_msmarco_miniLM)
which is used in experiments of our [EMNLP 2023](https://aclanthology.org/2023.findings-emnlp.544/) and [ICLR 2024](https://openreview.net/forum?id=1CPta0bfN2) papers.
See our EMNLP 2022 paper titled "Efficient Nearest Neighbor Search for Cross-Encoder Models using Matrix Factorization" for more details on the dot-product based scoring mechanism.
|