tformal commited on
Commit
0f718e0
1 Parent(s): ab161cd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -0
README.md CHANGED
@@ -2,6 +2,7 @@
2
  license: cc-by-nc-sa-4.0
3
  language: "en"
4
  tags:
 
5
  - query-expansion
6
  - document-expansion
7
  - bag-of-words
@@ -10,3 +11,30 @@ tags:
10
  datasets:
11
  - ms_marco
12
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: cc-by-nc-sa-4.0
3
  language: "en"
4
  tags:
5
+ - splade
6
  - query-expansion
7
  - document-expansion
8
  - bag-of-words
 
11
  datasets:
12
  - ms_marco
13
  ---
14
+
15
+ ## SPLADE CoCondenser SelfDistil
16
+
17
+ SPLADE model for passage retrieval. For additional details, please visit:
18
+ * paper: https://arxiv.org/abs/2205.04733
19
+ * code: https://github.com/naver/splade
20
+
21
+ | | MRR@10 (MS MARCO dev) | R@1000 (MS MARCO dev) |
22
+ | --- | --- | --- |
23
+ | `splade-cocondenser-selfdistil` | 37.6 | 98.4 |
24
+
25
+ ## Citation
26
+
27
+ If you use our checkpoint, please cite our work:
28
+
29
+ ```
30
+ @misc{https://doi.org/10.48550/arxiv.2205.04733,
31
+ doi = {10.48550/ARXIV.2205.04733},
32
+ url = {https://arxiv.org/abs/2205.04733},
33
+ author = {Formal, Thibault and Lassance, Carlos and Piwowarski, Benjamin and Clinchant, Stéphane},
34
+ keywords = {Information Retrieval (cs.IR), Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
35
+ title = {From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective},
36
+ publisher = {arXiv},
37
+ year = {2022},
38
+ copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International}
39
+ }
40
+ ```