File size: 5,099 Bytes
2933e2f 6ff4f74 2933e2f e80e4f3 2933e2f 6ff4f74 2933e2f 6ff4f74 2933e2f 6ff4f74 2933e2f e80e4f3 2933e2f 6ff4f74 2933e2f d17fcf9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 |
## Introduction
This respository introduces how to reproduce the `Dense`, `Sparse`, and `Dense+Sparse` evaluation results of the paper [BGE-M3](https://arxiv.org/pdf/2402.03216.pdf) on the [MIRACL](https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00595/117438/MIRACL-A-Multilingual-Retrieval-Dataset-Covering) dev split.
## Requirements
```bash
# Install Java (Linux)
apt update
apt install openjdk-21-jdk
# Install Pyserini
pip install pyserini
# Install Faiss
## CPU version
conda install -c conda-forge faiss-cpu
## GPU version
conda install -c conda-forge faiss-gpu
```
**It should be noted that** the Pyserini code needs to be modified to support the multiple alpha settings in `pyserini/fusion`. I have already submitted a pull request to the official repository to support this feature. You can refer to this [PR](https://github.com/castorini/pyserini/pull/1858) to modify the code.
## 2CR
### Download and Unzip
```bash
# Download
## MIRACL topics and qrels
git clone https://huggingface.co/datasets/miracl/miracl
mv miracl/*/*/* topics-and-qrels
## Dense and Sparse Index
git lfs install
git clone https://huggingface.co/datasets/hanhainebula/bge-m3_miracl_2cr
cat bge-m3_miracl_2cr/dense/en.tar.gz.part_* > bge-m3_miracl_2cr/dense/en.tar.gz
cat bge-m3_miracl_2cr/dense/de.tar.gz.part_* > bge-m3_miracl_2cr/dense/de.tar.gz
# Unzip
languages=(ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo)
## Dense
for lang in ${languages[@]}; do
tar -zxvf bge-m3_miracl_2cr/dense/${lang}.tar.gz -C bge-m3_miracl_2cr/dense/
done
## Sparse
for lang in ${languages[@]}; do
tar -zxvf bge-m3_miracl_2cr/sparse/${lang}.tar.gz -C bge-m3_miracl_2cr/sparse/
done
```
### Reproduction
#### Dense
```bash
# Avaliable Language: ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo
lang=zh
# Generate run
python -m pyserini.search.faiss \
--threads 16 --batch-size 512 \
--encoder-class auto \
--encoder BAAI/bge-m3 \
--pooling cls --l2-norm \
--topics topics-and-qrels/topics.miracl-v1.0-${lang}-dev.tsv \
--index bge-m3_miracl_2cr/dense/${lang} \
--output bge-m3_miracl_2cr/dense/runs/${lang}.txt \
--hits 1000
# Evaluate
## nDCG@10
python -m pyserini.eval.trec_eval \
-c -M 100 -m ndcg_cut.10 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/dense/runs/${lang}.txt
## Recall@100
python -m pyserini.eval.trec_eval \
-c -m recall.100 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/dense/runs/${lang}.txt
```
#### Sparse
```bash
# Avaliable Language: ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo
lang=zh
# Generate run
python -m pyserini.search.lucene \
--threads 16 --batch-size 128 \
--topics bge-m3_miracl_2cr/sparse/${lang}/query_embd.tsv \
--index bge-m3_miracl_2cr/sparse/${lang}/index \
--output bge-m3_miracl_2cr/sparse/runs/${lang}.txt \
--output-format trec \
--impact --hits 1000
# Evaluate
## nDCG@10
python -m pyserini.eval.trec_eval \
-c -M 100 -m ndcg_cut.10 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/sparse/runs/${lang}.txt
## Recall@100
python -m pyserini.eval.trec_eval \
-c -m recall.100 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/sparse/runs/${lang}.txt
```
#### Dense+Sparse
**Note**: You should first merge this [PR](https://github.com/castorini/pyserini/pull/1858) to support the multiple alpha settings in `pyserini/fusion`.
```bash
# Avaliable Language: ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo
lang=zh
# Generate dense run and sparse run
python -m pyserini.search.faiss \
--threads 16 --batch-size 512 \
--encoder-class auto \
--encoder BAAI/bge-m3 \
--pooling cls --l2-norm \
--topics topics-and-qrels/topics.miracl-v1.0-${lang}-dev.tsv \
--index bge-m3_miracl_2cr/dense/${lang} \
--output bge-m3_miracl_2cr/dense/runs/${lang}.txt \
--hits 1000
python -m pyserini.search.lucene \
--threads 16 --batch-size 128 \
--topics bge-m3_miracl_2cr/sparse/${lang}/query_embd.tsv \
--index bge-m3_miracl_2cr/sparse/${lang}/index \
--output bge-m3_miracl_2cr/sparse/runs/${lang}.txt \
--output-format trec \
--impact --hits 1000
# Generate dense+sparse run
mkdir -p bge-m3_miracl_2cr/fusion/runs
python -m pyserini.fusion \
--method interpolation \
--runs bge-m3_miracl_2cr/dense/runs/${lang}.txt bge-m3_miracl_2cr/sparse/runs/${lang}.txt \
--alpha 1 3e-5 \
--output bge-m3_miracl_2cr/fusion/runs/${lang}.txt \
--depth 1000 --k 1000
# Evaluation
## nDCG@10
python -m pyserini.eval.trec_eval \
-c -M 100 -m ndcg_cut.10 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/fusion/runs/${lang}.txt
## Recall@100
python -m pyserini.eval.trec_eval \
-c -m recall.100 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/fusion/runs/${lang}.txt
```
Note:
- The hybrid method we used for MIRACL in BGE-M3 paper is: `s_dense + 0.3 * s_sparse`. But when the sparse score is calculated, it has already been multiplied by 100^2, so the alpha for sparse run here is 3e-5, instead of 0.3.
|