fix: markdown table
Browse files
README.md
CHANGED
@@ -45,16 +45,16 @@ curl https://api.jina.ai/v1/rerank \
|
|
45 |
"model": "jina-reranker-v2-base-multilingual",
|
46 |
"query": "Organic skincare products for sensitive skin",
|
47 |
"documents": [
|
48 |
-
"
|
49 |
-
"
|
50 |
-
"
|
51 |
-
"
|
52 |
-
"
|
53 |
-
"
|
54 |
-
"
|
55 |
-
"
|
56 |
-
"
|
57 |
-
"
|
58 |
],
|
59 |
"top_n": 3
|
60 |
}'
|
@@ -108,14 +108,14 @@ For instance the returning scores in this case will be:
|
|
108 |
[0.8311430811882019, 0.09401018172502518,
|
109 |
0.6334102749824524, 0.08269733935594559,
|
110 |
0.7620701193809509, 0.09947021305561066,
|
111 |
-
0.9263036847114563, 0.05834583938121796,
|
112 |
0.8418256044387817, 0.11124119907617569]
|
113 |
```
|
114 |
|
115 |
The model gives high relevance scores to the documents that are most relevant to the query regardless of the language of the document.
|
116 |
|
117 |
-
Note that by default, the `jina-reranker-v2-base-multilingual` model uses [flash attention](https://github.com/Dao-AILab/flash-attention), which requires certain types of GPU hardware to run.
|
118 |
-
If you encounter any issues, you can try call `AutoModelForSequenceClassification.from_pretrained()` with `use_flash_attn=False`.
|
119 |
This will use the standard attention mechanism instead of flash attention.
|
120 |
|
121 |
If you want to use flash attention for fast inference, you need to install the following packages:
|
@@ -150,8 +150,8 @@ Specifically, the `rerank()` function will split the documents into chunks of si
|
|
150 |
|
151 |
We evaluated Jina Reranker v2 on multiple benchmarks to ensure top-tier performance and search relevance.
|
152 |
|
153 |
-
|
|
154 |
-
|
155 |
| jina-reranker-v2-multilingual | 62.14 | 54.83 | 53.17 | 68.95 | 71.36 | 61.33 | 77.75 | 93.31 |
|
156 |
| bge-reranker-v2-m3 | 63.43 | 54.17 | 53.65 | 59.73 | 62.86 | 61.28 | 78.46 | 74.86 |
|
157 |
| mmarco-mMiniLMv2-L12-H384-v1 | 59.71 | 53.37 | 45.40 | 28.91 | 51.78 | 56.46 | 58.39 | 53.60 |
|
|
|
45 |
"model": "jina-reranker-v2-base-multilingual",
|
46 |
"query": "Organic skincare products for sensitive skin",
|
47 |
"documents": [
|
48 |
+
"Organic skincare for sensitive skin with aloe vera and chamomile.",
|
49 |
+
"New makeup trends focus on bold colors and innovative techniques",
|
50 |
+
"Bio-Hautpflege für empfindliche Haut mit Aloe Vera und Kamille",
|
51 |
+
"Neue Make-up-Trends setzen auf kräftige Farben und innovative Techniken",
|
52 |
+
"Cuidado de la piel orgánico para piel sensible con aloe vera y manzanilla",
|
53 |
+
"Las nuevas tendencias de maquillaje se centran en colores vivos y técnicas innovadoras",
|
54 |
+
"针对敏感肌专门设计的天然有机护肤产品",
|
55 |
+
"新的化妆趋势注重鲜艳的颜色和创新的技巧",
|
56 |
+
"敏感肌のために特別に設計された天然有機スキンケア製品",
|
57 |
+
"新しいメイクのトレンドは鮮やかな色と革新的な技術に焦点を当てています"
|
58 |
],
|
59 |
"top_n": 3
|
60 |
}'
|
|
|
108 |
[0.8311430811882019, 0.09401018172502518,
|
109 |
0.6334102749824524, 0.08269733935594559,
|
110 |
0.7620701193809509, 0.09947021305561066,
|
111 |
+
0.9263036847114563, 0.05834583938121796,
|
112 |
0.8418256044387817, 0.11124119907617569]
|
113 |
```
|
114 |
|
115 |
The model gives high relevance scores to the documents that are most relevant to the query regardless of the language of the document.
|
116 |
|
117 |
+
Note that by default, the `jina-reranker-v2-base-multilingual` model uses [flash attention](https://github.com/Dao-AILab/flash-attention), which requires certain types of GPU hardware to run.
|
118 |
+
If you encounter any issues, you can try call `AutoModelForSequenceClassification.from_pretrained()` with `use_flash_attn=False`.
|
119 |
This will use the standard attention mechanism instead of flash attention.
|
120 |
|
121 |
If you want to use flash attention for fast inference, you need to install the following packages:
|
|
|
150 |
|
151 |
We evaluated Jina Reranker v2 on multiple benchmarks to ensure top-tier performance and search relevance.
|
152 |
|
153 |
+
| Model Name | Miracl(nDCG@10, 18 langs) | MKQA(nDCG@10, 26 langs) | BEIR(nDCG@10, 17 datasets) | MLDR(recall@10, 13 langs) | CodeSearchNet (MRR@10, 3 tasks) | AirBench (nDCG@10, zh/en) | ToolBench (recall@3, 3 tasks) | TableSearch (recall@3) |
|
154 |
+
| -----------------------------: | ------------------------- | ------------------------- | ---------------------------- | --------------------------- | --------------------------------- | --------------------------- | ------------------------------- | ------------------------ |
|
155 |
| jina-reranker-v2-multilingual | 62.14 | 54.83 | 53.17 | 68.95 | 71.36 | 61.33 | 77.75 | 93.31 |
|
156 |
| bge-reranker-v2-m3 | 63.43 | 54.17 | 53.65 | 59.73 | 62.86 | 61.28 | 78.46 | 74.86 |
|
157 |
| mmarco-mMiniLMv2-L12-H384-v1 | 59.71 | 53.37 | 45.40 | 28.91 | 51.78 | 56.46 | 58.39 | 53.60 |
|