Update README.md
Browse files
README.md
CHANGED
@@ -7,6 +7,7 @@ tags:
|
|
7 |
|
8 |
---
|
9 |
|
|
|
10 |
|
11 |
# BGE-M3
|
12 |
In this project, we introduce BGE-M3, which is distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity.
|
@@ -183,7 +184,7 @@ The small-batch strategy is simple but effective, which also can used to fine-tu
|
|
183 |
- MCLS: A simple method to improve the performance on long text without fine-tuning.
|
184 |
If you have no enough resource to fine-tuning model with long text, the method is useful.
|
185 |
|
186 |
-
Refer to our [report]() for more details.
|
187 |
|
188 |
**The fine-tuning codes and datasets will be open-sourced in the near future.**
|
189 |
|
|
|
7 |
|
8 |
---
|
9 |
|
10 |
+
For more details please refer to our github repo: https://github.com/FlagOpen/FlagEmbedding
|
11 |
|
12 |
# BGE-M3
|
13 |
In this project, we introduce BGE-M3, which is distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity.
|
|
|
184 |
- MCLS: A simple method to improve the performance on long text without fine-tuning.
|
185 |
If you have no enough resource to fine-tuning model with long text, the method is useful.
|
186 |
|
187 |
+
Refer to our [report](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/BGE_M3/BGE_M3.pdf) for more details.
|
188 |
|
189 |
**The fine-tuning codes and datasets will be open-sourced in the near future.**
|
190 |
|