Shitao commited on
Commit
aac0c78
1 Parent(s): 5a40171

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -7,6 +7,8 @@ tags:
7
 
8
  ---
9
 
 
 
10
  # BGE-M3
11
  In this project, we introduce BGE-M3, which is distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity.
12
  - Multi-Functionality: It can simultaneously perform the three common retrieval functionalities of embedding model: dense retrieval, multi-vector retrieval, and sparse retrieval.
@@ -182,7 +184,7 @@ The small-batch strategy is simple but effective, which also can used to fine-tu
182
  - MCLS: A simple method to improve the performance on long text without fine-tuning.
183
  If you have no enough resource to fine-tuning model with long text, the method is useful.
184
 
185
- Refer to our [report]() for more details.
186
 
187
  **The fine-tuning codes and datasets will be open-sourced in the near future.**
188
 
 
7
 
8
  ---
9
 
10
+ For more details please refer to our github repo: https://github.com/FlagOpen/FlagEmbedding
11
+
12
  # BGE-M3
13
  In this project, we introduce BGE-M3, which is distinguished for its versatility in Multi-Functionality, Multi-Linguality, and Multi-Granularity.
14
  - Multi-Functionality: It can simultaneously perform the three common retrieval functionalities of embedding model: dense retrieval, multi-vector retrieval, and sparse retrieval.
 
184
  - MCLS: A simple method to improve the performance on long text without fine-tuning.
185
  If you have no enough resource to fine-tuning model with long text, the method is useful.
186
 
187
+ Refer to our [report](https://github.com/FlagOpen/FlagEmbedding/blob/master/FlagEmbedding/BGE_M3/BGE_M3.pdf) for more details.
188
 
189
  **The fine-tuning codes and datasets will be open-sourced in the near future.**
190