Update README.md
Browse files
README.md
CHANGED
@@ -194,7 +194,8 @@ bi-encoder_msmarco_bert-base_german (new) | 0.5300 <br /> 🏆 | 0.7196 <br />
|
|
194 |
[svalabs/bi-electra-ms-marco-german-uncased](svalabs/bi-electra-ms-marco-german-uncased) | 0.3818 | 0.5663 | 0.5986 | "most similar to OUR model"
|
195 |
[BM25](https://www.elastic.co/guide/en/elasticsearch/reference/current/index-modules-similarity.html#bm25) | 0.3196 | 0.5377 | 0.5740 | "lexical approach"
|
196 |
|
197 |
-
|
|
|
198 |
A direct comparison based on the same approach can be made with [svalabs/bi-electra-ms-marco-german-uncased](svalabs/bi-electra-ms-marco-german-uncased).
|
199 |
In this case, the model presented here outperforms its predecessor by up to 14 percentage points.
|
200 |
|
|
|
194 |
[svalabs/bi-electra-ms-marco-german-uncased](svalabs/bi-electra-ms-marco-german-uncased) | 0.3818 | 0.5663 | 0.5986 | "most similar to OUR model"
|
195 |
[BM25](https://www.elastic.co/guide/en/elasticsearch/reference/current/index-modules-similarity.html#bm25) | 0.3196 | 0.5377 | 0.5740 | "lexical approach"
|
196 |
|
197 |
+
**❗It is crucial to understand that the comparisons are also made with models based on other transformer approaches❗**
|
198 |
+
|
199 |
A direct comparison based on the same approach can be made with [svalabs/bi-electra-ms-marco-german-uncased](svalabs/bi-electra-ms-marco-german-uncased).
|
200 |
In this case, the model presented here outperforms its predecessor by up to 14 percentage points.
|
201 |
|