Update README.md
Browse files
README.md
CHANGED
@@ -4,8 +4,10 @@ tags:
|
|
4 |
pipeline_tag: feature-extraction
|
5 |
---
|
6 |
DRAGON+ is a BERT-base sized dense retriever initialized from [RetroMAE](https://huggingface.co/Shitao/RetroMAE) and further trained on the data augmented from MS MARCO corpus, following the approach described in [How to Train Your DRAGON:
|
7 |
-
Diverse Augmentation Towards Generalizable Dense Retrieval](
|
8 |
-
|
|
|
|
|
9 |
Model | Initialization | Query Encoder Path | Context Encoder Path
|
10 |
|---|---|---|---
|
11 |
DRAGON+ | Shitao/RetroMAE| facebook/dragon-plus-query-encoder | facebook/dragon-plus-context-encoder
|
@@ -35,4 +37,4 @@ ctx_emb = context_encoder(**ctx_input).last_hidden_state[:, 0, :]
|
|
35 |
# Compute similarity scores using dot product
|
36 |
score1 = query_emb @ ctx_emb[0] # 396.5625
|
37 |
score2 = query_emb @ ctx_emb[1] # 393.8340
|
38 |
-
```
|
|
|
4 |
pipeline_tag: feature-extraction
|
5 |
---
|
6 |
DRAGON+ is a BERT-base sized dense retriever initialized from [RetroMAE](https://huggingface.co/Shitao/RetroMAE) and further trained on the data augmented from MS MARCO corpus, following the approach described in [How to Train Your DRAGON:
|
7 |
+
Diverse Augmentation Towards Generalizable Dense Retrieval](https://arxiv.org/abs/2302.07452).
|
8 |
+
|
9 |
+
The associated GitHub repository is available here https://github.com/facebookresearch/dpr-scale/tree/main/dragon. We use asymmetric dual encoder, with two distinctly parameterized encoders.
|
10 |
+
|
11 |
Model | Initialization | Query Encoder Path | Context Encoder Path
|
12 |
|---|---|---|---
|
13 |
DRAGON+ | Shitao/RetroMAE| facebook/dragon-plus-query-encoder | facebook/dragon-plus-context-encoder
|
|
|
37 |
# Compute similarity scores using dot product
|
38 |
score1 = query_emb @ ctx_emb[0] # 396.5625
|
39 |
score2 = query_emb @ ctx_emb[1] # 393.8340
|
40 |
+
```
|