orionweller
commited on
Commit
•
f1c127d
1
Parent(s):
876d63e
Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,18 @@ Promptriever is a bi-encoder retrieval model that can take in natural language i
|
|
20 |
- **Instruction-Training Dataset:** [samaya-ai/msmarco-w-instructions](https://huggingface.co/datasets/samaya-ai/msmarco-w-instructions)
|
21 |
|
22 |
|
23 |
-
#
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
You can use MTEB to load this model ([source code](https://github.com/embeddings-benchmark/mteb/blob/main/mteb/models/promptriever_models.py)):
|
26 |
```python
|
|
|
20 |
- **Instruction-Training Dataset:** [samaya-ai/msmarco-w-instructions](https://huggingface.co/datasets/samaya-ai/msmarco-w-instructions)
|
21 |
|
22 |
|
23 |
+
# Other Links
|
24 |
+
| Binary | Description |
|
25 |
+
|:-------|:------------|
|
26 |
+
| [samaya-ai/promptriever-llama2-7b-v1](https://huggingface.co/samaya-ai/promptriever-llama2-7b-v1) | A Promptriever bi-encoder model based on LLaMA 2 (7B parameters).|
|
27 |
+
| [samaya-ai/promptriever-llama3.1-8b-instruct-v1](https://huggingface.co/samaya-ai/promptriever-llama3.1-8b-instruct-v1) | A Promptriever bi-encoder model based on LLaMA 3.1 Instruct (8B parameters).|
|
28 |
+
| [samaya-ai/promptriever-llama3.1-8b-v1](https://huggingface.co/samaya-ai/promptriever-llama3.1-8b-v1) | A Promptriever bi-encoder model based on LLaMA 3.1 (8B parameters).|
|
29 |
+
| [samaya-ai/promptriever-mistral-v0.1-7b-v1](https://huggingface.co/samaya-ai/promptriever-mistral-v0.1-7b-v1) | A Promptriever bi-encoder model based on Mistral v0.1 (7B parameters). |
|
30 |
+
| [samaya-ai/RepLLaMA-reproduced](https://huggingface.co/samaya-ai/RepLLaMA-reproduced) | A reproduction of the RepLLaMA model (no instructions). A bi-encoder based on LLaMA 2, trained on the [tevatron/msmarco-passage-aug](https://huggingface.co/datasets/Tevatron/msmarco-passage-aug) dataset. |
|
31 |
+
| [samaya-ai/msmarco-w-instructions](https://huggingface.co/samaya-ai/msmarco-w-instructions) | A dataset of MS MARCO with added instructions and instruction-negatives, used for training the above models. |
|
32 |
+
|
33 |
+
|
34 |
+
# Usage
|
35 |
|
36 |
You can use MTEB to load this model ([source code](https://github.com/embeddings-benchmark/mteb/blob/main/mteb/models/promptriever_models.py)):
|
37 |
```python
|