Update README.md
Browse files
README.md
CHANGED
@@ -6,9 +6,9 @@ datasets:
|
|
6 |
- allenai/scirepeval
|
7 |
---
|
8 |
|
9 |
-
# Adapter `allenai/
|
10 |
|
11 |
-
An [adapter](https://adapterhub.ml) for the `allenai/
|
12 |
|
13 |
This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
|
14 |
|
@@ -26,15 +26,15 @@ Now, the adapter can be loaded and activated like this:
|
|
26 |
```python
|
27 |
from transformers import AutoAdapterModel
|
28 |
|
29 |
-
model = AutoAdapterModel.from_pretrained("allenai/
|
30 |
-
adapter_name = model.load_adapter("allenai/
|
31 |
```
|
32 |
|
33 |
## SPECTER 2.0
|
34 |
|
35 |
<!-- Provide a quick summary of what the model is/does. -->
|
36 |
|
37 |
-
SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/
|
38 |
Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
|
39 |
|
40 |
# Model Details
|
@@ -50,7 +50,7 @@ Task Formats trained on:
|
|
50 |
- Proximity
|
51 |
- Adhoc Search
|
52 |
|
53 |
-
This is the adhoc search query specific adapter. For tasks where papers have to retrieved for a short textual query, use this adapter to encode the query and [allenai/
|
54 |
|
55 |
|
56 |
It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientific Document Representations](https://api.semanticscholar.org/CorpusID:254018137) and we evaluate the trained model on this benchmark as well.
|
@@ -79,23 +79,23 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
|
|
79 |
|
80 |
|Model|Type|Name and HF link|
|
81 |
|--|--|--|
|
82 |
-
|Base|Transformer|[allenai/
|
83 |
-
|Classification|Adapter|[allenai/
|
84 |
-
|Regression|Adapter|[allenai/
|
85 |
-
|Retrieval|Adapter|[allenai/
|
86 |
-
|Adhoc Query|Adapter|[allenai/
|
87 |
|
88 |
```python
|
89 |
from transformers import AutoTokenizer, AutoModel
|
90 |
|
91 |
# load model and tokenizer
|
92 |
-
tokenizer = AutoTokenizer.from_pretrained('allenai/
|
93 |
|
94 |
#load base model
|
95 |
-
model = AutoModel.from_pretrained('allenai/
|
96 |
|
97 |
#load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
|
98 |
-
model.load_adapter("allenai/
|
99 |
|
100 |
papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
|
101 |
{'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
|
@@ -159,8 +159,8 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
|
|
159 |
|[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
|
160 |
|[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
|
161 |
|[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
|
162 |
-
|[SPECTER 2.0-base](https://huggingface.co/allenai/
|
163 |
-
|[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/
|
164 |
|
165 |
Please cite the following works if you end up using SPECTER 2.0:
|
166 |
|
|
|
6 |
- allenai/scirepeval
|
7 |
---
|
8 |
|
9 |
+
# Adapter `allenai/specter2_adhoc_query` for allenai/specter2
|
10 |
|
11 |
+
An [adapter](https://adapterhub.ml) for the `allenai/specter2` model that was trained on the [allenai/scirepeval](https://huggingface.co/datasets/allenai/scirepeval/) dataset.
|
12 |
|
13 |
This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
|
14 |
|
|
|
26 |
```python
|
27 |
from transformers import AutoAdapterModel
|
28 |
|
29 |
+
model = AutoAdapterModel.from_pretrained("allenai/specter2")
|
30 |
+
adapter_name = model.load_adapter("allenai/specter2_adhoc_query", source="hf", set_active=True)
|
31 |
```
|
32 |
|
33 |
## SPECTER 2.0
|
34 |
|
35 |
<!-- Provide a quick summary of what the model is/does. -->
|
36 |
|
37 |
+
SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2).
|
38 |
Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
|
39 |
|
40 |
# Model Details
|
|
|
50 |
- Proximity
|
51 |
- Adhoc Search
|
52 |
|
53 |
+
This is the adhoc search query specific adapter. For tasks where papers have to retrieved for a short textual query, use this adapter to encode the query and [allenai/specter2_proximity](https://huggingface.co/allenai/specter2_adhoc_proximity)
|
54 |
|
55 |
|
56 |
It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientific Document Representations](https://api.semanticscholar.org/CorpusID:254018137) and we evaluate the trained model on this benchmark as well.
|
|
|
79 |
|
80 |
|Model|Type|Name and HF link|
|
81 |
|--|--|--|
|
82 |
+
|Base|Transformer|[allenai/specter2](https://huggingface.co/allenai/specter2)|
|
83 |
+
|Classification|Adapter|[allenai/specter2_classification](https://huggingface.co/allenai/specter2_classification)|
|
84 |
+
|Regression|Adapter|[allenai/specter2_regression](https://huggingface.co/allenai/specter2_regression)|
|
85 |
+
|Retrieval|Adapter|[allenai/specter2_proximity](https://huggingface.co/allenai/specter2_proximity)|
|
86 |
+
|Adhoc Query|Adapter|[allenai/specter2_adhoc_query](https://huggingface.co/allenai/specter2_adhoc_query)|
|
87 |
|
88 |
```python
|
89 |
from transformers import AutoTokenizer, AutoModel
|
90 |
|
91 |
# load model and tokenizer
|
92 |
+
tokenizer = AutoTokenizer.from_pretrained('allenai/specter2')
|
93 |
|
94 |
#load base model
|
95 |
+
model = AutoModel.from_pretrained('allenai/specter2')
|
96 |
|
97 |
#load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
|
98 |
+
model.load_adapter("allenai/specter2_adhoc_query", source="hf", load_as="specter2_adhoc_query", set_active=True)
|
99 |
|
100 |
papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
|
101 |
{'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
|
|
|
159 |
|[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
|
160 |
|[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
|
161 |
|[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
|
162 |
+
|[SPECTER 2.0-base](https://huggingface.co/allenai/specter2)|56.3|58.0|69.2|(38.0, 32.4)|
|
163 |
+
|[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
|
164 |
|
165 |
Please cite the following works if you end up using SPECTER 2.0:
|
166 |
|