Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ An [adapter](https://adapterhub.ml) for the [`allenai/specter2_base`](https://hu
|
|
13 |
This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
|
14 |
|
15 |
**Aug 2023 Update:**
|
16 |
-
1. **The
|
17 |
|
18 |
|Old Name|New Name|
|
19 |
|--|--|
|
@@ -24,11 +24,11 @@ This adapter was created for usage with the **[adapter-transformers](https://git
|
|
24 |
However, for benchmarking purposes, please continue using the current version.**
|
25 |
|
26 |
|
27 |
-
##
|
28 |
|
29 |
<!-- Provide a quick summary of what the model is/does. -->
|
30 |
|
31 |
-
|
32 |
This is the base model to be used along with the adapters.
|
33 |
Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
|
34 |
|
@@ -59,7 +59,7 @@ adapter_name = model.load_adapter("allenai/specter2_adhoc_query", source="hf", s
|
|
59 |
|
60 |
## Model Description
|
61 |
|
62 |
-
|
63 |
Post that it is trained with additionally attached task format specific adapter modules on all the [SciRepEval](https://huggingface.co/datasets/allenai/scirepeval) training tasks.
|
64 |
|
65 |
Task Formats trained on:
|
@@ -86,9 +86,9 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
|
|
86 |
|
87 |
<!-- Provide the basic links for the model. -->
|
88 |
|
89 |
-
- **Repository:** [https://github.com/allenai/
|
90 |
- **Paper:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
|
91 |
-
- **Demo:** [Usage](https://github.com/allenai/
|
92 |
|
93 |
# Uses
|
94 |
|
@@ -179,20 +179,11 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
|
|
179 |
|[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
|
180 |
|[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
|
181 |
|[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
|
182 |
-
|[
|
|
|
183 |
|
184 |
-
Please cite the following works if you end up using
|
185 |
|
186 |
-
[SPECTER paper](https://api.semanticscholar.org/CorpusID:215768677):
|
187 |
-
|
188 |
-
```bibtex
|
189 |
-
@inproceedings{specter2020cohan,
|
190 |
-
title={{SPECTER: Document-level Representation Learning using Citation-informed Transformers}},
|
191 |
-
author={Arman Cohan and Sergey Feldman and Iz Beltagy and Doug Downey and Daniel S. Weld},
|
192 |
-
booktitle={ACL},
|
193 |
-
year={2020}
|
194 |
-
}
|
195 |
-
```
|
196 |
[SciRepEval paper](https://api.semanticscholar.org/CorpusID:254018137)
|
197 |
```bibtex
|
198 |
@article{Singh2022SciRepEvalAM,
|
|
|
13 |
This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
|
14 |
|
15 |
**Aug 2023 Update:**
|
16 |
+
1. **The SPECTER2 Base and proximity adapter models have been renamed in Hugging Face based upon usage patterns as follows:**
|
17 |
|
18 |
|Old Name|New Name|
|
19 |
|--|--|
|
|
|
24 |
However, for benchmarking purposes, please continue using the current version.**
|
25 |
|
26 |
|
27 |
+
## SPECTER2
|
28 |
|
29 |
<!-- Provide a quick summary of what the model is/does. -->
|
30 |
|
31 |
+
SPECTER2 is the successor to [SPECTER](https://huggingface.co/allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2_).
|
32 |
This is the base model to be used along with the adapters.
|
33 |
Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
|
34 |
|
|
|
59 |
|
60 |
## Model Description
|
61 |
|
62 |
+
SPECTER2 has been trained on over 6M triplets of scientific paper citations, which are available [here](https://huggingface.co/datasets/allenai/scirepeval/viewer/cite_prediction_new/evaluation).
|
63 |
Post that it is trained with additionally attached task format specific adapter modules on all the [SciRepEval](https://huggingface.co/datasets/allenai/scirepeval) training tasks.
|
64 |
|
65 |
Task Formats trained on:
|
|
|
86 |
|
87 |
<!-- Provide the basic links for the model. -->
|
88 |
|
89 |
+
- **Repository:** [https://github.com/allenai/SPECTER2](https://github.com/allenai/SPECTER2)
|
90 |
- **Paper:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
|
91 |
+
- **Demo:** [Usage](https://github.com/allenai/SPECTER2/blob/main/README.md)
|
92 |
|
93 |
# Uses
|
94 |
|
|
|
179 |
|[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
|
180 |
|[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
|
181 |
|[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
|
182 |
+
|[SPECTER2 Base](allenai/specter2_base)|56.3|73.6|69.1|(38.0, 32.4)|
|
183 |
+
|[SPECTER2-Adapters](https://huggingface.co/models?search=allenai/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
|
184 |
|
185 |
+
Please cite the following works if you end up using SPECTER2:
|
186 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
187 |
[SciRepEval paper](https://api.semanticscholar.org/CorpusID:254018137)
|
188 |
```bibtex
|
189 |
@article{Singh2022SciRepEvalAM,
|