aps6992 commited on
Commit
c82cd4f
1 Parent(s): 680630c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -17
README.md CHANGED
@@ -13,7 +13,7 @@ An [adapter](https://adapterhub.ml) for the [`allenai/specter2_base`](https://hu
13
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
15
  **Aug 2023 Update:**
16
- 1. **The SPECTER 2.0 Base and proximity adapter models have been renamed in Hugging Face based upon usage patterns as follows:**
17
 
18
  |Old Name|New Name|
19
  |--|--|
@@ -23,11 +23,11 @@ This adapter was created for usage with the **[adapter-transformers](https://git
23
  2. **We have a parallel version (termed [aug2023refresh](https://huggingface.co/allenai/specter2_aug2023refresh)) where the base transformer encoder version is pre-trained on a collection of newer papers (published after 2018).
24
  However, for benchmarking purposes, please continue using the current version.**
25
 
26
- ## SPECTER 2.0
27
 
28
  <!-- Provide a quick summary of what the model is/does. -->
29
 
30
- SPECTER 2.0 is the successor to [SPECTER](https://huggingface.co/allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2_).
31
  This is the base model to be used along with the adapters.
32
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
33
 
@@ -58,7 +58,7 @@ adapter_name = model.load_adapter("allenai/specter2_regression", source="hf", se
58
 
59
  ## Model Description
60
 
61
- SPECTER 2.0 has been trained on over 6M triplets of scientific paper citations, which are available [here](https://huggingface.co/datasets/allenai/scirepeval/viewer/cite_prediction_new/evaluation).
62
  Post that it is trained with additionally attached task format specific adapter modules on all the [SciRepEval](https://huggingface.co/datasets/allenai/scirepeval) training tasks.
63
 
64
  Task Formats trained on:
@@ -84,9 +84,9 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
84
 
85
  <!-- Provide the basic links for the model. -->
86
 
87
- - **Repository:** [https://github.com/allenai/SPECTER2_0](https://github.com/allenai/SPECTER2_0)
88
  - **Paper:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
89
- - **Demo:** [Usage](https://github.com/allenai/SPECTER2_0/blob/main/README.md)
90
 
91
  # Uses
92
 
@@ -177,19 +177,11 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
177
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
178
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
179
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
180
- |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allenai/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
 
181
 
182
- Please cite the following works if you end up using SPECTER 2.0:
183
 
184
- [SPECTER paper](https://api.semanticscholar.org/CorpusID:215768677):
185
-
186
- ```bibtex
187
- @inproceedings{specter2020cohan,
188
- title={{SPECTER: Document-level Representation Learning using Citation-informed Transformers}},
189
- author={Arman Cohan and Sergey Feldman and Iz Beltagy and Doug Downey and Daniel S. Weld},
190
- booktitle={ACL},
191
- year={2020}
192
- }
193
  ```
194
  [SciRepEval paper](https://api.semanticscholar.org/CorpusID:254018137)
195
  ```bibtex
 
13
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
15
  **Aug 2023 Update:**
16
+ 1. **The SPECTER2 Base and proximity adapter models have been renamed in Hugging Face based upon usage patterns as follows:**
17
 
18
  |Old Name|New Name|
19
  |--|--|
 
23
  2. **We have a parallel version (termed [aug2023refresh](https://huggingface.co/allenai/specter2_aug2023refresh)) where the base transformer encoder version is pre-trained on a collection of newer papers (published after 2018).
24
  However, for benchmarking purposes, please continue using the current version.**
25
 
26
+ ## SPECTER2
27
 
28
  <!-- Provide a quick summary of what the model is/does. -->
29
 
30
+ SPECTER2 is the successor to [SPECTER](https://huggingface.co/allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2_).
31
  This is the base model to be used along with the adapters.
32
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
33
 
 
58
 
59
  ## Model Description
60
 
61
+ SPECTER2 has been trained on over 6M triplets of scientific paper citations, which are available [here](https://huggingface.co/datasets/allenai/scirepeval/viewer/cite_prediction_new/evaluation).
62
  Post that it is trained with additionally attached task format specific adapter modules on all the [SciRepEval](https://huggingface.co/datasets/allenai/scirepeval) training tasks.
63
 
64
  Task Formats trained on:
 
84
 
85
  <!-- Provide the basic links for the model. -->
86
 
87
+ - **Repository:** [https://github.com/allenai/SPECTER2](https://github.com/allenai/SPECTER2)
88
  - **Paper:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
89
+ - **Demo:** [Usage](https://github.com/allenai/SPECTER2/blob/main/README.md)
90
 
91
  # Uses
92
 
 
177
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
178
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
179
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
180
+ |[SPECTER2 Base](allenai/specter2_base)|56.3|73.6|69.1|(38.0, 32.4)|
181
+ |[SPECTER2-Adapters](https://huggingface.co/models?search=allenai/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
182
 
183
+ Please cite the following works if you end up using SPECTER2:
184
 
 
 
 
 
 
 
 
 
 
185
  ```
186
  [SciRepEval paper](https://api.semanticscholar.org/CorpusID:254018137)
187
  ```bibtex