aps6992 commited on
Commit
d2269f8
1 Parent(s): 8f9126f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -32,11 +32,11 @@ adapter_name = model.load_adapter("allenai/specter2_aug2023refresh", source="hf"
32
 
33
  **\*\*\*\*\*\*Update\*\*\*\*\*\***
34
 
35
- This update introduces a new set of SPECTER 2.0 models with the base transformer encoder pre-trained on an extended citation dataset containing more recent papers.
36
- For benchmarking purposes please use the existing SPECTER 2.0 [models](https://huggingface.co/allenai/specter2) w/o the **aug2023refresh** suffix.
37
 
38
- # SPECTER 2.0 (Base)
39
- SPECTER 2.0 is the successor to [SPECTER](https://huggingface.co/allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2_).
40
  This is the base model to be used along with the adapters.
41
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
42
 
@@ -48,7 +48,7 @@ Given the combination of title and abstract of a scientific paper or a short tex
48
 
49
  ## Model Description
50
 
51
- SPECTER 2.0 has been trained on over 6M triplets of scientific paper citations, which are available [here](https://huggingface.co/datasets/allenai/scirepeval/viewer/cite_prediction_new/evaluation).
52
  Post that it is trained with additionally attached task format specific adapter modules on all the [SciRepEval](https://huggingface.co/datasets/allenai/scirepeval) training tasks.
53
 
54
  Task Formats trained on:
@@ -72,9 +72,9 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
72
 
73
  <!-- Provide the basic links for the model. -->
74
 
75
- - **Repository:** [https://github.com/allenai/SPECTER2_0](https://github.com/allenai/SPECTER2_0)
76
  - **Paper:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
77
- - **Demo:** [Usage](https://github.com/allenai/SPECTER2_0/blob/main/README.md)
78
 
79
  # Uses
80
 
@@ -165,9 +165,9 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
165
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
166
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
167
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
168
- |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allenai/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
169
 
170
- Please cite the following works if you end up using SPECTER 2.0:
171
 
172
  [SPECTER paper](https://api.semanticscholar.org/CorpusID:215768677):
173
 
 
32
 
33
  **\*\*\*\*\*\*Update\*\*\*\*\*\***
34
 
35
+ This update introduces a new set of SPECTER2 models with the base transformer encoder pre-trained on an extended citation dataset containing more recent papers.
36
+ For benchmarking purposes please use the existing SPECTER2 [models](https://huggingface.co/allenai/specter2) w/o the **aug2023refresh** suffix.
37
 
38
+ # SPECTER2 (Base)
39
+ SPECTER2 is the successor to [SPECTER](https://huggingface.co/allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2_).
40
  This is the base model to be used along with the adapters.
41
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
42
 
 
48
 
49
  ## Model Description
50
 
51
+ SPECTER2 has been trained on over 6M triplets of scientific paper citations, which are available [here](https://huggingface.co/datasets/allenai/scirepeval/viewer/cite_prediction_new/evaluation).
52
  Post that it is trained with additionally attached task format specific adapter modules on all the [SciRepEval](https://huggingface.co/datasets/allenai/scirepeval) training tasks.
53
 
54
  Task Formats trained on:
 
72
 
73
  <!-- Provide the basic links for the model. -->
74
 
75
+ - **Repository:** [https://github.com/allenai/SPECTER2](https://github.com/allenai/SPECTER2)
76
  - **Paper:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
77
+ - **Demo:** [Usage](https://github.com/allenai/SPECTER2/blob/main/README.md)
78
 
79
  # Uses
80
 
 
165
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
166
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
167
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
168
+ |[SPECTER2-Adapters](https://huggingface.co/models?search=allenai/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
169
 
170
+ Please cite the following works if you end up using SPECTER2:
171
 
172
  [SPECTER paper](https://api.semanticscholar.org/CorpusID:215768677):
173