aps6992 commited on
Commit
018f886
1 Parent(s): f6695a8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -15
README.md CHANGED
@@ -6,9 +6,9 @@ datasets:
6
  - allenai/scirepeval
7
  ---
8
 
9
- # Adapter `allenai/spp_regression` for allenai/specter_plus_plus
10
 
11
- An [adapter](https://adapterhub.ml) for the `allenai/specter_plus_plus` model that was trained on the [allenai/scirepeval](https://huggingface.co/datasets/allenai/scirepeval/) dataset.
12
 
13
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
@@ -26,15 +26,15 @@ Now, the adapter can be loaded and activated like this:
26
  ```python
27
  from transformers import AutoAdapterModel
28
 
29
- model = AutoAdapterModel.from_pretrained("allenai/specter_plus_plus")
30
- adapter_name = model.load_adapter("allenai/spp_regression", source="hf", set_active=True)
31
  ```
32
 
33
  ## SPECTER 2.0
34
 
35
  <!-- Provide a quick summary of what the model is/does. -->
36
 
37
- SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/spp).
38
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
39
 
40
  # Model Details
@@ -79,23 +79,23 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
79
 
80
  |Model|Type|Name and HF link|
81
  |--|--|--|
82
- |Base|Transformer|[allenai/specter_plus_plus](https://huggingface.co/allenai/specter_plus_plus)|
83
- |Classification|Adapter|[allenai/spp_classification](https://huggingface.co/allenai/spp_classification)|
84
- |Regression|Adapter|[allenai/spp_regression](https://huggingface.co/allenai/spp_regression)|
85
- |Retrieval|Adapter|[allenai/spp_proximity](https://huggingface.co/allenai/spp_proximity)|
86
- |Adhoc Query|Adapter|[allenai/spp_adhoc_query](https://huggingface.co/allenai/spp_adhoc_query)|
87
 
88
  ```python
89
  from transformers import AutoTokenizer, AutoModel
90
 
91
  # load model and tokenizer
92
- tokenizer = AutoTokenizer.from_pretrained('allenai/specter_plus_plus')
93
 
94
  #load base model
95
- model = AutoModel.from_pretrained('allenai/specter_plus_plus')
96
 
97
  #load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
98
- model.load_adapter("allenai/spp_regression", source="hf", load_as="spp_regression", set_active=True)
99
 
100
  papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
101
  {'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
@@ -159,8 +159,8 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
159
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
160
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
161
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
162
- |[SPECTER 2.0-base](https://huggingface.co/allenai/specter_plus_plus)|56.3|58.0|69.2|(38.0, 32.4)|
163
- |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/spp)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
164
 
165
  Please cite the following works if you end up using SPECTER 2.0:
166
 
 
6
  - allenai/scirepeval
7
  ---
8
 
9
+ # Adapter `allenai/specter2_regression` for allenai/specter2
10
 
11
+ An [adapter](https://adapterhub.ml) for the `allenai/specter2` model that was trained on the [allenai/scirepeval](https://huggingface.co/datasets/allenai/scirepeval/) dataset.
12
 
13
  This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
14
 
 
26
  ```python
27
  from transformers import AutoAdapterModel
28
 
29
+ model = AutoAdapterModel.from_pretrained("allenai/specter2")
30
+ adapter_name = model.load_adapter("allenai/specter2_regression", source="hf", set_active=True)
31
  ```
32
 
33
  ## SPECTER 2.0
34
 
35
  <!-- Provide a quick summary of what the model is/does. -->
36
 
37
+ SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2).
38
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
39
 
40
  # Model Details
 
79
 
80
  |Model|Type|Name and HF link|
81
  |--|--|--|
82
+ |Base|Transformer|[allenai/specter2](https://huggingface.co/allenai/specter2)|
83
+ |Classification|Adapter|[allenai/specter2_classification](https://huggingface.co/allenai/specter2_classification)|
84
+ |Regression|Adapter|[allenai/specter2_regression](https://huggingface.co/allenai/specter2_regression)|
85
+ |Retrieval|Adapter|[allenai/specter2_proximity](https://huggingface.co/allenai/specter2_proximity)|
86
+ |Adhoc Query|Adapter|[allenai/specter2_adhoc_query](https://huggingface.co/allenai/specter2_adhoc_query)|
87
 
88
  ```python
89
  from transformers import AutoTokenizer, AutoModel
90
 
91
  # load model and tokenizer
92
+ tokenizer = AutoTokenizer.from_pretrained('allenai/specter2')
93
 
94
  #load base model
95
+ model = AutoModel.from_pretrained('allenai/specter2')
96
 
97
  #load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
98
+ model.load_adapter("allenai/specter2_regression", source="hf", load_as="regression", set_active=True)
99
 
100
  papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
101
  {'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
 
159
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
160
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
161
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
162
+ |[SPECTER 2.0-base](https://huggingface.co/allenai/specter2)|56.3|58.0|69.2|(38.0, 32.4)|
163
+ |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
164
 
165
  Please cite the following works if you end up using SPECTER 2.0:
166