aps6992 commited on
Commit
0d3c023
1 Parent(s): 461b37e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -15
README.md CHANGED
@@ -9,7 +9,7 @@ language:
9
 
10
  <!-- Provide a quick summary of what the model is/does. -->
11
 
12
- SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/spp).
13
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
14
 
15
  # Model Details
@@ -34,15 +34,15 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
34
  - **Shared by :** Allen AI
35
  - **Model type:** bert-base-uncased + adapters
36
  - **License:** Apache 2.0
37
- - **Finetuned from model [optional]:** [allenai/scibert](https://huggingface.co/allenai/scibert_scivocab_uncased).
38
 
39
- ## Model Sources [optional]
40
 
41
  <!-- Provide the basic links for the model. -->
42
 
43
  - **Repository:** [https://github.com/allenai/SPECTER2_0] (https://github.com/allenai/SPECTER2_0)
44
- - **Paper [optional]:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
45
- - **Demo [optional]:** [Usage] (https://github.com/allenai/SPECTER2_0/blob/main/README.md)
46
 
47
  # Uses
48
 
@@ -52,23 +52,23 @@ It builds on the work done in [SciRepEval: A Multi-Format Benchmark for Scientif
52
 
53
  |Model|Type|Name and HF link|
54
  |--|--|--|
55
- |Base|Transformer|[allenai/specter_plus_plus](https://huggingface.co/allenai/specter_plus_plus)|
56
- |Classification|Adapter|[allenai/spp_classification](https://huggingface.co/allenai/spp_classification)|
57
- |Regression|Adapter|[allenai/spp_regression](https://huggingface.co/allenai/spp_regression)|
58
- |Retrieval|Adapter|[allenai/spp_proximity](https://huggingface.co/allenai/spp_proximity)|
59
- |Adhoc Query|Adapter|[allenai/spp_adhoc_query](https://huggingface.co/allenai/spp_adhoc_query)|
60
 
61
  ```python
62
  from transformers import AutoTokenizer, AutoModel
63
 
64
  # load model and tokenizer
65
- tokenizer = AutoTokenizer.from_pretrained('allenai/specter_plus_plus')
66
 
67
  #load base model
68
- model = AutoModel.from_pretrained('allenai/specter_plus_plus')
69
 
70
  #load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
71
- model.load_adapter("allenai/spp_adhoc_query", source="hf", load_as="adhoc_query", set_active=True)
72
 
73
  papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
74
  {'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
@@ -132,8 +132,8 @@ We also evaluate and establish a new SoTA on [MDCR](https://github.com/zoranmedi
132
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
133
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
134
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
135
- |[SPECTER 2.0-base](https://huggingface.co/allenai/specter_plus_plus)|56.3|58.0|69.2|(38.0, 32.4)|
136
- |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/spp)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
137
 
138
  Please cite the following works if you end up using SPECTER 2.0:
139
 
 
9
 
10
  <!-- Provide a quick summary of what the model is/does. -->
11
 
12
+ SPECTER 2.0 is the successor to [SPECTER](allenai/specter) and is capable of generating task specific embeddings for scientific tasks when paired with [adapters](https://huggingface.co/models?search=allenai/specter-2_).
13
  Given the combination of title and abstract of a scientific paper or a short texual query, the model can be used to generate effective embeddings to be used in downstream applications.
14
 
15
  # Model Details
 
34
  - **Shared by :** Allen AI
35
  - **Model type:** bert-base-uncased + adapters
36
  - **License:** Apache 2.0
37
+ - **Finetuned from model:** [allenai/scibert](https://huggingface.co/allenai/scibert_scivocab_uncased).
38
 
39
+ ## Model Sources
40
 
41
  <!-- Provide the basic links for the model. -->
42
 
43
  - **Repository:** [https://github.com/allenai/SPECTER2_0] (https://github.com/allenai/SPECTER2_0)
44
+ - **Paper:** [https://api.semanticscholar.org/CorpusID:254018137](https://api.semanticscholar.org/CorpusID:254018137)
45
+ - **Demo:** [Usage] (https://github.com/allenai/SPECTER2_0/blob/main/README.md)
46
 
47
  # Uses
48
 
 
52
 
53
  |Model|Type|Name and HF link|
54
  |--|--|--|
55
+ |Base|Transformer|[allenai/specter2](https://huggingface.co/allenai/specter2)|
56
+ |Classification|Adapter|[allenai/specter2_classification](https://huggingface.co/allenai/specter2_classification)|
57
+ |Regression|Adapter|[allenai/specter2_regression](https://huggingface.co/allenai/specter2_regression)|
58
+ |Retrieval|Adapter|[allenai/specter2_proximity](https://huggingface.co/allenai/specter2_proximity)|
59
+ |Adhoc Query|Adapter|[allenai/specter2_adhoc_query](https://huggingface.co/allenai/specter2_adhoc_query)|
60
 
61
  ```python
62
  from transformers import AutoTokenizer, AutoModel
63
 
64
  # load model and tokenizer
65
+ tokenizer = AutoTokenizer.from_pretrained('allenai/specter2')
66
 
67
  #load base model
68
+ model = AutoModel.from_pretrained('allenai/specter2')
69
 
70
  #load the adapter(s) as per the required task, provide an identifier for the adapter in load_as argument and activate it
71
+ model.load_adapter("allenai/specter2_adhoc_query", source="hf", load_as="adhoc_query", set_active=True)
72
 
73
  papers = [{'title': 'BERT', 'abstract': 'We introduce a new language representation model called BERT'},
74
  {'title': 'Attention is all you need', 'abstract': ' The dominant sequence transduction models are based on complex recurrent or convolutional neural networks'}]
 
132
  |[SPECTER](https://huggingface.co/allenai/specter)|54.7|57.4|68.0|(30.6, 25.5)|
133
  |[SciNCL](https://huggingface.co/malteos/scincl)|55.6|57.8|69.0|(32.6, 27.3)|
134
  |[SciRepEval-Adapters](https://huggingface.co/models?search=scirepeval)|61.9|59.0|70.9|(35.3, 29.6)|
135
+ |[SPECTER 2.0-base](https://huggingface.co/allenai/specter2)|56.3|58.0|69.2|(38.0, 32.4)|
136
+ |[SPECTER 2.0-Adapters](https://huggingface.co/models?search=allen/specter-2)|**62.3**|**59.2**|**71.2**|**(38.4, 33.0)**|
137
 
138
  Please cite the following works if you end up using SPECTER 2.0:
139