Commit
·
d515295
1
Parent(s):
796655d
Update article_classification_modelcard.md
Browse files
article_classification_modelcard.md
CHANGED
@@ -20,7 +20,7 @@ This model has been fine-tuned to classify scientific articles (title and abstra
|
|
20 |
- **Model type:** RoBERTa (BERT; Transformer)
|
21 |
- **Language(s) (NLP):** Python
|
22 |
- **License:** MIT
|
23 |
-
- **Finetuned from model:**
|
24 |
|
25 |
## Model Sources
|
26 |
|
@@ -112,15 +112,16 @@ The model works satisfactorily for identifying articles describing biodata resou
|
|
112 |
|
113 |
## Model Architecture and Objective
|
114 |
|
115 |
-
[
|
|
|
116 |
|
117 |
## Compute Infrastructure
|
118 |
|
119 |
-
|
120 |
|
121 |
### Hardware
|
122 |
|
123 |
-
Model was fine-tuned
|
124 |
|
125 |
### Software
|
126 |
|
|
|
20 |
- **Model type:** RoBERTa (BERT; Transformer)
|
21 |
- **Language(s) (NLP):** Python
|
22 |
- **License:** MIT
|
23 |
+
- **Finetuned from model:** https://huggingface.co/allenai/dsp_roberta_base_dapt_biomed_tapt_rct_500
|
24 |
|
25 |
## Model Sources
|
26 |
|
|
|
112 |
|
113 |
## Model Architecture and Objective
|
114 |
|
115 |
+
The base model architecture is as described in [Gururangan S., *et al.,* 2020](http://arxiv.org/abs/2004.10964). Classification is performed using
|
116 |
+
a linear sequence classification layer initialized using [transformers.AutoModelForSequenceClassification()](https://huggingface.co/docs/transformers/model_doc/auto).
|
117 |
|
118 |
## Compute Infrastructure
|
119 |
|
120 |
+
Model was fine-tuned on Google Colaboratory.
|
121 |
|
122 |
### Hardware
|
123 |
|
124 |
+
Model was fine-tuned using GPU acceleration provided by Google Colaboratory.
|
125 |
|
126 |
### Software
|
127 |
|