youngking0727
commited on
Commit
•
8950dd2
1
Parent(s):
3e11e9b
Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,7 @@ The model was trained with the following hyperparameters:
|
|
24 |
* Cutoff length: 2048
|
25 |
* Learning rate: 2e-5
|
26 |
|
27 |
-
Overview
|
28 |
(PMC)-ID and PubMed ID as criteria.
|
29 |
|
30 |
|
@@ -43,7 +43,7 @@ ical data modalities via a single GPT model. BioMedGPT aligns different biologic
|
|
43 |
**Out-of-scope Uses**
|
44 |
|
45 |
|
46 |
-
###
|
47 |
"BioMedGPT: Open Multimodal Generative Pre-trained Transformer for BioMedicine"
|
48 |
|
49 |
|
|
|
24 |
* Cutoff length: 2048
|
25 |
* Learning rate: 2e-5
|
26 |
|
27 |
+
Overview BioMedGPT-LM-7B was finetuned on over 26 billion tokens highly pertinent to the field of biomedicine. The fine-tuning data are extracted from 5.5 million biomedical papers in S2ORC data using PubMed Central
|
28 |
(PMC)-ID and PubMed ID as criteria.
|
29 |
|
30 |
|
|
|
43 |
**Out-of-scope Uses**
|
44 |
|
45 |
|
46 |
+
### Technical Report
|
47 |
"BioMedGPT: Open Multimodal Generative Pre-trained Transformer for BioMedicine"
|
48 |
|
49 |
|