youngking0727 commited on
Commit
1db5200
1 Parent(s): ba554ac

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +56 -0
README.md ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - medical
5
+ datasets:
6
+ - biomed
7
+ ---
8
+ # BioMedGPT-LM-7B
9
+
10
+ In this repo, we present a medical language model named BioMedGPT-LM which is the first commercial-friendly GPT model in the biomedical domain and has demonstrated
11
+ superior performance over existing LLMs of the same parameter size. We are releasing a 7B model **BioMedGPT-LM-7B** which is LLaMA2-7b-chat finetuned on the PMC abstracts and papers from the S2ORC.
12
+
13
+
14
+
15
+
16
+ ### Training Details
17
+
18
+
19
+
20
+ The model was trained with the following hyperparameters:
21
+
22
+ * Epochs: 5
23
+ * Batch size: 192
24
+ * Cutoff length: 2048
25
+ * Learning rate: 2e-5
26
+
27
+ Overview Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.
28
+
29
+ ### Model Developers
30
+ PharMolix
31
+
32
+ ### How to Use
33
+ BioMedGPT-LM-7B is a part of **[BioMedGPT-10B](https://github.com/BioFM/OpenBioMed)**, an open-source version of BioMedGPT. BioMedGPT is a multimodal generative pre-trained transformer (GPT) for biomedicine, which bridges the natural language modality and diverse biomed-
34
+ ical data modalities via a single GPT model. BioMedGPT aligns different biological modalities with the text modality via BioMedGPT-LM. The details of BioMedGPT-10B and BioMedGPT-LM-7B can be found in the [technical report]().
35
+
36
+
37
+ **Intended Use Cases**
38
+
39
+
40
+
41
+ **Out-of-scope Uses**
42
+
43
+
44
+ ### Research Paper
45
+ "BioMedGPT: Open Multimodal Generative Pre-trained Transformer for BioMedicine"
46
+
47
+
48
+ ### github
49
+ [https://github.com/BioFM/OpenBioMed](https://github.com/BioFM/OpenBioMed)
50
+
51
+
52
+ ### Limitations
53
+
54
+ [Highlight any limitations or potential issues of your model.]
55
+
56
+