sultan commited on
Commit
f223b43
1 Parent(s): dbbe0ff

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -4
README.md CHANGED
@@ -1,7 +1,6 @@
1
- BioM-Transformers: Building Large Biomedical Language Models with
2
- BERT, ALBERT and ELECTRA
3
 
4
- Abstract
5
 
6
 
7
  The impact of design choices on the performance
@@ -21,14 +20,18 @@ the significant effect of design choices on
21
  improving the performance of biomedical language
22
  models.
23
 
 
 
24
  This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 434K steps with a batch size of 4096 on TPUv3-512 unit.
25
 
26
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
27
 
28
- Acknowledgment
29
 
30
  We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
31
 
 
 
32
  ```bibtex
33
  @inproceedings{alrowili-shanker-2021-biom,
34
  title = "{B}io{M}-Transformers: Building Large Biomedical Language Models with {BERT}, {ALBERT} and {ELECTRA}",
 
1
+ #BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
 
2
 
3
+ # Abstract
4
 
5
 
6
  The impact of design choices on the performance
 
20
  improving the performance of biomedical language
21
  models.
22
 
23
+ # Model Description
24
+
25
  This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 434K steps with a batch size of 4096 on TPUv3-512 unit.
26
 
27
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
28
 
29
+ # Acknowledgment
30
 
31
  We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
32
 
33
+ # Citation
34
+
35
  ```bibtex
36
  @inproceedings{alrowili-shanker-2021-biom,
37
  title = "{B}io{M}-Transformers: Building Large Biomedical Language Models with {BERT}, {ALBERT} and {ELECTRA}",