sultan commited on
Commit
3137108
1 Parent(s): a5dc61c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -3
README.md CHANGED
@@ -1,7 +1,6 @@
1
- BioM-Transformers: Building Large Biomedical Language Models with
2
- BERT, ALBERT and ELECTRA
3
 
4
- Abstract
5
 
6
 
7
  The impact of design choices on the performance
@@ -21,10 +20,16 @@ the significant effect of design choices on
21
  improving the performance of biomedical language
22
  models.
23
 
 
 
24
  This model was pre-trained on PMC full article for further 64k steps with a batch size of 8192, where we initiate our weights from our model BioM-ALBERT-xxlarge.
25
 
26
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
 
 
 
27
 
 
28
 
29
  ```bibtex
30
  @inproceedings{alrowili-shanker-2021-biom,
 
1
+ # BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
 
2
 
3
+ # Abstract
4
 
5
 
6
  The impact of design choices on the performance
 
20
  improving the performance of biomedical language
21
  models.
22
 
23
+ # Model Description
24
+
25
  This model was pre-trained on PMC full article for further 64k steps with a batch size of 8192, where we initiate our weights from our model BioM-ALBERT-xxlarge.
26
 
27
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
28
+ # Acknowledgment
29
+
30
+ We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
31
 
32
+ # Citation
33
 
34
  ```bibtex
35
  @inproceedings{alrowili-shanker-2021-biom,