sultan commited on
Commit
6b8e77b
1 Parent(s): 5936dd6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -1,7 +1,6 @@
1
- BioM-Transformers: Building Large Biomedical Language Models with
2
- BERT, ALBERT and ELECTRA
3
 
4
- Abstract
5
 
6
 
7
  The impact of design choices on the performance
@@ -21,15 +20,16 @@ the significant effect of design choices on
21
  improving the performance of biomedical language
22
  models.
23
 
 
24
  This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 264K steps with a batch size of 8192 on TPUv3-512 unit.
25
 
26
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
27
 
28
- Acknowledgment
29
 
30
  We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
31
 
32
- Citation
33
 
34
  ```bibtex
35
  @inproceedings{alrowili-shanker-2021-biom,
 
1
+ # BioM-Transformers: Building Large Biomedical Language Models with BERT, ALBERT and ELECTRA
 
2
 
3
+ # Abstract
4
 
5
 
6
  The impact of design choices on the performance
 
20
  improving the performance of biomedical language
21
  models.
22
 
23
+ # Model Description
24
  This model was pre-trained on PubMed Abstracts only with biomedical domain vocabulary for 264K steps with a batch size of 8192 on TPUv3-512 unit.
25
 
26
  Check our GitHub repo at https://github.com/salrowili/BioM-Transformers for TensorFlow and GluonNLP checkpoints.
27
 
28
+ # Acknowledgment
29
 
30
  We would like to acknowledge the support we have from Tensorflow Research Cloud (TFRC) team to grant us access to TPUv3 units.
31
 
32
+ # Citation
33
 
34
  ```bibtex
35
  @inproceedings{alrowili-shanker-2021-biom,