Li commited on
Commit
1855f2c
1 Parent(s): 75365d4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -0
README.md CHANGED
@@ -17,6 +17,11 @@ Bioformer was pre-trained from scratch on the same corpus as the vocabulary (33
17
 
18
  Pre-training of Bioformer was performed on a single Cloud TPU device (TPUv2, 8 cores, 8GB memory per core). The maximum input sequence length was fixed to 512, and the batch size was set to 256. We pre-trained Bioformer for 2 million steps, which took about 8.3 days.
19
 
 
 
 
 
 
20
  ## Acknowledgment
21
 
22
  Bioformer is partly supported by the Google TPU Research Cloud (TRC) program.
 
17
 
18
  Pre-training of Bioformer was performed on a single Cloud TPU device (TPUv2, 8 cores, 8GB memory per core). The maximum input sequence length was fixed to 512, and the batch size was set to 256. We pre-trained Bioformer for 2 million steps, which took about 8.3 days.
19
 
20
+
21
+ ## Awards
22
+
23
+ Bioformer achieved top performance (highest micro-F1 score) in the BioCreative VII COVID-19 Multi-label topic classification challenge (https://biocreative.bioinformatics.udel.edu/media/store/files/2021/TRACK5_pos_1_BC7_submission_221.pdf)
24
+
25
  ## Acknowledgment
26
 
27
  Bioformer is partly supported by the Google TPU Research Cloud (TRC) program.