nielsr HF staff commited on
Commit
1e052ba
1 Parent(s): 1c397d8
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -55,7 +55,7 @@ then of the form:
55
  [CLS] Sentence [SEP] Flattened table [SEP]
56
  ```
57
 
58
- ### Fine-tuning
59
 
60
  The model was pre-trained on 32 Cloud TPU v3 cores for 1,000,000 steps with maximum sequence length 512 and batch size of 512.
61
  In this setup, pre-training on MLM only takes around 3 days. Aditionally, the model has been further pre-trained on a second task (table entailment). See the original TAPAS [paper](https://www.aclweb.org/anthology/2020.acl-main.398/) and the [follow-up paper](https://www.aclweb.org/anthology/2020.findings-emnlp.27/) for more details.
55
  [CLS] Sentence [SEP] Flattened table [SEP]
56
  ```
57
 
58
+ ### Pre-training
59
 
60
  The model was pre-trained on 32 Cloud TPU v3 cores for 1,000,000 steps with maximum sequence length 512 and batch size of 512.
61
  In this setup, pre-training on MLM only takes around 3 days. Aditionally, the model has been further pre-trained on a second task (table entailment). See the original TAPAS [paper](https://www.aclweb.org/anthology/2020.acl-main.398/) and the [follow-up paper](https://www.aclweb.org/anthology/2020.findings-emnlp.27/) for more details.