fbaigt commited on
Commit
7830d0d
1 Parent(s): a52d94d

Update model card

Browse files
Files changed (1) hide show
  1. README.md +11 -6
README.md CHANGED
@@ -11,12 +11,17 @@ datasets:
11
  ProcBERT is a pre-trained language model specifically for procedural text. It was pre-trained on a large-scale procedural corpus (PubMed articles/chemical patents/cooking recipes) containing over 12B tokens and shows great performance on downstream tasks. More details can be found in the following [paper](https://arxiv.org/abs/2109.04711):
12
 
13
  ```
14
- @article{Bai2021PretrainOA,
15
- title={Pre-train or Annotate? Domain Adaptation with a Constrained Budget},
16
- author={Fan Bai and Alan Ritter and Wei Xu},
17
- journal={ArXiv},
18
- year={2021},
19
- volume={abs/2109.04711}
 
 
 
 
 
20
  }
21
  ```
22
 
 
11
  ProcBERT is a pre-trained language model specifically for procedural text. It was pre-trained on a large-scale procedural corpus (PubMed articles/chemical patents/cooking recipes) containing over 12B tokens and shows great performance on downstream tasks. More details can be found in the following [paper](https://arxiv.org/abs/2109.04711):
12
 
13
  ```
14
+ @inproceedings{bai-etal-2021-pre,
15
+ title = "Pre-train or Annotate? Domain Adaptation with a Constrained Budget",
16
+ author = "Bai, Fan and
17
+ Ritter, Alan and
18
+ Xu, Wei",
19
+ booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
20
+ month = nov,
21
+ year = "2021",
22
+ address = "Online and Punta Cana, Dominican Republic",
23
+ publisher = "Association for Computational Linguistics",
24
+ url = "https://aclanthology.org/2021.emnlp-main.409",
25
  }
26
  ```
27