fbaigt commited on
Commit
d3fe3be
1 Parent(s): 8bc0597

Update model card

Browse files
Files changed (1) hide show
  1. README.md +10 -6
README.md CHANGED
@@ -11,12 +11,16 @@ datasets:
11
  Proc-RoBERTa is a pre-trained language model for procedural text. It was built by fine-tuning the RoBERTa-based model on a procedural corpus (PubMed articles/chemical patents/cooking recipes), which contains 1.05B tokens. More details can be found in the following [paper](https://arxiv.org/abs/2109.04711):
12
 
13
  ```
14
- @article{Bai2021PretrainOA,
15
- title={Pre-train or Annotate? Domain Adaptation with a Constrained Budget},
16
- author={Fan Bai and Alan Ritter and Wei Xu},
17
- journal={ArXiv},
18
- year={2021},
19
- volume={abs/2109.04711}
 
 
 
 
20
  }
21
  ```
22
 
 
11
  Proc-RoBERTa is a pre-trained language model for procedural text. It was built by fine-tuning the RoBERTa-based model on a procedural corpus (PubMed articles/chemical patents/cooking recipes), which contains 1.05B tokens. More details can be found in the following [paper](https://arxiv.org/abs/2109.04711):
12
 
13
  ```
14
+ @inproceedings{bai-etal-2021-pre,
15
+ title = "Pre-train or Annotate? Domain Adaptation with a Constrained Budget",
16
+ author = "Bai, Fan and
17
+ Ritter, Alan and
18
+ Xu, Wei",
19
+ booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
20
+ month = nov,
21
+ year = "2021",
22
+ address = "Online and Punta Cana, Dominican Republic",
23
+ publisher = "Association for Computational Linguistics",
24
  }
25
  ```
26