xww033 commited on
Commit
511cef5
·
1 Parent(s): 761da54

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -14
README.md CHANGED
@@ -7,7 +7,7 @@ It was introduced in the paper From Clozing to Comprehending: Retrofitting Pre-t
7
  Weiwen Xu, Xin Li, Wenxuan Zhang, Meng Zhou, Wai Lam, Luo Si, Lidong Bing
8
  and first released in [this repository](https://github.com/DAMO-NLP-SG/PMR).
9
 
10
- The model is initialized with roberta-base and further continued pre-trained with an MRC objective.
11
 
12
  ## Model description
13
  The model is pre-trained with distantly labeled data using a learning objective called Wiki Anchor Extraction (WAE).
@@ -43,23 +43,16 @@ There are three versions of models released. The details are:
43
  The models need to be fine-tuned on the data downstream tasks. During fine-tuning, no task-specific layer is required.
44
 
45
  ### How to use
46
- You can try the scripts from [this repo](https://github.com/DAMO-NLP-SG/PMR).
47
 
48
 
49
 
50
  ### BibTeX entry and citation info
51
  ```bibtxt
52
- @inproceedings{acl23/SSTuning,
53
- author = {Chaoqun Liu and
54
- Wenxuan Zhang and
55
- Guizhen Chen and
56
- Xiaobao Wu and
57
- Anh Tuan Luu and
58
- Chip Hong Chang and
59
- Lidong Bing},
60
- title = {Zero-Shot Text Classification via Self-Supervised Tuning},
61
- booktitle = {Findings of the 2023 ACL},
62
- year = {2023},
63
- url = {},
64
  }
65
  ```
 
7
  Weiwen Xu, Xin Li, Wenxuan Zhang, Meng Zhou, Wai Lam, Luo Si, Lidong Bing
8
  and first released in [this repository](https://github.com/DAMO-NLP-SG/PMR).
9
 
10
+ This model is initialized with roberta-base and further continued pre-trained with an MRC objective.
11
 
12
  ## Model description
13
  The model is pre-trained with distantly labeled data using a learning objective called Wiki Anchor Extraction (WAE).
 
43
  The models need to be fine-tuned on the data downstream tasks. During fine-tuning, no task-specific layer is required.
44
 
45
  ### How to use
46
+ You can try the codes from [this repo](https://github.com/DAMO-NLP-SG/PMR).
47
 
48
 
49
 
50
  ### BibTeX entry and citation info
51
  ```bibtxt
52
+ @article{xu2022clozing,
53
+ title={From Clozing to Comprehending: Retrofitting Pre-trained Language Model to Pre-trained Machine Reader},
54
+ author={Xu, Weiwen and Li, Xin and Zhang, Wenxuan and Zhou, Meng and Bing, Lidong and Lam, Wai and Si, Luo},
55
+ journal={arXiv preprint arXiv:2212.04755},
56
+ year={2022}
 
 
 
 
 
 
 
57
  }
58
  ```