saikatc commited on
Commit
b0fa115
1 Parent(s): 2eea032

Update README

Browse files
Files changed (1) hide show
  1. README.md +21 -0
README.md CHANGED
@@ -26,3 +26,24 @@ tokenizer = AutoTokenizer.from_pretrained("saikatc/NatGen")
26
 
27
  model = AutoModelForSeq2SeqLM.from_pretrained("saikatc/NatGen")
28
  ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
 
27
  model = AutoModelForSeq2SeqLM.from_pretrained("saikatc/NatGen")
28
  ```
29
+ NatGen: Generative Pre-training by “Naturalizing” Source Code [[`Paper Link`]](https://dl.acm.org/doi/abs/10.1145/3540250.3549162),[[`Code Repo`]](),[[`Slide`]](https://docs.google.com/presentation/d/1T6kjiohAAR1YvcNvTASR94HptA3xHGCl/edit?usp=sharing&ouid=111755026725574085503&rtpof=true&sd=true).
30
+ For citation,
31
+ ```
32
+ @inproceedings{chakraborty2022natgen,
33
+ author = {Chakraborty, Saikat and Ahmed, Toufique and Ding, Yangruibo and Devanbu, Premkumar T. and Ray, Baishakhi},
34
+ title = {NatGen: Generative Pre-Training by “Naturalizing” Source Code},
35
+ year = {2022},
36
+ isbn = {9781450394130},
37
+ publisher = {Association for Computing Machinery},
38
+ address = {New York, NY, USA},
39
+ url = {https://doi.org/10.1145/3540250.3549162},
40
+ doi = {10.1145/3540250.3549162},
41
+ booktitle = {Proceedings of the 30th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering},
42
+ pages = {18–30},
43
+ numpages = {13},
44
+ keywords = {Neural Network, Semantic Preserving Transformation, Source Code Transformer, Source Code Pre-training},
45
+ location = {Singapore, Singapore},
46
+ series = {ESEC/FSE 2022}
47
+ }
48
+ ```
49
+