AhmedSSoliman commited on
Commit
c498e18
·
1 Parent(s): 506824e

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -0
README.md ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # MarianCG: A TRANSFORMER MODEL FOR AUTOMATIC CODE GENERATION
2
+
3
+ In this work we worked to improve the solving of the code generation problem and implement a transformer model that can work with high accurate results. We implemented MarianCG transformer model which is a code generation model that can be able to generate code from natural language. This work declares the impact of using Marian machine translation model for solving the problem of code generation. In our implementation we prove that a machine translation model can be operated and working as a code generation model.Finally, we set the new contributors and state-of-the-art on CoNaLa reaching a BLEU score of 30.92 in the code generation problem with CoNaLa dataset.
4
+
5
+ This is the model is avialable on the huggingface hub
6
+ https://huggingface.co/AhmedSSoliman/MarianCG_NL-to-Code
7
+
8
+ ```python
9
+ # Model and Tokenizer
10
+ from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
11
+
12
+ # model_name = "AhmedSSoliman/MarianCG_NL-to-Code"
13
+ model = AutoModelForSeq2SeqLM.from_pretrained("AhmedSSoliman/MarianCG_NL-to-Code")
14
+ tokenizer = AutoTokenizer.from_pretrained("AhmedSSoliman/MarianCG_NL-to-Code")
15
+
16
+ # Input (Natural Language) and Output (Python Code)
17
+ NL_input = "create array containing the maximum value of respective elements of array `[2, 3, 4]` and array `[1, 5, 2]"
18
+ output = model.generate(**tokenizer(NL_input, padding="max_length", truncation=True, max_length=512, return_tensors="pt"))
19
+ output_code = tokenizer.decode(output[0], skip_special_tokens=True)
20
+ ```