Paul-B98 commited on
Commit
713f663
1 Parent(s): 00cb936

add readme

Browse files
Files changed (1) hide show
  1. README.md +59 -3
README.md CHANGED
@@ -1,3 +1,59 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Codet5+ 220m Py Sum
2
+
3
+ This Model is based on the [CodeT5+ (220m)](https://huggingface.co/Salesforce/codet5p-220m) from salesforce and was finetuned for the code summarization task by using the [XCodeGlue](https://github.com/microsoft/CodeXGLUE) Dataset. The Code is accessible on [Github](https://github.com/Paul-B98/mdl-ii).
4
+
5
+ ## Results
6
+
7
+ | Modell | BLEU |
8
+ | ------ | ---- |
9
+ | CodeT5-base-sum-python[^1] | 23.564 |
10
+ | CodeT5-base-multi-sum[^2] | 23.985 |
11
+ | Code-Trans-S-ST[^3] | 5.495 |
12
+ | Code-Trans-S-TF[^4] | 21.093 |
13
+ | Code-Trans-S-MT[^5] | 5.450 |
14
+ | Code-Trans-S-MT-TF[^6] | 16.378 |
15
+ | Code-Trans-B-ST[^7] | 4.638 |
16
+ | Code-Trans-B-TF[^8] | 21.671 |
17
+ | Code-Trans-B-MT[^9] | 2.957 |
18
+ | Code-Trans-B-MT-TF[^10] | 13.766 |
19
+ | Code-Trans-L-TF[^11] | 23.306 |
20
+ | Code-Trans-L-MT[^12] | 13.487 |
21
+ | Code-Trans-L-MT-TF[^13] | 16.362 |
22
+ | **CodeT5+ 220m Py Sum***| 25.245 |
23
+
24
+ ## Example on how to use
25
+
26
+ The model can be easily download from Huggingface and used in a summarization pipeline.
27
+
28
+ ```python
29
+ from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
30
+
31
+ pipeline = SummarizationPipeline(
32
+ model=AutoModelWithLMHead.from_pretrained("Paul-B98/codet5p_220m_py_sum"),
33
+ tokenizer=AutoTokenizer.from_pretrained("Salesforce/codet5p-220m"),
34
+ device=0
35
+ )
36
+
37
+ example_method = """
38
+ def greet(name):
39
+ print(f"Hello, {name}!")
40
+ """
41
+
42
+ pipeline([example_method])[0]["summary_text"]
43
+ ```
44
+
45
+ ## References
46
+
47
+ [^1]: https://huggingface.co/Salesforce/codet5-base-codexglue-sum-python
48
+ [^2]: https://huggingface.co/Salesforce/codet5-base-multi-sum
49
+ [^3]: https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python
50
+ [^4]: https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_transfer_learning_finetune
51
+ [^5]: https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask
52
+ [^6]: https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask_finetune
53
+ [^7]: https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python
54
+ [^8]: https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_transfer_learning_finetune
55
+ [^9]: https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask
56
+ [^10]: https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask_finetune
57
+ [^11]: https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_transfer_learning_finetune
58
+ [^12]: https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask
59
+ [^13]: https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask_finetune