File size: 2,976 Bytes
ba633e6
13e3927
 
ba633e6
 
 
 
 
 
 
713f663
 
 
 
 
 
 
 
13e3927
 
 
 
 
 
 
 
 
07821a7
13e3927
 
 
713f663
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
294daba
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
---
language:
- en
license: mit
datasets:
- code_x_glue_ct_code_to_text
metrics:
- bleu
- sacrebleu
---
# Codet5+ 220m Py Sum

This Model is based on the [CodeT5+ (220m)](https://huggingface.co/Salesforce/codet5p-220m) from salesforce and was finetuned for the code summarization task by using the [XCodeGlue](https://github.com/microsoft/CodeXGLUE) Dataset. The Code is accessible on [Github](https://github.com/Paul-B98/mdl-ii).

## Results

| Modell | BLEU |
| ------ | ---- | 
| [CodeT5-base-sum-python](https://huggingface.co/Salesforce/codet5-base-codexglue-sum-python)                                          | 23.564 | 
| [CodeT5-base-multi-sum](https://huggingface.co/Salesforce/codet5-base-multi-sum)                                                      | 23.985 | 
| [Code-Trans-S-ST](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python)                              |  5.495 | 
| [Code-Trans-S-TF](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_transfer_learning_finetune)   | 21.093 | 
| [Code-Trans-S-MT](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask)                    |  5.450 |  
| [Code-Trans-S-MT-TF](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask_finetune)        | 16.378 |  
| [Code-Trans-B-ST](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python)                               |  4.638 | 
| [Code-Trans-B-TF](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_transfer_learning_finetune)    | 21.671 |
| [Code-Trans-B-MT](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask)                     |  2.957 | 
| [Code-Trans-B-MT-TF](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask_finetune)         | 13.766 | 
| [Code-Trans-L-TF](https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_transfer_learning_finetune)   | 23.306 | 
| [Code-Trans-L-MT](https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask)                    | 13.487 | 
| [Code-Trans-L-MT-TF](https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask_finetune)        | 16.362 | 
| **CodeT5+ 220m Py Sum***| 25.245 | 

## Example on how to use

The model can be easily download from Huggingface and used in a summarization pipeline.

```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline

pipeline = SummarizationPipeline(
    model=AutoModelWithLMHead.from_pretrained("Paul-B98/codet5p_220m_py_sum"),
    tokenizer=AutoTokenizer.from_pretrained("Salesforce/codet5p-220m"),
    device=0
)

example_method = """
def greet(name):
    print(f"Hello, {name}!")
"""

pipeline([example_method])[0]["summary_text"]
```