|
--- |
|
language: |
|
- en |
|
license: mit |
|
datasets: |
|
- code_x_glue_ct_code_to_text |
|
metrics: |
|
- bleu |
|
- sacrebleu |
|
--- |
|
# Codet5+ 220m Py Sum |
|
|
|
This Model is based on the [CodeT5+ (220m)](https://huggingface.co/Salesforce/codet5p-220m) from salesforce and was finetuned for the code summarization task by using the [XCodeGlue](https://github.com/microsoft/CodeXGLUE) Dataset. The Code is accessible on [Github](https://github.com/Paul-B98/mdl-ii). |
|
|
|
## Results |
|
|
|
| Modell | BLEU | |
|
| ------ | ---- | |
|
| [CodeT5-base-sum-python](https://huggingface.co/Salesforce/codet5-base-codexglue-sum-python) | 23.564 | |
|
| [CodeT5-base-multi-sum](https://huggingface.co/Salesforce/codet5-base-multi-sum) | 23.985 | |
|
| [Code-Trans-S-ST](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python) | 5.495 | |
|
| [Code-Trans-S-TF](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_transfer_learning_finetune) | 21.093 | |
|
| [Code-Trans-S-MT](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask) | 5.450 | |
|
| [Code-Trans-S-MT-TF](https://huggingface.co/SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask_finetune) | 16.378 | |
|
| [Code-Trans-B-ST](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python) | 4.638 | |
|
| [Code-Trans-B-TF](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_transfer_learning_finetune) | 21.671 | |
|
| [Code-Trans-B-MT](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask) | 2.957 | |
|
| [Code-Trans-B-MT-TF](https://huggingface.co/SEBIS/code_trans_t5_base_code_documentation_generation_python_multitask_finetune) | 13.766 | |
|
| [Code-Trans-L-TF](https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_transfer_learning_finetune) | 23.306 | |
|
| [Code-Trans-L-MT](https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask) | 13.487 | |
|
| [Code-Trans-L-MT-TF](https://huggingface.co/SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask_finetune) | 16.362 | |
|
| **CodeT5+ 220m Py Sum***| 25.245 | |
|
|
|
## Example on how to use |
|
|
|
The model can be easily download from Huggingface and used in a summarization pipeline. |
|
|
|
```python |
|
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline |
|
|
|
pipeline = SummarizationPipeline( |
|
model=AutoModelWithLMHead.from_pretrained("Paul-B98/codet5p_220m_py_sum"), |
|
tokenizer=AutoTokenizer.from_pretrained("Salesforce/codet5p-220m"), |
|
device=0 |
|
) |
|
|
|
example_method = """ |
|
def greet(name): |
|
print(f"Hello, {name}!") |
|
""" |
|
|
|
pipeline([example_method])[0]["summary_text"] |
|
``` |