|
--- |
|
arxiv: 2102.02017 |
|
language: |
|
- code |
|
--- |
|
|
|
# Studying the Usage of Text-To-Text Transfer Transformer to Support Code-Related Tasks |
|
## Using Transfer Learning for Code-Related Tasks |
|
|
|
This is an *unofficial* reupload of `t5-learning-no-pretraining-bf-task` based off the [author's repo](https://github.com/antonio-mastropaolo/TransferLearning4Code), in the `SafeTensors` format using `transformers` `4.40.1`. I manually converted the checkpoints using the `tf_2_pytorch_T5.py` script and converted the tokenizers with my own script. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration. |
|
|
|
## Citation |
|
|
|
```bibtex |
|
@article{Mastropaolo2021StudyingTU, |
|
title={Studying the Usage of Text-To-Text Transfer Transformer to Support Code-Related Tasks}, |
|
author={Antonio Mastropaolo and Simone Scalabrino and Nathan Cooper and David Nader-Palacio and Denys Poshyvanyk and Rocco Oliveto and Gabriele Bavota}, |
|
journal={2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)}, |
|
year={2021}, |
|
pages={336-347} |
|
} |
|
``` |