Edit model card

Studying the Usage of Text-To-Text Transfer Transformer to Support Code-Related Tasks

Using Transfer Learning for Code-Related Tasks

This is an unofficial reupload of t5-learning-with-pretraining-mg-task based off the author's repo, in the SafeTensors format using transformers 4.40.1. I manually converted the checkpoints using the tf_2_pytorch_T5.py script and converted the tokenizers with my own script. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration.

Citation

@article{Mastropaolo2021StudyingTU,
  title={Studying the Usage of Text-To-Text Transfer Transformer to Support Code-Related Tasks},
  author={Antonio Mastropaolo and Simone Scalabrino and Nathan Cooper and David Nader-Palacio and Denys Poshyvanyk and Rocco Oliveto and Gabriele Bavota},
  journal={2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE)},
  year={2021},
  pages={336-347}
}
Downloads last month
3
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.