code_to_comment_conala

This model is a fine-tuned version of t5-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4816
  • Bleu: 13.7335
  • Gen Len: 19.8993

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
2.4821 1.0 625 1.8299 10.1751 19.7986
2.0145 2.0 1250 1.6934 12.468 19.9568
1.8543 3.0 1875 1.6314 12.2525 19.8129
1.6883 4.0 2500 1.5832 12.4645 19.9137
1.6251 5.0 3125 1.5516 13.1134 19.7842
1.581 6.0 3750 1.5304 13.0768 19.9856
1.5261 7.0 4375 1.5081 13.6443 19.8993
1.4693 8.0 5000 1.5049 13.4999 19.8705
1.4356 9.0 5625 1.5003 13.7787 19.8993
1.4267 10.0 6250 1.4952 14.6068 19.9137
1.4008 11.0 6875 1.4899 13.9106 19.9712
1.3817 12.0 7500 1.4895 13.6681 19.9424
1.3494 13.0 8125 1.4823 13.6544 19.9424
1.3512 14.0 8750 1.4813 13.5975 19.9424
1.3572 15.0 9375 1.4816 13.7335 19.8993

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.0.0+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.