Edit model card

codet5-small-ft-v8-cpatd-ft-v8-cpat_dv5

This model is a fine-tuned version of ayeshgk/codet5-small-ft-v8-cpatd on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1715
  • Rouge1: 86.4622
  • Rouge2: 76.541
  • Rougel: 85.4053
  • Rougelsum: 85.4218
  • Gen Len: 13.7143

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 4
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 20 0.3355 82.9142 68.2812 81.8458 81.8291 13.3857
No log 2.0 40 0.2333 84.3779 72.1358 83.5562 83.5825 13.3571
No log 3.0 60 0.1830 86.0988 75.5643 85.1853 85.2291 13.6143
No log 4.0 80 0.1715 86.4622 76.541 85.4053 85.4218 13.7143

Framework versions

  • Transformers 4.38.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
60.5M params
Tensor type
F32
·

Finetuned from