Edit model card

t5-large-finetuned-NL2BASH-customv3

This model is a fine-tuned version of alexsha/t5-large-finetuned-English-to-BASH on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7159
  • Nl2bash M: 0.8822
  • Gen Len: 13.4256

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 5
  • eval_batch_size: 5
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Nl2bash M Gen Len
No log 1.0 387 0.7666 0.8897 13.2314
0.0177 2.0 774 0.7878 0.887 13.3512
0.0188 3.0 1161 0.7580 0.8872 13.5661
0.0461 4.0 1548 0.7121 0.8846 13.3926
0.0461 5.0 1935 0.7159 0.8822 13.4256

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1
  • Datasets 2.6.1
  • Tokenizers 0.11.0
Downloads last month
8