Edit model card

KB13-t5-base-finetuned-en-to-regex

This model is a fine-tuned version of t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4785
  • Semantic accuracy: 0.3902
  • Syntactic accuracy: 0.3171
  • Gen Len: 15.2927

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 1000

Training results

Training Loss Epoch Step Validation Loss Semantic accuracy Syntactic accuracy Gen Len
No log 2.13 100 0.7159 0.122 0.0976 15.2439
No log 4.26 200 0.4649 0.2683 0.2195 15.0488
No log 6.38 300 0.3749 0.4146 0.3415 15.3659
No log 8.51 400 0.4155 0.3902 0.2927 15.0976
0.5191 10.64 500 0.4148 0.3902 0.2927 15.7561
0.5191 12.77 600 0.4010 0.439 0.3415 15.3902
0.5191 14.89 700 0.4429 0.3902 0.3171 15.3659
0.5191 17.02 800 0.4607 0.3902 0.3415 15.561
0.5191 19.15 900 0.4629 0.3902 0.3171 15.122
0.0518 21.28 1000 0.4785 0.3902 0.3171 15.2927

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.0+cu116
  • Datasets 2.7.1
  • Tokenizers 0.13.2
Downloads last month
2