Edit model card

Gen_Z_Model

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2083
  • Bleu: 38.8455
  • Gen Len: 15.0467

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 107 1.9909 28.2199 15.1893
No log 2.0 214 1.7933 32.7292 15.2734
No log 3.0 321 1.7042 33.0586 15.3575
No log 4.0 428 1.6409 33.5589 15.3294
1.9663 5.0 535 1.5944 34.0231 15.3084
1.9663 6.0 642 1.5542 34.5356 15.2453
1.9663 7.0 749 1.5204 34.5257 15.3178
1.9663 8.0 856 1.4949 35.0464 15.2664
1.9663 9.0 963 1.4656 34.8031 15.3692
1.563 10.0 1070 1.4452 34.8213 15.3248
1.563 11.0 1177 1.4273 34.8319 15.3715
1.563 12.0 1284 1.4041 34.6139 15.528
1.563 13.0 1391 1.3904 34.8305 15.4439
1.563 14.0 1498 1.3747 35.4972 15.5327
1.4209 15.0 1605 1.3619 35.7394 15.4322
1.4209 16.0 1712 1.3493 35.6452 15.4206
1.4209 17.0 1819 1.3369 35.8997 15.4276
1.4209 18.0 1926 1.3255 35.8844 15.4416
1.3222 19.0 2033 1.3168 35.8468 15.465
1.3222 20.0 2140 1.3074 36.3525 15.3621
1.3222 21.0 2247 1.2993 37.2694 15.2453
1.3222 22.0 2354 1.2925 37.3457 15.2593
1.3222 23.0 2461 1.2842 37.3279 15.236
1.2566 24.0 2568 1.2805 37.4183 15.2056
1.2566 25.0 2675 1.2750 37.7844 15.1939
1.2566 26.0 2782 1.2684 37.8613 15.1799
1.2566 27.0 2889 1.2626 37.8746 15.1519
1.2566 28.0 2996 1.2562 38.017 15.1495
1.1991 29.0 3103 1.2536 38.1961 15.1145
1.1991 30.0 3210 1.2473 38.2285 15.0981
1.1991 31.0 3317 1.2429 38.214 15.1028
1.1991 32.0 3424 1.2397 38.5427 15.0467
1.1655 33.0 3531 1.2353 38.2303 15.1121
1.1655 34.0 3638 1.2344 38.5399 15.1285
1.1655 35.0 3745 1.2288 38.4536 15.1005
1.1655 36.0 3852 1.2263 38.7325 15.0794
1.1655 37.0 3959 1.2237 38.7098 15.1051
1.1306 38.0 4066 1.2202 38.6696 15.1215
1.1306 39.0 4173 1.2182 38.8038 15.0771
1.1306 40.0 4280 1.2171 38.846 15.0561
1.1306 41.0 4387 1.2162 38.7233 15.0257
1.1306 42.0 4494 1.2144 38.7516 15.0327
1.1103 43.0 4601 1.2136 39.1562 15.0304
1.1103 44.0 4708 1.2115 38.9924 15.021
1.1103 45.0 4815 1.2104 39.0094 15.035
1.1103 46.0 4922 1.2097 38.9355 15.0421
1.0979 47.0 5029 1.2087 38.8939 15.0561
1.0979 48.0 5136 1.2087 38.8412 15.0491
1.0979 49.0 5243 1.2084 38.8575 15.0561
1.0979 50.0 5350 1.2083 38.8455 15.0467

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.3
  • Tokenizers 0.13.3
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for GCruz19/Gen_Z_Model

Base model

google-t5/t5-small
Finetuned
(1522)
this model