flan-t5-base-clara-med

This model is a fine-tuned version of google/flan-t5-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2699
  • Rouge1: 30.1376
  • Rouge2: 16.8424
  • Rougel: 27.9649
  • Rougelsum: 27.9946

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
No log 1.0 380 1.4710 27.6278 15.5057 25.9917 26.0601
No log 2.0 760 1.3863 28.4324 15.8032 26.8023 26.8387
1.6476 3.0 1140 1.3494 28.6807 16.0854 26.9253 26.9743
1.6476 4.0 1520 1.3170 28.3434 15.6852 26.58 26.5937
1.3695 5.0 1900 1.3009 28.8006 15.819 26.8122 26.8756
1.3695 6.0 2280 1.2797 29.0521 16.4032 27.1802 27.1988
1.3695 7.0 2660 1.2744 29.2339 16.4583 27.3799 27.4091
1.2162 8.0 3040 1.2557 28.8177 16.2513 26.9967 27.028
1.2162 9.0 3420 1.2553 29.0411 16.4606 27.2912 27.3004
1.1232 10.0 3800 1.2540 29.0367 16.3896 27.2911 27.324
1.1232 11.0 4180 1.2500 29.3928 16.6718 27.4638 27.4877
1.1232 12.0 4560 1.2487 29.6046 16.7906 27.6814 27.6977
1.0389 13.0 4940 1.2542 29.4922 16.5255 27.5363 27.5904
1.0389 14.0 5320 1.2384 29.6472 16.707 27.6808 27.6988
0.9794 15.0 5700 1.2476 29.3771 16.2381 27.3751 27.3876
0.9794 16.0 6080 1.2437 29.4158 16.4003 27.3116 27.3409
0.9794 17.0 6460 1.2466 29.2787 16.4136 27.3256 27.3622
0.9276 18.0 6840 1.2530 29.4183 16.4244 27.325 27.3583
0.9276 19.0 7220 1.2582 29.743 16.7631 27.6997 27.7752
0.8851 20.0 7600 1.2560 29.5645 16.5834 27.5395 27.5622
0.8851 21.0 7980 1.2544 29.4893 16.4478 27.3961 27.4465
0.8851 22.0 8360 1.2593 29.785 16.6023 27.6214 27.6394
0.8578 23.0 8740 1.2588 30.008 16.8796 27.882 27.8989
0.8578 24.0 9120 1.2672 30.0112 16.6782 27.8556 27.8934
0.8347 25.0 9500 1.2668 29.6945 16.431 27.4398 27.4956
0.8347 26.0 9880 1.2642 29.9327 16.6105 27.798 27.8497
0.8347 27.0 10260 1.2674 30.0747 16.7768 27.9137 27.9609
0.8156 28.0 10640 1.2712 29.9504 16.6466 27.8371 27.8742
0.8156 29.0 11020 1.2692 30.2209 16.9038 28.0454 28.0982
0.8055 30.0 11400 1.2699 30.1376 16.8424 27.9649 27.9946

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.13.0
  • Datasets 2.8.0
  • Tokenizers 0.12.1
Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.