Edit model card

byt5-base-es_maz

This model is a fine-tuned version of google/byt5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0917
  • Bleu: 14.8412
  • Gen Len: 98.6675

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 65
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 393 1.0346 0.0473 19.0
1.5209 2.0 786 0.8939 0.1413 19.0
1.0258 3.0 1179 0.8334 0.1641 19.0
0.9177 4.0 1572 0.7867 0.1729 19.0
0.9177 5.0 1965 0.7543 0.1742 19.0
0.8482 6.0 2358 0.7317 0.1692 19.0
0.7957 7.0 2751 0.7106 0.1742 19.0
0.7557 8.0 3144 0.6849 0.216 19.0
0.7204 9.0 3537 0.6731 0.189 19.0
0.7204 10.0 3930 0.6562 0.2063 19.0
0.6901 11.0 4323 0.6510 0.2025 19.0
0.6635 12.0 4716 0.6423 0.2266 19.0
0.6346 13.0 5109 0.6330 0.2229 19.0
0.6132 14.0 5502 0.6257 0.2195 19.0
0.6132 15.0 5895 0.6192 0.2344 19.0
0.5885 16.0 6288 0.6104 0.2424 19.0
0.5682 17.0 6681 0.6048 0.2536 19.0
0.5452 18.0 7074 0.6057 0.2541 19.0
0.5452 19.0 7467 0.6047 0.2526 19.0
0.5294 20.0 7860 0.6066 0.2644 19.0
0.5072 21.0 8253 0.6080 0.2666 19.0
0.4921 22.0 8646 0.6092 0.2499 19.0
0.4753 23.0 9039 0.6132 0.2719 19.0
0.4753 24.0 9432 0.6088 0.2724 19.0
0.4597 25.0 9825 0.6128 0.2683 19.0
0.4443 26.0 10218 0.6183 0.2856 19.0
0.4301 27.0 10611 0.6246 0.3006 19.0
0.418 28.0 11004 0.6312 0.2788 19.0
0.418 29.0 11397 0.6295 0.2843 19.0
0.4002 30.0 11790 0.6350 0.2982 19.0
0.3913 31.0 12183 0.6441 0.2822 19.0
0.3755 32.0 12576 0.6430 0.3215 19.0
0.3755 33.0 12969 0.6486 0.3024 19.0
0.3673 34.0 13362 0.6527 0.2985 19.0
0.352 35.0 13755 0.6660 0.31 19.0
0.3408 36.0 14148 0.6737 0.288 19.0
0.3307 37.0 14541 0.6773 0.2995 19.0
0.3307 38.0 14934 0.6903 0.29 19.0
0.3182 39.0 15327 0.7059 0.2848 19.0
0.3077 40.0 15720 0.6986 0.2878 19.0
0.298 41.0 16113 0.7053 0.2859 19.0
0.29 42.0 16506 0.7198 0.2871 19.0
0.29 43.0 16899 0.7275 0.2813 19.0
0.2787 44.0 17292 0.7370 0.2972 19.0
0.268 45.0 17685 0.7426 0.26 19.0
0.2638 46.0 18078 0.7529 0.2846 19.0
0.2638 47.0 18471 0.7603 0.2898 19.0
0.253 48.0 18864 0.7711 0.277 19.0
0.244 49.0 19257 0.7779 0.3005 19.0
0.2368 50.0 19650 0.7815 0.2931 19.0
0.2301 51.0 20043 0.8020 0.2998 19.0
0.2301 52.0 20436 0.8051 0.2806 19.0
0.2217 53.0 20829 0.8119 0.294 19.0
0.2158 54.0 21222 0.8288 0.2921 19.0
0.2079 55.0 21615 0.8341 0.2954 19.0
0.2027 56.0 22008 0.8365 0.2884 19.0
0.2027 57.0 22401 0.8441 0.2995 19.0
0.1954 58.0 22794 0.8488 0.3115 19.0
0.1918 59.0 23187 0.8710 0.3085 19.0
0.1857 60.0 23580 0.8718 0.2932 19.0
0.1857 61.0 23973 0.8777 0.2923 19.0
0.1796 62.0 24366 0.8832 0.3038 19.0
0.1753 63.0 24759 0.8997 0.3063 19.0
0.1703 64.0 25152 0.9198 0.3047 19.0
0.1661 65.0 25545 0.9194 0.3159 19.0
0.1661 66.0 25938 0.9243 0.2962 19.0
0.1606 67.0 26331 0.9376 0.3065 19.0
0.1582 68.0 26724 0.9339 0.3002 19.0
0.1533 69.0 27117 0.9420 0.3096 19.0
0.1503 70.0 27510 0.9522 0.2919 19.0
0.1503 71.0 27903 0.9620 0.3085 19.0
0.1469 72.0 28296 0.9673 0.2946 19.0
0.1416 73.0 28689 0.9706 0.3019 19.0
0.1401 74.0 29082 0.9877 0.3103 19.0
0.1401 75.0 29475 0.9860 0.2903 19.0
0.1376 76.0 29868 1.0073 0.2855 19.0
0.1341 77.0 30261 1.0067 0.2927 19.0
0.1307 78.0 30654 1.0064 0.3 19.0
0.1296 79.0 31047 1.0221 0.2886 19.0
0.1296 80.0 31440 1.0217 0.297 19.0
0.126 81.0 31833 1.0278 0.2919 19.0
0.1238 82.0 32226 1.0329 0.2951 19.0
0.1214 83.0 32619 1.0351 0.3043 19.0
0.1206 84.0 33012 1.0498 0.2964 19.0
0.1206 85.0 33405 1.0433 0.2971 19.0
0.1186 86.0 33798 1.0525 0.2964 19.0
0.116 87.0 34191 1.0547 0.2943 19.0
0.116 88.0 34584 1.0585 0.2876 19.0
0.116 89.0 34977 1.0631 0.2904 19.0
0.1131 90.0 35370 1.0678 0.2859 19.0
0.1124 91.0 35763 1.0764 0.3027 19.0
0.1109 92.0 36156 1.0759 0.3037 19.0
0.1097 93.0 36549 1.0738 0.2962 19.0
0.1097 94.0 36942 1.0855 0.2966 19.0
0.1093 95.0 37335 1.0902 0.2968 19.0
0.1082 96.0 37728 1.0859 0.2958 19.0
0.1073 97.0 38121 1.0867 0.3023 19.0
0.1063 98.0 38514 1.0902 0.3004 19.0
0.1063 99.0 38907 1.0910 0.3018 19.0
0.1065 100.0 39300 1.0917 0.3021 19.0

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
2