Edit model card

nllb-200-distilled-600M-finetuned_srimadbhagavatam_sns

This model is a fine-tuned version of facebook/nllb-200-distilled-600M on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9632
  • Rouge1: 39.9844
  • Rouge2: 15.8187
  • Rougel: 24.7601
  • Rougelsum: 37.8611

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
4.2029 1.0 193 3.5530 17.4525 1.8199 14.417 15.7939
3.6789 2.0 386 3.2385 18.4399 2.3063 14.4777 16.8663
3.4121 3.0 579 2.9913 18.6292 2.1671 14.0775 17.4039
3.1958 4.0 772 2.7935 20.9044 3.0869 15.7866 19.4597
3.0238 5.0 965 2.6154 22.9863 3.1733 15.4087 21.6705
2.8546 6.0 1158 2.4343 24.7063 4.0564 16.1424 23.2821
2.7 7.0 1351 2.2810 26.2011 4.6714 16.7887 24.6723
2.5532 8.0 1544 2.1071 30.7319 6.3718 17.4858 28.8254
2.42 9.0 1737 1.9742 28.5217 5.2919 16.9577 26.5686
2.2991 10.0 1930 1.8234 29.8937 6.3088 17.2141 28.0302
2.1851 11.0 2123 1.7177 29.8642 6.9874 18.2935 28.0493
2.0829 12.0 2316 1.5891 30.7551 6.7111 18.1772 28.8555
1.9954 13.0 2509 1.4965 32.6313 8.0662 18.4981 30.8014
1.9055 14.0 2702 1.3996 33.0299 9.6554 19.2763 31.2127
1.8372 15.0 2895 1.3271 35.4767 10.7234 20.2759 33.1856
1.7635 16.0 3088 1.2533 35.5164 11.5198 21.3301 33.2617
1.7052 17.0 3281 1.1865 37.5692 13.6047 22.9496 35.2626
1.6495 18.0 3474 1.1414 37.7493 13.6471 22.6947 35.6368
1.6009 19.0 3667 1.0859 40.251 15.2568 24.4602 37.955
1.5589 20.0 3860 1.0536 37.8875 14.5794 23.4696 35.8989
1.5209 21.0 4053 1.0268 38.4126 14.9535 24.3597 36.435
1.4963 22.0 4246 0.9982 40.9518 16.6418 25.284 38.5787
1.4651 23.0 4439 0.9771 39.4774 16.4189 24.7979 37.3614
1.451 24.0 4632 0.9662 40.4131 16.5895 25.0073 38.3018
1.4351 25.0 4825 0.9632 39.9844 15.8187 24.7601 37.8611

Framework versions

  • Transformers 4.28.0
  • Pytorch 1.12.1
  • Datasets 2.14.4
  • Tokenizers 0.13.3
Downloads last month
0