lapp0 commited on
Commit
5c95089
1 Parent(s): 7b70bef

End of training

Browse files
README.md CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
- - eval_enwikippl: 210.8708
20
- - eval_frwikippl: 1211.4100
21
- - eval_zhwikippl: 584.2182
22
- - eval_loss: 1.2651
23
- - eval_runtime: 34.5216
24
- - eval_samples_per_second: 57.935
25
- - eval_steps_per_second: 7.242
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -45,7 +45,7 @@ More information needed
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
- - distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:kl_divergence_loss()), activations_weight=0.2, activations_loss_fn=(fn:soft_mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:soft_mse_loss()))
49
  - train_embeddings: True
50
  - learning_rate: 4e-05
51
  - train_batch_size: 8
@@ -62,32 +62,32 @@ Peak GPU Memory: 8.0873 GB
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
65
- | 0 | 0 | 54069.2930 | 57285.3438 | 5.9282 | 34.2808 | 58.342 | 7.293 | 54227.1016 |
66
- | 1000 | 0.0404 | 716.9899 | 4690.9888 | 1.9692 | 34.4481 | 58.058 | 7.257 | 17055.6035 |
67
- | 2000 | 0.0808 | 511.8575 | 3223.3132 | 1.7813 | 34.3222 | 58.271 | 7.284 | 1834.8010 |
68
- | 3000 | 0.1212 | 424.5287 | 2764.4653 | 1.6683 | 34.365 | 58.199 | 7.275 | 1140.2816 |
69
- | 4000 | 0.1616 | 370.2410 | 2382.3306 | 1.5810 | 34.4288 | 58.091 | 7.261 | 850.9186 |
70
- | 5000 | 0.2020 | 320.3730 | 1866.8818 | 1.5005 | 34.5186 | 57.94 | 7.242 | 899.5260 |
71
- | 6000 | 0.2424 | 280.1215 | 1630.9778 | 1.4260 | 34.6307 | 57.752 | 7.219 | 1253.6843 |
72
- | 7000 | 0.2828 | 255.6939 | 1464.6079 | 1.3684 | 34.5592 | 57.872 | 7.234 | 1018.1992 |
73
- | 8000 | 0.3232 | 230.7115 | 1283.3308 | 1.3139 | 34.7117 | 57.617 | 7.202 | 884.0448 |
74
- | 9000 | 0.3636 | 210.8708 | 1211.4100 | 1.2651 | 34.5216 | 57.935 | 7.242 | 584.2182 |
75
- | 10000 | 0.4040 | 196.8811 | 1173.7437 | 1.2181 | 34.7428 | 57.566 | 7.196 | 518.6829 |
76
- | 11000 | 0.4444 | 177.3280 | 1025.2821 | 1.1661 | 34.6351 | 57.745 | 7.218 | 677.5621 |
77
- | 12000 | 0.4848 | 165.1271 | 918.4274 | 1.1265 | 34.3903 | 58.156 | 7.27 | 582.0380 |
78
- | 13000 | 0.5253 | 154.4717 | 852.5903 | 1.0962 | 34.4891 | 57.989 | 7.249 | 569.4298 |
79
- | 14000 | 0.5657 | 146.8455 | 790.3571 | 1.0641 | 34.4762 | 58.011 | 7.251 | 567.0016 |
80
- | 15000 | 0.6061 | 141.1984 | 757.0854 | 1.0448 | 34.3706 | 58.189 | 7.274 | 540.8218 |
81
- | 16000 | 0.6465 | 141.1545 | 720.0176 | 1.0306 | 34.5392 | 57.905 | 7.238 | 554.8672 |
82
- | 17000 | 0.6869 | 134.6346 | 724.4989 | 1.0165 | 34.5118 | 57.951 | 7.244 | 736.5389 |
83
- | 18000 | 0.7273 | 133.7696 | 713.7504 | 1.0063 | 34.5763 | 57.843 | 7.23 | 682.1922 |
84
- | 19000 | 0.7677 | 129.2763 | 671.6632 | 0.9931 | 34.7219 | 57.601 | 7.2 | 336.0171 |
85
- | 20000 | 0.8081 | 128.4457 | 661.3251 | 0.9853 | 34.6065 | 57.793 | 7.224 | 400.0389 |
86
- | 21000 | 0.8485 | 126.0350 | 631.0336 | 0.9754 | 34.5442 | 57.897 | 7.237 | 614.8754 |
87
- | 22000 | 0.8889 | 125.4004 | 610.4221 | 0.9684 | 34.7039 | 57.63 | 7.204 | 402.3428 |
88
- | 23000 | 0.9293 | 123.5638 | 645.7514 | 0.9629 | 34.6277 | 57.757 | 7.22 | 362.1565 |
89
- | 24000 | 0.9697 | 121.6218 | 638.1926 | 0.9553 | 34.5558 | 57.877 | 7.235 | 396.0526 |
90
- | 24750 | 1.0 | 120.3442 | 648.9921 | 0.9520 | 34.6113 | 57.785 | 7.223 | 346.9606 |
91
 
92
  ### Framework versions
93
  - Distily 0.2.0
 
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
+ - eval_enwikippl: 211.0345
20
+ - eval_frwikippl: 1207.8281
21
+ - eval_zhwikippl: 585.1553
22
+ - eval_loss: 1.2644
23
+ - eval_runtime: 34.8133
24
+ - eval_samples_per_second: 57.449
25
+ - eval_steps_per_second: 7.181
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
45
  ### Training hyperparameters
46
 
47
  The following hyperparameters were used during training:
48
+ - distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:kl_divergence_loss()), activations_weight=0.5, activations_loss_fn=(fn:soft_mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:soft_mse_loss()))
49
  - train_embeddings: True
50
  - learning_rate: 4e-05
51
  - train_batch_size: 8
 
62
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
63
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
64
  | **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
65
+ | 0 | 0 | 54069.2930 | 57285.3438 | 5.9282 | 34.854 | 57.382 | 7.173 | 54227.1016 |
66
+ | 1000 | 0.0404 | 715.9882 | 4680.4160 | 1.9682 | 34.7166 | 57.609 | 7.201 | 17073.8301 |
67
+ | 2000 | 0.0808 | 511.1823 | 3222.4033 | 1.7803 | 35.3145 | 56.634 | 7.079 | 1840.1992 |
68
+ | 3000 | 0.1212 | 423.7381 | 2758.2358 | 1.6673 | 34.6413 | 57.735 | 7.217 | 1143.0259 |
69
+ | 4000 | 0.1616 | 369.2649 | 2376.2915 | 1.5791 | 34.703 | 57.632 | 7.204 | 849.1024 |
70
+ | 5000 | 0.2020 | 318.7353 | 1859.0007 | 1.4983 | 34.7285 | 57.59 | 7.199 | 896.6478 |
71
+ | 6000 | 0.2424 | 278.9278 | 1626.8433 | 1.4235 | 34.7261 | 57.594 | 7.199 | 1273.9360 |
72
+ | 7000 | 0.2828 | 254.5844 | 1463.7820 | 1.3663 | 34.6733 | 57.681 | 7.21 | 1229.8070 |
73
+ | 8000 | 0.3232 | 230.2103 | 1278.4543 | 1.3125 | 34.4883 | 57.991 | 7.249 | 966.0128 |
74
+ | 9000 | 0.3636 | 211.0345 | 1207.8281 | 1.2644 | 34.8133 | 57.449 | 7.181 | 585.1553 |
75
+ | 10000 | 0.4040 | 196.6672 | 1174.5717 | 1.2176 | 34.9053 | 57.298 | 7.162 | 530.0967 |
76
+ | 11000 | 0.4444 | 177.6311 | 1018.0782 | 1.1662 | 34.8097 | 57.455 | 7.182 | 773.5347 |
77
+ | 12000 | 0.4848 | 164.7556 | 925.2521 | 1.1256 | 34.6156 | 57.777 | 7.222 | 547.3607 |
78
+ | 13000 | 0.5253 | 154.6037 | 854.5158 | 1.0956 | 34.4798 | 58.005 | 7.251 | 626.4785 |
79
+ | 14000 | 0.5657 | 146.6518 | 793.8755 | 1.0654 | 34.5931 | 57.815 | 7.227 | 670.8995 |
80
+ | 15000 | 0.6061 | 142.1114 | 773.8150 | 1.0480 | 34.5927 | 57.816 | 7.227 | 556.6484 |
81
+ | 16000 | 0.6465 | 140.9245 | 710.7372 | 1.0307 | 34.5966 | 57.809 | 7.226 | 648.1790 |
82
+ | 17000 | 0.6869 | 135.3474 | 722.6113 | 1.0181 | 34.5449 | 57.896 | 7.237 | 510.9829 |
83
+ | 18000 | 0.7273 | 133.1789 | 697.9252 | 1.0046 | 34.4453 | 58.063 | 7.258 | 526.8502 |
84
+ | 19000 | 0.7677 | 130.4562 | 684.9579 | 0.9941 | 34.4973 | 57.976 | 7.247 | 361.4319 |
85
+ | 20000 | 0.8081 | 128.2863 | 676.7495 | 0.9875 | 34.5351 | 57.912 | 7.239 | 392.1580 |
86
+ | 21000 | 0.8485 | 126.9386 | 645.8423 | 0.9779 | 34.5188 | 57.939 | 7.242 | 482.4038 |
87
+ | 22000 | 0.8889 | 125.7417 | 615.3482 | 0.9695 | 34.6552 | 57.711 | 7.214 | 353.2721 |
88
+ | 23000 | 0.9293 | 124.2566 | 641.0788 | 0.9649 | 34.5465 | 57.893 | 7.237 | 434.2212 |
89
+ | 24000 | 0.9697 | 121.7920 | 623.3393 | 0.9582 | 34.5651 | 57.862 | 7.233 | 437.1886 |
90
+ | 24750 | 1.0 | 120.7186 | 650.8248 | 0.9533 | 34.6549 | 57.712 | 7.214 | 447.5255 |
91
 
92
  ### Framework versions
93
  - Distily 0.2.0
logs/distillation_objective=MultiObjective(logits_weight_1__logits_loss_fn_(fn_kl_divergence_loss())__activations_weight_0.5__activations_loss_fn_(fn_soft_mse_loss())__attentions_weight_0__attentions_loss_/events.out.tfevents.1723501468.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:daf796a87489800e1e699f76abf8042183787b6b69e8fadca60231d875568b4b
3
+ size 253