End of training
Browse files
README.md
CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_loss: 1.
|
23 |
-
- eval_runtime: 34.
|
24 |
-
- eval_samples_per_second: 57.
|
25 |
-
- eval_steps_per_second: 7.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -45,7 +45,7 @@ More information needed
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
-
- distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:kl_divergence_loss()), activations_weight=0.
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
@@ -62,32 +62,32 @@ Peak GPU Memory: 8.0873 GB
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
-
| 0 | 0 | 54069.2930 | 57285.3438 | 5.9282 | 34.
|
66 |
-
| 1000 | 0.0404 |
|
67 |
-
| 2000 | 0.0808 | 511.
|
68 |
-
| 3000 | 0.1212 |
|
69 |
-
| 4000 | 0.1616 |
|
70 |
-
| 5000 | 0.2020 |
|
71 |
-
| 6000 | 0.2424 |
|
72 |
-
| 7000 | 0.2828 |
|
73 |
-
| 8000 | 0.3232 | 230.
|
74 |
-
| 9000 | 0.3636 |
|
75 |
-
| 10000 | 0.4040 | 196.
|
76 |
-
| 11000 | 0.4444 | 177.
|
77 |
-
| 12000 | 0.4848 |
|
78 |
-
| 13000 | 0.5253 | 154.
|
79 |
-
| 14000 | 0.5657 | 146.
|
80 |
-
| 15000 | 0.6061 |
|
81 |
-
| 16000 | 0.6465 |
|
82 |
-
| 17000 | 0.6869 |
|
83 |
-
| 18000 | 0.7273 | 133.
|
84 |
-
| 19000 | 0.7677 |
|
85 |
-
| 20000 | 0.8081 | 128.
|
86 |
-
| 21000 | 0.8485 | 126.
|
87 |
-
| 22000 | 0.8889 | 125.
|
88 |
-
| 23000 | 0.9293 |
|
89 |
-
| 24000 | 0.9697 | 121.
|
90 |
-
| 24750 | 1.0 | 120.
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 211.0345
|
20 |
+
- eval_frwikippl: 1207.8281
|
21 |
+
- eval_zhwikippl: 585.1553
|
22 |
+
- eval_loss: 1.2644
|
23 |
+
- eval_runtime: 34.8133
|
24 |
+
- eval_samples_per_second: 57.449
|
25 |
+
- eval_steps_per_second: 7.181
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:kl_divergence_loss()), activations_weight=0.5, activations_loss_fn=(fn:soft_mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:soft_mse_loss()))
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
+
| 0 | 0 | 54069.2930 | 57285.3438 | 5.9282 | 34.854 | 57.382 | 7.173 | 54227.1016 |
|
66 |
+
| 1000 | 0.0404 | 715.9882 | 4680.4160 | 1.9682 | 34.7166 | 57.609 | 7.201 | 17073.8301 |
|
67 |
+
| 2000 | 0.0808 | 511.1823 | 3222.4033 | 1.7803 | 35.3145 | 56.634 | 7.079 | 1840.1992 |
|
68 |
+
| 3000 | 0.1212 | 423.7381 | 2758.2358 | 1.6673 | 34.6413 | 57.735 | 7.217 | 1143.0259 |
|
69 |
+
| 4000 | 0.1616 | 369.2649 | 2376.2915 | 1.5791 | 34.703 | 57.632 | 7.204 | 849.1024 |
|
70 |
+
| 5000 | 0.2020 | 318.7353 | 1859.0007 | 1.4983 | 34.7285 | 57.59 | 7.199 | 896.6478 |
|
71 |
+
| 6000 | 0.2424 | 278.9278 | 1626.8433 | 1.4235 | 34.7261 | 57.594 | 7.199 | 1273.9360 |
|
72 |
+
| 7000 | 0.2828 | 254.5844 | 1463.7820 | 1.3663 | 34.6733 | 57.681 | 7.21 | 1229.8070 |
|
73 |
+
| 8000 | 0.3232 | 230.2103 | 1278.4543 | 1.3125 | 34.4883 | 57.991 | 7.249 | 966.0128 |
|
74 |
+
| 9000 | 0.3636 | 211.0345 | 1207.8281 | 1.2644 | 34.8133 | 57.449 | 7.181 | 585.1553 |
|
75 |
+
| 10000 | 0.4040 | 196.6672 | 1174.5717 | 1.2176 | 34.9053 | 57.298 | 7.162 | 530.0967 |
|
76 |
+
| 11000 | 0.4444 | 177.6311 | 1018.0782 | 1.1662 | 34.8097 | 57.455 | 7.182 | 773.5347 |
|
77 |
+
| 12000 | 0.4848 | 164.7556 | 925.2521 | 1.1256 | 34.6156 | 57.777 | 7.222 | 547.3607 |
|
78 |
+
| 13000 | 0.5253 | 154.6037 | 854.5158 | 1.0956 | 34.4798 | 58.005 | 7.251 | 626.4785 |
|
79 |
+
| 14000 | 0.5657 | 146.6518 | 793.8755 | 1.0654 | 34.5931 | 57.815 | 7.227 | 670.8995 |
|
80 |
+
| 15000 | 0.6061 | 142.1114 | 773.8150 | 1.0480 | 34.5927 | 57.816 | 7.227 | 556.6484 |
|
81 |
+
| 16000 | 0.6465 | 140.9245 | 710.7372 | 1.0307 | 34.5966 | 57.809 | 7.226 | 648.1790 |
|
82 |
+
| 17000 | 0.6869 | 135.3474 | 722.6113 | 1.0181 | 34.5449 | 57.896 | 7.237 | 510.9829 |
|
83 |
+
| 18000 | 0.7273 | 133.1789 | 697.9252 | 1.0046 | 34.4453 | 58.063 | 7.258 | 526.8502 |
|
84 |
+
| 19000 | 0.7677 | 130.4562 | 684.9579 | 0.9941 | 34.4973 | 57.976 | 7.247 | 361.4319 |
|
85 |
+
| 20000 | 0.8081 | 128.2863 | 676.7495 | 0.9875 | 34.5351 | 57.912 | 7.239 | 392.1580 |
|
86 |
+
| 21000 | 0.8485 | 126.9386 | 645.8423 | 0.9779 | 34.5188 | 57.939 | 7.242 | 482.4038 |
|
87 |
+
| 22000 | 0.8889 | 125.7417 | 615.3482 | 0.9695 | 34.6552 | 57.711 | 7.214 | 353.2721 |
|
88 |
+
| 23000 | 0.9293 | 124.2566 | 641.0788 | 0.9649 | 34.5465 | 57.893 | 7.237 | 434.2212 |
|
89 |
+
| 24000 | 0.9697 | 121.7920 | 623.3393 | 0.9582 | 34.5651 | 57.862 | 7.233 | 437.1886 |
|
90 |
+
| 24750 | 1.0 | 120.7186 | 650.8248 | 0.9533 | 34.6549 | 57.712 | 7.214 | 447.5255 |
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
logs/distillation_objective=MultiObjective(logits_weight_1__logits_loss_fn_(fn_kl_divergence_loss())__activations_weight_0.5__activations_loss_fn_(fn_soft_mse_loss())__attentions_weight_0__attentions_loss_/events.out.tfevents.1723501468.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:daf796a87489800e1e699f76abf8042183787b6b69e8fadca60231d875568b4b
|
3 |
+
size 253
|