End of training
Browse files
README.md
CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- eval_enwikippl:
|
19 |
-
- eval_frwikippl:
|
20 |
-
- eval_zhwikippl:
|
21 |
-
- eval_tinystoriesppl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime: 6.
|
24 |
-
- eval_samples_per_second: 76.
|
25 |
-
- eval_steps_per_second: 9.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -47,7 +47,7 @@ More information needed
|
|
47 |
The following hyperparameters were used during training:
|
48 |
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
|
49 |
- train_embeddings: True
|
50 |
-
- learning_rate: 0.
|
51 |
- train_batch_size: 1
|
52 |
- eval_batch_size: 8
|
53 |
- seed: 42
|
@@ -62,106 +62,106 @@ Peak GPU Memory: 6.6058 GB
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
-
| 0 | 0 | 23232.2363 | 111004.0469 | 6.4068 | 6.
|
66 |
-
| 500 | 0.0101 |
|
67 |
-
| 1000 | 0.0202 |
|
68 |
-
| 1500 | 0.0303 |
|
69 |
-
| 2000 | 0.0404 |
|
70 |
-
| 2500 | 0.0505 |
|
71 |
-
| 3000 | 0.0606 |
|
72 |
-
| 3500 | 0.0707 |
|
73 |
-
| 4000 | 0.0808 |
|
74 |
-
| 4500 | 0.0909 |
|
75 |
-
| 5000 | 0.1010 |
|
76 |
-
| 5500 | 0.1111 |
|
77 |
-
| 6000 | 0.1212 |
|
78 |
-
| 6500 | 0.1313 |
|
79 |
-
| 7000 | 0.1414 |
|
80 |
-
| 7500 | 0.1515 |
|
81 |
-
| 8000 | 0.1616 |
|
82 |
-
| 8500 | 0.1717 |
|
83 |
-
| 9000 | 0.1818 |
|
84 |
-
| 9500 | 0.1919 |
|
85 |
-
| 10000 | 0.2020 |
|
86 |
-
| 10500 | 0.2121 |
|
87 |
-
| 11000 | 0.2222 |
|
88 |
-
| 11500 | 0.2323 |
|
89 |
-
| 12000 | 0.2424 |
|
90 |
-
| 12500 | 0.2525 |
|
91 |
-
| 13000 | 0.2626 |
|
92 |
-
| 13500 | 0.2727 |
|
93 |
-
| 14000 | 0.2828 |
|
94 |
-
| 14500 | 0.2929 |
|
95 |
-
| 15000 | 0.3030 |
|
96 |
-
| 15500 | 0.3131 |
|
97 |
-
| 16000 | 0.3232 |
|
98 |
-
| 16500 | 0.3333 |
|
99 |
-
| 17000 | 0.3434 |
|
100 |
-
| 17500 | 0.3535 |
|
101 |
-
| 18000 | 0.3636 |
|
102 |
-
| 18500 | 0.3737 |
|
103 |
-
| 19000 | 0.3838 |
|
104 |
-
| 19500 | 0.3939 |
|
105 |
-
| 20000 | 0.4040 |
|
106 |
-
| 20500 | 0.4141 |
|
107 |
-
| 21000 | 0.4242 |
|
108 |
-
| 21500 | 0.4343 |
|
109 |
-
| 22000 | 0.4444 |
|
110 |
-
| 22500 | 0.4545 |
|
111 |
-
| 23000 | 0.4646 |
|
112 |
-
| 23500 | 0.4747 |
|
113 |
-
| 24000 | 0.4848 |
|
114 |
-
| 24500 | 0.4949 |
|
115 |
-
| 25000 | 0.5051 |
|
116 |
-
| 25500 | 0.5152 |
|
117 |
-
| 26000 | 0.5253 |
|
118 |
-
| 26500 | 0.5354 |
|
119 |
-
| 27000 | 0.5455 |
|
120 |
-
| 27500 | 0.5556 |
|
121 |
-
| 28000 | 0.5657 |
|
122 |
-
| 28500 | 0.5758 |
|
123 |
-
| 29000 | 0.5859 |
|
124 |
-
| 29500 | 0.5960 |
|
125 |
-
| 30000 | 0.6061 |
|
126 |
-
| 30500 | 0.6162 |
|
127 |
-
| 31000 | 0.6263 |
|
128 |
-
| 31500 | 0.6364 |
|
129 |
-
| 32000 | 0.6465 |
|
130 |
-
| 32500 | 0.6566 |
|
131 |
-
| 33000 | 0.6667 |
|
132 |
-
| 33500 | 0.6768 |
|
133 |
-
| 34000 | 0.6869 |
|
134 |
-
| 34500 | 0.6970 |
|
135 |
-
| 35000 | 0.7071 |
|
136 |
-
| 35500 | 0.7172 |
|
137 |
-
| 36000 | 0.7273 |
|
138 |
-
| 36500 | 0.7374 |
|
139 |
-
| 37000 | 0.7475 |
|
140 |
-
| 37500 | 0.7576 |
|
141 |
-
| 38000 | 0.7677 |
|
142 |
-
| 38500 | 0.7778 |
|
143 |
-
| 39000 | 0.7879 |
|
144 |
-
| 39500 | 0.7980 |
|
145 |
-
| 40000 | 0.8081 |
|
146 |
-
| 40500 | 0.8182 |
|
147 |
-
| 41000 | 0.8283 |
|
148 |
-
| 41500 | 0.8384 |
|
149 |
-
| 42000 | 0.8485 |
|
150 |
-
| 42500 | 0.8586 |
|
151 |
-
| 43000 | 0.8687 |
|
152 |
-
| 43500 | 0.8788 |
|
153 |
-
| 44000 | 0.8889 |
|
154 |
-
| 44500 | 0.8990 |
|
155 |
-
| 45000 | 0.9091 |
|
156 |
-
| 45500 | 0.9192 |
|
157 |
-
| 46000 | 0.9293 |
|
158 |
-
| 46500 | 0.9394 |
|
159 |
-
| 47000 | 0.9495 |
|
160 |
-
| 47500 | 0.9596 |
|
161 |
-
| 48000 | 0.9697 |
|
162 |
-
| 48500 | 0.9798 |
|
163 |
-
| 49000 | 0.9899 |
|
164 |
-
| 49500 | 1.0 |
|
165 |
|
166 |
### Framework versions
|
167 |
- Distily 0.2.0
|
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- eval_enwikippl: 3887.9170
|
19 |
+
- eval_frwikippl: 50974.8398
|
20 |
+
- eval_zhwikippl: 83822.7812
|
21 |
+
- eval_tinystoriesppl: 1011.5359
|
22 |
+
- eval_loss: 4.8822
|
23 |
+
- eval_runtime: 6.5175
|
24 |
+
- eval_samples_per_second: 76.716
|
25 |
+
- eval_steps_per_second: 9.666
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
47 |
The following hyperparameters were used during training:
|
48 |
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
|
49 |
- train_embeddings: True
|
50 |
+
- learning_rate: 0.0004
|
51 |
- train_batch_size: 1
|
52 |
- eval_batch_size: 8
|
53 |
- seed: 42
|
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
+
| 0 | 0 | 23232.2363 | 111004.0469 | 6.4068 | 6.5373 | 76.484 | 9.637 | 9550.5166 | 102446.0156 |
|
66 |
+
| 500 | 0.0101 | 3972.8616 | 51053.8984 | 4.8833 | 6.5063 | 76.848 | 9.683 | 1038.9088 | 84361.2188 |
|
67 |
+
| 1000 | 0.0202 | 3908.7549 | 50974.8398 | 4.8822 | 6.5275 | 76.599 | 9.651 | 1016.2294 | 83867.4844 |
|
68 |
+
| 1500 | 0.0303 | 3887.6167 | 50974.8398 | 4.8822 | 6.5158 | 76.737 | 9.669 | 1010.1989 | 83867.4844 |
|
69 |
+
| 2000 | 0.0404 | 3873.1880 | 50989.1836 | 4.8822 | 6.5149 | 76.747 | 9.67 | 1005.8657 | 83822.7812 |
|
70 |
+
| 2500 | 0.0505 | 3897.8667 | 50974.8398 | 4.8822 | 6.5124 | 76.777 | 9.674 | 1014.4666 | 83822.7812 |
|
71 |
+
| 3000 | 0.0606 | 3906.9363 | 50974.8398 | 4.8822 | 6.5204 | 76.682 | 9.662 | 1016.0612 | 83822.7812 |
|
72 |
+
| 3500 | 0.0707 | 3893.6423 | 50974.8398 | 4.8822 | 6.5097 | 76.809 | 9.678 | 1012.0377 | 83822.7812 |
|
73 |
+
| 4000 | 0.0808 | 3894.2476 | 50974.8398 | 4.8822 | 6.511 | 76.793 | 9.676 | 1013.2932 | 83822.7812 |
|
74 |
+
| 4500 | 0.0909 | 3917.8462 | 51010.7305 | 4.8822 | 6.5167 | 76.726 | 9.668 | 1019.4268 | 83867.4844 |
|
75 |
+
| 5000 | 0.1010 | 3912.3875 | 51003.5820 | 4.8822 | 6.5075 | 76.834 | 9.681 | 1017.7424 | 83867.4844 |
|
76 |
+
| 5500 | 0.1111 | 3903.9119 | 50974.8398 | 4.8822 | 6.5226 | 76.656 | 9.659 | 1015.6412 | 83822.7812 |
|
77 |
+
| 6000 | 0.1212 | 3912.3875 | 51003.5820 | 4.8822 | 6.5212 | 76.673 | 9.661 | 1017.4061 | 83867.4844 |
|
78 |
+
| 6500 | 0.1313 | 3936.0994 | 51068.2656 | 4.8822 | 6.4984 | 76.942 | 9.695 | 1025.1731 | 83957.0312 |
|
79 |
+
| 7000 | 0.1414 | 3922.1003 | 51053.8984 | 4.8822 | 6.5149 | 76.747 | 9.67 | 1020.9445 | 83912.2812 |
|
80 |
+
| 7500 | 0.1515 | 3914.2085 | 51010.7305 | 4.8822 | 6.5141 | 76.757 | 9.671 | 1018.9213 | 83867.4844 |
|
81 |
+
| 8000 | 0.1616 | 3907.5435 | 50974.8398 | 4.8822 | 6.5148 | 76.748 | 9.67 | 1016.2294 | 83867.4844 |
|
82 |
+
| 8500 | 0.1717 | 3913.6001 | 51003.5820 | 4.8822 | 6.5271 | 76.603 | 9.652 | 1018.0792 | 83867.4844 |
|
83 |
+
| 9000 | 0.1818 | 3897.8667 | 50974.8398 | 4.8822 | 6.5537 | 76.293 | 9.613 | 1014.1313 | 83822.7812 |
|
84 |
+
| 9500 | 0.1919 | 3887.9170 | 50974.8398 | 4.8822 | 6.5175 | 76.716 | 9.666 | 1011.5359 | 83822.7812 |
|
85 |
+
| 10000 | 0.2020 | 3882.1963 | 50960.4531 | 4.8822 | 6.5134 | 76.765 | 9.672 | 1008.7803 | 83822.7812 |
|
86 |
+
| 10500 | 0.2121 | 3912.3875 | 51003.5820 | 4.8822 | 6.5102 | 76.802 | 9.677 | 1017.5745 | 83867.4844 |
|
87 |
+
| 11000 | 0.2222 | 3938.5400 | 51068.2656 | 4.8822 | 6.4952 | 76.98 | 9.699 | 1025.5123 | 83957.0312 |
|
88 |
+
| 11500 | 0.2323 | 3939.7610 | 51068.2656 | 4.8822 | 6.4945 | 76.988 | 9.7 | 1028.0588 | 84136.4141 |
|
89 |
+
| 12000 | 0.2424 | 3951.9873 | 51097.0547 | 4.8822 | 6.5097 | 76.809 | 9.678 | 1029.9302 | 84136.4141 |
|
90 |
+
| 12500 | 0.2525 | 3922.1003 | 51053.8984 | 4.8822 | 6.5036 | 76.881 | 9.687 | 1021.7891 | 83912.2812 |
|
91 |
+
| 13000 | 0.2626 | 3897.8667 | 50974.8398 | 4.8822 | 6.5248 | 76.631 | 9.655 | 1014.6344 | 83822.7812 |
|
92 |
+
| 13500 | 0.2727 | 3887.6167 | 50974.8398 | 4.8822 | 6.5036 | 76.881 | 9.687 | 1010.3655 | 83822.7812 |
|
93 |
+
| 14000 | 0.2828 | 3867.7913 | 50989.1836 | 4.8822 | 6.5118 | 76.783 | 9.675 | 1001.6337 | 83778.0312 |
|
94 |
+
| 14500 | 0.2929 | 3860.6052 | 50989.1836 | 4.8822 | 6.5064 | 76.847 | 9.683 | 999.4833 | 83778.0312 |
|
95 |
+
| 15000 | 0.3030 | 3873.1880 | 50960.4531 | 4.8822 | 6.5104 | 76.8 | 9.677 | 1005.8657 | 83822.7812 |
|
96 |
+
| 15500 | 0.3131 | 3884.0034 | 50960.4531 | 4.8822 | 6.5186 | 76.704 | 9.665 | 1009.1969 | 83822.7812 |
|
97 |
+
| 16000 | 0.3232 | 3876.7874 | 50989.1836 | 4.8822 | 6.5138 | 76.76 | 9.672 | 1006.6141 | 83822.7812 |
|
98 |
+
| 16500 | 0.3333 | 3902.7021 | 50974.8398 | 4.8822 | 6.5538 | 76.291 | 9.613 | 1015.1377 | 83822.7812 |
|
99 |
+
| 17000 | 0.3434 | 3894.2476 | 50974.8398 | 4.8822 | 6.5508 | 76.326 | 9.617 | 1013.2932 | 83822.7812 |
|
100 |
+
| 17500 | 0.3535 | 3895.4548 | 50974.8398 | 4.8822 | 6.5145 | 76.752 | 9.671 | 1013.4609 | 83822.7812 |
|
101 |
+
| 18000 | 0.3636 | 3892.4358 | 50974.8398 | 4.8822 | 6.5208 | 76.678 | 9.661 | 1011.7029 | 83822.7812 |
|
102 |
+
| 18500 | 0.3737 | 3895.4548 | 50974.8398 | 4.8822 | 6.506 | 76.852 | 9.683 | 1013.4609 | 83822.7812 |
|
103 |
+
| 19000 | 0.3838 | 3922.1003 | 51053.8984 | 4.8822 | 6.4973 | 76.955 | 9.696 | 1021.7891 | 83912.2812 |
|
104 |
+
| 19500 | 0.3939 | 3931.2227 | 51068.2656 | 4.8822 | 6.4889 | 77.055 | 9.709 | 1023.6490 | 83957.0312 |
|
105 |
+
| 20000 | 0.4040 | 3926.9622 | 51068.2656 | 4.8822 | 6.4907 | 77.034 | 9.706 | 1023.3103 | 83912.2812 |
|
106 |
+
| 20500 | 0.4141 | 3922.1003 | 51053.8984 | 4.8822 | 6.4861 | 77.088 | 9.713 | 1021.7891 | 83912.2812 |
|
107 |
+
| 21000 | 0.4242 | 3922.1003 | 51053.8984 | 4.8822 | 6.4983 | 76.943 | 9.695 | 1021.7891 | 83912.2812 |
|
108 |
+
| 21500 | 0.4343 | 3931.2227 | 51068.2656 | 4.8822 | 6.509 | 76.817 | 9.679 | 1024.8344 | 83957.0312 |
|
109 |
+
| 22000 | 0.4444 | 3938.5400 | 51068.2656 | 4.8822 | 6.5034 | 76.883 | 9.687 | 1026.6998 | 84091.5703 |
|
110 |
+
| 22500 | 0.4545 | 3931.2227 | 51068.2656 | 4.8822 | 6.5235 | 76.646 | 9.657 | 1024.8344 | 83957.0312 |
|
111 |
+
| 23000 | 0.4646 | 3922.1003 | 51053.8984 | 4.8822 | 6.5476 | 76.363 | 9.622 | 1021.4509 | 83867.4844 |
|
112 |
+
| 23500 | 0.4747 | 3920.8850 | 51053.8984 | 4.8822 | 6.4853 | 77.098 | 9.714 | 1020.6072 | 83867.4844 |
|
113 |
+
| 24000 | 0.4848 | 3919.6699 | 51053.8984 | 4.8822 | 6.4956 | 76.975 | 9.699 | 1020.4384 | 83867.4844 |
|
114 |
+
| 24500 | 0.4949 | 3921.4907 | 51053.8984 | 4.8822 | 6.5036 | 76.88 | 9.687 | 1020.7761 | 83867.4844 |
|
115 |
+
| 25000 | 0.5051 | 3919.6699 | 51053.8984 | 4.8822 | 6.4891 | 77.052 | 9.709 | 1020.6072 | 83867.4844 |
|
116 |
+
| 25500 | 0.5152 | 3922.1003 | 51053.8984 | 4.8822 | 6.4927 | 77.01 | 9.703 | 1020.9445 | 83867.4844 |
|
117 |
+
| 26000 | 0.5253 | 3926.9622 | 51068.2656 | 4.8822 | 6.4956 | 76.975 | 9.699 | 1022.4650 | 83912.2812 |
|
118 |
+
| 26500 | 0.5354 | 3931.2227 | 51068.2656 | 4.8822 | 6.5008 | 76.914 | 9.691 | 1024.1567 | 83957.0312 |
|
119 |
+
| 27000 | 0.5455 | 3926.9622 | 51068.2656 | 4.8822 | 6.5048 | 76.867 | 9.685 | 1022.2959 | 83912.2812 |
|
120 |
+
| 27500 | 0.5556 | 3922.1003 | 51053.8984 | 4.8822 | 6.526 | 76.617 | 9.654 | 1021.7891 | 83912.2812 |
|
121 |
+
| 28000 | 0.5657 | 3921.4907 | 51053.8984 | 4.8822 | 6.5337 | 76.527 | 9.642 | 1020.7761 | 83867.4844 |
|
122 |
+
| 28500 | 0.5758 | 3917.8462 | 51053.8984 | 4.8822 | 6.4962 | 76.969 | 9.698 | 1019.5950 | 83867.4844 |
|
123 |
+
| 29000 | 0.5859 | 3917.8462 | 51039.4883 | 4.8822 | 6.4943 | 76.99 | 9.701 | 1019.4268 | 83867.4844 |
|
124 |
+
| 29500 | 0.5960 | 3919.6699 | 51053.8984 | 4.8822 | 6.5069 | 76.841 | 9.682 | 1020.4384 | 83867.4844 |
|
125 |
+
| 30000 | 0.6061 | 3919.6699 | 51053.8984 | 4.8822 | 6.5033 | 76.884 | 9.687 | 1020.4384 | 83867.4844 |
|
126 |
+
| 30500 | 0.6162 | 3917.8462 | 51039.4883 | 4.8822 | 6.4958 | 76.972 | 9.699 | 1019.4268 | 83867.4844 |
|
127 |
+
| 31000 | 0.6263 | 3917.8462 | 51039.4883 | 4.8822 | 6.4827 | 77.129 | 9.718 | 1019.4268 | 83867.4844 |
|
128 |
+
| 31500 | 0.6364 | 3917.8462 | 51039.4883 | 4.8822 | 6.495 | 76.982 | 9.7 | 1019.4268 | 83867.4844 |
|
129 |
+
| 32000 | 0.6465 | 3918.4551 | 51053.8984 | 4.8822 | 6.5193 | 76.696 | 9.664 | 1019.5950 | 83867.4844 |
|
130 |
+
| 32500 | 0.6566 | 3921.4907 | 51053.8984 | 4.8822 | 6.5213 | 76.671 | 9.661 | 1020.7761 | 83867.4844 |
|
131 |
+
| 33000 | 0.6667 | 3922.1003 | 51053.8984 | 4.8822 | 6.5 | 76.923 | 9.692 | 1021.4509 | 83867.4844 |
|
132 |
+
| 33500 | 0.6768 | 3922.1003 | 51053.8984 | 4.8822 | 6.5038 | 76.878 | 9.687 | 1021.7891 | 83912.2812 |
|
133 |
+
| 34000 | 0.6869 | 3922.1003 | 51053.8984 | 4.8822 | 6.5965 | 75.798 | 9.551 | 1021.6200 | 83867.4844 |
|
134 |
+
| 34500 | 0.6970 | 3922.1003 | 51053.8984 | 4.8822 | 6.4926 | 77.01 | 9.703 | 1020.7761 | 83867.4844 |
|
135 |
+
| 35000 | 0.7071 | 3921.4907 | 51053.8984 | 4.8822 | 6.5061 | 76.851 | 9.683 | 1020.7761 | 83867.4844 |
|
136 |
+
| 35500 | 0.7172 | 3921.4907 | 51053.8984 | 4.8822 | 6.5289 | 76.582 | 9.649 | 1020.7761 | 83867.4844 |
|
137 |
+
| 36000 | 0.7273 | 3922.1003 | 51053.8984 | 4.8822 | 6.5582 | 76.24 | 9.606 | 1021.4509 | 83867.4844 |
|
138 |
+
| 36500 | 0.7374 | 3922.1003 | 51053.8984 | 4.8822 | 6.5354 | 76.506 | 9.64 | 1021.7891 | 83912.2812 |
|
139 |
+
| 37000 | 0.7475 | 3924.5286 | 51053.8984 | 4.8822 | 6.5215 | 76.669 | 9.66 | 1021.7891 | 83912.2812 |
|
140 |
+
| 37500 | 0.7576 | 3926.9622 | 51053.8984 | 4.8822 | 6.5007 | 76.915 | 9.691 | 1021.7891 | 83912.2812 |
|
141 |
+
| 38000 | 0.7677 | 3924.5286 | 51053.8984 | 4.8822 | 6.5068 | 76.842 | 9.682 | 1021.7891 | 83912.2812 |
|
142 |
+
| 38500 | 0.7778 | 3922.1003 | 51053.8984 | 4.8822 | 6.5229 | 76.653 | 9.658 | 1021.6200 | 83867.4844 |
|
143 |
+
| 39000 | 0.7879 | 3922.1003 | 51053.8984 | 4.8822 | 6.5165 | 76.728 | 9.668 | 1020.9445 | 83867.4844 |
|
144 |
+
| 39500 | 0.7980 | 3920.8850 | 51053.8984 | 4.8822 | 6.5119 | 76.782 | 9.675 | 1020.6072 | 83867.4844 |
|
145 |
+
| 40000 | 0.8081 | 3920.8850 | 51053.8984 | 4.8822 | 6.5092 | 76.814 | 9.679 | 1020.6072 | 83867.4844 |
|
146 |
+
| 40500 | 0.8182 | 3920.8850 | 51053.8984 | 4.8822 | 6.5191 | 76.697 | 9.664 | 1020.7761 | 83867.4844 |
|
147 |
+
| 41000 | 0.8283 | 3920.8850 | 51053.8984 | 4.8822 | 6.5316 | 76.551 | 9.645 | 1020.7761 | 83867.4844 |
|
148 |
+
| 41500 | 0.8384 | 3920.8850 | 51053.8984 | 4.8822 | 6.5046 | 76.869 | 9.685 | 1020.7761 | 83867.4844 |
|
149 |
+
| 42000 | 0.8485 | 3920.8850 | 51053.8984 | 4.8822 | 6.5038 | 76.878 | 9.687 | 1020.6072 | 83867.4844 |
|
150 |
+
| 42500 | 0.8586 | 3920.8850 | 51053.8984 | 4.8822 | 6.5215 | 76.669 | 9.66 | 1020.6072 | 83867.4844 |
|
151 |
+
| 43000 | 0.8687 | 3920.8850 | 51053.8984 | 4.8822 | 6.5049 | 76.865 | 9.685 | 1020.6072 | 83867.4844 |
|
152 |
+
| 43500 | 0.8788 | 3920.8850 | 51053.8984 | 4.8822 | 6.5074 | 76.836 | 9.681 | 1020.6072 | 83867.4844 |
|
153 |
+
| 44000 | 0.8889 | 3920.8850 | 51053.8984 | 4.8822 | 6.4973 | 76.956 | 9.696 | 1020.6072 | 83867.4844 |
|
154 |
+
| 44500 | 0.8990 | 3920.8850 | 51053.8984 | 4.8822 | 6.529 | 76.581 | 9.649 | 1020.6072 | 83867.4844 |
|
155 |
+
| 45000 | 0.9091 | 3920.8850 | 51053.8984 | 4.8822 | 6.5231 | 76.651 | 9.658 | 1020.6072 | 83867.4844 |
|
156 |
+
| 45500 | 0.9192 | 3920.8850 | 51053.8984 | 4.8822 | 6.5386 | 76.469 | 9.635 | 1020.6072 | 83867.4844 |
|
157 |
+
| 46000 | 0.9293 | 3920.8850 | 51053.8984 | 4.8822 | 6.5266 | 76.61 | 9.653 | 1020.6072 | 83867.4844 |
|
158 |
+
| 46500 | 0.9394 | 3920.8850 | 51053.8984 | 4.8822 | 6.4999 | 76.924 | 9.692 | 1020.6072 | 83867.4844 |
|
159 |
+
| 47000 | 0.9495 | 3920.8850 | 51053.8984 | 4.8822 | 6.5248 | 76.63 | 9.655 | 1020.6072 | 83867.4844 |
|
160 |
+
| 47500 | 0.9596 | 3920.8850 | 51053.8984 | 4.8822 | 6.5217 | 76.668 | 9.66 | 1020.6072 | 83867.4844 |
|
161 |
+
| 48000 | 0.9697 | 3920.8850 | 51053.8984 | 4.8822 | 6.5043 | 76.872 | 9.686 | 1020.6072 | 83867.4844 |
|
162 |
+
| 48500 | 0.9798 | 3920.8850 | 51053.8984 | 4.8822 | 6.4865 | 77.084 | 9.713 | 1020.6072 | 83867.4844 |
|
163 |
+
| 49000 | 0.9899 | 3920.8850 | 51053.8984 | 4.8822 | 6.5203 | 76.684 | 9.662 | 1020.6072 | 83867.4844 |
|
164 |
+
| 49500 | 1.0 | 3920.8850 | 51053.8984 | 4.8822 | 6.6387 | 75.315 | 9.49 | 1020.6072 | 83867.4844 |
|
165 |
|
166 |
### Framework versions
|
167 |
- Distily 0.2.0
|
logs/dropout=0, learning_rate=0.0004, per_device_train_batch_size=1, weight_decay=0/events.out.tfevents.1723938921.5f530b1cf724
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:10986c8798cc7b549ad27168a0c106ee94a699450433741614d802e28c8b20b6
|
3 |
+
size 312
|