File size: 4,953 Bytes
e77239d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
2021-05-31 18:22:29,298	INFO	__main__	Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/wnli/kd/bert_base_uncased_from_bert_large_uncased.yaml', log='log/glue/wnli/kd/bert_base_uncased_from_bert_large_uncased.txt', private_output='leaderboard/glue/kd/bert_base_uncased_from_bert_large_uncased/', seed=None, student_only=False, task_name='wnli', test_only=False, world_size=1)
2021-05-31 18:22:29,328	INFO	__main__	Distributed environment: NO
Num processes: 1
Process index: 0
Local process index: 0
Device: cuda
Use FP16 precision: True

2021-05-31 18:22:39,133	WARNING	datasets.builder	Reusing dataset glue (/root/.cache/huggingface/datasets/glue/wnli/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
2021-05-31 18:22:39,753	INFO	__main__	Start training
2021-05-31 18:22:39,753	INFO	torchdistill.models.util	[teacher model]
2021-05-31 18:22:39,754	INFO	torchdistill.models.util	Using the original teacher model
2021-05-31 18:22:39,754	INFO	torchdistill.models.util	[student model]
2021-05-31 18:22:39,754	INFO	torchdistill.models.util	Using the original student model
2021-05-31 18:22:39,754	INFO	torchdistill.core.distillation	Loss = 1.0 * OrgLoss
2021-05-31 18:22:39,754	INFO	torchdistill.core.distillation	Freezing the whole teacher model
2021-05-31 18:22:43,520	INFO	torchdistill.misc.log	Epoch: [0]  [ 0/20]  eta: 0:00:11  lr: 2.97e-05  sample/s: 7.107315395186558  loss: 0.0587 (0.0587)  time: 0.5682  data: 0.0054  max mem: 2861
2021-05-31 18:22:53,674	INFO	torchdistill.misc.log	Epoch: [0] Total time: 0:00:10
2021-05-31 18:22:53,905	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
2021-05-31 18:22:53,905	INFO	__main__	Validation: accuracy = 0.5492957746478874
2021-05-31 18:22:53,905	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/wnli/kd/wnli-bert-base-uncased_from_bert-large-uncased
2021-05-31 18:22:55,500	INFO	torchdistill.misc.log	Epoch: [1]  [ 0/20]  eta: 0:00:10  lr: 2.37e-05  sample/s: 7.659786933102254  loss: 0.0016 (0.0016)  time: 0.5269  data: 0.0047  max mem: 4680
2021-05-31 18:23:05,780	INFO	torchdistill.misc.log	Epoch: [1] Total time: 0:00:10
2021-05-31 18:23:06,009	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
2021-05-31 18:23:06,010	INFO	__main__	Validation: accuracy = 0.5774647887323944
2021-05-31 18:23:06,010	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/wnli/kd/wnli-bert-base-uncased_from_bert-large-uncased
2021-05-31 18:23:07,623	INFO	torchdistill.misc.log	Epoch: [2]  [ 0/20]  eta: 0:00:09  lr: 1.77e-05  sample/s: 8.39442113663712  loss: 0.0000 (0.0000)  time: 0.4807  data: 0.0042  max mem: 4680
2021-05-31 18:23:18,217	INFO	torchdistill.misc.log	Epoch: [2] Total time: 0:00:11
2021-05-31 18:23:18,447	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
2021-05-31 18:23:18,448	INFO	__main__	Validation: accuracy = 0.5633802816901409
2021-05-31 18:23:19,063	INFO	torchdistill.misc.log	Epoch: [3]  [ 0/20]  eta: 0:00:12  lr: 1.1700000000000001e-05  sample/s: 6.543501026345667  loss: -0.0000 (-0.0000)  time: 0.6152  data: 0.0039  max mem: 4680
2021-05-31 18:23:29,640	INFO	torchdistill.misc.log	Epoch: [3] Total time: 0:00:11
2021-05-31 18:23:29,870	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
2021-05-31 18:23:29,870	INFO	__main__	Validation: accuracy = 0.5633802816901409
2021-05-31 18:23:30,396	INFO	torchdistill.misc.log	Epoch: [4]  [ 0/20]  eta: 0:00:10  lr: 5.7000000000000005e-06  sample/s: 7.676525791092471  loss: 0.0000 (0.0000)  time: 0.5252  data: 0.0041  max mem: 4680
2021-05-31 18:23:40,820	INFO	torchdistill.misc.log	Epoch: [4] Total time: 0:00:10
2021-05-31 18:23:41,051	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
2021-05-31 18:23:41,051	INFO	__main__	Validation: accuracy = 0.5633802816901409
2021-05-31 18:23:41,088	INFO	__main__	[Teacher: bert-large-uncased]
2021-05-31 18:23:41,729	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
2021-05-31 18:23:41,729	INFO	__main__	Test: accuracy = 0.5633802816901409
2021-05-31 18:23:44,813	INFO	__main__	[Student: bert-base-uncased]
2021-05-31 18:23:45,056	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/wnli/default_experiment-1-0.arrow
2021-05-31 18:23:45,056	INFO	__main__	Test: accuracy = 0.5774647887323944
2021-05-31 18:23:45,056	INFO	__main__	Start prediction for private dataset(s)
2021-05-31 18:23:45,057	INFO	__main__	wnli/test: 146 samples