yoshitomo-matsubara commited on
Commit
6b609d3
1 Parent(s): f57423d

initial commit

Browse files
README.md ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - bert
5
+ - mrpc
6
+ - glue
7
+ - kd
8
+ - torchdistill
9
+ license: apache-2.0
10
+ datasets:
11
+ - mrpc
12
+ metrics:
13
+ - f1
14
+ - accuracy
15
+ ---
16
+
17
+ `bert-base-uncased` fine-tuned on MRPC dataset, using fine-tuned `bert-large-uncased` as a teacher model, [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_kd_and_submission.ipynb) for knowledge distillation.
18
+ The training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/mrpc/kd/bert_base_uncased_from_bert_large_uncased.yaml).
19
+ I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **78.9**.
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "bert-base-uncased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "finetuning_task": "mrpc",
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "problem_type": "single_label_classification",
22
+ "transformers_version": "4.6.1",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d3c2d38a7041148d1c55f959584ff8e6062ce675f116bcfef782fa1c1419fcb6
3
+ size 438024457
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
training.log ADDED
@@ -0,0 +1,68 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2021-05-31 17:53:34,523 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/mrpc/kd/bert_base_uncased_from_bert_large_uncased.yaml', log='log/glue/mrpc/kd/bert_base_uncased_from_bert_large_uncased.txt', private_output='leaderboard/glue/kd/bert_base_uncased_from_bert_large_uncased/', seed=None, student_only=False, task_name='mrpc', test_only=False, world_size=1)
2
+ 2021-05-31 17:53:34,568 INFO __main__ Distributed environment: NO
3
+ Num processes: 1
4
+ Process index: 0
5
+ Local process index: 0
6
+ Device: cuda
7
+ Use FP16 precision: True
8
+
9
+ 2021-05-31 17:53:46,056 WARNING datasets.builder Reusing dataset glue (/root/.cache/huggingface/datasets/glue/mrpc/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
10
+ 2021-05-31 17:53:47,912 INFO __main__ Start training
11
+ 2021-05-31 17:53:47,912 INFO torchdistill.models.util [teacher model]
12
+ 2021-05-31 17:53:47,912 INFO torchdistill.models.util Using the original teacher model
13
+ 2021-05-31 17:53:47,912 INFO torchdistill.models.util [student model]
14
+ 2021-05-31 17:53:47,913 INFO torchdistill.models.util Using the original student model
15
+ 2021-05-31 17:53:47,913 INFO torchdistill.core.distillation Loss = 1.0 * OrgLoss
16
+ 2021-05-31 17:53:47,913 INFO torchdistill.core.distillation Freezing the whole teacher model
17
+ 2021-05-31 17:53:52,838 INFO torchdistill.misc.log Epoch: [0] [ 0/230] eta: 0:00:18 lr: 9.991304347826087e-05 sample/s: 53.83697333376119 loss: 0.1500 (0.1500) time: 0.0792 data: 0.0049 max mem: 1984
18
+ 2021-05-31 17:53:57,162 INFO torchdistill.misc.log Epoch: [0] [ 50/230] eta: 0:00:15 lr: 9.556521739130435e-05 sample/s: 47.326553812563645 loss: 0.1309 (0.1424) time: 0.0860 data: 0.0028 max mem: 3426
19
+ 2021-05-31 17:54:01,372 INFO torchdistill.misc.log Epoch: [0] [100/230] eta: 0:00:11 lr: 9.121739130434783e-05 sample/s: 50.72200454699367 loss: 0.1143 (0.1301) time: 0.0844 data: 0.0029 max mem: 3426
20
+ 2021-05-31 17:54:05,633 INFO torchdistill.misc.log Epoch: [0] [150/230] eta: 0:00:06 lr: 8.686956521739131e-05 sample/s: 47.706321425626356 loss: 0.0961 (0.1229) time: 0.0840 data: 0.0028 max mem: 3514
21
+ 2021-05-31 17:54:09,841 INFO torchdistill.misc.log Epoch: [0] [200/230] eta: 0:00:02 lr: 8.252173913043479e-05 sample/s: 51.89701774628108 loss: 0.0798 (0.1154) time: 0.0837 data: 0.0028 max mem: 3514
22
+ 2021-05-31 17:54:12,262 INFO torchdistill.misc.log Epoch: [0] Total time: 0:00:19
23
+ 2021-05-31 17:54:12,694 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
24
+ 2021-05-31 17:54:12,694 INFO __main__ Validation: accuracy = 0.8137254901960784, f1 = 0.8758169934640523
25
+ 2021-05-31 17:54:12,695 INFO __main__ Updating ckpt at ./resource/ckpt/glue/mrpc/kd/mrpc-bert-base-uncased_from_bert-large-uncased
26
+ 2021-05-31 17:54:13,740 INFO torchdistill.misc.log Epoch: [1] [ 0/230] eta: 0:00:23 lr: 7.991304347826087e-05 sample/s: 41.638144685540134 loss: 0.0671 (0.0671) time: 0.1008 data: 0.0047 max mem: 3514
27
+ 2021-05-31 17:54:18,206 INFO torchdistill.misc.log Epoch: [1] [ 50/230] eta: 0:00:16 lr: 7.556521739130435e-05 sample/s: 47.095790431062554 loss: 0.0538 (0.0574) time: 0.0852 data: 0.0028 max mem: 3514
28
+ 2021-05-31 17:54:22,392 INFO torchdistill.misc.log Epoch: [1] [100/230] eta: 0:00:11 lr: 7.121739130434783e-05 sample/s: 51.10875664478394 loss: 0.0411 (0.0576) time: 0.0829 data: 0.0028 max mem: 3514
29
+ 2021-05-31 17:54:26,596 INFO torchdistill.misc.log Epoch: [1] [150/230] eta: 0:00:06 lr: 6.686956521739131e-05 sample/s: 51.13508241490295 loss: 0.0486 (0.0582) time: 0.0831 data: 0.0028 max mem: 3514
30
+ 2021-05-31 17:54:30,823 INFO torchdistill.misc.log Epoch: [1] [200/230] eta: 0:00:02 lr: 6.252173913043479e-05 sample/s: 50.08109229524688 loss: 0.0462 (0.0563) time: 0.0837 data: 0.0027 max mem: 3514
31
+ 2021-05-31 17:54:33,328 INFO torchdistill.misc.log Epoch: [1] Total time: 0:00:19
32
+ 2021-05-31 17:54:33,746 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
33
+ 2021-05-31 17:54:33,746 INFO __main__ Validation: accuracy = 0.8480392156862745, f1 = 0.8973509933774835
34
+ 2021-05-31 17:54:33,747 INFO __main__ Updating ckpt at ./resource/ckpt/glue/mrpc/kd/mrpc-bert-base-uncased_from_bert-large-uncased
35
+ 2021-05-31 17:54:34,895 INFO torchdistill.misc.log Epoch: [2] [ 0/230] eta: 0:00:21 lr: 5.9913043478260875e-05 sample/s: 44.05931925197026 loss: 0.0465 (0.0465) time: 0.0945 data: 0.0038 max mem: 3514
36
+ 2021-05-31 17:54:39,140 INFO torchdistill.misc.log Epoch: [2] [ 50/230] eta: 0:00:15 lr: 5.556521739130435e-05 sample/s: 50.21795455089677 loss: 0.0095 (0.0346) time: 0.0823 data: 0.0027 max mem: 3514
37
+ 2021-05-31 17:54:43,295 INFO torchdistill.misc.log Epoch: [2] [100/230] eta: 0:00:10 lr: 5.1217391304347826e-05 sample/s: 50.8051007337377 loss: 0.0142 (0.0291) time: 0.0837 data: 0.0028 max mem: 3514
38
+ 2021-05-31 17:54:47,483 INFO torchdistill.misc.log Epoch: [2] [150/230] eta: 0:00:06 lr: 4.686956521739131e-05 sample/s: 51.06348993778838 loss: 0.0007 (0.0272) time: 0.0842 data: 0.0029 max mem: 3514
39
+ 2021-05-31 17:54:51,714 INFO torchdistill.misc.log Epoch: [2] [200/230] eta: 0:00:02 lr: 4.252173913043478e-05 sample/s: 50.24276854244841 loss: 0.0026 (0.0253) time: 0.0851 data: 0.0030 max mem: 3514
40
+ 2021-05-31 17:54:54,205 INFO torchdistill.misc.log Epoch: [2] Total time: 0:00:19
41
+ 2021-05-31 17:54:54,639 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
42
+ 2021-05-31 17:54:54,639 INFO __main__ Validation: accuracy = 0.8529411764705882, f1 = 0.8932384341637011
43
+ 2021-05-31 17:54:54,639 INFO __main__ Updating ckpt at ./resource/ckpt/glue/mrpc/kd/mrpc-bert-base-uncased_from_bert-large-uncased
44
+ 2021-05-31 17:54:55,920 INFO torchdistill.misc.log Epoch: [3] [ 0/230] eta: 0:00:22 lr: 3.991304347826087e-05 sample/s: 42.869448788315495 loss: 0.0016 (0.0016) time: 0.0967 data: 0.0034 max mem: 3514
45
+ 2021-05-31 17:55:00,189 INFO torchdistill.misc.log Epoch: [3] [ 50/230] eta: 0:00:15 lr: 3.556521739130435e-05 sample/s: 52.13343152700482 loss: 0.0001 (0.0136) time: 0.0830 data: 0.0027 max mem: 3514
46
+ 2021-05-31 17:55:04,403 INFO torchdistill.misc.log Epoch: [3] [100/230] eta: 0:00:11 lr: 3.121739130434783e-05 sample/s: 45.72906348890518 loss: 0.0002 (0.0123) time: 0.0849 data: 0.0028 max mem: 3514
47
+ 2021-05-31 17:55:08,613 INFO torchdistill.misc.log Epoch: [3] [150/230] eta: 0:00:06 lr: 2.6869565217391306e-05 sample/s: 46.72499255005389 loss: 0.0002 (0.0128) time: 0.0836 data: 0.0028 max mem: 3514
48
+ 2021-05-31 17:55:12,805 INFO torchdistill.misc.log Epoch: [3] [200/230] eta: 0:00:02 lr: 2.252173913043478e-05 sample/s: 50.039716295134184 loss: 0.0001 (0.0117) time: 0.0837 data: 0.0028 max mem: 3514
49
+ 2021-05-31 17:55:15,259 INFO torchdistill.misc.log Epoch: [3] Total time: 0:00:19
50
+ 2021-05-31 17:55:15,712 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
51
+ 2021-05-31 17:55:15,713 INFO __main__ Validation: accuracy = 0.875, f1 = 0.9090909090909091
52
+ 2021-05-31 17:55:15,713 INFO __main__ Updating ckpt at ./resource/ckpt/glue/mrpc/kd/mrpc-bert-base-uncased_from_bert-large-uncased
53
+ 2021-05-31 17:55:17,017 INFO torchdistill.misc.log Epoch: [4] [ 0/230] eta: 0:00:22 lr: 1.9913043478260872e-05 sample/s: 43.684523530545185 loss: 0.0001 (0.0001) time: 0.0963 data: 0.0048 max mem: 3514
54
+ 2021-05-31 17:55:21,223 INFO torchdistill.misc.log Epoch: [4] [ 50/230] eta: 0:00:15 lr: 1.5565217391304347e-05 sample/s: 46.87211121541281 loss: 0.0001 (0.0051) time: 0.0825 data: 0.0027 max mem: 3514
55
+ 2021-05-31 17:55:25,451 INFO torchdistill.misc.log Epoch: [4] [100/230] eta: 0:00:10 lr: 1.1217391304347827e-05 sample/s: 50.193014892865904 loss: 0.0000 (0.0061) time: 0.0839 data: 0.0028 max mem: 3514
56
+ 2021-05-31 17:55:29,638 INFO torchdistill.misc.log Epoch: [4] [150/230] eta: 0:00:06 lr: 6.869565217391305e-06 sample/s: 50.56851090507939 loss: 0.0001 (0.0062) time: 0.0832 data: 0.0028 max mem: 3514
57
+ 2021-05-31 17:55:33,890 INFO torchdistill.misc.log Epoch: [4] [200/230] eta: 0:00:02 lr: 2.5217391304347826e-06 sample/s: 46.181077480373915 loss: 0.0000 (0.0062) time: 0.0860 data: 0.0028 max mem: 3514
58
+ 2021-05-31 17:55:36,313 INFO torchdistill.misc.log Epoch: [4] Total time: 0:00:19
59
+ 2021-05-31 17:55:36,734 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
60
+ 2021-05-31 17:55:36,735 INFO __main__ Validation: accuracy = 0.8676470588235294, f1 = 0.9042553191489361
61
+ 2021-05-31 17:55:36,774 INFO __main__ [Teacher: bert-large-uncased]
62
+ 2021-05-31 17:55:37,374 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
63
+ 2021-05-31 17:55:37,375 INFO __main__ Test: accuracy = 0.8799019607843137, f1 = 0.9162393162393162
64
+ 2021-05-31 17:55:41,047 INFO __main__ [Student: bert-base-uncased]
65
+ 2021-05-31 17:55:41,477 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/mrpc/default_experiment-1-0.arrow
66
+ 2021-05-31 17:55:41,477 INFO __main__ Test: accuracy = 0.875, f1 = 0.9090909090909091
67
+ 2021-05-31 17:55:41,478 INFO __main__ Start prediction for private dataset(s)
68
+ 2021-05-31 17:55:41,478 INFO __main__ mrpc/test: 1725 samples
vocab.txt ADDED
The diff for this file is too large to render. See raw diff