yoshitomo-matsubara commited on
Commit
18e5eba
1 Parent(s): ef95bed

initial commit

Browse files
README.md ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - bert
5
+ - rte
6
+ - glue
7
+ - torchdistill
8
+ license: apache-2.0
9
+ datasets:
10
+ - rte
11
+ metrics:
12
+ - accuracy
13
+ ---
14
+
15
+ `bert-base-uncased` fine-tuned on RTE dataset, using [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_finetuning_and_submission.ipynb).
16
+ The hyperparameters are the same as those in Hugging Face's example and/or the paper of BERT, and the training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/rte/ce/bert_base_uncased.yaml).
17
+ I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **77.9**.
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "bert-base-uncased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "finetuning_task": "rte",
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "problem_type": "single_label_classification",
22
+ "transformers_version": "4.6.1",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b6f7de7e563064d7239c48e250e0cdfe4ecd69d808558e620f56e2806f84e383
3
+ size 438024457
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
training.log ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2021-05-29 21:08:05,621 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/rte/ce/bert_base_uncased.yaml', log='log/glue/rte/ce/bert_base_uncased.txt', private_output='leaderboard/glue/standard/bert_base_uncased/', seed=None, student_only=False, task_name='rte', test_only=False, world_size=1)
2
+ 2021-05-29 21:08:05,650 INFO __main__ Distributed environment: NO
3
+ Num processes: 1
4
+ Process index: 0
5
+ Local process index: 0
6
+ Device: cuda
7
+ Use FP16 precision: True
8
+
9
+ 2021-05-29 21:08:10,724 WARNING datasets.builder Reusing dataset glue (/root/.cache/huggingface/datasets/glue/rte/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
10
+ 2021-05-29 21:08:12,480 INFO __main__ Start training
11
+ 2021-05-29 21:08:12,480 INFO torchdistill.models.util [student model]
12
+ 2021-05-29 21:08:12,481 INFO torchdistill.models.util Using the original student model
13
+ 2021-05-29 21:08:12,481 INFO torchdistill.core.training Loss = 1.0 * OrgLoss
14
+ 2021-05-29 21:08:15,195 INFO torchdistill.misc.log Epoch: [0] [ 0/156] eta: 0:00:38 lr: 4.9919871794871795e-05 sample/s: 16.491046863772947 loss: 0.7014 (0.7014) time: 0.2471 data: 0.0046 max mem: 1900
15
+ 2021-05-29 21:08:26,519 INFO torchdistill.misc.log Epoch: [0] [ 50/156] eta: 0:00:24 lr: 4.591346153846154e-05 sample/s: 17.636260618464803 loss: 0.6778 (0.6873) time: 0.2286 data: 0.0028 max mem: 3189
16
+ 2021-05-29 21:08:37,850 INFO torchdistill.misc.log Epoch: [0] [100/156] eta: 0:00:12 lr: 4.190705128205128e-05 sample/s: 17.63246088265216 loss: 0.6690 (0.6821) time: 0.2286 data: 0.0028 max mem: 3189
17
+ 2021-05-29 21:08:49,200 INFO torchdistill.misc.log Epoch: [0] [150/156] eta: 0:00:01 lr: 3.790064102564103e-05 sample/s: 17.644718589450324 loss: 0.6526 (0.6783) time: 0.2262 data: 0.0027 max mem: 3189
18
+ 2021-05-29 21:08:50,282 INFO torchdistill.misc.log Epoch: [0] Total time: 0:00:35
19
+ 2021-05-29 21:08:51,602 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/rte/default_experiment-1-0.arrow
20
+ 2021-05-29 21:08:51,603 INFO __main__ Validation: accuracy = 0.6101083032490975
21
+ 2021-05-29 21:08:51,603 INFO __main__ Updating ckpt at ./resource/ckpt/glue/rte/ce/rte-bert-base-uncased
22
+ 2021-05-29 21:08:52,924 INFO torchdistill.misc.log Epoch: [1] [ 0/156] eta: 0:00:35 lr: 3.7419871794871796e-05 sample/s: 17.58968347993835 loss: 0.6698 (0.6698) time: 0.2306 data: 0.0032 max mem: 3189
23
+ 2021-05-29 21:09:04,238 INFO torchdistill.misc.log Epoch: [1] [ 50/156] eta: 0:00:23 lr: 3.3413461538461536e-05 sample/s: 17.62684768527809 loss: 0.5938 (0.6194) time: 0.2288 data: 0.0027 max mem: 3189
24
+ 2021-05-29 21:09:15,629 INFO torchdistill.misc.log Epoch: [1] [100/156] eta: 0:00:12 lr: 2.9407051282051283e-05 sample/s: 17.62131209182248 loss: 0.6079 (0.6134) time: 0.2296 data: 0.0027 max mem: 3189
25
+ 2021-05-29 21:09:27,017 INFO torchdistill.misc.log Epoch: [1] [150/156] eta: 0:00:01 lr: 2.5400641025641026e-05 sample/s: 17.635037110148975 loss: 0.5751 (0.6014) time: 0.2262 data: 0.0026 max mem: 3189
26
+ 2021-05-29 21:09:28,078 INFO torchdistill.misc.log Epoch: [1] Total time: 0:00:35
27
+ 2021-05-29 21:09:29,398 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/rte/default_experiment-1-0.arrow
28
+ 2021-05-29 21:09:29,398 INFO __main__ Validation: accuracy = 0.6678700361010831
29
+ 2021-05-29 21:09:29,399 INFO __main__ Updating ckpt at ./resource/ckpt/glue/rte/ce/rte-bert-base-uncased
30
+ 2021-05-29 21:09:30,853 INFO torchdistill.misc.log Epoch: [2] [ 0/156] eta: 0:00:36 lr: 2.4919871794871797e-05 sample/s: 17.44987373264604 loss: 0.5368 (0.5368) time: 0.2326 data: 0.0033 max mem: 3189
31
+ 2021-05-29 21:09:42,217 INFO torchdistill.misc.log Epoch: [2] [ 50/156] eta: 0:00:24 lr: 2.091346153846154e-05 sample/s: 17.61959102743042 loss: 0.4061 (0.4451) time: 0.2270 data: 0.0026 max mem: 3189
32
+ 2021-05-29 21:09:53,636 INFO torchdistill.misc.log Epoch: [2] [100/156] eta: 0:00:12 lr: 1.6907051282051284e-05 sample/s: 17.60167862688321 loss: 0.3714 (0.4380) time: 0.2293 data: 0.0028 max mem: 3189
33
+ 2021-05-29 21:10:05,060 INFO torchdistill.misc.log Epoch: [2] [150/156] eta: 0:00:01 lr: 1.2900641025641028e-05 sample/s: 17.62479225977246 loss: 0.4438 (0.4566) time: 0.2286 data: 0.0028 max mem: 3189
34
+ 2021-05-29 21:10:06,144 INFO torchdistill.misc.log Epoch: [2] Total time: 0:00:35
35
+ 2021-05-29 21:10:07,463 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/rte/default_experiment-1-0.arrow
36
+ 2021-05-29 21:10:07,463 INFO __main__ Validation: accuracy = 0.6642599277978339
37
+ 2021-05-29 21:10:07,694 INFO torchdistill.misc.log Epoch: [3] [ 0/156] eta: 0:00:35 lr: 1.2419871794871796e-05 sample/s: 17.64061842835753 loss: 0.6059 (0.6059) time: 0.2297 data: 0.0029 max mem: 3189
38
+ 2021-05-29 21:10:19,117 INFO torchdistill.misc.log Epoch: [3] [ 50/156] eta: 0:00:24 lr: 8.41346153846154e-06 sample/s: 17.56041029935106 loss: 0.2700 (0.2802) time: 0.2287 data: 0.0027 max mem: 3189
39
+ 2021-05-29 21:10:30,490 INFO torchdistill.misc.log Epoch: [3] [100/156] eta: 0:00:12 lr: 4.4070512820512826e-06 sample/s: 18.516447386573457 loss: 0.2417 (0.2638) time: 0.2275 data: 0.0028 max mem: 3189
40
+ 2021-05-29 21:10:41,907 INFO torchdistill.misc.log Epoch: [3] [150/156] eta: 0:00:01 lr: 4.006410256410256e-07 sample/s: 18.52875802484679 loss: 0.1854 (0.2611) time: 0.2272 data: 0.0026 max mem: 3189
41
+ 2021-05-29 21:10:42,988 INFO torchdistill.misc.log Epoch: [3] Total time: 0:00:35
42
+ 2021-05-29 21:10:44,306 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/rte/default_experiment-1-0.arrow
43
+ 2021-05-29 21:10:44,307 INFO __main__ Validation: accuracy = 0.6750902527075813
44
+ 2021-05-29 21:10:44,307 INFO __main__ Updating ckpt at ./resource/ckpt/glue/rte/ce/rte-bert-base-uncased
45
+ 2021-05-29 21:10:49,239 INFO __main__ [Student: bert-base-uncased]
46
+ 2021-05-29 21:10:50,573 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/rte/default_experiment-1-0.arrow
47
+ 2021-05-29 21:10:50,574 INFO __main__ Test: accuracy = 0.6750902527075813
48
+ 2021-05-29 21:10:50,574 INFO __main__ Start prediction for private dataset(s)
49
+ 2021-05-29 21:10:50,575 INFO __main__ rte/test: 3000 samples
vocab.txt ADDED
The diff for this file is too large to render. See raw diff