yoshitomo-matsubara commited on
Commit
8fc0be2
1 Parent(s): 8b7b239

initial commit

Browse files
README.md ADDED
@@ -0,0 +1,18 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - bert
5
+ - stsb
6
+ - glue
7
+ - torchdistill
8
+ license: apache-2.0
9
+ datasets:
10
+ - stsb
11
+ metrics:
12
+ - pearson correlation
13
+ - spearman correlation
14
+ ---
15
+
16
+ `bert-base-uncased` fine-tuned on STS-B dataset, using [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_finetuning_and_submission.ipynb).
17
+ The hyperparameters are the same as those in Hugging Face's example and/or the paper of BERT, and the training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/stsb/mse/bert_base_uncased.yaml).
18
+ I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **77.9**.
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "bert-base-uncased",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "finetuning_task": "stsb",
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "id2label": {
13
+ "0": "LABEL_0"
14
+ },
15
+ "initializer_range": 0.02,
16
+ "intermediate_size": 3072,
17
+ "label2id": {
18
+ "LABEL_0": 0
19
+ },
20
+ "layer_norm_eps": 1e-12,
21
+ "max_position_embeddings": 512,
22
+ "model_type": "bert",
23
+ "num_attention_heads": 12,
24
+ "num_hidden_layers": 12,
25
+ "pad_token_id": 0,
26
+ "position_embedding_type": "absolute",
27
+ "problem_type": "regression",
28
+ "transformers_version": "4.6.1",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 30522
32
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:93c920a7fb105287e463475f7ab7785a5943d4eaf0bf42b4100e1f6a52419195
3
+ size 438021385
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
training.log ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2021-05-27 23:26:01,880 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/stsb/mse/bert_base_uncased.yaml', log='log/glue/stsb/mse/bert_base_uncased.txt', private_output='leaderboard/glue/standard/bert_base_uncased/', seed=None, student_only=False, task_name='stsb', test_only=False, world_size=1)
2
+ 2021-05-27 23:26:01,909 INFO __main__ Distributed environment: NO
3
+ Num processes: 1
4
+ Process index: 0
5
+ Local process index: 0
6
+ Device: cuda
7
+ Use FP16 precision: True
8
+
9
+ 2021-05-27 23:26:07,015 WARNING datasets.builder Reusing dataset glue (/root/.cache/huggingface/datasets/glue/stsb/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
10
+ 2021-05-27 23:26:08,583 INFO __main__ Start training
11
+ 2021-05-27 23:26:08,583 INFO torchdistill.models.util [student model]
12
+ 2021-05-27 23:26:08,583 INFO torchdistill.models.util Using the original student model
13
+ 2021-05-27 23:26:08,583 INFO torchdistill.core.training Loss = 1.0 * OrgLoss
14
+ 2021-05-27 23:26:11,276 INFO torchdistill.misc.log Epoch: [0] [ 0/180] eta: 0:00:49 lr: 4.9907407407407406e-05 sample/s: 14.701703069204987 loss: 15.5303 (15.5303) time: 0.2778 data: 0.0057 max mem: 2057
15
+ 2021-05-27 23:26:23,734 INFO torchdistill.misc.log Epoch: [0] [ 50/180] eta: 0:00:32 lr: 4.527777777777778e-05 sample/s: 14.820613434933513 loss: 1.4726 (5.8169) time: 0.2571 data: 0.0034 max mem: 3684
16
+ 2021-05-27 23:26:36,370 INFO torchdistill.misc.log Epoch: [0] [100/180] eta: 0:00:20 lr: 4.064814814814815e-05 sample/s: 14.914340322071642 loss: 0.6186 (3.3036) time: 0.2533 data: 0.0034 max mem: 3887
17
+ 2021-05-27 23:26:48,629 INFO torchdistill.misc.log Epoch: [0] [150/180] eta: 0:00:07 lr: 3.601851851851852e-05 sample/s: 18.040405170003655 loss: 0.5369 (2.4056) time: 0.2418 data: 0.0034 max mem: 4447
18
+ 2021-05-27 23:26:55,955 INFO torchdistill.misc.log Epoch: [0] Total time: 0:00:44
19
+ 2021-05-27 23:26:58,859 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/stsb/default_experiment-1-0.arrow
20
+ 2021-05-27 23:26:58,859 INFO __main__ Validation: pearson = 0.881096812440957, spearmanr = 0.877532796405759
21
+ 2021-05-27 23:26:58,860 INFO __main__ Updating ckpt at ./resource/ckpt/glue/stsb/mse/stsb-bert-base-uncased
22
+ 2021-05-27 23:27:00,150 INFO torchdistill.misc.log Epoch: [1] [ 0/180] eta: 0:00:52 lr: 3.3240740740740746e-05 sample/s: 13.832772125389575 loss: 0.4722 (0.4722) time: 0.2935 data: 0.0043 max mem: 4447
23
+ 2021-05-27 23:27:12,625 INFO torchdistill.misc.log Epoch: [1] [ 50/180] eta: 0:00:32 lr: 2.861111111111111e-05 sample/s: 16.265235191865646 loss: 0.3779 (0.3652) time: 0.2371 data: 0.0035 max mem: 4447
24
+ 2021-05-27 23:27:24,974 INFO torchdistill.misc.log Epoch: [1] [100/180] eta: 0:00:19 lr: 2.398148148148148e-05 sample/s: 20.118664656864684 loss: 0.3085 (0.3523) time: 0.2422 data: 0.0035 max mem: 4447
25
+ 2021-05-27 23:27:37,643 INFO torchdistill.misc.log Epoch: [1] [150/180] eta: 0:00:07 lr: 1.9351851851851853e-05 sample/s: 13.83483675399673 loss: 0.3442 (0.3544) time: 0.2645 data: 0.0035 max mem: 4448
26
+ 2021-05-27 23:27:44,984 INFO torchdistill.misc.log Epoch: [1] Total time: 0:00:45
27
+ 2021-05-27 23:27:47,885 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/stsb/default_experiment-1-0.arrow
28
+ 2021-05-27 23:27:47,885 INFO __main__ Validation: pearson = 0.8871473366574754, spearmanr = 0.8842887289343818
29
+ 2021-05-27 23:27:47,885 INFO __main__ Updating ckpt at ./resource/ckpt/glue/stsb/mse/stsb-bert-base-uncased
30
+ 2021-05-27 23:27:49,860 INFO torchdistill.misc.log Epoch: [2] [ 0/180] eta: 0:00:45 lr: 1.6574074074074075e-05 sample/s: 16.07903226115608 loss: 0.1846 (0.1846) time: 0.2531 data: 0.0043 max mem: 4448
31
+ 2021-05-27 23:28:01,803 INFO torchdistill.misc.log Epoch: [2] [ 50/180] eta: 0:00:31 lr: 1.1944444444444446e-05 sample/s: 16.289887718998443 loss: 0.1593 (0.1702) time: 0.2374 data: 0.0033 max mem: 4448
32
+ 2021-05-27 23:28:14,559 INFO torchdistill.misc.log Epoch: [2] [100/180] eta: 0:00:19 lr: 7.314814814814815e-06 sample/s: 11.970558100375301 loss: 0.1566 (0.1677) time: 0.2591 data: 0.0035 max mem: 4448
33
+ 2021-05-27 23:28:27,138 INFO torchdistill.misc.log Epoch: [2] [150/180] eta: 0:00:07 lr: 2.685185185185185e-06 sample/s: 16.26870508515336 loss: 0.1590 (0.1644) time: 0.2500 data: 0.0036 max mem: 4448
34
+ 2021-05-27 23:28:34,498 INFO torchdistill.misc.log Epoch: [2] Total time: 0:00:44
35
+ 2021-05-27 23:28:37,405 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/stsb/default_experiment-1-0.arrow
36
+ 2021-05-27 23:28:37,406 INFO __main__ Validation: pearson = 0.8888187382563609, spearmanr = 0.885554195504554
37
+ 2021-05-27 23:28:37,406 INFO __main__ Updating ckpt at ./resource/ckpt/glue/stsb/mse/stsb-bert-base-uncased
38
+ 2021-05-27 23:28:42,235 INFO __main__ [Student: bert-base-uncased]
39
+ 2021-05-27 23:28:45,145 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/stsb/default_experiment-1-0.arrow
40
+ 2021-05-27 23:28:45,145 INFO __main__ Test: pearson = 0.8888187382563609, spearmanr = 0.885554195504554
41
+ 2021-05-27 23:28:45,145 INFO __main__ Start prediction for private dataset(s)
42
+ 2021-05-27 23:28:45,146 INFO __main__ stsb/test: 1379 samples
vocab.txt ADDED
The diff for this file is too large to render. See raw diff