lapp0 commited on
Commit
d1b306f
1 Parent(s): bf5fbb3

End of training

Browse files
README.md ADDED
@@ -0,0 +1,137 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: gpt2
3
+ library_name: Distily
4
+ license: mit
5
+ tags:
6
+ - bitnet
7
+ - 1.58b
8
+ - generated_from_trainer
9
+ model-index:
10
+ - name: distily_bitnet_gpt2
11
+ results: []
12
+ ---
13
+
14
+ # distily_bitnet_gpt2
15
+
16
+ This student model is distilled from the teacher model [gpt2](https://huggingface.co/gpt2) using the dataset (unspecified).
17
+
18
+ The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
19
+
20
+ It achieves the following results on the evaluation set:
21
+ - eval_enwikippl: 87.5
22
+ - eval_frwikippl: 358.0
23
+ - eval_zhwikippl: 139.0
24
+ - eval_tinystoriesppl: 72.5
25
+ - eval_loss: 0.6931
26
+ - eval_runtime: 29.8206
27
+ - eval_samples_per_second: 83.835
28
+ - eval_steps_per_second: 10.496
29
+
30
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
31
+ should probably proofread and complete it, then remove this comment.
32
+
33
+ ## Model description
34
+
35
+ More information needed
36
+
37
+ ## Intended uses & limitations
38
+
39
+ More information needed
40
+
41
+ ## Training and evaluation data
42
+
43
+ More information needed
44
+ -->
45
+
46
+ ## Training procedure
47
+
48
+ ### Training hyperparameters
49
+
50
+ The following hyperparameters were used during training:
51
+ - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
52
+ - train_embeddings: True
53
+ - learning_rate: 0.0001
54
+ - train_batch_size: 4
55
+ - eval_batch_size: 8
56
+ - seed: 42
57
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
58
+ - lr_scheduler_type: constant
59
+ - lr_scheduler_warmup_ratio: 0.2
60
+ - num_epochs: 1.0
61
+
62
+ ### Resource Usage
63
+ Peak GPU Memory: 7.5008 GB
64
+
65
+ ### Eval-Phase Metrics
66
+ | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
67
+ | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
68
+ | **teacher eval** | | 43.25 | 61.25 | | | | | 11.6875 | 19.125 |
69
+ | 0 | 0 | 820338753536.0 | 43705587204096.0 | 18.6434 | 30.0294 | 83.252 | 10.423 | 4731174912.0 | 17729624997888.0 |
70
+ | 1000 | 0.0162 | 324.0 | 1576.0 | 1.4569 | 29.8145 | 83.852 | 10.498 | 258.0 | 386.0 |
71
+ | 2000 | 0.0323 | 220.0 | 844.0 | 1.2562 | 29.8871 | 83.648 | 10.473 | 184.0 | 203.0 |
72
+ | 3000 | 0.0485 | 182.0 | 628.0 | 1.1014 | 29.8663 | 83.706 | 10.48 | 141.0 | 178.0 |
73
+ | 4000 | 0.0646 | 148.0 | 520.0 | 0.9878 | 29.8318 | 83.803 | 10.492 | 121.0 | 162.0 |
74
+ | 5000 | 0.0808 | 130.0 | 456.0 | 0.9061 | 29.8914 | 83.636 | 10.471 | 103.5 | 150.0 |
75
+ | 6000 | 0.0970 | 117.0 | 426.0 | 0.8448 | 29.8301 | 83.808 | 10.493 | 95.5 | 165.0 |
76
+ | 7000 | 0.1131 | 105.0 | 460.0 | 0.7878 | 29.8233 | 83.827 | 10.495 | 86.0 | 150.0 |
77
+ | 8000 | 0.1293 | 98.5 | 396.0 | 0.7433 | 29.8713 | 83.692 | 10.478 | 78.0 | 143.0 |
78
+ | 9000 | 0.1455 | 87.5 | 358.0 | 0.6931 | 29.8206 | 83.835 | 10.496 | 72.5 | 139.0 |
79
+ | 10000 | 0.1616 | 82.0 | 340.0 | 0.6355 | 29.8348 | 83.795 | 10.491 | 67.5 | 132.0 |
80
+ | 11000 | 0.1778 | 77.5 | 330.0 | 0.5981 | 29.8369 | 83.789 | 10.49 | 60.75 | 113.0 |
81
+ | 12000 | 0.1939 | 75.0 | 286.0 | 0.5715 | 29.8463 | 83.762 | 10.487 | 62.0 | 152.0 |
82
+ | 13000 | 0.2101 | 73.0 | 249.0 | 0.5484 | 29.8498 | 83.753 | 10.486 | 55.5 | 141.0 |
83
+ | 14000 | 0.2263 | 72.5 | 245.0 | 0.5344 | 29.8153 | 83.85 | 10.498 | 54.75 | 85.5 |
84
+ | 15000 | 0.2424 | 73.0 | 246.0 | 0.5171 | 29.8338 | 83.798 | 10.491 | 55.5 | 87.0 |
85
+ | 16000 | 0.2586 | 70.5 | 237.0 | 0.5125 | 29.8543 | 83.74 | 10.484 | 52.5 | 92.0 |
86
+ | 17000 | 0.2747 | 70.0 | 219.0 | 0.4954 | 29.8236 | 83.826 | 10.495 | 56.25 | 160.0 |
87
+ | 18000 | 0.2909 | 67.5 | 250.0 | 0.5031 | 29.8194 | 83.838 | 10.497 | 52.5 | 173.0 |
88
+ | 19000 | 0.3071 | 72.0 | 223.0 | 0.4795 | 29.8542 | 83.74 | 10.484 | 51.5 | 151.0 |
89
+ | 20000 | 0.3232 | 68.0 | 218.0 | 0.4735 | 29.8718 | 83.691 | 10.478 | 52.0 | 151.0 |
90
+ | 21000 | 0.3394 | 67.5 | 221.0 | 0.4795 | 29.8655 | 83.709 | 10.48 | 52.5 | 190.0 |
91
+ | 22000 | 0.3556 | 68.5 | 223.0 | 0.4733 | 29.8778 | 83.674 | 10.476 | 52.0 | 96.0 |
92
+ | 23000 | 0.3717 | 69.0 | 204.0 | 0.4633 | 29.8215 | 83.832 | 10.496 | 48.75 | 104.0 |
93
+ | 24000 | 0.3879 | 66.0 | 222.0 | 0.4587 | 29.843 | 83.772 | 10.488 | 50.0 | 122.0 |
94
+ | 25000 | 0.4040 | 67.0 | 216.0 | 0.4568 | 29.8561 | 83.735 | 10.484 | 48.75 | 92.0 |
95
+ | 26000 | 0.4202 | 70.0 | 214.0 | 0.4556 | 29.8665 | 83.706 | 10.48 | 49.0 | 103.5 |
96
+ | 27000 | 0.4364 | 66.0 | 220.0 | 0.4601 | 29.8646 | 83.711 | 10.481 | 48.5 | 95.5 |
97
+ | 28000 | 0.4525 | 65.0 | 205.0 | 0.4516 | 29.8541 | 83.741 | 10.484 | 46.5 | 150.0 |
98
+ | 29000 | 0.4687 | 66.5 | 223.0 | 0.4496 | 29.8307 | 83.806 | 10.493 | 46.5 | 102.5 |
99
+ | 30000 | 0.4848 | 66.5 | 237.0 | 0.4509 | 29.8678 | 83.702 | 10.48 | 46.25 | 137.0 |
100
+ | 31000 | 0.5010 | 64.5 | 219.0 | 0.4445 | 29.851 | 83.749 | 10.485 | 46.0 | 97.5 |
101
+ | 32000 | 0.5172 | 64.0 | 200.0 | 0.4380 | 29.8955 | 83.625 | 10.47 | 49.25 | 101.0 |
102
+ | 33000 | 0.5333 | 64.5 | 204.0 | 0.4379 | 29.838 | 83.786 | 10.49 | 49.0 | 85.5 |
103
+ | 34000 | 0.5495 | 64.0 | 217.0 | 0.4419 | 29.8427 | 83.773 | 10.488 | 46.25 | 76.0 |
104
+ | 35000 | 0.5657 | 72.5 | 229.0 | 0.4345 | 29.8803 | 83.667 | 10.475 | 50.0 | 128.0 |
105
+ | 36000 | 0.5818 | 67.5 | 203.0 | 0.4349 | 30.0752 | 83.125 | 10.407 | 45.0 | 147.0 |
106
+ | 37000 | 0.5980 | 65.5 | 205.0 | 0.4354 | 29.8558 | 83.736 | 10.484 | 47.75 | 129.0 |
107
+ | 38000 | 0.6141 | 63.75 | 208.0 | 0.4375 | 29.868 | 83.702 | 10.479 | 46.0 | 108.5 |
108
+ | 39000 | 0.6303 | 64.0 | 215.0 | 0.4395 | 30.2231 | 82.718 | 10.356 | 45.5 | 125.0 |
109
+ | 40000 | 0.6465 | 64.5 | 197.0 | 0.4278 | 29.9055 | 83.597 | 10.466 | 46.0 | 84.5 |
110
+ | 41000 | 0.6626 | 62.25 | 186.0 | 0.4285 | 29.951 | 83.47 | 10.45 | 44.75 | 80.0 |
111
+ | 42000 | 0.6788 | 62.75 | 225.0 | 0.4301 | 29.835 | 83.794 | 10.491 | 46.25 | 168.0 |
112
+ | 43000 | 0.6949 | 65.5 | 224.0 | 0.4222 | 29.874 | 83.685 | 10.477 | 46.5 | 139.0 |
113
+ | 44000 | 0.7111 | 63.5 | 197.0 | 0.4294 | 29.9084 | 83.589 | 10.465 | 45.75 | 125.5 |
114
+ | 45000 | 0.7273 | 63.0 | 192.0 | 0.4263 | 29.8797 | 83.669 | 10.475 | 46.25 | 95.0 |
115
+ | 46000 | 0.7434 | 63.25 | 198.0 | 0.4266 | 29.8479 | 83.758 | 10.487 | 44.75 | 120.5 |
116
+ | 47000 | 0.7596 | 64.5 | 213.0 | 0.4247 | 29.8769 | 83.677 | 10.476 | 44.5 | 120.5 |
117
+ | 48000 | 0.7758 | 62.25 | 202.0 | 0.4214 | 29.8514 | 83.748 | 10.485 | 42.75 | 83.5 |
118
+ | 49000 | 0.7919 | 63.75 | 204.0 | 0.4230 | 29.8895 | 83.641 | 10.472 | 46.25 | 94.5 |
119
+ | 50000 | 0.8081 | 63.75 | 209.0 | 0.4218 | 29.9008 | 83.61 | 10.468 | 45.25 | 131.0 |
120
+ | 51000 | 0.8242 | 65.5 | 223.0 | 0.4213 | 29.8534 | 83.743 | 10.485 | 45.0 | 233.0 |
121
+ | 52000 | 0.8404 | 64.5 | 195.0 | 0.4132 | 29.8416 | 83.776 | 10.489 | 44.0 | 99.0 |
122
+ | 53000 | 0.8566 | 64.0 | 216.0 | 0.4259 | 29.8576 | 83.731 | 10.483 | 45.5 | 95.0 |
123
+ | 54000 | 0.8727 | 65.0 | 207.0 | 0.4207 | 29.8695 | 83.698 | 10.479 | 45.5 | 126.0 |
124
+ | 55000 | 0.8889 | 66.5 | 198.0 | 0.4141 | 29.8307 | 83.806 | 10.493 | 42.75 | 118.0 |
125
+ | 56000 | 0.9051 | 60.0 | 186.0 | 0.4209 | 29.866 | 83.707 | 10.48 | 43.75 | 142.0 |
126
+ | 57000 | 0.9212 | 62.25 | 192.0 | 0.4143 | 29.9063 | 83.594 | 10.466 | 45.0 | 78.0 |
127
+ | 58000 | 0.9374 | 63.5 | 205.0 | 0.4192 | 29.859 | 83.727 | 10.483 | 44.75 | 117.5 |
128
+ | 59000 | 0.9535 | 62.75 | 191.0 | 0.4202 | 29.8691 | 83.699 | 10.479 | 44.0 | 100.0 |
129
+ | 60000 | 0.9697 | 66.0 | 219.0 | 0.4149 | 29.9387 | 83.504 | 10.455 | 43.75 | 130.0 |
130
+ | 61000 | 0.9859 | 64.5 | 207.0 | 0.4162 | 29.8366 | 83.79 | 10.49 | 43.5 | 161.0 |
131
+ | 61875 | 1.0 | 61.5 | 204.0 | 0.4125 | 29.9423 | 83.494 | 10.453 | 44.25 | 223.0 |
132
+
133
+ ### Framework versions
134
+ - Distily 0.2.0
135
+ - Transformers 4.44.0
136
+ - Pytorch 2.3.0
137
+ - Datasets 2.21.0
config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "gpt2",
3
+ "activation_function": "gelu_new",
4
+ "architectures": [
5
+ "GPT2LMHeadModel"
6
+ ],
7
+ "attn_pdrop": 0.1,
8
+ "bos_token_id": 50256,
9
+ "embd_pdrop": 0.1,
10
+ "eos_token_id": 50256,
11
+ "initializer_range": 0.02,
12
+ "layer_norm_epsilon": 1e-05,
13
+ "model_type": "gpt2",
14
+ "n_ctx": 1024,
15
+ "n_embd": 768,
16
+ "n_head": 12,
17
+ "n_inner": null,
18
+ "n_layer": 12,
19
+ "n_positions": 1024,
20
+ "reorder_and_upcast_attn": false,
21
+ "resid_pdrop": 0.1,
22
+ "scale_attn_by_inverse_layer_idx": false,
23
+ "scale_attn_weights": true,
24
+ "summary_activation": null,
25
+ "summary_first_dropout": 0.1,
26
+ "summary_proj_to_labels": true,
27
+ "summary_type": "cls_index",
28
+ "summary_use_proj": true,
29
+ "task_specific_params": {
30
+ "text-generation": {
31
+ "do_sample": true,
32
+ "max_length": 50
33
+ }
34
+ },
35
+ "torch_dtype": "bfloat16",
36
+ "transformers_version": "4.44.0",
37
+ "use_cache": true,
38
+ "vocab_size": 50257
39
+ }
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 50256,
4
+ "eos_token_id": 50256,
5
+ "transformers_version": "4.44.0"
6
+ }
logs/events.out.tfevents.1724138424.5f530b1cf724 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1855c3511813142f0463d3688cf0e221a42001a3722e0ec8b0e4120f394ad4bc
3
+ size 29652608
logs/events.out.tfevents.1724152434.5f530b1cf724 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:41758d3e9d8c913e3a79b2fdb844a3b37d63f3bf5b4b114c8d2228502f0bf7d3
3
+ size 312
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b862fc88690431ec82268c8b975c893c4a590c6af4e350b571dcadbbdcde279c
3
+ size 248894656
special_tokens_map.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<|endoftext|>",
3
+ "eos_token": "<|endoftext|>",
4
+ "pad_token": "<|endoftext|>",
5
+ "unk_token": "<|endoftext|>"
6
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "50256": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ }
12
+ },
13
+ "bos_token": "<|endoftext|>",
14
+ "clean_up_tokenization_spaces": true,
15
+ "eos_token": "<|endoftext|>",
16
+ "model_max_length": 1024,
17
+ "pad_token": "<|endoftext|>",
18
+ "tokenizer_class": "GPT2Tokenizer",
19
+ "unk_token": "<|endoftext|>"
20
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d5b2a42aaef175c0ecb3bc4cd25b82bc04dad5ac2bd8d6c2524f11ca0bc0d16
3
+ size 1017899016
vocab.json ADDED
The diff for this file is too large to render. See raw diff