jon-t commited on
Commit
3112400
verified
1 Parent(s): 1e41966

Model save

Browse files
README.md CHANGED
@@ -1,14 +1,53 @@
1
  ---
2
- language: en
3
  license: mit
 
4
  tags:
5
- - question-answering
6
- - pytorch
7
- - bert
 
8
  ---
9
 
 
 
 
10
  # tiny-clinicalbert-qa
11
 
12
- A tiny clinical QA model that is trained using the merged [squad_v2](https://huggingface.co/datasets/rajpurkar/squad_v2) and [emrqa-msquad](https://huggingface.co/datasets/Eladio/emrqa-msquad) datasets.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
 
14
- Source code is available [on GitHub](https://github.com/jon-edward/tiny-clinicalbert-qa).
 
 
 
 
1
  ---
2
+ library_name: transformers
3
  license: mit
4
+ base_model: nlpie/tiny-clinicalbert
5
  tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: tiny-clinicalbert-qa
9
+ results: []
10
  ---
11
 
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
  # tiny-clinicalbert-qa
16
 
17
+ This model is a fine-tuned version of [nlpie/tiny-clinicalbert](https://huggingface.co/nlpie/tiny-clinicalbert) on the None dataset.
18
+
19
+ ## Model description
20
+
21
+ More information needed
22
+
23
+ ## Intended uses & limitations
24
+
25
+ More information needed
26
+
27
+ ## Training and evaluation data
28
+
29
+ More information needed
30
+
31
+ ## Training procedure
32
+
33
+ ### Training hyperparameters
34
+
35
+ The following hyperparameters were used during training:
36
+ - learning_rate: 0.0001
37
+ - train_batch_size: 12
38
+ - eval_batch_size: 8
39
+ - seed: 42
40
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
41
+ - lr_scheduler_type: linear
42
+ - num_epochs: 2.0
43
+
44
+ ### Training results
45
+
46
+
47
+
48
+ ### Framework versions
49
 
50
+ - Transformers 4.53.0
51
+ - Pytorch 2.7.1+cu118
52
+ - Datasets 3.6.0
53
+ - Tokenizers 0.21.2
all_results.json CHANGED
@@ -1,9 +1,26 @@
1
  {
2
  "epoch": 2.0,
3
- "total_flos": 1.1159654864928768e+16,
4
- "train_loss": 1.1755467142439282,
5
- "train_runtime": 8474.7112,
6
- "train_samples": 397452,
7
- "train_samples_per_second": 93.797,
8
- "train_steps_per_second": 7.816
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  }
 
1
  {
2
  "epoch": 2.0,
3
+ "eval_HasAns_exact": 0.0,
4
+ "eval_HasAns_f1": 0.0,
5
+ "eval_HasAns_total": 87,
6
+ "eval_NoAns_exact": 100.0,
7
+ "eval_NoAns_f1": 100.0,
8
+ "eval_NoAns_total": 13,
9
+ "eval_best_exact": 13.0,
10
+ "eval_best_exact_thresh": 0.0,
11
+ "eval_best_f1": 13.0,
12
+ "eval_best_f1_thresh": 0.0,
13
+ "eval_exact": 13.0,
14
+ "eval_f1": 13.0,
15
+ "eval_runtime": 0.446,
16
+ "eval_samples": 100,
17
+ "eval_samples_per_second": 224.2,
18
+ "eval_steps_per_second": 29.146,
19
+ "eval_total": 100,
20
+ "total_flos": 2807799398400.0,
21
+ "train_loss": 5.859864128960504,
22
+ "train_runtime": 5.9882,
23
+ "train_samples": 100,
24
+ "train_samples_per_second": 33.399,
25
+ "train_steps_per_second": 3.006
26
  }
eval_nbest_predictions.json CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:43df2980f28317698aa7d359fba8cfe5d0fa0bd2cf6fbe326ea72712d5de9945
3
- size 58166149
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3df163c73e234bd0030e9826602bc6e19a7c957dd7c295e4630b8b3ba8d504aa
3
+ size 274549
eval_null_odds.json ADDED
@@ -0,0 +1,102 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "fcd48163-83fc-48e3-89ff-3258d95b783f": 1.7732281684875488,
3
+ "f4cfe265-080e-4f60-a7b4-4684e6ee5a92": 1.9041776657104492,
4
+ "ff425e22-c432-4caf-a698-63faf5d7713a": 1.9762433767318726,
5
+ "0641f098-7b20-49d2-bd0f-34bbaff13580": 2.011300563812256,
6
+ "aa8f15f9-0977-43d7-878c-160dde1e19db": 1.765954852104187,
7
+ "1ece7aef-ca3a-4b34-9d1a-70f39a95be5f": 1.7935830354690552,
8
+ "8cbc2a33-2ea0-4268-89d6-6b86b1dbf6d4": 1.8020281791687012,
9
+ "39ae80bd-ab64-4252-a446-ba944028f5c7": 1.9813148975372314,
10
+ "d0e4ebf6-08ff-4a97-8116-546ca98194a9": 1.686867594718933,
11
+ "c10fce84-5643-4235-acd3-7e2d9aa9331f": 1.6595547199249268,
12
+ "38c75646-957a-4170-a961-affadf233d40": 1.7701886892318726,
13
+ "7ab2bf54-eda1-49df-b569-c073fdbdf0dd": 2.0477426052093506,
14
+ "f643439c-0b97-4891-b799-635884bb0760": 1.868635892868042,
15
+ "f95fae98-be1e-4622-ad45-a44805dc4ca4": 1.8836207389831543,
16
+ "bdc7dd81-ecd1-4c88-86cb-0c24cfa9678f": 1.9977221488952637,
17
+ "55bb4198-bf14-4214-a3e6-8d666f8ad7ce": 1.8843570947647095,
18
+ "7bc5a56b-8940-4873-88e0-485de94121c7": 1.6930863857269287,
19
+ "2a5bf0ee-4072-4e84-bb77-65fa7b6c2174": 1.7073335647583008,
20
+ "2e17006f-8c12-420f-8034-77a205258cdd": 1.9988418817520142,
21
+ "43e4d8f8-92ad-488c-9e76-f188a22ad840": 2.1723241806030273,
22
+ "02e4a792-3932-49b0-8f8d-bf4c03f19b13": 1.6765046119689941,
23
+ "55286f23-5d9e-4c34-b657-b465bfd83be1": 1.6347368955612183,
24
+ "e06186f8-bd45-4ff0-95bb-5b83e6f0c1eb": 1.9272425174713135,
25
+ "3cbe688e-89c3-4609-879a-5bff6174e981": 1.8221582174301147,
26
+ "25b7d5b4-ad23-4bce-bc0f-73b5f0151c5b": 1.8779842853546143,
27
+ "5439a61f-17dc-48c0-83eb-d6a84b8dcd99": 1.773216962814331,
28
+ "d29ad6ef-c5eb-4184-8be0-cd0d479e0686": 1.8767989873886108,
29
+ "4b1b36e8-9340-4746-b071-1b8c0d1ee2aa": 2.070279836654663,
30
+ "eb24add0-924e-48e8-9620-3ebf24811365": 1.9247840642929077,
31
+ "f46bd01f-7d00-40ac-a21c-70f103b63731": 1.6565182209014893,
32
+ "5fef754b-5194-4955-ae3b-759ae90f6c35": 1.7619998455047607,
33
+ "ccb40aaa-cdcd-41e7-b0ca-c2cb3a8b3a50": 1.7788583040237427,
34
+ "ea2c41b8-8c14-4b29-ae81-8bdcabc52a32": 1.8925293684005737,
35
+ "3214405e-582e-4721-8fce-d96c4d6193c8": 1.9059315919876099,
36
+ "da5ef9e4-afdb-400d-ad85-a6ba6bd05dd0": 1.711046814918518,
37
+ "29f0648f-6139-42d9-b60b-2946798c05eb": 1.7472726106643677,
38
+ "c7076914-6610-4465-9e6c-779217450af5": 1.888414978981018,
39
+ "dbfb8954-ac92-4ec7-96d6-1eccd1805333": 1.9473762512207031,
40
+ "7c61225d-44c9-4b5b-9c52-ee9112ef77e9": 1.982516884803772,
41
+ "ffc956ad-01d2-4040-a024-bec1302b688d": 1.6833488941192627,
42
+ "f038f339-f762-47cc-b623-0c068a2de09f": 1.9112119674682617,
43
+ "bf4605a3-e8ab-417e-a3af-b325e1e3acfd": 1.8640227317810059,
44
+ "6fee7783-bc41-46a1-a746-b1ed56de9c69": 1.9132907390594482,
45
+ "46ea31d4-32da-4432-9112-defaebfcdba9": 2.061676025390625,
46
+ "db2f4457-664d-466b-b404-503e79518741": 1.8728480339050293,
47
+ "d89da8c5-2bd3-413f-8afc-cc3a4427ae4e": 1.6107432842254639,
48
+ "6d00bdd6-a031-4d56-9933-e70df183703b": 1.7630739212036133,
49
+ "e175d441-bab9-4674-98ba-8b575ca5b9a2": 1.6398649215698242,
50
+ "7a27b921-8f75-4a61-9c93-dbe8284cfd5c": 1.9089109897613525,
51
+ "53072674-c057-496b-863c-d83a7f63ae66": 1.891668677330017,
52
+ "5def711a-6b51-4485-9924-3f114e9579a9": 2.018949508666992,
53
+ "b41b4654-e3e7-4e51-b659-d469ac4b7b1b": 1.644025206565857,
54
+ "2283f919-3c09-4ba0-9a94-2459db973331": 1.9814622402191162,
55
+ "7baa0a7a-8a92-41e3-be2b-6fad28a01217": 1.9016613960266113,
56
+ "9e6bd54a-d34d-4a6b-bb78-6c55d9c3d5b9": 1.8171358108520508,
57
+ "07284484-725a-402a-9049-f4c3337e6d64": 1.0723514556884766,
58
+ "9b7f5680-7399-4508-acf9-5c05be8cebf6": 1.92265784740448,
59
+ "3169569c-0624-4419-870c-6133b42a13d7": 1.8748815059661865,
60
+ "194c0c73-5b86-43a0-8a6c-aa03b33acd23": 1.919926404953003,
61
+ "3629efd3-0d17-4a7d-9563-20897ae4e080": 1.953999400138855,
62
+ "f2fce5ac-3b2c-4af0-b695-f7c9eac54f9e": 2.708578586578369,
63
+ "95b6b9fa-68a9-440e-8b47-c2ad385c7196": 2.708578586578369,
64
+ "3bfa6b3a-6a5a-4a20-ae45-e5c08ec3bc6b": 2.708578586578369,
65
+ "bcd90c1e-1a49-449e-9791-0995a5cfc7d9": 2.708578586578369,
66
+ "13da11d4-2740-45d4-8423-cb2840ebd86b": 2.708578586578369,
67
+ "fe5ab8a4-9345-45fd-8d2f-6b64f34ecd28": 2.708578586578369,
68
+ "b3e731e4-448e-4a7b-b1aa-1fc120c7af5b": 2.708578586578369,
69
+ "c66f8cf1-0f43-4390-b51d-c09426bb3901": 2.708578586578369,
70
+ "03ab1625-77de-4e22-897f-bfe8e9e6a5f5": 2.708578586578369,
71
+ "770d481b-109a-4408-94cf-25fbb62ea60d": 2.708578586578369,
72
+ "9df5e5a2-bc70-4209-a422-e98c81ed2b23": 2.708578586578369,
73
+ "b2aba516-0854-425a-9f8c-69a1ad6a553c": 2.708578586578369,
74
+ "e9e0b0ba-f544-46d6-bf1d-5b390a879d0a": 2.708578586578369,
75
+ "9f5bea61-fe80-49eb-a260-2bb2dc82ce38": 2.708578586578369,
76
+ "c63cccc2-3167-4ed5-8f5b-a4a848419f02": 2.708578586578369,
77
+ "081e91ca-a389-46b2-8ccc-85c2d192e764": 2.708578586578369,
78
+ "634d2c6f-f9f0-466c-98c2-b4246633ac97": 2.708578586578369,
79
+ "8578b1fd-9a7b-4db6-9fce-af3d6be166bf": 2.708578586578369,
80
+ "0d376144-4990-4e70-a27c-4017141bee85": 2.708578586578369,
81
+ "548ef534-3710-4ae4-86f2-9c99c4c51e5a": 2.708578586578369,
82
+ "cd897cc1-c575-48fb-957e-7945ae9e8657": 2.708578586578369,
83
+ "7fd2bd05-62ed-4908-9bde-d37b4cc0e8e7": 2.708578586578369,
84
+ "9f24ca80-a066-47fa-98c0-1206c8d66121": 2.708578586578369,
85
+ "63f8898e-0049-4f2e-8d30-98522c0654ef": 2.708578586578369,
86
+ "8e5f6696-a155-4647-83d4-8a005604e0c2": 2.708578586578369,
87
+ "66897d90-7406-40a4-a1b8-a72de9e647e8": 2.708578586578369,
88
+ "5f9edccf-2c17-4624-921c-c52e96311797": 2.708578586578369,
89
+ "0e6823a2-4cdb-46ac-8706-8b6e767ce78c": 2.708578586578369,
90
+ "cc0f967a-77d4-4479-b4b7-962d9aec50f7": 2.708578586578369,
91
+ "f3efa7a1-1ba3-40a0-a0e6-5a3edf7b41c9": 2.708578586578369,
92
+ "c806234f-8a18-487f-b4a8-bc26154a78dd": 2.708578586578369,
93
+ "c64896db-b495-4a0d-a290-5eb153463923": 2.708578586578369,
94
+ "9cc981b7-e0d5-4058-916b-34d9a89de226": 2.708578586578369,
95
+ "f3ff4a75-f7cd-412f-9894-5c954f651848": 2.708578586578369,
96
+ "7a0ff79b-8c19-420f-9c69-757bbc29cad8": 2.708578586578369,
97
+ "914b2bfe-e54c-420b-831c-a8f6d970d936": 2.708578586578369,
98
+ "26b7f97d-4ce7-4bb2-a7fd-ff5f2c767a58": 2.708578586578369,
99
+ "e86f0101-68e3-482a-a493-0974eb7d9ae1": 2.708578586578369,
100
+ "643fc7e7-9cbf-46b1-af9e-53b6cd6034e3": 2.708578586578369,
101
+ "0a837c28-7605-4544-884f-2611c02e1dce": 2.708578586578369
102
+ }
eval_predictions.json CHANGED
The diff for this file is too large to render. See raw diff
 
eval_results.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 2.0,
3
+ "eval_HasAns_exact": 0.0,
4
+ "eval_HasAns_f1": 0.0,
5
+ "eval_HasAns_total": 87,
6
+ "eval_NoAns_exact": 100.0,
7
+ "eval_NoAns_f1": 100.0,
8
+ "eval_NoAns_total": 13,
9
+ "eval_best_exact": 13.0,
10
+ "eval_best_exact_thresh": 0.0,
11
+ "eval_best_f1": 13.0,
12
+ "eval_best_f1_thresh": 0.0,
13
+ "eval_exact": 13.0,
14
+ "eval_f1": 13.0,
15
+ "eval_runtime": 0.446,
16
+ "eval_samples": 100,
17
+ "eval_samples_per_second": 224.2,
18
+ "eval_steps_per_second": 29.146,
19
+ "eval_total": 100
20
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5bc0080772daa24c2c0813317ba1bbeb1ce5ead68d5fa69b4fce7dbb72b63124
3
  size 55116544
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:083aedb6fa01b112c33107d32a260c6d7993229e200b3948321050ddb181c19e
3
  size 55116544
train_results.json CHANGED
@@ -1,9 +1,9 @@
1
  {
2
  "epoch": 2.0,
3
- "total_flos": 1.1159654864928768e+16,
4
- "train_loss": 1.1755467142439282,
5
- "train_runtime": 8474.7112,
6
- "train_samples": 397452,
7
- "train_samples_per_second": 93.797,
8
- "train_steps_per_second": 7.816
9
  }
 
1
  {
2
  "epoch": 2.0,
3
+ "total_flos": 2807799398400.0,
4
+ "train_loss": 5.859864128960504,
5
+ "train_runtime": 5.9882,
6
+ "train_samples": 100,
7
+ "train_samples_per_second": 33.399,
8
+ "train_steps_per_second": 3.006
9
  }
trainer_state.json CHANGED
@@ -4,947 +4,23 @@
4
  "best_model_checkpoint": null,
5
  "epoch": 2.0,
6
  "eval_steps": 500,
7
- "global_step": 66242,
8
  "is_hyper_param_search": false,
9
  "is_local_process_zero": true,
10
  "is_world_process_zero": true,
11
  "log_history": [
12
- {
13
- "epoch": 0.015096162555478397,
14
- "grad_norm": 7.316836833953857,
15
- "learning_rate": 9.924670148848163e-05,
16
- "loss": 3.4381,
17
- "step": 500
18
- },
19
- {
20
- "epoch": 0.030192325110956795,
21
- "grad_norm": 7.617539882659912,
22
- "learning_rate": 9.84918933607077e-05,
23
- "loss": 3.0248,
24
- "step": 1000
25
- },
26
- {
27
- "epoch": 0.045288487666435194,
28
- "grad_norm": 14.377625465393066,
29
- "learning_rate": 9.773708523293379e-05,
30
- "loss": 2.7697,
31
- "step": 1500
32
- },
33
- {
34
- "epoch": 0.06038465022191359,
35
- "grad_norm": 8.488965034484863,
36
- "learning_rate": 9.698227710515986e-05,
37
- "loss": 2.5475,
38
- "step": 2000
39
- },
40
- {
41
- "epoch": 0.07548081277739199,
42
- "grad_norm": 10.39892292022705,
43
- "learning_rate": 9.622746897738595e-05,
44
- "loss": 2.4385,
45
- "step": 2500
46
- },
47
- {
48
- "epoch": 0.09057697533287039,
49
- "grad_norm": 11.845101356506348,
50
- "learning_rate": 9.547266084961202e-05,
51
- "loss": 2.2998,
52
- "step": 3000
53
- },
54
- {
55
- "epoch": 0.10567313788834878,
56
- "grad_norm": 10.765714645385742,
57
- "learning_rate": 9.471785272183812e-05,
58
- "loss": 2.2303,
59
- "step": 3500
60
- },
61
- {
62
- "epoch": 0.12076930044382718,
63
- "grad_norm": 13.092281341552734,
64
- "learning_rate": 9.39630445940642e-05,
65
- "loss": 2.1798,
66
- "step": 4000
67
- },
68
- {
69
- "epoch": 0.13586546299930558,
70
- "grad_norm": 14.106061935424805,
71
- "learning_rate": 9.320823646629028e-05,
72
- "loss": 2.1111,
73
- "step": 4500
74
- },
75
- {
76
- "epoch": 0.15096162555478398,
77
- "grad_norm": 10.392507553100586,
78
- "learning_rate": 9.245342833851635e-05,
79
- "loss": 2.1069,
80
- "step": 5000
81
- },
82
- {
83
- "epoch": 0.16605778811026237,
84
- "grad_norm": 9.839272499084473,
85
- "learning_rate": 9.169862021074244e-05,
86
- "loss": 2.0419,
87
- "step": 5500
88
- },
89
- {
90
- "epoch": 0.18115395066574078,
91
- "grad_norm": 8.988909721374512,
92
- "learning_rate": 9.094381208296851e-05,
93
- "loss": 1.998,
94
- "step": 6000
95
- },
96
- {
97
- "epoch": 0.19625011322121916,
98
- "grad_norm": 10.165138244628906,
99
- "learning_rate": 9.01890039551946e-05,
100
- "loss": 2.0117,
101
- "step": 6500
102
- },
103
- {
104
- "epoch": 0.21134627577669757,
105
- "grad_norm": 6.843221187591553,
106
- "learning_rate": 8.943419582742067e-05,
107
- "loss": 1.9208,
108
- "step": 7000
109
- },
110
- {
111
- "epoch": 0.22644243833217595,
112
- "grad_norm": 9.529773712158203,
113
- "learning_rate": 8.867938769964676e-05,
114
- "loss": 1.8786,
115
- "step": 7500
116
- },
117
- {
118
- "epoch": 0.24153860088765436,
119
- "grad_norm": 10.685002326965332,
120
- "learning_rate": 8.792457957187283e-05,
121
- "loss": 1.8579,
122
- "step": 8000
123
- },
124
- {
125
- "epoch": 0.25663476344313274,
126
- "grad_norm": 13.080573081970215,
127
- "learning_rate": 8.716977144409891e-05,
128
- "loss": 1.813,
129
- "step": 8500
130
- },
131
- {
132
- "epoch": 0.27173092599861115,
133
- "grad_norm": 8.630938529968262,
134
- "learning_rate": 8.641496331632499e-05,
135
- "loss": 1.786,
136
- "step": 9000
137
- },
138
- {
139
- "epoch": 0.28682708855408956,
140
- "grad_norm": 14.508255958557129,
141
- "learning_rate": 8.566015518855107e-05,
142
- "loss": 1.7593,
143
- "step": 9500
144
- },
145
- {
146
- "epoch": 0.30192325110956797,
147
- "grad_norm": 16.67688751220703,
148
- "learning_rate": 8.490534706077715e-05,
149
- "loss": 1.7348,
150
- "step": 10000
151
- },
152
- {
153
- "epoch": 0.3170194136650463,
154
- "grad_norm": 9.229181289672852,
155
- "learning_rate": 8.415053893300323e-05,
156
- "loss": 1.7211,
157
- "step": 10500
158
- },
159
- {
160
- "epoch": 0.33211557622052473,
161
- "grad_norm": 10.458921432495117,
162
- "learning_rate": 8.339573080522932e-05,
163
- "loss": 1.6878,
164
- "step": 11000
165
- },
166
- {
167
- "epoch": 0.34721173877600314,
168
- "grad_norm": 15.979294776916504,
169
- "learning_rate": 8.264092267745539e-05,
170
- "loss": 1.6594,
171
- "step": 11500
172
- },
173
- {
174
- "epoch": 0.36230790133148155,
175
- "grad_norm": 10.298636436462402,
176
- "learning_rate": 8.188611454968148e-05,
177
- "loss": 1.6486,
178
- "step": 12000
179
- },
180
- {
181
- "epoch": 0.37740406388695996,
182
- "grad_norm": 9.108405113220215,
183
- "learning_rate": 8.113130642190756e-05,
184
- "loss": 1.5902,
185
- "step": 12500
186
- },
187
- {
188
- "epoch": 0.3925002264424383,
189
- "grad_norm": 10.374357223510742,
190
- "learning_rate": 8.037649829413363e-05,
191
- "loss": 1.5834,
192
- "step": 13000
193
- },
194
- {
195
- "epoch": 0.4075963889979167,
196
- "grad_norm": 13.339021682739258,
197
- "learning_rate": 7.962169016635972e-05,
198
- "loss": 1.5589,
199
- "step": 13500
200
- },
201
- {
202
- "epoch": 0.42269255155339513,
203
- "grad_norm": 10.74330997467041,
204
- "learning_rate": 7.886688203858579e-05,
205
- "loss": 1.5469,
206
- "step": 14000
207
- },
208
- {
209
- "epoch": 0.43778871410887354,
210
- "grad_norm": 10.266663551330566,
211
- "learning_rate": 7.811207391081188e-05,
212
- "loss": 1.5098,
213
- "step": 14500
214
- },
215
- {
216
- "epoch": 0.4528848766643519,
217
- "grad_norm": 12.610342025756836,
218
- "learning_rate": 7.735726578303795e-05,
219
- "loss": 1.4979,
220
- "step": 15000
221
- },
222
- {
223
- "epoch": 0.4679810392198303,
224
- "grad_norm": 11.99213695526123,
225
- "learning_rate": 7.660245765526404e-05,
226
- "loss": 1.4389,
227
- "step": 15500
228
- },
229
- {
230
- "epoch": 0.4830772017753087,
231
- "grad_norm": 12.106096267700195,
232
- "learning_rate": 7.584764952749012e-05,
233
- "loss": 1.4559,
234
- "step": 16000
235
- },
236
- {
237
- "epoch": 0.4981733643307871,
238
- "grad_norm": 16.475229263305664,
239
- "learning_rate": 7.50928413997162e-05,
240
- "loss": 1.3956,
241
- "step": 16500
242
- },
243
- {
244
- "epoch": 0.5132695268862655,
245
- "grad_norm": 12.459781646728516,
246
- "learning_rate": 7.433803327194228e-05,
247
- "loss": 1.3987,
248
- "step": 17000
249
- },
250
- {
251
- "epoch": 0.5283656894417439,
252
- "grad_norm": 16.59553337097168,
253
- "learning_rate": 7.358322514416835e-05,
254
- "loss": 1.3747,
255
- "step": 17500
256
- },
257
- {
258
- "epoch": 0.5434618519972223,
259
- "grad_norm": 12.136076927185059,
260
- "learning_rate": 7.282841701639444e-05,
261
- "loss": 1.3639,
262
- "step": 18000
263
- },
264
- {
265
- "epoch": 0.5585580145527007,
266
- "grad_norm": 16.129152297973633,
267
- "learning_rate": 7.207360888862051e-05,
268
- "loss": 1.3646,
269
- "step": 18500
270
- },
271
- {
272
- "epoch": 0.5736541771081791,
273
- "grad_norm": 11.556707382202148,
274
- "learning_rate": 7.13188007608466e-05,
275
- "loss": 1.3189,
276
- "step": 19000
277
- },
278
- {
279
- "epoch": 0.5887503396636575,
280
- "grad_norm": 15.598160743713379,
281
- "learning_rate": 7.056399263307267e-05,
282
- "loss": 1.3361,
283
- "step": 19500
284
- },
285
- {
286
- "epoch": 0.6038465022191359,
287
- "grad_norm": 15.071793556213379,
288
- "learning_rate": 6.980918450529876e-05,
289
- "loss": 1.3031,
290
- "step": 20000
291
- },
292
- {
293
- "epoch": 0.6189426647746143,
294
- "grad_norm": 16.57260513305664,
295
- "learning_rate": 6.905437637752483e-05,
296
- "loss": 1.3158,
297
- "step": 20500
298
- },
299
- {
300
- "epoch": 0.6340388273300926,
301
- "grad_norm": 12.095833778381348,
302
- "learning_rate": 6.829956824975092e-05,
303
- "loss": 1.2785,
304
- "step": 21000
305
- },
306
- {
307
- "epoch": 0.6491349898855711,
308
- "grad_norm": 15.220791816711426,
309
- "learning_rate": 6.7544760121977e-05,
310
- "loss": 1.2843,
311
- "step": 21500
312
- },
313
- {
314
- "epoch": 0.6642311524410495,
315
- "grad_norm": 14.36874008178711,
316
- "learning_rate": 6.678995199420309e-05,
317
- "loss": 1.2265,
318
- "step": 22000
319
- },
320
- {
321
- "epoch": 0.6793273149965279,
322
- "grad_norm": 7.681280612945557,
323
- "learning_rate": 6.603514386642916e-05,
324
- "loss": 1.2018,
325
- "step": 22500
326
- },
327
- {
328
- "epoch": 0.6944234775520063,
329
- "grad_norm": 22.970266342163086,
330
- "learning_rate": 6.528033573865525e-05,
331
- "loss": 1.209,
332
- "step": 23000
333
- },
334
- {
335
- "epoch": 0.7095196401074847,
336
- "grad_norm": 16.40315055847168,
337
- "learning_rate": 6.452552761088132e-05,
338
- "loss": 1.196,
339
- "step": 23500
340
- },
341
- {
342
- "epoch": 0.7246158026629631,
343
- "grad_norm": 9.259981155395508,
344
- "learning_rate": 6.37707194831074e-05,
345
- "loss": 1.2396,
346
- "step": 24000
347
- },
348
- {
349
- "epoch": 0.7397119652184415,
350
- "grad_norm": 13.87867546081543,
351
- "learning_rate": 6.301591135533348e-05,
352
- "loss": 1.1755,
353
- "step": 24500
354
- },
355
- {
356
- "epoch": 0.7548081277739199,
357
- "grad_norm": 19.097341537475586,
358
- "learning_rate": 6.226110322755956e-05,
359
- "loss": 1.1719,
360
- "step": 25000
361
- },
362
- {
363
- "epoch": 0.7699042903293982,
364
- "grad_norm": 15.854406356811523,
365
- "learning_rate": 6.150629509978564e-05,
366
- "loss": 1.1554,
367
- "step": 25500
368
- },
369
- {
370
- "epoch": 0.7850004528848766,
371
- "grad_norm": 11.38527774810791,
372
- "learning_rate": 6.0751486972011714e-05,
373
- "loss": 1.1181,
374
- "step": 26000
375
- },
376
- {
377
- "epoch": 0.800096615440355,
378
- "grad_norm": 9.802115440368652,
379
- "learning_rate": 5.9996678844237794e-05,
380
- "loss": 1.1336,
381
- "step": 26500
382
- },
383
- {
384
- "epoch": 0.8151927779958335,
385
- "grad_norm": 11.467633247375488,
386
- "learning_rate": 5.924187071646388e-05,
387
- "loss": 1.0586,
388
- "step": 27000
389
- },
390
- {
391
- "epoch": 0.8302889405513119,
392
- "grad_norm": 7.793050289154053,
393
- "learning_rate": 5.848706258868996e-05,
394
- "loss": 1.1079,
395
- "step": 27500
396
- },
397
- {
398
- "epoch": 0.8453851031067903,
399
- "grad_norm": 16.90721893310547,
400
- "learning_rate": 5.773225446091604e-05,
401
- "loss": 1.0793,
402
- "step": 28000
403
- },
404
- {
405
- "epoch": 0.8604812656622687,
406
- "grad_norm": 8.10334587097168,
407
- "learning_rate": 5.697744633314212e-05,
408
- "loss": 1.0968,
409
- "step": 28500
410
- },
411
- {
412
- "epoch": 0.8755774282177471,
413
- "grad_norm": 12.179634094238281,
414
- "learning_rate": 5.6222638205368197e-05,
415
- "loss": 1.0717,
416
- "step": 29000
417
- },
418
- {
419
- "epoch": 0.8906735907732255,
420
- "grad_norm": 10.544957160949707,
421
- "learning_rate": 5.5467830077594276e-05,
422
- "loss": 1.0724,
423
- "step": 29500
424
- },
425
- {
426
- "epoch": 0.9057697533287038,
427
- "grad_norm": 16.116649627685547,
428
- "learning_rate": 5.4713021949820355e-05,
429
- "loss": 1.0402,
430
- "step": 30000
431
- },
432
- {
433
- "epoch": 0.9208659158841822,
434
- "grad_norm": 14.845929145812988,
435
- "learning_rate": 5.395821382204644e-05,
436
- "loss": 1.039,
437
- "step": 30500
438
- },
439
- {
440
- "epoch": 0.9359620784396606,
441
- "grad_norm": 15.941644668579102,
442
- "learning_rate": 5.320340569427252e-05,
443
- "loss": 1.0279,
444
- "step": 31000
445
- },
446
- {
447
- "epoch": 0.951058240995139,
448
- "grad_norm": 11.153121948242188,
449
- "learning_rate": 5.2448597566498606e-05,
450
- "loss": 1.0515,
451
- "step": 31500
452
- },
453
- {
454
- "epoch": 0.9661544035506174,
455
- "grad_norm": 9.693687438964844,
456
- "learning_rate": 5.1693789438724685e-05,
457
- "loss": 1.0202,
458
- "step": 32000
459
- },
460
- {
461
- "epoch": 0.9812505661060958,
462
- "grad_norm": 11.340658187866211,
463
- "learning_rate": 5.0938981310950765e-05,
464
- "loss": 1.0226,
465
- "step": 32500
466
- },
467
- {
468
- "epoch": 0.9963467286615743,
469
- "grad_norm": 15.900833129882812,
470
- "learning_rate": 5.0184173183176844e-05,
471
- "loss": 1.0472,
472
- "step": 33000
473
- },
474
- {
475
- "epoch": 1.0114428912170526,
476
- "grad_norm": 12.715502738952637,
477
- "learning_rate": 4.942936505540292e-05,
478
- "loss": 0.9413,
479
- "step": 33500
480
- },
481
- {
482
- "epoch": 1.026539053772531,
483
- "grad_norm": 19.717144012451172,
484
- "learning_rate": 4.8674556927629e-05,
485
- "loss": 0.941,
486
- "step": 34000
487
- },
488
- {
489
- "epoch": 1.0416352163280094,
490
- "grad_norm": 16.49878692626953,
491
- "learning_rate": 4.791974879985508e-05,
492
- "loss": 0.9209,
493
- "step": 34500
494
- },
495
- {
496
- "epoch": 1.0567313788834878,
497
- "grad_norm": 9.172051429748535,
498
- "learning_rate": 4.716494067208116e-05,
499
- "loss": 0.8879,
500
- "step": 35000
501
- },
502
- {
503
- "epoch": 1.0718275414389662,
504
- "grad_norm": 14.834227561950684,
505
- "learning_rate": 4.641013254430724e-05,
506
- "loss": 0.9343,
507
- "step": 35500
508
- },
509
- {
510
- "epoch": 1.0869237039944446,
511
- "grad_norm": 15.549851417541504,
512
- "learning_rate": 4.565532441653332e-05,
513
- "loss": 0.9249,
514
- "step": 36000
515
- },
516
- {
517
- "epoch": 1.102019866549923,
518
- "grad_norm": 8.137205123901367,
519
- "learning_rate": 4.49005162887594e-05,
520
- "loss": 0.8802,
521
- "step": 36500
522
- },
523
- {
524
- "epoch": 1.1171160291054014,
525
- "grad_norm": 12.385357856750488,
526
- "learning_rate": 4.414570816098548e-05,
527
- "loss": 0.9158,
528
- "step": 37000
529
- },
530
- {
531
- "epoch": 1.1322121916608798,
532
- "grad_norm": 14.191947937011719,
533
- "learning_rate": 4.3390900033211564e-05,
534
- "loss": 0.8878,
535
- "step": 37500
536
- },
537
- {
538
- "epoch": 1.1473083542163582,
539
- "grad_norm": 12.700613975524902,
540
- "learning_rate": 4.263609190543764e-05,
541
- "loss": 0.8945,
542
- "step": 38000
543
- },
544
- {
545
- "epoch": 1.1624045167718366,
546
- "grad_norm": 9.552591323852539,
547
- "learning_rate": 4.188128377766372e-05,
548
- "loss": 0.8681,
549
- "step": 38500
550
- },
551
- {
552
- "epoch": 1.177500679327315,
553
- "grad_norm": 8.992091178894043,
554
- "learning_rate": 4.11264756498898e-05,
555
- "loss": 0.8636,
556
- "step": 39000
557
- },
558
- {
559
- "epoch": 1.1925968418827935,
560
- "grad_norm": 9.592864990234375,
561
- "learning_rate": 4.037166752211588e-05,
562
- "loss": 0.8813,
563
- "step": 39500
564
- },
565
- {
566
- "epoch": 1.2076930044382719,
567
- "grad_norm": 10.786945343017578,
568
- "learning_rate": 3.961685939434196e-05,
569
- "loss": 0.8586,
570
- "step": 40000
571
- },
572
- {
573
- "epoch": 1.22278916699375,
574
- "grad_norm": 14.606813430786133,
575
- "learning_rate": 3.886205126656804e-05,
576
- "loss": 0.8472,
577
- "step": 40500
578
- },
579
- {
580
- "epoch": 1.2378853295492287,
581
- "grad_norm": 16.00104522705078,
582
- "learning_rate": 3.810724313879412e-05,
583
- "loss": 0.8492,
584
- "step": 41000
585
- },
586
- {
587
- "epoch": 1.2529814921047069,
588
- "grad_norm": 20.34152603149414,
589
- "learning_rate": 3.73524350110202e-05,
590
- "loss": 0.8528,
591
- "step": 41500
592
- },
593
- {
594
- "epoch": 1.2680776546601855,
595
- "grad_norm": 13.854582786560059,
596
- "learning_rate": 3.6597626883246284e-05,
597
- "loss": 0.8315,
598
- "step": 42000
599
- },
600
- {
601
- "epoch": 1.2831738172156637,
602
- "grad_norm": 8.319161415100098,
603
- "learning_rate": 3.584281875547236e-05,
604
- "loss": 0.8254,
605
- "step": 42500
606
- },
607
- {
608
- "epoch": 1.2982699797711421,
609
- "grad_norm": 16.603830337524414,
610
- "learning_rate": 3.508801062769844e-05,
611
- "loss": 0.8311,
612
- "step": 43000
613
- },
614
- {
615
- "epoch": 1.3133661423266205,
616
- "grad_norm": 32.95830535888672,
617
- "learning_rate": 3.433320249992452e-05,
618
- "loss": 0.816,
619
- "step": 43500
620
- },
621
- {
622
- "epoch": 1.328462304882099,
623
- "grad_norm": 13.81757640838623,
624
- "learning_rate": 3.35783943721506e-05,
625
- "loss": 0.8198,
626
- "step": 44000
627
- },
628
- {
629
- "epoch": 1.3435584674375773,
630
- "grad_norm": 6.295320987701416,
631
- "learning_rate": 3.282358624437668e-05,
632
- "loss": 0.8179,
633
- "step": 44500
634
- },
635
- {
636
- "epoch": 1.3586546299930558,
637
- "grad_norm": 14.061367988586426,
638
- "learning_rate": 3.206877811660276e-05,
639
- "loss": 0.7924,
640
- "step": 45000
641
- },
642
- {
643
- "epoch": 1.3737507925485342,
644
- "grad_norm": 19.793121337890625,
645
- "learning_rate": 3.131396998882884e-05,
646
- "loss": 0.8093,
647
- "step": 45500
648
- },
649
- {
650
- "epoch": 1.3888469551040126,
651
- "grad_norm": 9.438281059265137,
652
- "learning_rate": 3.055916186105492e-05,
653
- "loss": 0.8255,
654
- "step": 46000
655
- },
656
- {
657
- "epoch": 1.403943117659491,
658
- "grad_norm": 17.061111450195312,
659
- "learning_rate": 2.9804353733281003e-05,
660
- "loss": 0.7948,
661
- "step": 46500
662
- },
663
- {
664
- "epoch": 1.4190392802149694,
665
- "grad_norm": 11.102389335632324,
666
- "learning_rate": 2.9049545605507083e-05,
667
- "loss": 0.785,
668
- "step": 47000
669
- },
670
- {
671
- "epoch": 1.4341354427704478,
672
- "grad_norm": 7.18344783782959,
673
- "learning_rate": 2.8294737477733162e-05,
674
- "loss": 0.7998,
675
- "step": 47500
676
- },
677
- {
678
- "epoch": 1.4492316053259262,
679
- "grad_norm": 16.159982681274414,
680
- "learning_rate": 2.753992934995924e-05,
681
- "loss": 0.7842,
682
- "step": 48000
683
- },
684
- {
685
- "epoch": 1.4643277678814046,
686
- "grad_norm": 21.598909378051758,
687
- "learning_rate": 2.6785121222185324e-05,
688
- "loss": 0.8171,
689
- "step": 48500
690
- },
691
- {
692
- "epoch": 1.4794239304368828,
693
- "grad_norm": 13.887185096740723,
694
- "learning_rate": 2.6030313094411403e-05,
695
- "loss": 0.8031,
696
- "step": 49000
697
- },
698
- {
699
- "epoch": 1.4945200929923614,
700
- "grad_norm": 5.625245094299316,
701
- "learning_rate": 2.5275504966637482e-05,
702
- "loss": 0.7824,
703
- "step": 49500
704
- },
705
- {
706
- "epoch": 1.5096162555478396,
707
- "grad_norm": 12.34050178527832,
708
- "learning_rate": 2.4520696838863565e-05,
709
- "loss": 0.7807,
710
- "step": 50000
711
- },
712
- {
713
- "epoch": 1.5247124181033183,
714
- "grad_norm": 16.76266860961914,
715
- "learning_rate": 2.3765888711089644e-05,
716
- "loss": 0.7968,
717
- "step": 50500
718
- },
719
- {
720
- "epoch": 1.5398085806587964,
721
- "grad_norm": 8.601690292358398,
722
- "learning_rate": 2.3011080583315723e-05,
723
- "loss": 0.7053,
724
- "step": 51000
725
- },
726
- {
727
- "epoch": 1.554904743214275,
728
- "grad_norm": 10.708945274353027,
729
- "learning_rate": 2.2256272455541802e-05,
730
- "loss": 0.7548,
731
- "step": 51500
732
- },
733
- {
734
- "epoch": 1.5700009057697533,
735
- "grad_norm": 12.558730125427246,
736
- "learning_rate": 2.1501464327767882e-05,
737
- "loss": 0.7547,
738
- "step": 52000
739
- },
740
- {
741
- "epoch": 1.5850970683252317,
742
- "grad_norm": 13.777571678161621,
743
- "learning_rate": 2.0746656199993964e-05,
744
- "loss": 0.7463,
745
- "step": 52500
746
- },
747
- {
748
- "epoch": 1.60019323088071,
749
- "grad_norm": 6.568544387817383,
750
- "learning_rate": 1.9991848072220044e-05,
751
- "loss": 0.7713,
752
- "step": 53000
753
- },
754
- {
755
- "epoch": 1.6152893934361885,
756
- "grad_norm": 8.380231857299805,
757
- "learning_rate": 1.9237039944446123e-05,
758
- "loss": 0.7218,
759
- "step": 53500
760
- },
761
- {
762
- "epoch": 1.630385555991667,
763
- "grad_norm": 15.468815803527832,
764
- "learning_rate": 1.8482231816672202e-05,
765
- "loss": 0.7452,
766
- "step": 54000
767
- },
768
- {
769
- "epoch": 1.6454817185471453,
770
- "grad_norm": 9.33584976196289,
771
- "learning_rate": 1.7727423688898285e-05,
772
- "loss": 0.7433,
773
- "step": 54500
774
- },
775
- {
776
- "epoch": 1.6605778811026237,
777
- "grad_norm": 25.546720504760742,
778
- "learning_rate": 1.6972615561124364e-05,
779
- "loss": 0.7679,
780
- "step": 55000
781
- },
782
- {
783
- "epoch": 1.6756740436581021,
784
- "grad_norm": 25.5761661529541,
785
- "learning_rate": 1.6217807433350443e-05,
786
- "loss": 0.7717,
787
- "step": 55500
788
- },
789
- {
790
- "epoch": 1.6907702062135805,
791
- "grad_norm": 13.344952583312988,
792
- "learning_rate": 1.5462999305576522e-05,
793
- "loss": 0.7313,
794
- "step": 56000
795
- },
796
- {
797
- "epoch": 1.705866368769059,
798
- "grad_norm": 7.900672912597656,
799
- "learning_rate": 1.4708191177802602e-05,
800
- "loss": 0.7339,
801
- "step": 56500
802
- },
803
- {
804
- "epoch": 1.7209625313245374,
805
- "grad_norm": 7.074102401733398,
806
- "learning_rate": 1.3953383050028684e-05,
807
- "loss": 0.7217,
808
- "step": 57000
809
- },
810
- {
811
- "epoch": 1.7360586938800155,
812
- "grad_norm": 16.54216766357422,
813
- "learning_rate": 1.3198574922254763e-05,
814
- "loss": 0.7513,
815
- "step": 57500
816
- },
817
- {
818
- "epoch": 1.7511548564354942,
819
- "grad_norm": 9.716275215148926,
820
- "learning_rate": 1.2443766794480843e-05,
821
- "loss": 0.7189,
822
- "step": 58000
823
- },
824
- {
825
- "epoch": 1.7662510189909724,
826
- "grad_norm": 10.64577579498291,
827
- "learning_rate": 1.1688958666706924e-05,
828
- "loss": 0.7293,
829
- "step": 58500
830
- },
831
- {
832
- "epoch": 1.781347181546451,
833
- "grad_norm": 18.039608001708984,
834
- "learning_rate": 1.0934150538933004e-05,
835
- "loss": 0.712,
836
- "step": 59000
837
- },
838
- {
839
- "epoch": 1.7964433441019292,
840
- "grad_norm": 19.148468017578125,
841
- "learning_rate": 1.0179342411159084e-05,
842
- "loss": 0.7101,
843
- "step": 59500
844
- },
845
- {
846
- "epoch": 1.8115395066574078,
847
- "grad_norm": 9.460704803466797,
848
- "learning_rate": 9.424534283385165e-06,
849
- "loss": 0.7298,
850
- "step": 60000
851
- },
852
- {
853
- "epoch": 1.826635669212886,
854
- "grad_norm": 19.228601455688477,
855
- "learning_rate": 8.669726155611244e-06,
856
- "loss": 0.7265,
857
- "step": 60500
858
- },
859
- {
860
- "epoch": 1.8417318317683646,
861
- "grad_norm": 32.19565200805664,
862
- "learning_rate": 7.914918027837325e-06,
863
- "loss": 0.7139,
864
- "step": 61000
865
- },
866
- {
867
- "epoch": 1.8568279943238428,
868
- "grad_norm": 11.675516128540039,
869
- "learning_rate": 7.160109900063404e-06,
870
- "loss": 0.7163,
871
- "step": 61500
872
- },
873
- {
874
- "epoch": 1.8719241568793212,
875
- "grad_norm": 16.321319580078125,
876
- "learning_rate": 6.405301772289485e-06,
877
- "loss": 0.7546,
878
- "step": 62000
879
- },
880
- {
881
- "epoch": 1.8870203194347996,
882
- "grad_norm": 8.392330169677734,
883
- "learning_rate": 5.650493644515564e-06,
884
- "loss": 0.7005,
885
- "step": 62500
886
- },
887
- {
888
- "epoch": 1.902116481990278,
889
- "grad_norm": 18.768943786621094,
890
- "learning_rate": 4.895685516741644e-06,
891
- "loss": 0.7305,
892
- "step": 63000
893
- },
894
- {
895
- "epoch": 1.9172126445457565,
896
- "grad_norm": 26.690473556518555,
897
- "learning_rate": 4.140877388967724e-06,
898
- "loss": 0.7373,
899
- "step": 63500
900
- },
901
- {
902
- "epoch": 1.9323088071012349,
903
- "grad_norm": 9.388639450073242,
904
- "learning_rate": 3.386069261193805e-06,
905
- "loss": 0.6859,
906
- "step": 64000
907
- },
908
- {
909
- "epoch": 1.9474049696567133,
910
- "grad_norm": 10.29443645477295,
911
- "learning_rate": 2.6312611334198844e-06,
912
- "loss": 0.7192,
913
- "step": 64500
914
- },
915
- {
916
- "epoch": 1.9625011322121917,
917
- "grad_norm": 20.20147705078125,
918
- "learning_rate": 1.8764530056459647e-06,
919
- "loss": 0.689,
920
- "step": 65000
921
- },
922
- {
923
- "epoch": 1.97759729476767,
924
- "grad_norm": 7.911527633666992,
925
- "learning_rate": 1.121644877872045e-06,
926
- "loss": 0.6781,
927
- "step": 65500
928
- },
929
- {
930
- "epoch": 1.9926934573231485,
931
- "grad_norm": 14.393226623535156,
932
- "learning_rate": 3.6683675009812505e-07,
933
- "loss": 0.6922,
934
- "step": 66000
935
- },
936
  {
937
  "epoch": 2.0,
938
- "step": 66242,
939
- "total_flos": 1.1159654864928768e+16,
940
- "train_loss": 1.1755467142439282,
941
- "train_runtime": 8474.7112,
942
- "train_samples_per_second": 93.797,
943
- "train_steps_per_second": 7.816
944
  }
945
  ],
946
  "logging_steps": 500,
947
- "max_steps": 66242,
948
  "num_input_tokens_seen": 0,
949
  "num_train_epochs": 2,
950
  "save_steps": 500,
@@ -960,7 +36,7 @@
960
  "attributes": {}
961
  }
962
  },
963
- "total_flos": 1.1159654864928768e+16,
964
  "train_batch_size": 12,
965
  "trial_name": null,
966
  "trial_params": null
 
4
  "best_model_checkpoint": null,
5
  "epoch": 2.0,
6
  "eval_steps": 500,
7
+ "global_step": 18,
8
  "is_hyper_param_search": false,
9
  "is_local_process_zero": true,
10
  "is_world_process_zero": true,
11
  "log_history": [
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  {
13
  "epoch": 2.0,
14
+ "step": 18,
15
+ "total_flos": 2807799398400.0,
16
+ "train_loss": 5.859864128960504,
17
+ "train_runtime": 5.9882,
18
+ "train_samples_per_second": 33.399,
19
+ "train_steps_per_second": 3.006
20
  }
21
  ],
22
  "logging_steps": 500,
23
+ "max_steps": 18,
24
  "num_input_tokens_seen": 0,
25
  "num_train_epochs": 2,
26
  "save_steps": 500,
 
36
  "attributes": {}
37
  }
38
  },
39
+ "total_flos": 2807799398400.0,
40
  "train_batch_size": 12,
41
  "trial_name": null,
42
  "trial_params": null
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c998a92d17d4555f0cd3fa049884bd2d9715d590ce7f0c64b38a8b01aa61dc53
3
  size 5777
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fe9b3aaec2cbe16f62d12e6ae4c48c8dfa131fc333449a1b61956836034b5f3b
3
  size 5777