Mrohit01 commited on
Commit
96847b9
·
verified ·
1 Parent(s): 95191c7

Training in progress, epoch 0

Browse files
all_results.json CHANGED
@@ -1,7 +1,13 @@
1
  {
 
2
  "eval_accuracy": 0.11342436602234439,
3
  "eval_loss": 2.3934237957000732,
4
  "eval_runtime": 514.1383,
5
  "eval_samples_per_second": 54.839,
6
- "eval_steps_per_second": 0.858
 
 
 
 
 
7
  }
 
1
  {
2
+ "epoch": 9.993390614672835,
3
  "eval_accuracy": 0.11342436602234439,
4
  "eval_loss": 2.3934237957000732,
5
  "eval_runtime": 514.1383,
6
  "eval_samples_per_second": 54.839,
7
+ "eval_steps_per_second": 0.858,
8
+ "total_flos": 7.511248143138256e+19,
9
+ "train_loss": 1.6334815527396227,
10
+ "train_runtime": 44633.0248,
11
+ "train_samples_per_second": 21.683,
12
+ "train_steps_per_second": 0.085
13
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8da3d1c41ee608c4db30186b4662da10a78cf55e23c026472d8f1b69be8dda18
3
  size 343248584
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:97016281036b19c1e77a456c81f3be4700ff5dede23cca6cf00773d5a3e3c5b4
3
  size 343248584
runs/May16_00-21-59_e2e-66-39/events.out.tfevents.1715799126.e2e-66-39.276148.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9b05809ba1e54448989a1b06fb1af5af0a32b1ee87e3a06ebf31a14cc7eb3575
3
+ size 11804
runs/May16_10-48-15_e2e-66-39/events.out.tfevents.1715836698.e2e-66-39.330728.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:83fe818f0397f5e7a86f8a91dee8b8baa1ab77fce182fe738a447375694da3eb
3
+ size 13072
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 9.993390614672835,
3
+ "total_flos": 7.511248143138256e+19,
4
+ "train_loss": 1.6334815527396227,
5
+ "train_runtime": 44633.0248,
6
+ "train_samples_per_second": 21.683,
7
+ "train_steps_per_second": 0.085
8
+ }
trainer_state.json ADDED
@@ -0,0 +1,2766 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 0.32094342968611456,
3
+ "best_model_checkpoint": "Mrohit01/cards-vit-base-patch16-224-finetuned-v1/checkpoint-3404",
4
+ "epoch": 9.993390614672835,
5
+ "eval_steps": 500,
6
+ "global_step": 3780,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.026437541308658295,
13
+ "grad_norm": 1.8425437211990356,
14
+ "learning_rate": 1.3227513227513228e-06,
15
+ "loss": 1.9993,
16
+ "step": 10
17
+ },
18
+ {
19
+ "epoch": 0.05287508261731659,
20
+ "grad_norm": 1.9051584005355835,
21
+ "learning_rate": 2.6455026455026455e-06,
22
+ "loss": 2.0327,
23
+ "step": 20
24
+ },
25
+ {
26
+ "epoch": 0.07931262392597488,
27
+ "grad_norm": 1.936784267425537,
28
+ "learning_rate": 3.968253968253968e-06,
29
+ "loss": 2.0055,
30
+ "step": 30
31
+ },
32
+ {
33
+ "epoch": 0.10575016523463318,
34
+ "grad_norm": 1.9875764846801758,
35
+ "learning_rate": 5.291005291005291e-06,
36
+ "loss": 1.9807,
37
+ "step": 40
38
+ },
39
+ {
40
+ "epoch": 0.13218770654329148,
41
+ "grad_norm": 2.0385992527008057,
42
+ "learning_rate": 6.613756613756614e-06,
43
+ "loss": 1.9866,
44
+ "step": 50
45
+ },
46
+ {
47
+ "epoch": 0.15862524785194976,
48
+ "grad_norm": 1.9623510837554932,
49
+ "learning_rate": 7.936507936507936e-06,
50
+ "loss": 1.9612,
51
+ "step": 60
52
+ },
53
+ {
54
+ "epoch": 0.18506278916060806,
55
+ "grad_norm": 1.8609628677368164,
56
+ "learning_rate": 9.259259259259259e-06,
57
+ "loss": 1.9499,
58
+ "step": 70
59
+ },
60
+ {
61
+ "epoch": 0.21150033046926636,
62
+ "grad_norm": 2.279550552368164,
63
+ "learning_rate": 1.0582010582010582e-05,
64
+ "loss": 1.9277,
65
+ "step": 80
66
+ },
67
+ {
68
+ "epoch": 0.23793787177792466,
69
+ "grad_norm": 2.0092594623565674,
70
+ "learning_rate": 1.1904761904761905e-05,
71
+ "loss": 1.9246,
72
+ "step": 90
73
+ },
74
+ {
75
+ "epoch": 0.26437541308658297,
76
+ "grad_norm": 1.8894586563110352,
77
+ "learning_rate": 1.3227513227513228e-05,
78
+ "loss": 1.9187,
79
+ "step": 100
80
+ },
81
+ {
82
+ "epoch": 0.29081295439524124,
83
+ "grad_norm": 1.841611623764038,
84
+ "learning_rate": 1.455026455026455e-05,
85
+ "loss": 1.9279,
86
+ "step": 110
87
+ },
88
+ {
89
+ "epoch": 0.3172504957038995,
90
+ "grad_norm": 1.9487497806549072,
91
+ "learning_rate": 1.5873015873015872e-05,
92
+ "loss": 1.8989,
93
+ "step": 120
94
+ },
95
+ {
96
+ "epoch": 0.34368803701255785,
97
+ "grad_norm": 1.8522062301635742,
98
+ "learning_rate": 1.7195767195767195e-05,
99
+ "loss": 1.9041,
100
+ "step": 130
101
+ },
102
+ {
103
+ "epoch": 0.3701255783212161,
104
+ "grad_norm": 1.6527143716812134,
105
+ "learning_rate": 1.8518518518518518e-05,
106
+ "loss": 1.9107,
107
+ "step": 140
108
+ },
109
+ {
110
+ "epoch": 0.3965631196298744,
111
+ "grad_norm": 1.6777476072311401,
112
+ "learning_rate": 1.984126984126984e-05,
113
+ "loss": 1.8915,
114
+ "step": 150
115
+ },
116
+ {
117
+ "epoch": 0.4230006609385327,
118
+ "grad_norm": 2.1537675857543945,
119
+ "learning_rate": 2.1164021164021164e-05,
120
+ "loss": 1.8814,
121
+ "step": 160
122
+ },
123
+ {
124
+ "epoch": 0.449438202247191,
125
+ "grad_norm": 2.119074583053589,
126
+ "learning_rate": 2.2486772486772487e-05,
127
+ "loss": 1.8767,
128
+ "step": 170
129
+ },
130
+ {
131
+ "epoch": 0.47587574355584933,
132
+ "grad_norm": 2.027825355529785,
133
+ "learning_rate": 2.380952380952381e-05,
134
+ "loss": 1.8542,
135
+ "step": 180
136
+ },
137
+ {
138
+ "epoch": 0.5023132848645075,
139
+ "grad_norm": 1.9427117109298706,
140
+ "learning_rate": 2.5132275132275137e-05,
141
+ "loss": 1.8614,
142
+ "step": 190
143
+ },
144
+ {
145
+ "epoch": 0.5287508261731659,
146
+ "grad_norm": 2.057250499725342,
147
+ "learning_rate": 2.6455026455026456e-05,
148
+ "loss": 1.8578,
149
+ "step": 200
150
+ },
151
+ {
152
+ "epoch": 0.5551883674818242,
153
+ "grad_norm": 1.760541558265686,
154
+ "learning_rate": 2.777777777777778e-05,
155
+ "loss": 1.8507,
156
+ "step": 210
157
+ },
158
+ {
159
+ "epoch": 0.5816259087904825,
160
+ "grad_norm": 2.5953147411346436,
161
+ "learning_rate": 2.91005291005291e-05,
162
+ "loss": 1.8325,
163
+ "step": 220
164
+ },
165
+ {
166
+ "epoch": 0.6080634500991408,
167
+ "grad_norm": 2.2065205574035645,
168
+ "learning_rate": 3.0423280423280425e-05,
169
+ "loss": 1.8418,
170
+ "step": 230
171
+ },
172
+ {
173
+ "epoch": 0.634500991407799,
174
+ "grad_norm": 1.930132269859314,
175
+ "learning_rate": 3.1746031746031745e-05,
176
+ "loss": 1.8295,
177
+ "step": 240
178
+ },
179
+ {
180
+ "epoch": 0.6609385327164574,
181
+ "grad_norm": 2.2985873222351074,
182
+ "learning_rate": 3.306878306878307e-05,
183
+ "loss": 1.8367,
184
+ "step": 250
185
+ },
186
+ {
187
+ "epoch": 0.6873760740251157,
188
+ "grad_norm": 2.229893207550049,
189
+ "learning_rate": 3.439153439153439e-05,
190
+ "loss": 1.8298,
191
+ "step": 260
192
+ },
193
+ {
194
+ "epoch": 0.713813615333774,
195
+ "grad_norm": 2.627453088760376,
196
+ "learning_rate": 3.571428571428572e-05,
197
+ "loss": 1.8321,
198
+ "step": 270
199
+ },
200
+ {
201
+ "epoch": 0.7402511566424322,
202
+ "grad_norm": 2.1309328079223633,
203
+ "learning_rate": 3.7037037037037037e-05,
204
+ "loss": 1.8377,
205
+ "step": 280
206
+ },
207
+ {
208
+ "epoch": 0.7666886979510905,
209
+ "grad_norm": 1.6566647291183472,
210
+ "learning_rate": 3.835978835978836e-05,
211
+ "loss": 1.8391,
212
+ "step": 290
213
+ },
214
+ {
215
+ "epoch": 0.7931262392597488,
216
+ "grad_norm": 3.22412109375,
217
+ "learning_rate": 3.968253968253968e-05,
218
+ "loss": 1.825,
219
+ "step": 300
220
+ },
221
+ {
222
+ "epoch": 0.8195637805684072,
223
+ "grad_norm": 2.1069350242614746,
224
+ "learning_rate": 4.100529100529101e-05,
225
+ "loss": 1.8074,
226
+ "step": 310
227
+ },
228
+ {
229
+ "epoch": 0.8460013218770654,
230
+ "grad_norm": 2.3734729290008545,
231
+ "learning_rate": 4.232804232804233e-05,
232
+ "loss": 1.8281,
233
+ "step": 320
234
+ },
235
+ {
236
+ "epoch": 0.8724388631857237,
237
+ "grad_norm": 1.7937417030334473,
238
+ "learning_rate": 4.3650793650793655e-05,
239
+ "loss": 1.8065,
240
+ "step": 330
241
+ },
242
+ {
243
+ "epoch": 0.898876404494382,
244
+ "grad_norm": 2.0130672454833984,
245
+ "learning_rate": 4.4973544973544974e-05,
246
+ "loss": 1.8133,
247
+ "step": 340
248
+ },
249
+ {
250
+ "epoch": 0.9253139458030403,
251
+ "grad_norm": 1.8942523002624512,
252
+ "learning_rate": 4.62962962962963e-05,
253
+ "loss": 1.8097,
254
+ "step": 350
255
+ },
256
+ {
257
+ "epoch": 0.9517514871116987,
258
+ "grad_norm": 2.3066070079803467,
259
+ "learning_rate": 4.761904761904762e-05,
260
+ "loss": 1.8325,
261
+ "step": 360
262
+ },
263
+ {
264
+ "epoch": 0.9781890284203569,
265
+ "grad_norm": 1.563308596611023,
266
+ "learning_rate": 4.894179894179895e-05,
267
+ "loss": 1.7866,
268
+ "step": 370
269
+ },
270
+ {
271
+ "epoch": 0.9993390614672836,
272
+ "eval_accuracy": 0.2745522255719099,
273
+ "eval_loss": 1.7676516771316528,
274
+ "eval_runtime": 664.7291,
275
+ "eval_samples_per_second": 42.416,
276
+ "eval_steps_per_second": 0.663,
277
+ "step": 378
278
+ },
279
+ {
280
+ "epoch": 1.004626569729015,
281
+ "grad_norm": 1.7081835269927979,
282
+ "learning_rate": 4.9970605526161084e-05,
283
+ "loss": 1.7884,
284
+ "step": 380
285
+ },
286
+ {
287
+ "epoch": 1.0310641110376735,
288
+ "grad_norm": 2.7630808353424072,
289
+ "learning_rate": 4.982363315696649e-05,
290
+ "loss": 1.7739,
291
+ "step": 390
292
+ },
293
+ {
294
+ "epoch": 1.0575016523463319,
295
+ "grad_norm": 3.5567917823791504,
296
+ "learning_rate": 4.96766607877719e-05,
297
+ "loss": 1.7764,
298
+ "step": 400
299
+ },
300
+ {
301
+ "epoch": 1.08393919365499,
302
+ "grad_norm": 2.181006669998169,
303
+ "learning_rate": 4.952968841857731e-05,
304
+ "loss": 1.8154,
305
+ "step": 410
306
+ },
307
+ {
308
+ "epoch": 1.1103767349636484,
309
+ "grad_norm": 2.285545587539673,
310
+ "learning_rate": 4.938271604938271e-05,
311
+ "loss": 1.7998,
312
+ "step": 420
313
+ },
314
+ {
315
+ "epoch": 1.1368142762723066,
316
+ "grad_norm": 2.4093916416168213,
317
+ "learning_rate": 4.923574368018813e-05,
318
+ "loss": 1.7643,
319
+ "step": 430
320
+ },
321
+ {
322
+ "epoch": 1.163251817580965,
323
+ "grad_norm": 1.6129083633422852,
324
+ "learning_rate": 4.908877131099354e-05,
325
+ "loss": 1.7932,
326
+ "step": 440
327
+ },
328
+ {
329
+ "epoch": 1.1896893588896233,
330
+ "grad_norm": 1.6316031217575073,
331
+ "learning_rate": 4.894179894179895e-05,
332
+ "loss": 1.7851,
333
+ "step": 450
334
+ },
335
+ {
336
+ "epoch": 1.2161269001982815,
337
+ "grad_norm": 1.6007944345474243,
338
+ "learning_rate": 4.879482657260435e-05,
339
+ "loss": 1.795,
340
+ "step": 460
341
+ },
342
+ {
343
+ "epoch": 1.24256444150694,
344
+ "grad_norm": 1.69334077835083,
345
+ "learning_rate": 4.864785420340976e-05,
346
+ "loss": 1.7523,
347
+ "step": 470
348
+ },
349
+ {
350
+ "epoch": 1.269001982815598,
351
+ "grad_norm": 1.9214173555374146,
352
+ "learning_rate": 4.850088183421517e-05,
353
+ "loss": 1.7595,
354
+ "step": 480
355
+ },
356
+ {
357
+ "epoch": 1.2954395241242564,
358
+ "grad_norm": 1.5973252058029175,
359
+ "learning_rate": 4.835390946502058e-05,
360
+ "loss": 1.7542,
361
+ "step": 490
362
+ },
363
+ {
364
+ "epoch": 1.3218770654329148,
365
+ "grad_norm": 1.8921444416046143,
366
+ "learning_rate": 4.820693709582599e-05,
367
+ "loss": 1.7466,
368
+ "step": 500
369
+ },
370
+ {
371
+ "epoch": 1.348314606741573,
372
+ "grad_norm": 1.84861159324646,
373
+ "learning_rate": 4.8059964726631394e-05,
374
+ "loss": 1.7763,
375
+ "step": 510
376
+ },
377
+ {
378
+ "epoch": 1.3747521480502314,
379
+ "grad_norm": 2.476146936416626,
380
+ "learning_rate": 4.79129923574368e-05,
381
+ "loss": 1.7577,
382
+ "step": 520
383
+ },
384
+ {
385
+ "epoch": 1.4011896893588895,
386
+ "grad_norm": 1.5669139623641968,
387
+ "learning_rate": 4.776601998824221e-05,
388
+ "loss": 1.7697,
389
+ "step": 530
390
+ },
391
+ {
392
+ "epoch": 1.427627230667548,
393
+ "grad_norm": 2.1005804538726807,
394
+ "learning_rate": 4.761904761904762e-05,
395
+ "loss": 1.7737,
396
+ "step": 540
397
+ },
398
+ {
399
+ "epoch": 1.4540647719762063,
400
+ "grad_norm": 1.8329302072525024,
401
+ "learning_rate": 4.747207524985303e-05,
402
+ "loss": 1.7476,
403
+ "step": 550
404
+ },
405
+ {
406
+ "epoch": 1.4805023132848645,
407
+ "grad_norm": 1.7490884065628052,
408
+ "learning_rate": 4.732510288065844e-05,
409
+ "loss": 1.7695,
410
+ "step": 560
411
+ },
412
+ {
413
+ "epoch": 1.5069398545935226,
414
+ "grad_norm": 1.7790248394012451,
415
+ "learning_rate": 4.717813051146385e-05,
416
+ "loss": 1.7696,
417
+ "step": 570
418
+ },
419
+ {
420
+ "epoch": 1.533377395902181,
421
+ "grad_norm": 1.4908485412597656,
422
+ "learning_rate": 4.7031158142269256e-05,
423
+ "loss": 1.7809,
424
+ "step": 580
425
+ },
426
+ {
427
+ "epoch": 1.5598149372108394,
428
+ "grad_norm": 1.4122120141983032,
429
+ "learning_rate": 4.6884185773074665e-05,
430
+ "loss": 1.7281,
431
+ "step": 590
432
+ },
433
+ {
434
+ "epoch": 1.5862524785194978,
435
+ "grad_norm": 1.5507616996765137,
436
+ "learning_rate": 4.673721340388007e-05,
437
+ "loss": 1.7906,
438
+ "step": 600
439
+ },
440
+ {
441
+ "epoch": 1.612690019828156,
442
+ "grad_norm": 1.7312538623809814,
443
+ "learning_rate": 4.659024103468548e-05,
444
+ "loss": 1.7863,
445
+ "step": 610
446
+ },
447
+ {
448
+ "epoch": 1.6391275611368141,
449
+ "grad_norm": 1.9535393714904785,
450
+ "learning_rate": 4.644326866549089e-05,
451
+ "loss": 1.7963,
452
+ "step": 620
453
+ },
454
+ {
455
+ "epoch": 1.6655651024454725,
456
+ "grad_norm": 2.0952374935150146,
457
+ "learning_rate": 4.62962962962963e-05,
458
+ "loss": 1.784,
459
+ "step": 630
460
+ },
461
+ {
462
+ "epoch": 1.692002643754131,
463
+ "grad_norm": 1.8771756887435913,
464
+ "learning_rate": 4.614932392710171e-05,
465
+ "loss": 1.7717,
466
+ "step": 640
467
+ },
468
+ {
469
+ "epoch": 1.7184401850627893,
470
+ "grad_norm": 1.9628280401229858,
471
+ "learning_rate": 4.600235155790711e-05,
472
+ "loss": 1.7156,
473
+ "step": 650
474
+ },
475
+ {
476
+ "epoch": 1.7448777263714474,
477
+ "grad_norm": 1.7401697635650635,
478
+ "learning_rate": 4.585537918871252e-05,
479
+ "loss": 1.7748,
480
+ "step": 660
481
+ },
482
+ {
483
+ "epoch": 1.7713152676801056,
484
+ "grad_norm": 1.9188202619552612,
485
+ "learning_rate": 4.5708406819517937e-05,
486
+ "loss": 1.7759,
487
+ "step": 670
488
+ },
489
+ {
490
+ "epoch": 1.797752808988764,
491
+ "grad_norm": 1.7092291116714478,
492
+ "learning_rate": 4.5561434450323345e-05,
493
+ "loss": 1.7368,
494
+ "step": 680
495
+ },
496
+ {
497
+ "epoch": 1.8241903502974224,
498
+ "grad_norm": 1.9781934022903442,
499
+ "learning_rate": 4.541446208112875e-05,
500
+ "loss": 1.7743,
501
+ "step": 690
502
+ },
503
+ {
504
+ "epoch": 1.8506278916060808,
505
+ "grad_norm": 2.159959077835083,
506
+ "learning_rate": 4.5267489711934157e-05,
507
+ "loss": 1.7723,
508
+ "step": 700
509
+ },
510
+ {
511
+ "epoch": 1.877065432914739,
512
+ "grad_norm": 1.9522385597229004,
513
+ "learning_rate": 4.5120517342739565e-05,
514
+ "loss": 1.7669,
515
+ "step": 710
516
+ },
517
+ {
518
+ "epoch": 1.903502974223397,
519
+ "grad_norm": 2.2479653358459473,
520
+ "learning_rate": 4.4973544973544974e-05,
521
+ "loss": 1.7479,
522
+ "step": 720
523
+ },
524
+ {
525
+ "epoch": 1.9299405155320555,
526
+ "grad_norm": 2.117433547973633,
527
+ "learning_rate": 4.482657260435038e-05,
528
+ "loss": 1.7363,
529
+ "step": 730
530
+ },
531
+ {
532
+ "epoch": 1.9563780568407139,
533
+ "grad_norm": 1.7201601266860962,
534
+ "learning_rate": 4.467960023515579e-05,
535
+ "loss": 1.7551,
536
+ "step": 740
537
+ },
538
+ {
539
+ "epoch": 1.9828155981493722,
540
+ "grad_norm": 1.7983427047729492,
541
+ "learning_rate": 4.45326278659612e-05,
542
+ "loss": 1.7457,
543
+ "step": 750
544
+ },
545
+ {
546
+ "epoch": 1.998678122934567,
547
+ "eval_accuracy": 0.29902464976059584,
548
+ "eval_loss": 1.7163423299789429,
549
+ "eval_runtime": 618.2518,
550
+ "eval_samples_per_second": 45.604,
551
+ "eval_steps_per_second": 0.713,
552
+ "step": 756
553
+ },
554
+ {
555
+ "epoch": 2.00925313945803,
556
+ "grad_norm": 1.3950998783111572,
557
+ "learning_rate": 4.438565549676661e-05,
558
+ "loss": 1.7106,
559
+ "step": 760
560
+ },
561
+ {
562
+ "epoch": 2.0356906807666886,
563
+ "grad_norm": 1.764172077178955,
564
+ "learning_rate": 4.423868312757202e-05,
565
+ "loss": 1.7251,
566
+ "step": 770
567
+ },
568
+ {
569
+ "epoch": 2.062128222075347,
570
+ "grad_norm": 1.5473984479904175,
571
+ "learning_rate": 4.409171075837743e-05,
572
+ "loss": 1.7107,
573
+ "step": 780
574
+ },
575
+ {
576
+ "epoch": 2.0885657633840053,
577
+ "grad_norm": 1.851131558418274,
578
+ "learning_rate": 4.394473838918284e-05,
579
+ "loss": 1.7104,
580
+ "step": 790
581
+ },
582
+ {
583
+ "epoch": 2.1150033046926637,
584
+ "grad_norm": 1.5341060161590576,
585
+ "learning_rate": 4.3797766019988246e-05,
586
+ "loss": 1.7051,
587
+ "step": 800
588
+ },
589
+ {
590
+ "epoch": 2.1414408460013217,
591
+ "grad_norm": 1.7939796447753906,
592
+ "learning_rate": 4.3650793650793655e-05,
593
+ "loss": 1.7393,
594
+ "step": 810
595
+ },
596
+ {
597
+ "epoch": 2.16787838730998,
598
+ "grad_norm": 1.713605284690857,
599
+ "learning_rate": 4.3503821281599064e-05,
600
+ "loss": 1.6997,
601
+ "step": 820
602
+ },
603
+ {
604
+ "epoch": 2.1943159286186384,
605
+ "grad_norm": 1.7437527179718018,
606
+ "learning_rate": 4.3356848912404466e-05,
607
+ "loss": 1.7004,
608
+ "step": 830
609
+ },
610
+ {
611
+ "epoch": 2.220753469927297,
612
+ "grad_norm": 1.8171709775924683,
613
+ "learning_rate": 4.3209876543209875e-05,
614
+ "loss": 1.7275,
615
+ "step": 840
616
+ },
617
+ {
618
+ "epoch": 2.247191011235955,
619
+ "grad_norm": 1.628079891204834,
620
+ "learning_rate": 4.306290417401529e-05,
621
+ "loss": 1.7378,
622
+ "step": 850
623
+ },
624
+ {
625
+ "epoch": 2.273628552544613,
626
+ "grad_norm": 1.7622318267822266,
627
+ "learning_rate": 4.29159318048207e-05,
628
+ "loss": 1.7247,
629
+ "step": 860
630
+ },
631
+ {
632
+ "epoch": 2.3000660938532715,
633
+ "grad_norm": 1.4866081476211548,
634
+ "learning_rate": 4.27689594356261e-05,
635
+ "loss": 1.7166,
636
+ "step": 870
637
+ },
638
+ {
639
+ "epoch": 2.32650363516193,
640
+ "grad_norm": 1.7897216081619263,
641
+ "learning_rate": 4.262198706643151e-05,
642
+ "loss": 1.7096,
643
+ "step": 880
644
+ },
645
+ {
646
+ "epoch": 2.3529411764705883,
647
+ "grad_norm": 1.7703666687011719,
648
+ "learning_rate": 4.247501469723692e-05,
649
+ "loss": 1.7096,
650
+ "step": 890
651
+ },
652
+ {
653
+ "epoch": 2.3793787177792467,
654
+ "grad_norm": 1.9003839492797852,
655
+ "learning_rate": 4.232804232804233e-05,
656
+ "loss": 1.7172,
657
+ "step": 900
658
+ },
659
+ {
660
+ "epoch": 2.4058162590879046,
661
+ "grad_norm": 2.338798999786377,
662
+ "learning_rate": 4.2181069958847744e-05,
663
+ "loss": 1.7209,
664
+ "step": 910
665
+ },
666
+ {
667
+ "epoch": 2.432253800396563,
668
+ "grad_norm": 2.1922528743743896,
669
+ "learning_rate": 4.2034097589653146e-05,
670
+ "loss": 1.7195,
671
+ "step": 920
672
+ },
673
+ {
674
+ "epoch": 2.4586913417052214,
675
+ "grad_norm": 1.7596076726913452,
676
+ "learning_rate": 4.1887125220458555e-05,
677
+ "loss": 1.7316,
678
+ "step": 930
679
+ },
680
+ {
681
+ "epoch": 2.48512888301388,
682
+ "grad_norm": 1.8625160455703735,
683
+ "learning_rate": 4.1740152851263964e-05,
684
+ "loss": 1.7224,
685
+ "step": 940
686
+ },
687
+ {
688
+ "epoch": 2.511566424322538,
689
+ "grad_norm": 2.2473673820495605,
690
+ "learning_rate": 4.159318048206937e-05,
691
+ "loss": 1.7198,
692
+ "step": 950
693
+ },
694
+ {
695
+ "epoch": 2.538003965631196,
696
+ "grad_norm": 1.8248813152313232,
697
+ "learning_rate": 4.144620811287478e-05,
698
+ "loss": 1.7246,
699
+ "step": 960
700
+ },
701
+ {
702
+ "epoch": 2.5644415069398545,
703
+ "grad_norm": 1.8088594675064087,
704
+ "learning_rate": 4.129923574368019e-05,
705
+ "loss": 1.707,
706
+ "step": 970
707
+ },
708
+ {
709
+ "epoch": 2.590879048248513,
710
+ "grad_norm": 1.818827748298645,
711
+ "learning_rate": 4.11522633744856e-05,
712
+ "loss": 1.7038,
713
+ "step": 980
714
+ },
715
+ {
716
+ "epoch": 2.6173165895571713,
717
+ "grad_norm": 1.716060757637024,
718
+ "learning_rate": 4.100529100529101e-05,
719
+ "loss": 1.7066,
720
+ "step": 990
721
+ },
722
+ {
723
+ "epoch": 2.6437541308658297,
724
+ "grad_norm": 1.7306545972824097,
725
+ "learning_rate": 4.085831863609642e-05,
726
+ "loss": 1.6767,
727
+ "step": 1000
728
+ },
729
+ {
730
+ "epoch": 2.6701916721744876,
731
+ "grad_norm": 1.7304011583328247,
732
+ "learning_rate": 4.071134626690182e-05,
733
+ "loss": 1.6969,
734
+ "step": 1010
735
+ },
736
+ {
737
+ "epoch": 2.696629213483146,
738
+ "grad_norm": 2.024599552154541,
739
+ "learning_rate": 4.056437389770723e-05,
740
+ "loss": 1.7181,
741
+ "step": 1020
742
+ },
743
+ {
744
+ "epoch": 2.7230667547918044,
745
+ "grad_norm": 2.146998643875122,
746
+ "learning_rate": 4.0417401528512645e-05,
747
+ "loss": 1.7311,
748
+ "step": 1030
749
+ },
750
+ {
751
+ "epoch": 2.7495042961004628,
752
+ "grad_norm": 1.935433030128479,
753
+ "learning_rate": 4.0270429159318054e-05,
754
+ "loss": 1.7194,
755
+ "step": 1040
756
+ },
757
+ {
758
+ "epoch": 2.775941837409121,
759
+ "grad_norm": 1.6351971626281738,
760
+ "learning_rate": 4.012345679012346e-05,
761
+ "loss": 1.7519,
762
+ "step": 1050
763
+ },
764
+ {
765
+ "epoch": 2.802379378717779,
766
+ "grad_norm": 1.971442461013794,
767
+ "learning_rate": 3.9976484420928865e-05,
768
+ "loss": 1.7377,
769
+ "step": 1060
770
+ },
771
+ {
772
+ "epoch": 2.8288169200264375,
773
+ "grad_norm": 1.8716827630996704,
774
+ "learning_rate": 3.9829512051734274e-05,
775
+ "loss": 1.7066,
776
+ "step": 1070
777
+ },
778
+ {
779
+ "epoch": 2.855254461335096,
780
+ "grad_norm": 1.6349095106124878,
781
+ "learning_rate": 3.968253968253968e-05,
782
+ "loss": 1.7,
783
+ "step": 1080
784
+ },
785
+ {
786
+ "epoch": 2.8816920026437542,
787
+ "grad_norm": 1.7663626670837402,
788
+ "learning_rate": 3.95355673133451e-05,
789
+ "loss": 1.6906,
790
+ "step": 1090
791
+ },
792
+ {
793
+ "epoch": 2.9081295439524126,
794
+ "grad_norm": 2.1080522537231445,
795
+ "learning_rate": 3.93885949441505e-05,
796
+ "loss": 1.7245,
797
+ "step": 1100
798
+ },
799
+ {
800
+ "epoch": 2.9345670852610706,
801
+ "grad_norm": 1.7439042329788208,
802
+ "learning_rate": 3.924162257495591e-05,
803
+ "loss": 1.6977,
804
+ "step": 1110
805
+ },
806
+ {
807
+ "epoch": 2.961004626569729,
808
+ "grad_norm": 1.6347275972366333,
809
+ "learning_rate": 3.909465020576132e-05,
810
+ "loss": 1.7225,
811
+ "step": 1120
812
+ },
813
+ {
814
+ "epoch": 2.9874421678783873,
815
+ "grad_norm": 2.7075507640838623,
816
+ "learning_rate": 3.894767783656673e-05,
817
+ "loss": 1.7123,
818
+ "step": 1130
819
+ },
820
+ {
821
+ "epoch": 2.9980171844018506,
822
+ "eval_accuracy": 0.30069161198794114,
823
+ "eval_loss": 1.6861532926559448,
824
+ "eval_runtime": 594.266,
825
+ "eval_samples_per_second": 47.445,
826
+ "eval_steps_per_second": 0.742,
827
+ "step": 1134
828
+ },
829
+ {
830
+ "epoch": 3.0138797091870457,
831
+ "grad_norm": 2.001418113708496,
832
+ "learning_rate": 3.8800705467372136e-05,
833
+ "loss": 1.7138,
834
+ "step": 1140
835
+ },
836
+ {
837
+ "epoch": 3.040317250495704,
838
+ "grad_norm": 1.5465753078460693,
839
+ "learning_rate": 3.8653733098177545e-05,
840
+ "loss": 1.6694,
841
+ "step": 1150
842
+ },
843
+ {
844
+ "epoch": 3.066754791804362,
845
+ "grad_norm": 2.0662121772766113,
846
+ "learning_rate": 3.8506760728982954e-05,
847
+ "loss": 1.6635,
848
+ "step": 1160
849
+ },
850
+ {
851
+ "epoch": 3.0931923331130204,
852
+ "grad_norm": 1.6004307270050049,
853
+ "learning_rate": 3.835978835978836e-05,
854
+ "loss": 1.6928,
855
+ "step": 1170
856
+ },
857
+ {
858
+ "epoch": 3.119629874421679,
859
+ "grad_norm": 2.330245018005371,
860
+ "learning_rate": 3.821281599059377e-05,
861
+ "loss": 1.6512,
862
+ "step": 1180
863
+ },
864
+ {
865
+ "epoch": 3.146067415730337,
866
+ "grad_norm": 2.421938896179199,
867
+ "learning_rate": 3.806584362139918e-05,
868
+ "loss": 1.6904,
869
+ "step": 1190
870
+ },
871
+ {
872
+ "epoch": 3.1725049570389956,
873
+ "grad_norm": 1.943880319595337,
874
+ "learning_rate": 3.791887125220458e-05,
875
+ "loss": 1.6807,
876
+ "step": 1200
877
+ },
878
+ {
879
+ "epoch": 3.1989424983476535,
880
+ "grad_norm": 1.9737601280212402,
881
+ "learning_rate": 3.777189888301e-05,
882
+ "loss": 1.6763,
883
+ "step": 1210
884
+ },
885
+ {
886
+ "epoch": 3.225380039656312,
887
+ "grad_norm": 1.5722044706344604,
888
+ "learning_rate": 3.762492651381541e-05,
889
+ "loss": 1.6449,
890
+ "step": 1220
891
+ },
892
+ {
893
+ "epoch": 3.2518175809649703,
894
+ "grad_norm": 1.8357861042022705,
895
+ "learning_rate": 3.7477954144620817e-05,
896
+ "loss": 1.6637,
897
+ "step": 1230
898
+ },
899
+ {
900
+ "epoch": 3.2782551222736287,
901
+ "grad_norm": 2.0803420543670654,
902
+ "learning_rate": 3.733098177542622e-05,
903
+ "loss": 1.6715,
904
+ "step": 1240
905
+ },
906
+ {
907
+ "epoch": 3.304692663582287,
908
+ "grad_norm": 1.9865127801895142,
909
+ "learning_rate": 3.718400940623163e-05,
910
+ "loss": 1.6841,
911
+ "step": 1250
912
+ },
913
+ {
914
+ "epoch": 3.331130204890945,
915
+ "grad_norm": 1.953385353088379,
916
+ "learning_rate": 3.7037037037037037e-05,
917
+ "loss": 1.6699,
918
+ "step": 1260
919
+ },
920
+ {
921
+ "epoch": 3.3575677461996034,
922
+ "grad_norm": 4.0031867027282715,
923
+ "learning_rate": 3.689006466784245e-05,
924
+ "loss": 1.6795,
925
+ "step": 1270
926
+ },
927
+ {
928
+ "epoch": 3.384005287508262,
929
+ "grad_norm": 2.2489538192749023,
930
+ "learning_rate": 3.6743092298647854e-05,
931
+ "loss": 1.6705,
932
+ "step": 1280
933
+ },
934
+ {
935
+ "epoch": 3.41044282881692,
936
+ "grad_norm": 1.7907918691635132,
937
+ "learning_rate": 3.659611992945326e-05,
938
+ "loss": 1.6754,
939
+ "step": 1290
940
+ },
941
+ {
942
+ "epoch": 3.4368803701255786,
943
+ "grad_norm": 1.7390344142913818,
944
+ "learning_rate": 3.644914756025867e-05,
945
+ "loss": 1.6621,
946
+ "step": 1300
947
+ },
948
+ {
949
+ "epoch": 3.4633179114342365,
950
+ "grad_norm": 1.6002901792526245,
951
+ "learning_rate": 3.630217519106408e-05,
952
+ "loss": 1.679,
953
+ "step": 1310
954
+ },
955
+ {
956
+ "epoch": 3.489755452742895,
957
+ "grad_norm": 1.9788098335266113,
958
+ "learning_rate": 3.615520282186949e-05,
959
+ "loss": 1.6633,
960
+ "step": 1320
961
+ },
962
+ {
963
+ "epoch": 3.5161929940515533,
964
+ "grad_norm": 2.1224050521850586,
965
+ "learning_rate": 3.60082304526749e-05,
966
+ "loss": 1.679,
967
+ "step": 1330
968
+ },
969
+ {
970
+ "epoch": 3.5426305353602117,
971
+ "grad_norm": 2.0570385456085205,
972
+ "learning_rate": 3.586125808348031e-05,
973
+ "loss": 1.6752,
974
+ "step": 1340
975
+ },
976
+ {
977
+ "epoch": 3.56906807666887,
978
+ "grad_norm": 2.158527374267578,
979
+ "learning_rate": 3.571428571428572e-05,
980
+ "loss": 1.6624,
981
+ "step": 1350
982
+ },
983
+ {
984
+ "epoch": 3.595505617977528,
985
+ "grad_norm": 2.440808057785034,
986
+ "learning_rate": 3.5567313345091126e-05,
987
+ "loss": 1.66,
988
+ "step": 1360
989
+ },
990
+ {
991
+ "epoch": 3.6219431592861864,
992
+ "grad_norm": 2.058363914489746,
993
+ "learning_rate": 3.5420340975896535e-05,
994
+ "loss": 1.6985,
995
+ "step": 1370
996
+ },
997
+ {
998
+ "epoch": 3.6483807005948448,
999
+ "grad_norm": 1.795248031616211,
1000
+ "learning_rate": 3.527336860670194e-05,
1001
+ "loss": 1.7037,
1002
+ "step": 1380
1003
+ },
1004
+ {
1005
+ "epoch": 3.6748182419035027,
1006
+ "grad_norm": 1.7252378463745117,
1007
+ "learning_rate": 3.512639623750735e-05,
1008
+ "loss": 1.6633,
1009
+ "step": 1390
1010
+ },
1011
+ {
1012
+ "epoch": 3.7012557832121615,
1013
+ "grad_norm": 1.8798094987869263,
1014
+ "learning_rate": 3.497942386831276e-05,
1015
+ "loss": 1.6881,
1016
+ "step": 1400
1017
+ },
1018
+ {
1019
+ "epoch": 3.7276933245208195,
1020
+ "grad_norm": 2.317443609237671,
1021
+ "learning_rate": 3.483245149911817e-05,
1022
+ "loss": 1.6916,
1023
+ "step": 1410
1024
+ },
1025
+ {
1026
+ "epoch": 3.754130865829478,
1027
+ "grad_norm": 2.2357330322265625,
1028
+ "learning_rate": 3.468547912992357e-05,
1029
+ "loss": 1.665,
1030
+ "step": 1420
1031
+ },
1032
+ {
1033
+ "epoch": 3.7805684071381362,
1034
+ "grad_norm": 1.8255904912948608,
1035
+ "learning_rate": 3.453850676072898e-05,
1036
+ "loss": 1.6839,
1037
+ "step": 1430
1038
+ },
1039
+ {
1040
+ "epoch": 3.807005948446794,
1041
+ "grad_norm": 1.95132315158844,
1042
+ "learning_rate": 3.439153439153439e-05,
1043
+ "loss": 1.6977,
1044
+ "step": 1440
1045
+ },
1046
+ {
1047
+ "epoch": 3.833443489755453,
1048
+ "grad_norm": 1.942299723625183,
1049
+ "learning_rate": 3.4244562022339806e-05,
1050
+ "loss": 1.6648,
1051
+ "step": 1450
1052
+ },
1053
+ {
1054
+ "epoch": 3.859881031064111,
1055
+ "grad_norm": 2.280670642852783,
1056
+ "learning_rate": 3.4097589653145215e-05,
1057
+ "loss": 1.6621,
1058
+ "step": 1460
1059
+ },
1060
+ {
1061
+ "epoch": 3.8863185723727693,
1062
+ "grad_norm": 2.0500681400299072,
1063
+ "learning_rate": 3.395061728395062e-05,
1064
+ "loss": 1.6395,
1065
+ "step": 1470
1066
+ },
1067
+ {
1068
+ "epoch": 3.9127561136814277,
1069
+ "grad_norm": 1.9235775470733643,
1070
+ "learning_rate": 3.3803644914756026e-05,
1071
+ "loss": 1.6907,
1072
+ "step": 1480
1073
+ },
1074
+ {
1075
+ "epoch": 3.9391936549900857,
1076
+ "grad_norm": 1.8194037675857544,
1077
+ "learning_rate": 3.3656672545561435e-05,
1078
+ "loss": 1.6793,
1079
+ "step": 1490
1080
+ },
1081
+ {
1082
+ "epoch": 3.965631196298744,
1083
+ "grad_norm": 2.047736167907715,
1084
+ "learning_rate": 3.3509700176366844e-05,
1085
+ "loss": 1.682,
1086
+ "step": 1500
1087
+ },
1088
+ {
1089
+ "epoch": 3.9920687376074024,
1090
+ "grad_norm": 2.2198452949523926,
1091
+ "learning_rate": 3.336272780717225e-05,
1092
+ "loss": 1.6607,
1093
+ "step": 1510
1094
+ },
1095
+ {
1096
+ "epoch": 4.0,
1097
+ "eval_accuracy": 0.3080688065259798,
1098
+ "eval_loss": 1.6822971105575562,
1099
+ "eval_runtime": 603.9647,
1100
+ "eval_samples_per_second": 46.683,
1101
+ "eval_steps_per_second": 0.73,
1102
+ "step": 1513
1103
+ },
1104
+ {
1105
+ "epoch": 4.01850627891606,
1106
+ "grad_norm": 1.416684627532959,
1107
+ "learning_rate": 3.321575543797766e-05,
1108
+ "loss": 1.6447,
1109
+ "step": 1520
1110
+ },
1111
+ {
1112
+ "epoch": 4.044943820224719,
1113
+ "grad_norm": 2.129450798034668,
1114
+ "learning_rate": 3.306878306878307e-05,
1115
+ "loss": 1.6303,
1116
+ "step": 1530
1117
+ },
1118
+ {
1119
+ "epoch": 4.071381361533377,
1120
+ "grad_norm": 1.7213213443756104,
1121
+ "learning_rate": 3.292181069958848e-05,
1122
+ "loss": 1.6126,
1123
+ "step": 1540
1124
+ },
1125
+ {
1126
+ "epoch": 4.097818902842036,
1127
+ "grad_norm": 2.0465450286865234,
1128
+ "learning_rate": 3.277483833039389e-05,
1129
+ "loss": 1.6351,
1130
+ "step": 1550
1131
+ },
1132
+ {
1133
+ "epoch": 4.124256444150694,
1134
+ "grad_norm": 2.259282350540161,
1135
+ "learning_rate": 3.262786596119929e-05,
1136
+ "loss": 1.6437,
1137
+ "step": 1560
1138
+ },
1139
+ {
1140
+ "epoch": 4.150693985459352,
1141
+ "grad_norm": 2.2626054286956787,
1142
+ "learning_rate": 3.24808935920047e-05,
1143
+ "loss": 1.6523,
1144
+ "step": 1570
1145
+ },
1146
+ {
1147
+ "epoch": 4.177131526768011,
1148
+ "grad_norm": 2.269216537475586,
1149
+ "learning_rate": 3.2333921222810116e-05,
1150
+ "loss": 1.6369,
1151
+ "step": 1580
1152
+ },
1153
+ {
1154
+ "epoch": 4.203569068076669,
1155
+ "grad_norm": 2.1457483768463135,
1156
+ "learning_rate": 3.2186948853615525e-05,
1157
+ "loss": 1.6211,
1158
+ "step": 1590
1159
+ },
1160
+ {
1161
+ "epoch": 4.2300066093853275,
1162
+ "grad_norm": 1.9659942388534546,
1163
+ "learning_rate": 3.2039976484420934e-05,
1164
+ "loss": 1.639,
1165
+ "step": 1600
1166
+ },
1167
+ {
1168
+ "epoch": 4.256444150693985,
1169
+ "grad_norm": 2.447136402130127,
1170
+ "learning_rate": 3.1893004115226336e-05,
1171
+ "loss": 1.6548,
1172
+ "step": 1610
1173
+ },
1174
+ {
1175
+ "epoch": 4.282881692002643,
1176
+ "grad_norm": 2.2011499404907227,
1177
+ "learning_rate": 3.1746031746031745e-05,
1178
+ "loss": 1.6042,
1179
+ "step": 1620
1180
+ },
1181
+ {
1182
+ "epoch": 4.309319233311302,
1183
+ "grad_norm": 1.9171946048736572,
1184
+ "learning_rate": 3.1599059376837154e-05,
1185
+ "loss": 1.6273,
1186
+ "step": 1630
1187
+ },
1188
+ {
1189
+ "epoch": 4.33575677461996,
1190
+ "grad_norm": 1.9522180557250977,
1191
+ "learning_rate": 3.145208700764257e-05,
1192
+ "loss": 1.6479,
1193
+ "step": 1640
1194
+ },
1195
+ {
1196
+ "epoch": 4.362194315928619,
1197
+ "grad_norm": 1.8249017000198364,
1198
+ "learning_rate": 3.130511463844797e-05,
1199
+ "loss": 1.633,
1200
+ "step": 1650
1201
+ },
1202
+ {
1203
+ "epoch": 4.388631857237277,
1204
+ "grad_norm": 2.093296766281128,
1205
+ "learning_rate": 3.115814226925338e-05,
1206
+ "loss": 1.6519,
1207
+ "step": 1660
1208
+ },
1209
+ {
1210
+ "epoch": 4.415069398545935,
1211
+ "grad_norm": 2.218247652053833,
1212
+ "learning_rate": 3.101116990005879e-05,
1213
+ "loss": 1.6478,
1214
+ "step": 1670
1215
+ },
1216
+ {
1217
+ "epoch": 4.441506939854594,
1218
+ "grad_norm": 1.9361261129379272,
1219
+ "learning_rate": 3.08641975308642e-05,
1220
+ "loss": 1.6473,
1221
+ "step": 1680
1222
+ },
1223
+ {
1224
+ "epoch": 4.467944481163252,
1225
+ "grad_norm": 1.9687649011611938,
1226
+ "learning_rate": 3.071722516166961e-05,
1227
+ "loss": 1.6263,
1228
+ "step": 1690
1229
+ },
1230
+ {
1231
+ "epoch": 4.49438202247191,
1232
+ "grad_norm": 1.8882747888565063,
1233
+ "learning_rate": 3.0570252792475016e-05,
1234
+ "loss": 1.6531,
1235
+ "step": 1700
1236
+ },
1237
+ {
1238
+ "epoch": 4.520819563780568,
1239
+ "grad_norm": 2.0464086532592773,
1240
+ "learning_rate": 3.0423280423280425e-05,
1241
+ "loss": 1.6481,
1242
+ "step": 1710
1243
+ },
1244
+ {
1245
+ "epoch": 4.547257105089226,
1246
+ "grad_norm": 2.1452088356018066,
1247
+ "learning_rate": 3.0276308054085834e-05,
1248
+ "loss": 1.6126,
1249
+ "step": 1720
1250
+ },
1251
+ {
1252
+ "epoch": 4.573694646397885,
1253
+ "grad_norm": 1.7108988761901855,
1254
+ "learning_rate": 3.012933568489124e-05,
1255
+ "loss": 1.6365,
1256
+ "step": 1730
1257
+ },
1258
+ {
1259
+ "epoch": 4.600132187706543,
1260
+ "grad_norm": 1.7739413976669312,
1261
+ "learning_rate": 2.998236331569665e-05,
1262
+ "loss": 1.6373,
1263
+ "step": 1740
1264
+ },
1265
+ {
1266
+ "epoch": 4.626569729015202,
1267
+ "grad_norm": 1.9384492635726929,
1268
+ "learning_rate": 2.9835390946502057e-05,
1269
+ "loss": 1.6212,
1270
+ "step": 1750
1271
+ },
1272
+ {
1273
+ "epoch": 4.65300727032386,
1274
+ "grad_norm": 2.0029444694519043,
1275
+ "learning_rate": 2.968841857730747e-05,
1276
+ "loss": 1.6185,
1277
+ "step": 1760
1278
+ },
1279
+ {
1280
+ "epoch": 4.679444811632518,
1281
+ "grad_norm": 2.2173917293548584,
1282
+ "learning_rate": 2.954144620811288e-05,
1283
+ "loss": 1.6441,
1284
+ "step": 1770
1285
+ },
1286
+ {
1287
+ "epoch": 4.705882352941177,
1288
+ "grad_norm": 2.4079232215881348,
1289
+ "learning_rate": 2.9394473838918284e-05,
1290
+ "loss": 1.6441,
1291
+ "step": 1780
1292
+ },
1293
+ {
1294
+ "epoch": 4.732319894249835,
1295
+ "grad_norm": 2.1316912174224854,
1296
+ "learning_rate": 2.9247501469723693e-05,
1297
+ "loss": 1.6464,
1298
+ "step": 1790
1299
+ },
1300
+ {
1301
+ "epoch": 4.758757435558493,
1302
+ "grad_norm": 2.2552201747894287,
1303
+ "learning_rate": 2.91005291005291e-05,
1304
+ "loss": 1.6321,
1305
+ "step": 1800
1306
+ },
1307
+ {
1308
+ "epoch": 4.785194976867151,
1309
+ "grad_norm": 2.0181891918182373,
1310
+ "learning_rate": 2.8953556731334508e-05,
1311
+ "loss": 1.6584,
1312
+ "step": 1810
1313
+ },
1314
+ {
1315
+ "epoch": 4.811632518175809,
1316
+ "grad_norm": 1.912116527557373,
1317
+ "learning_rate": 2.880658436213992e-05,
1318
+ "loss": 1.6618,
1319
+ "step": 1820
1320
+ },
1321
+ {
1322
+ "epoch": 4.838070059484468,
1323
+ "grad_norm": 1.916869878768921,
1324
+ "learning_rate": 2.865961199294533e-05,
1325
+ "loss": 1.6719,
1326
+ "step": 1830
1327
+ },
1328
+ {
1329
+ "epoch": 4.864507600793126,
1330
+ "grad_norm": 2.0317203998565674,
1331
+ "learning_rate": 2.8512639623750738e-05,
1332
+ "loss": 1.6258,
1333
+ "step": 1840
1334
+ },
1335
+ {
1336
+ "epoch": 4.890945142101785,
1337
+ "grad_norm": 2.080411195755005,
1338
+ "learning_rate": 2.8365667254556143e-05,
1339
+ "loss": 1.6518,
1340
+ "step": 1850
1341
+ },
1342
+ {
1343
+ "epoch": 4.917382683410443,
1344
+ "grad_norm": 1.9478580951690674,
1345
+ "learning_rate": 2.8218694885361552e-05,
1346
+ "loss": 1.614,
1347
+ "step": 1860
1348
+ },
1349
+ {
1350
+ "epoch": 4.943820224719101,
1351
+ "grad_norm": 1.889710545539856,
1352
+ "learning_rate": 2.8071722516166958e-05,
1353
+ "loss": 1.6221,
1354
+ "step": 1870
1355
+ },
1356
+ {
1357
+ "epoch": 4.97025776602776,
1358
+ "grad_norm": 2.255192756652832,
1359
+ "learning_rate": 2.7924750146972374e-05,
1360
+ "loss": 1.6271,
1361
+ "step": 1880
1362
+ },
1363
+ {
1364
+ "epoch": 4.9966953073364175,
1365
+ "grad_norm": 2.026352643966675,
1366
+ "learning_rate": 2.777777777777778e-05,
1367
+ "loss": 1.6188,
1368
+ "step": 1890
1369
+ },
1370
+ {
1371
+ "epoch": 4.999339061467284,
1372
+ "eval_accuracy": 0.3107997871963114,
1373
+ "eval_loss": 1.6907494068145752,
1374
+ "eval_runtime": 587.6675,
1375
+ "eval_samples_per_second": 47.978,
1376
+ "eval_steps_per_second": 0.75,
1377
+ "step": 1891
1378
+ },
1379
+ {
1380
+ "epoch": 5.023132848645076,
1381
+ "grad_norm": 1.7152390480041504,
1382
+ "learning_rate": 2.7630805408583188e-05,
1383
+ "loss": 1.6156,
1384
+ "step": 1900
1385
+ },
1386
+ {
1387
+ "epoch": 5.049570389953734,
1388
+ "grad_norm": 2.2232937812805176,
1389
+ "learning_rate": 2.7483833039388597e-05,
1390
+ "loss": 1.618,
1391
+ "step": 1910
1392
+ },
1393
+ {
1394
+ "epoch": 5.076007931262392,
1395
+ "grad_norm": 1.9228154420852661,
1396
+ "learning_rate": 2.7336860670194003e-05,
1397
+ "loss": 1.6137,
1398
+ "step": 1920
1399
+ },
1400
+ {
1401
+ "epoch": 5.102445472571051,
1402
+ "grad_norm": 1.9937798976898193,
1403
+ "learning_rate": 2.718988830099941e-05,
1404
+ "loss": 1.6139,
1405
+ "step": 1930
1406
+ },
1407
+ {
1408
+ "epoch": 5.128883013879709,
1409
+ "grad_norm": 2.4573240280151367,
1410
+ "learning_rate": 2.7042915931804824e-05,
1411
+ "loss": 1.5976,
1412
+ "step": 1940
1413
+ },
1414
+ {
1415
+ "epoch": 5.155320555188368,
1416
+ "grad_norm": 2.107956647872925,
1417
+ "learning_rate": 2.6895943562610233e-05,
1418
+ "loss": 1.6046,
1419
+ "step": 1950
1420
+ },
1421
+ {
1422
+ "epoch": 5.181758096497026,
1423
+ "grad_norm": 1.9604800939559937,
1424
+ "learning_rate": 2.6748971193415638e-05,
1425
+ "loss": 1.6001,
1426
+ "step": 1960
1427
+ },
1428
+ {
1429
+ "epoch": 5.208195637805684,
1430
+ "grad_norm": 2.3065364360809326,
1431
+ "learning_rate": 2.6601998824221047e-05,
1432
+ "loss": 1.5818,
1433
+ "step": 1970
1434
+ },
1435
+ {
1436
+ "epoch": 5.234633179114343,
1437
+ "grad_norm": 2.3068606853485107,
1438
+ "learning_rate": 2.6455026455026456e-05,
1439
+ "loss": 1.629,
1440
+ "step": 1980
1441
+ },
1442
+ {
1443
+ "epoch": 5.2610707204230005,
1444
+ "grad_norm": 1.9791010618209839,
1445
+ "learning_rate": 2.630805408583186e-05,
1446
+ "loss": 1.5831,
1447
+ "step": 1990
1448
+ },
1449
+ {
1450
+ "epoch": 5.287508261731659,
1451
+ "grad_norm": 2.128002643585205,
1452
+ "learning_rate": 2.6161081716637274e-05,
1453
+ "loss": 1.6208,
1454
+ "step": 2000
1455
+ },
1456
+ {
1457
+ "epoch": 5.313945803040317,
1458
+ "grad_norm": 1.999395728111267,
1459
+ "learning_rate": 2.6014109347442683e-05,
1460
+ "loss": 1.5791,
1461
+ "step": 2010
1462
+ },
1463
+ {
1464
+ "epoch": 5.340383344348975,
1465
+ "grad_norm": 2.4271738529205322,
1466
+ "learning_rate": 2.5867136978248092e-05,
1467
+ "loss": 1.6145,
1468
+ "step": 2020
1469
+ },
1470
+ {
1471
+ "epoch": 5.366820885657634,
1472
+ "grad_norm": 2.0685036182403564,
1473
+ "learning_rate": 2.5720164609053497e-05,
1474
+ "loss": 1.6013,
1475
+ "step": 2030
1476
+ },
1477
+ {
1478
+ "epoch": 5.393258426966292,
1479
+ "grad_norm": 2.3582699298858643,
1480
+ "learning_rate": 2.5573192239858906e-05,
1481
+ "loss": 1.5878,
1482
+ "step": 2040
1483
+ },
1484
+ {
1485
+ "epoch": 5.419695968274951,
1486
+ "grad_norm": 2.0049946308135986,
1487
+ "learning_rate": 2.5426219870664315e-05,
1488
+ "loss": 1.6114,
1489
+ "step": 2050
1490
+ },
1491
+ {
1492
+ "epoch": 5.446133509583609,
1493
+ "grad_norm": 1.902218222618103,
1494
+ "learning_rate": 2.5279247501469728e-05,
1495
+ "loss": 1.5934,
1496
+ "step": 2060
1497
+ },
1498
+ {
1499
+ "epoch": 5.472571050892267,
1500
+ "grad_norm": 2.1240365505218506,
1501
+ "learning_rate": 2.5132275132275137e-05,
1502
+ "loss": 1.561,
1503
+ "step": 2070
1504
+ },
1505
+ {
1506
+ "epoch": 5.4990085922009255,
1507
+ "grad_norm": 2.0185890197753906,
1508
+ "learning_rate": 2.4985302763080542e-05,
1509
+ "loss": 1.5912,
1510
+ "step": 2080
1511
+ },
1512
+ {
1513
+ "epoch": 5.5254461335095835,
1514
+ "grad_norm": 2.5916292667388916,
1515
+ "learning_rate": 2.483833039388595e-05,
1516
+ "loss": 1.6176,
1517
+ "step": 2090
1518
+ },
1519
+ {
1520
+ "epoch": 5.551883674818242,
1521
+ "grad_norm": 2.4469070434570312,
1522
+ "learning_rate": 2.4691358024691357e-05,
1523
+ "loss": 1.6373,
1524
+ "step": 2100
1525
+ },
1526
+ {
1527
+ "epoch": 5.5783212161269,
1528
+ "grad_norm": 1.9966007471084595,
1529
+ "learning_rate": 2.454438565549677e-05,
1530
+ "loss": 1.6049,
1531
+ "step": 2110
1532
+ },
1533
+ {
1534
+ "epoch": 5.604758757435558,
1535
+ "grad_norm": 2.5291779041290283,
1536
+ "learning_rate": 2.4397413286302174e-05,
1537
+ "loss": 1.595,
1538
+ "step": 2120
1539
+ },
1540
+ {
1541
+ "epoch": 5.631196298744217,
1542
+ "grad_norm": 2.031622886657715,
1543
+ "learning_rate": 2.4250440917107583e-05,
1544
+ "loss": 1.6069,
1545
+ "step": 2130
1546
+ },
1547
+ {
1548
+ "epoch": 5.657633840052875,
1549
+ "grad_norm": 1.9951566457748413,
1550
+ "learning_rate": 2.4103468547912996e-05,
1551
+ "loss": 1.583,
1552
+ "step": 2140
1553
+ },
1554
+ {
1555
+ "epoch": 5.684071381361534,
1556
+ "grad_norm": 2.609140634536743,
1557
+ "learning_rate": 2.39564961787184e-05,
1558
+ "loss": 1.5821,
1559
+ "step": 2150
1560
+ },
1561
+ {
1562
+ "epoch": 5.710508922670192,
1563
+ "grad_norm": 2.389698028564453,
1564
+ "learning_rate": 2.380952380952381e-05,
1565
+ "loss": 1.6114,
1566
+ "step": 2160
1567
+ },
1568
+ {
1569
+ "epoch": 5.73694646397885,
1570
+ "grad_norm": 2.4471077919006348,
1571
+ "learning_rate": 2.366255144032922e-05,
1572
+ "loss": 1.5919,
1573
+ "step": 2170
1574
+ },
1575
+ {
1576
+ "epoch": 5.7633840052875085,
1577
+ "grad_norm": 2.0070338249206543,
1578
+ "learning_rate": 2.3515579071134628e-05,
1579
+ "loss": 1.5963,
1580
+ "step": 2180
1581
+ },
1582
+ {
1583
+ "epoch": 5.789821546596166,
1584
+ "grad_norm": 2.1658718585968018,
1585
+ "learning_rate": 2.3368606701940034e-05,
1586
+ "loss": 1.6038,
1587
+ "step": 2190
1588
+ },
1589
+ {
1590
+ "epoch": 5.816259087904825,
1591
+ "grad_norm": 2.08571457862854,
1592
+ "learning_rate": 2.3221634332745446e-05,
1593
+ "loss": 1.5962,
1594
+ "step": 2200
1595
+ },
1596
+ {
1597
+ "epoch": 5.842696629213483,
1598
+ "grad_norm": 2.118481397628784,
1599
+ "learning_rate": 2.3074661963550855e-05,
1600
+ "loss": 1.5945,
1601
+ "step": 2210
1602
+ },
1603
+ {
1604
+ "epoch": 5.869134170522141,
1605
+ "grad_norm": 2.135956048965454,
1606
+ "learning_rate": 2.292768959435626e-05,
1607
+ "loss": 1.5785,
1608
+ "step": 2220
1609
+ },
1610
+ {
1611
+ "epoch": 5.8955717118308,
1612
+ "grad_norm": 2.1876909732818604,
1613
+ "learning_rate": 2.2780717225161673e-05,
1614
+ "loss": 1.5769,
1615
+ "step": 2230
1616
+ },
1617
+ {
1618
+ "epoch": 5.922009253139458,
1619
+ "grad_norm": 2.121159315109253,
1620
+ "learning_rate": 2.2633744855967078e-05,
1621
+ "loss": 1.5869,
1622
+ "step": 2240
1623
+ },
1624
+ {
1625
+ "epoch": 5.948446794448117,
1626
+ "grad_norm": 2.0962424278259277,
1627
+ "learning_rate": 2.2486772486772487e-05,
1628
+ "loss": 1.6203,
1629
+ "step": 2250
1630
+ },
1631
+ {
1632
+ "epoch": 5.974884335756775,
1633
+ "grad_norm": 1.9238280057907104,
1634
+ "learning_rate": 2.2339800117577896e-05,
1635
+ "loss": 1.6009,
1636
+ "step": 2260
1637
+ },
1638
+ {
1639
+ "epoch": 5.998678122934567,
1640
+ "eval_accuracy": 0.3150203936868239,
1641
+ "eval_loss": 1.677284598350525,
1642
+ "eval_runtime": 825.1974,
1643
+ "eval_samples_per_second": 34.168,
1644
+ "eval_steps_per_second": 0.534,
1645
+ "step": 2269
1646
+ },
1647
+ {
1648
+ "epoch": 6.001321877065433,
1649
+ "grad_norm": 3.6250228881835938,
1650
+ "learning_rate": 2.2192827748383305e-05,
1651
+ "loss": 1.6085,
1652
+ "step": 2270
1653
+ },
1654
+ {
1655
+ "epoch": 6.0277594183740915,
1656
+ "grad_norm": 2.6421561241149902,
1657
+ "learning_rate": 2.2045855379188714e-05,
1658
+ "loss": 1.5612,
1659
+ "step": 2280
1660
+ },
1661
+ {
1662
+ "epoch": 6.054196959682749,
1663
+ "grad_norm": 1.9773502349853516,
1664
+ "learning_rate": 2.1898883009994123e-05,
1665
+ "loss": 1.558,
1666
+ "step": 2290
1667
+ },
1668
+ {
1669
+ "epoch": 6.080634500991408,
1670
+ "grad_norm": 2.200873374938965,
1671
+ "learning_rate": 2.1751910640799532e-05,
1672
+ "loss": 1.5701,
1673
+ "step": 2300
1674
+ },
1675
+ {
1676
+ "epoch": 6.107072042300066,
1677
+ "grad_norm": 2.346799373626709,
1678
+ "learning_rate": 2.1604938271604937e-05,
1679
+ "loss": 1.5605,
1680
+ "step": 2310
1681
+ },
1682
+ {
1683
+ "epoch": 6.133509583608724,
1684
+ "grad_norm": 2.3931312561035156,
1685
+ "learning_rate": 2.145796590241035e-05,
1686
+ "loss": 1.5498,
1687
+ "step": 2320
1688
+ },
1689
+ {
1690
+ "epoch": 6.159947124917383,
1691
+ "grad_norm": 2.662827968597412,
1692
+ "learning_rate": 2.1310993533215755e-05,
1693
+ "loss": 1.5782,
1694
+ "step": 2330
1695
+ },
1696
+ {
1697
+ "epoch": 6.186384666226041,
1698
+ "grad_norm": 2.2345123291015625,
1699
+ "learning_rate": 2.1164021164021164e-05,
1700
+ "loss": 1.5672,
1701
+ "step": 2340
1702
+ },
1703
+ {
1704
+ "epoch": 6.2128222075347,
1705
+ "grad_norm": 2.2579140663146973,
1706
+ "learning_rate": 2.1017048794826573e-05,
1707
+ "loss": 1.5323,
1708
+ "step": 2350
1709
+ },
1710
+ {
1711
+ "epoch": 6.239259748843358,
1712
+ "grad_norm": 2.2183802127838135,
1713
+ "learning_rate": 2.0870076425631982e-05,
1714
+ "loss": 1.5499,
1715
+ "step": 2360
1716
+ },
1717
+ {
1718
+ "epoch": 6.265697290152016,
1719
+ "grad_norm": 2.227612257003784,
1720
+ "learning_rate": 2.072310405643739e-05,
1721
+ "loss": 1.5522,
1722
+ "step": 2370
1723
+ },
1724
+ {
1725
+ "epoch": 6.292134831460674,
1726
+ "grad_norm": 2.7729430198669434,
1727
+ "learning_rate": 2.05761316872428e-05,
1728
+ "loss": 1.5614,
1729
+ "step": 2380
1730
+ },
1731
+ {
1732
+ "epoch": 6.318572372769332,
1733
+ "grad_norm": 2.166227340698242,
1734
+ "learning_rate": 2.042915931804821e-05,
1735
+ "loss": 1.5358,
1736
+ "step": 2390
1737
+ },
1738
+ {
1739
+ "epoch": 6.345009914077991,
1740
+ "grad_norm": 2.1141440868377686,
1741
+ "learning_rate": 2.0282186948853614e-05,
1742
+ "loss": 1.554,
1743
+ "step": 2400
1744
+ },
1745
+ {
1746
+ "epoch": 6.371447455386649,
1747
+ "grad_norm": 2.1744093894958496,
1748
+ "learning_rate": 2.0135214579659027e-05,
1749
+ "loss": 1.5583,
1750
+ "step": 2410
1751
+ },
1752
+ {
1753
+ "epoch": 6.397884996695307,
1754
+ "grad_norm": 2.0957577228546143,
1755
+ "learning_rate": 1.9988242210464432e-05,
1756
+ "loss": 1.5573,
1757
+ "step": 2420
1758
+ },
1759
+ {
1760
+ "epoch": 6.424322538003966,
1761
+ "grad_norm": 2.6140244007110596,
1762
+ "learning_rate": 1.984126984126984e-05,
1763
+ "loss": 1.5597,
1764
+ "step": 2430
1765
+ },
1766
+ {
1767
+ "epoch": 6.450760079312624,
1768
+ "grad_norm": 2.0459976196289062,
1769
+ "learning_rate": 1.969429747207525e-05,
1770
+ "loss": 1.5439,
1771
+ "step": 2440
1772
+ },
1773
+ {
1774
+ "epoch": 6.477197620621283,
1775
+ "grad_norm": 2.2509853839874268,
1776
+ "learning_rate": 1.954732510288066e-05,
1777
+ "loss": 1.5478,
1778
+ "step": 2450
1779
+ },
1780
+ {
1781
+ "epoch": 6.503635161929941,
1782
+ "grad_norm": 2.0103518962860107,
1783
+ "learning_rate": 1.9400352733686068e-05,
1784
+ "loss": 1.555,
1785
+ "step": 2460
1786
+ },
1787
+ {
1788
+ "epoch": 6.530072703238599,
1789
+ "grad_norm": 2.5956430435180664,
1790
+ "learning_rate": 1.9253380364491477e-05,
1791
+ "loss": 1.5604,
1792
+ "step": 2470
1793
+ },
1794
+ {
1795
+ "epoch": 6.556510244547257,
1796
+ "grad_norm": 2.4196279048919678,
1797
+ "learning_rate": 1.9106407995296886e-05,
1798
+ "loss": 1.554,
1799
+ "step": 2480
1800
+ },
1801
+ {
1802
+ "epoch": 6.582947785855915,
1803
+ "grad_norm": 2.2963523864746094,
1804
+ "learning_rate": 1.895943562610229e-05,
1805
+ "loss": 1.5345,
1806
+ "step": 2490
1807
+ },
1808
+ {
1809
+ "epoch": 6.609385327164574,
1810
+ "grad_norm": 2.4753236770629883,
1811
+ "learning_rate": 1.8812463256907704e-05,
1812
+ "loss": 1.5698,
1813
+ "step": 2500
1814
+ },
1815
+ {
1816
+ "epoch": 6.635822868473232,
1817
+ "grad_norm": 2.251317024230957,
1818
+ "learning_rate": 1.866549088771311e-05,
1819
+ "loss": 1.5625,
1820
+ "step": 2510
1821
+ },
1822
+ {
1823
+ "epoch": 6.66226040978189,
1824
+ "grad_norm": 2.4385430812835693,
1825
+ "learning_rate": 1.8518518518518518e-05,
1826
+ "loss": 1.5649,
1827
+ "step": 2520
1828
+ },
1829
+ {
1830
+ "epoch": 6.688697951090549,
1831
+ "grad_norm": 2.1294140815734863,
1832
+ "learning_rate": 1.8371546149323927e-05,
1833
+ "loss": 1.5214,
1834
+ "step": 2530
1835
+ },
1836
+ {
1837
+ "epoch": 6.715135492399207,
1838
+ "grad_norm": 2.234025478363037,
1839
+ "learning_rate": 1.8224573780129336e-05,
1840
+ "loss": 1.5928,
1841
+ "step": 2540
1842
+ },
1843
+ {
1844
+ "epoch": 6.741573033707866,
1845
+ "grad_norm": 2.6647934913635254,
1846
+ "learning_rate": 1.8077601410934745e-05,
1847
+ "loss": 1.5519,
1848
+ "step": 2550
1849
+ },
1850
+ {
1851
+ "epoch": 6.768010575016524,
1852
+ "grad_norm": 2.2540123462677,
1853
+ "learning_rate": 1.7930629041740154e-05,
1854
+ "loss": 1.5642,
1855
+ "step": 2560
1856
+ },
1857
+ {
1858
+ "epoch": 6.7944481163251815,
1859
+ "grad_norm": 2.701176404953003,
1860
+ "learning_rate": 1.7783656672545563e-05,
1861
+ "loss": 1.5733,
1862
+ "step": 2570
1863
+ },
1864
+ {
1865
+ "epoch": 6.82088565763384,
1866
+ "grad_norm": 2.6911590099334717,
1867
+ "learning_rate": 1.763668430335097e-05,
1868
+ "loss": 1.5651,
1869
+ "step": 2580
1870
+ },
1871
+ {
1872
+ "epoch": 6.847323198942498,
1873
+ "grad_norm": 2.3102941513061523,
1874
+ "learning_rate": 1.748971193415638e-05,
1875
+ "loss": 1.5716,
1876
+ "step": 2590
1877
+ },
1878
+ {
1879
+ "epoch": 6.873760740251157,
1880
+ "grad_norm": 2.4161031246185303,
1881
+ "learning_rate": 1.7342739564961786e-05,
1882
+ "loss": 1.5558,
1883
+ "step": 2600
1884
+ },
1885
+ {
1886
+ "epoch": 6.900198281559815,
1887
+ "grad_norm": 2.4778518676757812,
1888
+ "learning_rate": 1.7195767195767195e-05,
1889
+ "loss": 1.5766,
1890
+ "step": 2610
1891
+ },
1892
+ {
1893
+ "epoch": 6.926635822868473,
1894
+ "grad_norm": 2.188422918319702,
1895
+ "learning_rate": 1.7048794826572608e-05,
1896
+ "loss": 1.5593,
1897
+ "step": 2620
1898
+ },
1899
+ {
1900
+ "epoch": 6.953073364177132,
1901
+ "grad_norm": 2.146476984024048,
1902
+ "learning_rate": 1.6901822457378013e-05,
1903
+ "loss": 1.6021,
1904
+ "step": 2630
1905
+ },
1906
+ {
1907
+ "epoch": 6.97951090548579,
1908
+ "grad_norm": 2.322460174560547,
1909
+ "learning_rate": 1.6754850088183422e-05,
1910
+ "loss": 1.5485,
1911
+ "step": 2640
1912
+ },
1913
+ {
1914
+ "epoch": 6.998017184401851,
1915
+ "eval_accuracy": 0.31984394396169535,
1916
+ "eval_loss": 1.6719752550125122,
1917
+ "eval_runtime": 687.5385,
1918
+ "eval_samples_per_second": 41.009,
1919
+ "eval_steps_per_second": 0.641,
1920
+ "step": 2647
1921
+ },
1922
+ {
1923
+ "epoch": 7.005948446794448,
1924
+ "grad_norm": 2.46777606010437,
1925
+ "learning_rate": 1.660787771898883e-05,
1926
+ "loss": 1.5634,
1927
+ "step": 2650
1928
+ },
1929
+ {
1930
+ "epoch": 7.0323859881031066,
1931
+ "grad_norm": 2.2213032245635986,
1932
+ "learning_rate": 1.646090534979424e-05,
1933
+ "loss": 1.5148,
1934
+ "step": 2660
1935
+ },
1936
+ {
1937
+ "epoch": 7.0588235294117645,
1938
+ "grad_norm": 2.3514413833618164,
1939
+ "learning_rate": 1.6313932980599646e-05,
1940
+ "loss": 1.4979,
1941
+ "step": 2670
1942
+ },
1943
+ {
1944
+ "epoch": 7.085261070720423,
1945
+ "grad_norm": 2.6042823791503906,
1946
+ "learning_rate": 1.6166960611405058e-05,
1947
+ "loss": 1.5066,
1948
+ "step": 2680
1949
+ },
1950
+ {
1951
+ "epoch": 7.111698612029081,
1952
+ "grad_norm": 2.6137919425964355,
1953
+ "learning_rate": 1.6019988242210467e-05,
1954
+ "loss": 1.5394,
1955
+ "step": 2690
1956
+ },
1957
+ {
1958
+ "epoch": 7.138136153337739,
1959
+ "grad_norm": 2.8564794063568115,
1960
+ "learning_rate": 1.5873015873015872e-05,
1961
+ "loss": 1.5452,
1962
+ "step": 2700
1963
+ },
1964
+ {
1965
+ "epoch": 7.164573694646398,
1966
+ "grad_norm": 2.685819149017334,
1967
+ "learning_rate": 1.5726043503821285e-05,
1968
+ "loss": 1.5514,
1969
+ "step": 2710
1970
+ },
1971
+ {
1972
+ "epoch": 7.191011235955056,
1973
+ "grad_norm": 2.6131200790405273,
1974
+ "learning_rate": 1.557907113462669e-05,
1975
+ "loss": 1.5208,
1976
+ "step": 2720
1977
+ },
1978
+ {
1979
+ "epoch": 7.217448777263715,
1980
+ "grad_norm": 2.341998338699341,
1981
+ "learning_rate": 1.54320987654321e-05,
1982
+ "loss": 1.525,
1983
+ "step": 2730
1984
+ },
1985
+ {
1986
+ "epoch": 7.243886318572373,
1987
+ "grad_norm": 2.3170387744903564,
1988
+ "learning_rate": 1.5285126396237508e-05,
1989
+ "loss": 1.5363,
1990
+ "step": 2740
1991
+ },
1992
+ {
1993
+ "epoch": 7.270323859881031,
1994
+ "grad_norm": 2.702817440032959,
1995
+ "learning_rate": 1.5138154027042917e-05,
1996
+ "loss": 1.5114,
1997
+ "step": 2750
1998
+ },
1999
+ {
2000
+ "epoch": 7.2967614011896895,
2001
+ "grad_norm": 2.3331921100616455,
2002
+ "learning_rate": 1.4991181657848324e-05,
2003
+ "loss": 1.5078,
2004
+ "step": 2760
2005
+ },
2006
+ {
2007
+ "epoch": 7.3231989424983475,
2008
+ "grad_norm": 3.1344053745269775,
2009
+ "learning_rate": 1.4844209288653735e-05,
2010
+ "loss": 1.5251,
2011
+ "step": 2770
2012
+ },
2013
+ {
2014
+ "epoch": 7.349636483807006,
2015
+ "grad_norm": 2.3613967895507812,
2016
+ "learning_rate": 1.4697236919459142e-05,
2017
+ "loss": 1.5444,
2018
+ "step": 2780
2019
+ },
2020
+ {
2021
+ "epoch": 7.376074025115664,
2022
+ "grad_norm": 2.8878893852233887,
2023
+ "learning_rate": 1.455026455026455e-05,
2024
+ "loss": 1.5078,
2025
+ "step": 2790
2026
+ },
2027
+ {
2028
+ "epoch": 7.402511566424322,
2029
+ "grad_norm": 2.3623602390289307,
2030
+ "learning_rate": 1.440329218106996e-05,
2031
+ "loss": 1.5497,
2032
+ "step": 2800
2033
+ },
2034
+ {
2035
+ "epoch": 7.428949107732981,
2036
+ "grad_norm": 2.708502769470215,
2037
+ "learning_rate": 1.4256319811875369e-05,
2038
+ "loss": 1.5249,
2039
+ "step": 2810
2040
+ },
2041
+ {
2042
+ "epoch": 7.455386649041639,
2043
+ "grad_norm": 2.521937847137451,
2044
+ "learning_rate": 1.4109347442680776e-05,
2045
+ "loss": 1.5556,
2046
+ "step": 2820
2047
+ },
2048
+ {
2049
+ "epoch": 7.481824190350298,
2050
+ "grad_norm": 2.3380672931671143,
2051
+ "learning_rate": 1.3962375073486187e-05,
2052
+ "loss": 1.5295,
2053
+ "step": 2830
2054
+ },
2055
+ {
2056
+ "epoch": 7.508261731658956,
2057
+ "grad_norm": 2.609221935272217,
2058
+ "learning_rate": 1.3815402704291594e-05,
2059
+ "loss": 1.5173,
2060
+ "step": 2840
2061
+ },
2062
+ {
2063
+ "epoch": 7.534699272967614,
2064
+ "grad_norm": 2.692725658416748,
2065
+ "learning_rate": 1.3668430335097001e-05,
2066
+ "loss": 1.5256,
2067
+ "step": 2850
2068
+ },
2069
+ {
2070
+ "epoch": 7.5611368142762725,
2071
+ "grad_norm": 2.8412363529205322,
2072
+ "learning_rate": 1.3521457965902412e-05,
2073
+ "loss": 1.5213,
2074
+ "step": 2860
2075
+ },
2076
+ {
2077
+ "epoch": 7.58757435558493,
2078
+ "grad_norm": 2.349609375,
2079
+ "learning_rate": 1.3374485596707819e-05,
2080
+ "loss": 1.5245,
2081
+ "step": 2870
2082
+ },
2083
+ {
2084
+ "epoch": 7.614011896893589,
2085
+ "grad_norm": 2.3491156101226807,
2086
+ "learning_rate": 1.3227513227513228e-05,
2087
+ "loss": 1.5397,
2088
+ "step": 2880
2089
+ },
2090
+ {
2091
+ "epoch": 7.640449438202247,
2092
+ "grad_norm": 2.2094242572784424,
2093
+ "learning_rate": 1.3080540858318637e-05,
2094
+ "loss": 1.5225,
2095
+ "step": 2890
2096
+ },
2097
+ {
2098
+ "epoch": 7.666886979510905,
2099
+ "grad_norm": 2.7071151733398438,
2100
+ "learning_rate": 1.2933568489124046e-05,
2101
+ "loss": 1.5397,
2102
+ "step": 2900
2103
+ },
2104
+ {
2105
+ "epoch": 7.693324520819564,
2106
+ "grad_norm": 2.4890148639678955,
2107
+ "learning_rate": 1.2786596119929453e-05,
2108
+ "loss": 1.5188,
2109
+ "step": 2910
2110
+ },
2111
+ {
2112
+ "epoch": 7.719762062128222,
2113
+ "grad_norm": 2.607391595840454,
2114
+ "learning_rate": 1.2639623750734864e-05,
2115
+ "loss": 1.5243,
2116
+ "step": 2920
2117
+ },
2118
+ {
2119
+ "epoch": 7.746199603436881,
2120
+ "grad_norm": 2.255218505859375,
2121
+ "learning_rate": 1.2492651381540271e-05,
2122
+ "loss": 1.4923,
2123
+ "step": 2930
2124
+ },
2125
+ {
2126
+ "epoch": 7.772637144745539,
2127
+ "grad_norm": 2.095276355743408,
2128
+ "learning_rate": 1.2345679012345678e-05,
2129
+ "loss": 1.5154,
2130
+ "step": 2940
2131
+ },
2132
+ {
2133
+ "epoch": 7.799074686054197,
2134
+ "grad_norm": 2.3163740634918213,
2135
+ "learning_rate": 1.2198706643151087e-05,
2136
+ "loss": 1.5273,
2137
+ "step": 2950
2138
+ },
2139
+ {
2140
+ "epoch": 7.8255122273628555,
2141
+ "grad_norm": 2.356632709503174,
2142
+ "learning_rate": 1.2051734273956498e-05,
2143
+ "loss": 1.5434,
2144
+ "step": 2960
2145
+ },
2146
+ {
2147
+ "epoch": 7.851949768671513,
2148
+ "grad_norm": 2.6374380588531494,
2149
+ "learning_rate": 1.1904761904761905e-05,
2150
+ "loss": 1.5443,
2151
+ "step": 2970
2152
+ },
2153
+ {
2154
+ "epoch": 7.878387309980172,
2155
+ "grad_norm": 2.324978828430176,
2156
+ "learning_rate": 1.1757789535567314e-05,
2157
+ "loss": 1.5277,
2158
+ "step": 2980
2159
+ },
2160
+ {
2161
+ "epoch": 7.90482485128883,
2162
+ "grad_norm": 2.326624870300293,
2163
+ "learning_rate": 1.1610817166372723e-05,
2164
+ "loss": 1.5392,
2165
+ "step": 2990
2166
+ },
2167
+ {
2168
+ "epoch": 7.931262392597488,
2169
+ "grad_norm": 2.3055641651153564,
2170
+ "learning_rate": 1.146384479717813e-05,
2171
+ "loss": 1.5304,
2172
+ "step": 3000
2173
+ },
2174
+ {
2175
+ "epoch": 7.957699933906147,
2176
+ "grad_norm": 2.652165174484253,
2177
+ "learning_rate": 1.1316872427983539e-05,
2178
+ "loss": 1.4925,
2179
+ "step": 3010
2180
+ },
2181
+ {
2182
+ "epoch": 7.984137475214805,
2183
+ "grad_norm": 2.6190438270568848,
2184
+ "learning_rate": 1.1169900058788948e-05,
2185
+ "loss": 1.5133,
2186
+ "step": 3020
2187
+ },
2188
+ {
2189
+ "epoch": 8.0,
2190
+ "eval_accuracy": 0.3199148785245611,
2191
+ "eval_loss": 1.6810516119003296,
2192
+ "eval_runtime": 655.8832,
2193
+ "eval_samples_per_second": 42.988,
2194
+ "eval_steps_per_second": 0.672,
2195
+ "step": 3026
2196
+ },
2197
+ {
2198
+ "epoch": 8.010575016523463,
2199
+ "grad_norm": 2.4034554958343506,
2200
+ "learning_rate": 1.1022927689594357e-05,
2201
+ "loss": 1.4946,
2202
+ "step": 3030
2203
+ },
2204
+ {
2205
+ "epoch": 8.03701255783212,
2206
+ "grad_norm": 2.3963074684143066,
2207
+ "learning_rate": 1.0875955320399766e-05,
2208
+ "loss": 1.4916,
2209
+ "step": 3040
2210
+ },
2211
+ {
2212
+ "epoch": 8.06345009914078,
2213
+ "grad_norm": 2.228837728500366,
2214
+ "learning_rate": 1.0728982951205175e-05,
2215
+ "loss": 1.4966,
2216
+ "step": 3050
2217
+ },
2218
+ {
2219
+ "epoch": 8.089887640449438,
2220
+ "grad_norm": 2.434828042984009,
2221
+ "learning_rate": 1.0582010582010582e-05,
2222
+ "loss": 1.476,
2223
+ "step": 3060
2224
+ },
2225
+ {
2226
+ "epoch": 8.116325181758096,
2227
+ "grad_norm": 3.0598857402801514,
2228
+ "learning_rate": 1.0435038212815991e-05,
2229
+ "loss": 1.509,
2230
+ "step": 3070
2231
+ },
2232
+ {
2233
+ "epoch": 8.142762723066754,
2234
+ "grad_norm": 2.3611867427825928,
2235
+ "learning_rate": 1.02880658436214e-05,
2236
+ "loss": 1.5034,
2237
+ "step": 3080
2238
+ },
2239
+ {
2240
+ "epoch": 8.169200264375412,
2241
+ "grad_norm": 2.5760085582733154,
2242
+ "learning_rate": 1.0141093474426807e-05,
2243
+ "loss": 1.4722,
2244
+ "step": 3090
2245
+ },
2246
+ {
2247
+ "epoch": 8.195637805684072,
2248
+ "grad_norm": 2.364185333251953,
2249
+ "learning_rate": 9.994121105232216e-06,
2250
+ "loss": 1.503,
2251
+ "step": 3100
2252
+ },
2253
+ {
2254
+ "epoch": 8.22207534699273,
2255
+ "grad_norm": 2.9226784706115723,
2256
+ "learning_rate": 9.847148736037625e-06,
2257
+ "loss": 1.5095,
2258
+ "step": 3110
2259
+ },
2260
+ {
2261
+ "epoch": 8.248512888301388,
2262
+ "grad_norm": 2.599740505218506,
2263
+ "learning_rate": 9.700176366843034e-06,
2264
+ "loss": 1.5113,
2265
+ "step": 3120
2266
+ },
2267
+ {
2268
+ "epoch": 8.274950429610046,
2269
+ "grad_norm": 2.4345662593841553,
2270
+ "learning_rate": 9.553203997648443e-06,
2271
+ "loss": 1.4932,
2272
+ "step": 3130
2273
+ },
2274
+ {
2275
+ "epoch": 8.301387970918704,
2276
+ "grad_norm": 2.821202516555786,
2277
+ "learning_rate": 9.406231628453852e-06,
2278
+ "loss": 1.4873,
2279
+ "step": 3140
2280
+ },
2281
+ {
2282
+ "epoch": 8.327825512227363,
2283
+ "grad_norm": 2.3733325004577637,
2284
+ "learning_rate": 9.259259259259259e-06,
2285
+ "loss": 1.4714,
2286
+ "step": 3150
2287
+ },
2288
+ {
2289
+ "epoch": 8.354263053536021,
2290
+ "grad_norm": 2.5024616718292236,
2291
+ "learning_rate": 9.112286890064668e-06,
2292
+ "loss": 1.4943,
2293
+ "step": 3160
2294
+ },
2295
+ {
2296
+ "epoch": 8.38070059484468,
2297
+ "grad_norm": 2.6110079288482666,
2298
+ "learning_rate": 8.965314520870077e-06,
2299
+ "loss": 1.4889,
2300
+ "step": 3170
2301
+ },
2302
+ {
2303
+ "epoch": 8.407138136153337,
2304
+ "grad_norm": 2.432016611099243,
2305
+ "learning_rate": 8.818342151675484e-06,
2306
+ "loss": 1.498,
2307
+ "step": 3180
2308
+ },
2309
+ {
2310
+ "epoch": 8.433575677461995,
2311
+ "grad_norm": 2.565856695175171,
2312
+ "learning_rate": 8.671369782480893e-06,
2313
+ "loss": 1.5063,
2314
+ "step": 3190
2315
+ },
2316
+ {
2317
+ "epoch": 8.460013218770655,
2318
+ "grad_norm": 2.3665668964385986,
2319
+ "learning_rate": 8.524397413286304e-06,
2320
+ "loss": 1.4984,
2321
+ "step": 3200
2322
+ },
2323
+ {
2324
+ "epoch": 8.486450760079313,
2325
+ "grad_norm": 2.3754770755767822,
2326
+ "learning_rate": 8.377425044091711e-06,
2327
+ "loss": 1.4884,
2328
+ "step": 3210
2329
+ },
2330
+ {
2331
+ "epoch": 8.51288830138797,
2332
+ "grad_norm": 2.3668394088745117,
2333
+ "learning_rate": 8.23045267489712e-06,
2334
+ "loss": 1.5134,
2335
+ "step": 3220
2336
+ },
2337
+ {
2338
+ "epoch": 8.539325842696629,
2339
+ "grad_norm": 2.5414023399353027,
2340
+ "learning_rate": 8.083480305702529e-06,
2341
+ "loss": 1.4838,
2342
+ "step": 3230
2343
+ },
2344
+ {
2345
+ "epoch": 8.565763384005287,
2346
+ "grad_norm": 3.5699350833892822,
2347
+ "learning_rate": 7.936507936507936e-06,
2348
+ "loss": 1.5143,
2349
+ "step": 3240
2350
+ },
2351
+ {
2352
+ "epoch": 8.592200925313946,
2353
+ "grad_norm": 2.8063623905181885,
2354
+ "learning_rate": 7.789535567313345e-06,
2355
+ "loss": 1.496,
2356
+ "step": 3250
2357
+ },
2358
+ {
2359
+ "epoch": 8.618638466622604,
2360
+ "grad_norm": 2.588224172592163,
2361
+ "learning_rate": 7.642563198118754e-06,
2362
+ "loss": 1.4791,
2363
+ "step": 3260
2364
+ },
2365
+ {
2366
+ "epoch": 8.645076007931262,
2367
+ "grad_norm": 2.4756646156311035,
2368
+ "learning_rate": 7.495590828924162e-06,
2369
+ "loss": 1.5166,
2370
+ "step": 3270
2371
+ },
2372
+ {
2373
+ "epoch": 8.67151354923992,
2374
+ "grad_norm": 2.575335741043091,
2375
+ "learning_rate": 7.348618459729571e-06,
2376
+ "loss": 1.4806,
2377
+ "step": 3280
2378
+ },
2379
+ {
2380
+ "epoch": 8.697951090548578,
2381
+ "grad_norm": 2.5106089115142822,
2382
+ "learning_rate": 7.20164609053498e-06,
2383
+ "loss": 1.517,
2384
+ "step": 3290
2385
+ },
2386
+ {
2387
+ "epoch": 8.724388631857238,
2388
+ "grad_norm": 2.419461727142334,
2389
+ "learning_rate": 7.054673721340388e-06,
2390
+ "loss": 1.5046,
2391
+ "step": 3300
2392
+ },
2393
+ {
2394
+ "epoch": 8.750826173165896,
2395
+ "grad_norm": 2.6011338233947754,
2396
+ "learning_rate": 6.907701352145797e-06,
2397
+ "loss": 1.51,
2398
+ "step": 3310
2399
+ },
2400
+ {
2401
+ "epoch": 8.777263714474554,
2402
+ "grad_norm": 3.790093183517456,
2403
+ "learning_rate": 6.760728982951206e-06,
2404
+ "loss": 1.4779,
2405
+ "step": 3320
2406
+ },
2407
+ {
2408
+ "epoch": 8.803701255783212,
2409
+ "grad_norm": 2.2048144340515137,
2410
+ "learning_rate": 6.613756613756614e-06,
2411
+ "loss": 1.5019,
2412
+ "step": 3330
2413
+ },
2414
+ {
2415
+ "epoch": 8.83013879709187,
2416
+ "grad_norm": 2.383060932159424,
2417
+ "learning_rate": 6.466784244562023e-06,
2418
+ "loss": 1.4841,
2419
+ "step": 3340
2420
+ },
2421
+ {
2422
+ "epoch": 8.85657633840053,
2423
+ "grad_norm": 2.647857666015625,
2424
+ "learning_rate": 6.319811875367432e-06,
2425
+ "loss": 1.486,
2426
+ "step": 3350
2427
+ },
2428
+ {
2429
+ "epoch": 8.883013879709187,
2430
+ "grad_norm": 2.314814805984497,
2431
+ "learning_rate": 6.172839506172839e-06,
2432
+ "loss": 1.4847,
2433
+ "step": 3360
2434
+ },
2435
+ {
2436
+ "epoch": 8.909451421017845,
2437
+ "grad_norm": 2.521620988845825,
2438
+ "learning_rate": 6.025867136978249e-06,
2439
+ "loss": 1.4983,
2440
+ "step": 3370
2441
+ },
2442
+ {
2443
+ "epoch": 8.935888962326503,
2444
+ "grad_norm": 2.53841495513916,
2445
+ "learning_rate": 5.878894767783657e-06,
2446
+ "loss": 1.5085,
2447
+ "step": 3380
2448
+ },
2449
+ {
2450
+ "epoch": 8.962326503635161,
2451
+ "grad_norm": 3.0474183559417725,
2452
+ "learning_rate": 5.731922398589065e-06,
2453
+ "loss": 1.5159,
2454
+ "step": 3390
2455
+ },
2456
+ {
2457
+ "epoch": 8.98876404494382,
2458
+ "grad_norm": 2.3993120193481445,
2459
+ "learning_rate": 5.584950029394474e-06,
2460
+ "loss": 1.5001,
2461
+ "step": 3400
2462
+ },
2463
+ {
2464
+ "epoch": 8.999339061467284,
2465
+ "eval_accuracy": 0.32094342968611456,
2466
+ "eval_loss": 1.6820588111877441,
2467
+ "eval_runtime": 644.8259,
2468
+ "eval_samples_per_second": 43.725,
2469
+ "eval_steps_per_second": 0.684,
2470
+ "step": 3404
2471
+ },
2472
+ {
2473
+ "epoch": 9.015201586252479,
2474
+ "grad_norm": 2.3832859992980957,
2475
+ "learning_rate": 5.437977660199883e-06,
2476
+ "loss": 1.4755,
2477
+ "step": 3410
2478
+ },
2479
+ {
2480
+ "epoch": 9.041639127561137,
2481
+ "grad_norm": 2.3615214824676514,
2482
+ "learning_rate": 5.291005291005291e-06,
2483
+ "loss": 1.4527,
2484
+ "step": 3420
2485
+ },
2486
+ {
2487
+ "epoch": 9.068076668869795,
2488
+ "grad_norm": 2.408276319503784,
2489
+ "learning_rate": 5.1440329218107e-06,
2490
+ "loss": 1.5007,
2491
+ "step": 3430
2492
+ },
2493
+ {
2494
+ "epoch": 9.094514210178453,
2495
+ "grad_norm": 3.026998281478882,
2496
+ "learning_rate": 4.997060552616108e-06,
2497
+ "loss": 1.4759,
2498
+ "step": 3440
2499
+ },
2500
+ {
2501
+ "epoch": 9.120951751487112,
2502
+ "grad_norm": 2.7928555011749268,
2503
+ "learning_rate": 4.850088183421517e-06,
2504
+ "loss": 1.5078,
2505
+ "step": 3450
2506
+ },
2507
+ {
2508
+ "epoch": 9.14738929279577,
2509
+ "grad_norm": 2.3969271183013916,
2510
+ "learning_rate": 4.703115814226926e-06,
2511
+ "loss": 1.4786,
2512
+ "step": 3460
2513
+ },
2514
+ {
2515
+ "epoch": 9.173826834104428,
2516
+ "grad_norm": 2.5589447021484375,
2517
+ "learning_rate": 4.556143445032334e-06,
2518
+ "loss": 1.4857,
2519
+ "step": 3470
2520
+ },
2521
+ {
2522
+ "epoch": 9.200264375413086,
2523
+ "grad_norm": 2.3872599601745605,
2524
+ "learning_rate": 4.409171075837742e-06,
2525
+ "loss": 1.5131,
2526
+ "step": 3480
2527
+ },
2528
+ {
2529
+ "epoch": 9.226701916721744,
2530
+ "grad_norm": 3.470109462738037,
2531
+ "learning_rate": 4.262198706643152e-06,
2532
+ "loss": 1.4878,
2533
+ "step": 3490
2534
+ },
2535
+ {
2536
+ "epoch": 9.253139458030404,
2537
+ "grad_norm": 2.545947790145874,
2538
+ "learning_rate": 4.11522633744856e-06,
2539
+ "loss": 1.4973,
2540
+ "step": 3500
2541
+ },
2542
+ {
2543
+ "epoch": 9.279576999339062,
2544
+ "grad_norm": 2.713286876678467,
2545
+ "learning_rate": 3.968253968253968e-06,
2546
+ "loss": 1.4771,
2547
+ "step": 3510
2548
+ },
2549
+ {
2550
+ "epoch": 9.30601454064772,
2551
+ "grad_norm": 2.5866544246673584,
2552
+ "learning_rate": 3.821281599059377e-06,
2553
+ "loss": 1.4987,
2554
+ "step": 3520
2555
+ },
2556
+ {
2557
+ "epoch": 9.332452081956378,
2558
+ "grad_norm": 2.608093023300171,
2559
+ "learning_rate": 3.6743092298647855e-06,
2560
+ "loss": 1.4855,
2561
+ "step": 3530
2562
+ },
2563
+ {
2564
+ "epoch": 9.358889623265036,
2565
+ "grad_norm": 2.363469362258911,
2566
+ "learning_rate": 3.527336860670194e-06,
2567
+ "loss": 1.4903,
2568
+ "step": 3540
2569
+ },
2570
+ {
2571
+ "epoch": 9.385327164573695,
2572
+ "grad_norm": 2.4162986278533936,
2573
+ "learning_rate": 3.380364491475603e-06,
2574
+ "loss": 1.4944,
2575
+ "step": 3550
2576
+ },
2577
+ {
2578
+ "epoch": 9.411764705882353,
2579
+ "grad_norm": 2.531153440475464,
2580
+ "learning_rate": 3.2333921222810115e-06,
2581
+ "loss": 1.4974,
2582
+ "step": 3560
2583
+ },
2584
+ {
2585
+ "epoch": 9.438202247191011,
2586
+ "grad_norm": 2.674409866333008,
2587
+ "learning_rate": 3.0864197530864196e-06,
2588
+ "loss": 1.4544,
2589
+ "step": 3570
2590
+ },
2591
+ {
2592
+ "epoch": 9.46463978849967,
2593
+ "grad_norm": 2.8534953594207764,
2594
+ "learning_rate": 2.9394473838918285e-06,
2595
+ "loss": 1.5179,
2596
+ "step": 3580
2597
+ },
2598
+ {
2599
+ "epoch": 9.491077329808327,
2600
+ "grad_norm": 2.373920202255249,
2601
+ "learning_rate": 2.792475014697237e-06,
2602
+ "loss": 1.487,
2603
+ "step": 3590
2604
+ },
2605
+ {
2606
+ "epoch": 9.517514871116987,
2607
+ "grad_norm": 2.327130079269409,
2608
+ "learning_rate": 2.6455026455026455e-06,
2609
+ "loss": 1.464,
2610
+ "step": 3600
2611
+ },
2612
+ {
2613
+ "epoch": 9.543952412425645,
2614
+ "grad_norm": 2.580538749694824,
2615
+ "learning_rate": 2.498530276308054e-06,
2616
+ "loss": 1.474,
2617
+ "step": 3610
2618
+ },
2619
+ {
2620
+ "epoch": 9.570389953734303,
2621
+ "grad_norm": 2.5735127925872803,
2622
+ "learning_rate": 2.351557907113463e-06,
2623
+ "loss": 1.4443,
2624
+ "step": 3620
2625
+ },
2626
+ {
2627
+ "epoch": 9.59682749504296,
2628
+ "grad_norm": 2.6845741271972656,
2629
+ "learning_rate": 2.204585537918871e-06,
2630
+ "loss": 1.4728,
2631
+ "step": 3630
2632
+ },
2633
+ {
2634
+ "epoch": 9.623265036351619,
2635
+ "grad_norm": 2.7224180698394775,
2636
+ "learning_rate": 2.05761316872428e-06,
2637
+ "loss": 1.4737,
2638
+ "step": 3640
2639
+ },
2640
+ {
2641
+ "epoch": 9.649702577660278,
2642
+ "grad_norm": 2.6456120014190674,
2643
+ "learning_rate": 1.9106407995296885e-06,
2644
+ "loss": 1.4572,
2645
+ "step": 3650
2646
+ },
2647
+ {
2648
+ "epoch": 9.676140118968936,
2649
+ "grad_norm": 2.3827290534973145,
2650
+ "learning_rate": 1.763668430335097e-06,
2651
+ "loss": 1.4906,
2652
+ "step": 3660
2653
+ },
2654
+ {
2655
+ "epoch": 9.702577660277594,
2656
+ "grad_norm": 2.7378439903259277,
2657
+ "learning_rate": 1.6166960611405057e-06,
2658
+ "loss": 1.4844,
2659
+ "step": 3670
2660
+ },
2661
+ {
2662
+ "epoch": 9.729015201586252,
2663
+ "grad_norm": 2.330042839050293,
2664
+ "learning_rate": 1.4697236919459143e-06,
2665
+ "loss": 1.4552,
2666
+ "step": 3680
2667
+ },
2668
+ {
2669
+ "epoch": 9.75545274289491,
2670
+ "grad_norm": 2.9327447414398193,
2671
+ "learning_rate": 1.3227513227513228e-06,
2672
+ "loss": 1.4699,
2673
+ "step": 3690
2674
+ },
2675
+ {
2676
+ "epoch": 9.78189028420357,
2677
+ "grad_norm": 3.050356149673462,
2678
+ "learning_rate": 1.1757789535567315e-06,
2679
+ "loss": 1.4831,
2680
+ "step": 3700
2681
+ },
2682
+ {
2683
+ "epoch": 9.808327825512228,
2684
+ "grad_norm": 2.769808053970337,
2685
+ "learning_rate": 1.02880658436214e-06,
2686
+ "loss": 1.4886,
2687
+ "step": 3710
2688
+ },
2689
+ {
2690
+ "epoch": 9.834765366820886,
2691
+ "grad_norm": 2.5177111625671387,
2692
+ "learning_rate": 8.818342151675485e-07,
2693
+ "loss": 1.5074,
2694
+ "step": 3720
2695
+ },
2696
+ {
2697
+ "epoch": 9.861202908129544,
2698
+ "grad_norm": 2.4207041263580322,
2699
+ "learning_rate": 7.348618459729571e-07,
2700
+ "loss": 1.4792,
2701
+ "step": 3730
2702
+ },
2703
+ {
2704
+ "epoch": 9.887640449438202,
2705
+ "grad_norm": 2.842231512069702,
2706
+ "learning_rate": 5.878894767783657e-07,
2707
+ "loss": 1.4696,
2708
+ "step": 3740
2709
+ },
2710
+ {
2711
+ "epoch": 9.914077990746861,
2712
+ "grad_norm": 2.7768394947052,
2713
+ "learning_rate": 4.4091710758377425e-07,
2714
+ "loss": 1.4642,
2715
+ "step": 3750
2716
+ },
2717
+ {
2718
+ "epoch": 9.94051553205552,
2719
+ "grad_norm": 2.7544069290161133,
2720
+ "learning_rate": 2.9394473838918287e-07,
2721
+ "loss": 1.4824,
2722
+ "step": 3760
2723
+ },
2724
+ {
2725
+ "epoch": 9.966953073364177,
2726
+ "grad_norm": 2.5182807445526123,
2727
+ "learning_rate": 1.4697236919459144e-07,
2728
+ "loss": 1.4768,
2729
+ "step": 3770
2730
+ },
2731
+ {
2732
+ "epoch": 9.993390614672835,
2733
+ "grad_norm": 2.31466007232666,
2734
+ "learning_rate": 0.0,
2735
+ "loss": 1.4303,
2736
+ "step": 3780
2737
+ },
2738
+ {
2739
+ "epoch": 9.993390614672835,
2740
+ "eval_accuracy": 0.3192410001773364,
2741
+ "eval_loss": 1.6835154294967651,
2742
+ "eval_runtime": 626.4861,
2743
+ "eval_samples_per_second": 45.005,
2744
+ "eval_steps_per_second": 0.704,
2745
+ "step": 3780
2746
+ },
2747
+ {
2748
+ "epoch": 9.993390614672835,
2749
+ "step": 3780,
2750
+ "total_flos": 7.511248143138256e+19,
2751
+ "train_loss": 1.6334815527396227,
2752
+ "train_runtime": 44633.0248,
2753
+ "train_samples_per_second": 21.683,
2754
+ "train_steps_per_second": 0.085
2755
+ }
2756
+ ],
2757
+ "logging_steps": 10,
2758
+ "max_steps": 3780,
2759
+ "num_input_tokens_seen": 0,
2760
+ "num_train_epochs": 10,
2761
+ "save_steps": 500,
2762
+ "total_flos": 7.511248143138256e+19,
2763
+ "train_batch_size": 64,
2764
+ "trial_name": null,
2765
+ "trial_params": null
2766
+ }
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e43f953c4995f0fce2826474ef73df1b439a2f78b1fb21ce14cd3a03dd7ab196
3
- size 5048
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:20a69f58503433efa0516ad7a50800dcc65855d77ad90ea2cf82f5696584f14e
3
+ size 4603