radic2682 commited on
Commit
e58c828
1 Parent(s): 410cf8f

End of training

Browse files
README.md CHANGED
@@ -3,6 +3,8 @@ license: apache-2.0
3
  base_model: google/bigbird-roberta-large
4
  tags:
5
  - generated_from_trainer
 
 
6
  model-index:
7
  - name: bigBird-large-fine-tuning-squad-B16R3_bias
8
  results: []
@@ -14,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
14
  [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/radic2682-unist/bigBird-large-fine-tuning-squad-b16r3_bias/runs/y5gikiz9)
15
  # bigBird-large-fine-tuning-squad-B16R3_bias
16
 
17
- This model is a fine-tuned version of [google/bigbird-roberta-large](https://huggingface.co/google/bigbird-roberta-large) on an unknown dataset.
18
 
19
  ## Model description
20
 
 
3
  base_model: google/bigbird-roberta-large
4
  tags:
5
  - generated_from_trainer
6
+ datasets:
7
+ - squad
8
  model-index:
9
  - name: bigBird-large-fine-tuning-squad-B16R3_bias
10
  results: []
 
16
  [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/radic2682-unist/bigBird-large-fine-tuning-squad-b16r3_bias/runs/y5gikiz9)
17
  # bigBird-large-fine-tuning-squad-B16R3_bias
18
 
19
+ This model is a fine-tuned version of [google/bigbird-roberta-large](https://huggingface.co/google/bigbird-roberta-large) on the squad dataset.
20
 
21
  ## Model description
22
 
all_results.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_exact_match": 85.65752128666036,
4
+ "eval_f1": 92.27662721260018,
5
+ "eval_runtime": 416.0977,
6
+ "eval_samples": 10570,
7
+ "eval_samples_per_second": 25.403,
8
+ "eval_steps_per_second": 1.589,
9
+ "total_flos": 2.006738209660207e+18,
10
+ "train_loss": 0.6596583454358523,
11
+ "train_runtime": 64667.9553,
12
+ "train_samples": 87599,
13
+ "train_samples_per_second": 4.064,
14
+ "train_steps_per_second": 0.339
15
+ }
eval_nbest_predictions.json CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f4c55a449bfe7926912f4108019a52274551a299f9356c2341fea1bc92a3198e
3
- size 45993360
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:35732dc2a21aca542606884dd5795c93efb41e2d80ab33a828d873d7104620c5
3
+ size 45976193
eval_predictions.json CHANGED
The diff for this file is too large to render. See raw diff
 
eval_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "eval_exact_match": 85.65752128666036,
4
+ "eval_f1": 92.27662721260018,
5
+ "eval_runtime": 416.0977,
6
+ "eval_samples": 10570,
7
+ "eval_samples_per_second": 25.403,
8
+ "eval_steps_per_second": 1.589
9
+ }
log/events.out.tfevents.1723515089.isl-gpu4.29132.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ca97da55559ede401695a37add84b369d550f1c0f591bdac3a586dda887e6843
3
+ size 418
train_results.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 3.0,
3
+ "total_flos": 2.006738209660207e+18,
4
+ "train_loss": 0.6596583454358523,
5
+ "train_runtime": 64667.9553,
6
+ "train_samples": 87599,
7
+ "train_samples_per_second": 4.064,
8
+ "train_steps_per_second": 0.339
9
+ }
trainer_state.json ADDED
@@ -0,0 +1,3297 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 3.0,
5
+ "eval_steps": 1000,
6
+ "global_step": 21900,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.00684931506849315,
13
+ "grad_norm": 323.0462646484375,
14
+ "learning_rate": 2.9943835616438356e-05,
15
+ "loss": 5.0367,
16
+ "step": 50
17
+ },
18
+ {
19
+ "epoch": 0.0136986301369863,
20
+ "grad_norm": 77.02570343017578,
21
+ "learning_rate": 2.987808219178082e-05,
22
+ "loss": 3.1287,
23
+ "step": 100
24
+ },
25
+ {
26
+ "epoch": 0.02054794520547945,
27
+ "grad_norm": 33.8292236328125,
28
+ "learning_rate": 2.980958904109589e-05,
29
+ "loss": 2.247,
30
+ "step": 150
31
+ },
32
+ {
33
+ "epoch": 0.0273972602739726,
34
+ "grad_norm": 45.22611618041992,
35
+ "learning_rate": 2.974109589041096e-05,
36
+ "loss": 1.8313,
37
+ "step": 200
38
+ },
39
+ {
40
+ "epoch": 0.03424657534246575,
41
+ "grad_norm": 40.95551681518555,
42
+ "learning_rate": 2.9672602739726026e-05,
43
+ "loss": 1.7307,
44
+ "step": 250
45
+ },
46
+ {
47
+ "epoch": 0.0410958904109589,
48
+ "grad_norm": 132.5420684814453,
49
+ "learning_rate": 2.9604109589041095e-05,
50
+ "loss": 1.71,
51
+ "step": 300
52
+ },
53
+ {
54
+ "epoch": 0.04794520547945205,
55
+ "grad_norm": 46.58742141723633,
56
+ "learning_rate": 2.9535616438356165e-05,
57
+ "loss": 1.5229,
58
+ "step": 350
59
+ },
60
+ {
61
+ "epoch": 0.0547945205479452,
62
+ "grad_norm": 60.270259857177734,
63
+ "learning_rate": 2.9467123287671234e-05,
64
+ "loss": 1.4451,
65
+ "step": 400
66
+ },
67
+ {
68
+ "epoch": 0.06164383561643835,
69
+ "grad_norm": 36.05481719970703,
70
+ "learning_rate": 2.93986301369863e-05,
71
+ "loss": 1.4609,
72
+ "step": 450
73
+ },
74
+ {
75
+ "epoch": 0.0684931506849315,
76
+ "grad_norm": 20.183435440063477,
77
+ "learning_rate": 2.933013698630137e-05,
78
+ "loss": 1.5096,
79
+ "step": 500
80
+ },
81
+ {
82
+ "epoch": 0.07534246575342465,
83
+ "grad_norm": 31.55844497680664,
84
+ "learning_rate": 2.926164383561644e-05,
85
+ "loss": 1.4921,
86
+ "step": 550
87
+ },
88
+ {
89
+ "epoch": 0.0821917808219178,
90
+ "grad_norm": 26.991056442260742,
91
+ "learning_rate": 2.919315068493151e-05,
92
+ "loss": 1.3397,
93
+ "step": 600
94
+ },
95
+ {
96
+ "epoch": 0.08904109589041095,
97
+ "grad_norm": 22.860877990722656,
98
+ "learning_rate": 2.9124657534246575e-05,
99
+ "loss": 1.2389,
100
+ "step": 650
101
+ },
102
+ {
103
+ "epoch": 0.0958904109589041,
104
+ "grad_norm": 34.633026123046875,
105
+ "learning_rate": 2.9056164383561644e-05,
106
+ "loss": 1.182,
107
+ "step": 700
108
+ },
109
+ {
110
+ "epoch": 0.10273972602739725,
111
+ "grad_norm": 29.118816375732422,
112
+ "learning_rate": 2.8987671232876714e-05,
113
+ "loss": 1.3518,
114
+ "step": 750
115
+ },
116
+ {
117
+ "epoch": 0.1095890410958904,
118
+ "grad_norm": 72.57406616210938,
119
+ "learning_rate": 2.8919178082191783e-05,
120
+ "loss": 1.3301,
121
+ "step": 800
122
+ },
123
+ {
124
+ "epoch": 0.11643835616438356,
125
+ "grad_norm": 19.362930297851562,
126
+ "learning_rate": 2.885068493150685e-05,
127
+ "loss": 1.2366,
128
+ "step": 850
129
+ },
130
+ {
131
+ "epoch": 0.1232876712328767,
132
+ "grad_norm": 34.462242126464844,
133
+ "learning_rate": 2.878219178082192e-05,
134
+ "loss": 1.3056,
135
+ "step": 900
136
+ },
137
+ {
138
+ "epoch": 0.13013698630136986,
139
+ "grad_norm": 20.275766372680664,
140
+ "learning_rate": 2.871369863013699e-05,
141
+ "loss": 1.2415,
142
+ "step": 950
143
+ },
144
+ {
145
+ "epoch": 0.136986301369863,
146
+ "grad_norm": 34.678558349609375,
147
+ "learning_rate": 2.8645205479452058e-05,
148
+ "loss": 1.2008,
149
+ "step": 1000
150
+ },
151
+ {
152
+ "epoch": 0.136986301369863,
153
+ "eval_exact_match": 74.98580889309366,
154
+ "eval_f1": 84.12743563305797,
155
+ "eval_runtime": 416.5856,
156
+ "eval_samples_per_second": 25.373,
157
+ "eval_steps_per_second": 1.587,
158
+ "step": 1000
159
+ },
160
+ {
161
+ "epoch": 0.14383561643835616,
162
+ "grad_norm": 27.335071563720703,
163
+ "learning_rate": 2.8576712328767124e-05,
164
+ "loss": 1.1301,
165
+ "step": 1050
166
+ },
167
+ {
168
+ "epoch": 0.1506849315068493,
169
+ "grad_norm": 36.93644714355469,
170
+ "learning_rate": 2.8508219178082194e-05,
171
+ "loss": 1.3069,
172
+ "step": 1100
173
+ },
174
+ {
175
+ "epoch": 0.15753424657534246,
176
+ "grad_norm": 53.08256149291992,
177
+ "learning_rate": 2.8439726027397263e-05,
178
+ "loss": 1.203,
179
+ "step": 1150
180
+ },
181
+ {
182
+ "epoch": 0.1643835616438356,
183
+ "grad_norm": 70.76780700683594,
184
+ "learning_rate": 2.8371232876712332e-05,
185
+ "loss": 1.2392,
186
+ "step": 1200
187
+ },
188
+ {
189
+ "epoch": 0.17123287671232876,
190
+ "grad_norm": 26.65420150756836,
191
+ "learning_rate": 2.8302739726027395e-05,
192
+ "loss": 1.1829,
193
+ "step": 1250
194
+ },
195
+ {
196
+ "epoch": 0.1780821917808219,
197
+ "grad_norm": 23.404756546020508,
198
+ "learning_rate": 2.8234246575342465e-05,
199
+ "loss": 1.2175,
200
+ "step": 1300
201
+ },
202
+ {
203
+ "epoch": 0.18493150684931506,
204
+ "grad_norm": 23.29783058166504,
205
+ "learning_rate": 2.8165753424657534e-05,
206
+ "loss": 1.1673,
207
+ "step": 1350
208
+ },
209
+ {
210
+ "epoch": 0.1917808219178082,
211
+ "grad_norm": 45.53365707397461,
212
+ "learning_rate": 2.8097260273972604e-05,
213
+ "loss": 1.1966,
214
+ "step": 1400
215
+ },
216
+ {
217
+ "epoch": 0.19863013698630136,
218
+ "grad_norm": 13.774870872497559,
219
+ "learning_rate": 2.802876712328767e-05,
220
+ "loss": 1.2635,
221
+ "step": 1450
222
+ },
223
+ {
224
+ "epoch": 0.2054794520547945,
225
+ "grad_norm": 14.898510932922363,
226
+ "learning_rate": 2.796027397260274e-05,
227
+ "loss": 1.1165,
228
+ "step": 1500
229
+ },
230
+ {
231
+ "epoch": 0.21232876712328766,
232
+ "grad_norm": 35.47676467895508,
233
+ "learning_rate": 2.789178082191781e-05,
234
+ "loss": 1.1877,
235
+ "step": 1550
236
+ },
237
+ {
238
+ "epoch": 0.2191780821917808,
239
+ "grad_norm": 28.7779598236084,
240
+ "learning_rate": 2.7823287671232878e-05,
241
+ "loss": 1.1162,
242
+ "step": 1600
243
+ },
244
+ {
245
+ "epoch": 0.22602739726027396,
246
+ "grad_norm": 21.763216018676758,
247
+ "learning_rate": 2.7754794520547944e-05,
248
+ "loss": 1.2048,
249
+ "step": 1650
250
+ },
251
+ {
252
+ "epoch": 0.2328767123287671,
253
+ "grad_norm": 32.003700256347656,
254
+ "learning_rate": 2.7686301369863014e-05,
255
+ "loss": 1.1233,
256
+ "step": 1700
257
+ },
258
+ {
259
+ "epoch": 0.23972602739726026,
260
+ "grad_norm": 31.449691772460938,
261
+ "learning_rate": 2.7617808219178083e-05,
262
+ "loss": 1.0509,
263
+ "step": 1750
264
+ },
265
+ {
266
+ "epoch": 0.2465753424657534,
267
+ "grad_norm": 18.793638229370117,
268
+ "learning_rate": 2.7549315068493153e-05,
269
+ "loss": 1.0995,
270
+ "step": 1800
271
+ },
272
+ {
273
+ "epoch": 0.2534246575342466,
274
+ "grad_norm": 12.931886672973633,
275
+ "learning_rate": 2.748082191780822e-05,
276
+ "loss": 1.037,
277
+ "step": 1850
278
+ },
279
+ {
280
+ "epoch": 0.2602739726027397,
281
+ "grad_norm": 14.888827323913574,
282
+ "learning_rate": 2.7412328767123288e-05,
283
+ "loss": 1.0856,
284
+ "step": 1900
285
+ },
286
+ {
287
+ "epoch": 0.2671232876712329,
288
+ "grad_norm": 42.177330017089844,
289
+ "learning_rate": 2.7343835616438358e-05,
290
+ "loss": 1.0143,
291
+ "step": 1950
292
+ },
293
+ {
294
+ "epoch": 0.273972602739726,
295
+ "grad_norm": 52.39680480957031,
296
+ "learning_rate": 2.7275342465753427e-05,
297
+ "loss": 0.9292,
298
+ "step": 2000
299
+ },
300
+ {
301
+ "epoch": 0.273972602739726,
302
+ "eval_exact_match": 78.05108798486282,
303
+ "eval_f1": 86.8708882051522,
304
+ "eval_runtime": 417.914,
305
+ "eval_samples_per_second": 25.292,
306
+ "eval_steps_per_second": 1.582,
307
+ "step": 2000
308
+ },
309
+ {
310
+ "epoch": 0.2808219178082192,
311
+ "grad_norm": 79.39705657958984,
312
+ "learning_rate": 2.7206849315068493e-05,
313
+ "loss": 1.1627,
314
+ "step": 2050
315
+ },
316
+ {
317
+ "epoch": 0.2876712328767123,
318
+ "grad_norm": 13.503669738769531,
319
+ "learning_rate": 2.7138356164383563e-05,
320
+ "loss": 0.9592,
321
+ "step": 2100
322
+ },
323
+ {
324
+ "epoch": 0.2945205479452055,
325
+ "grad_norm": 34.6471061706543,
326
+ "learning_rate": 2.7069863013698632e-05,
327
+ "loss": 0.9724,
328
+ "step": 2150
329
+ },
330
+ {
331
+ "epoch": 0.3013698630136986,
332
+ "grad_norm": 28.719350814819336,
333
+ "learning_rate": 2.70013698630137e-05,
334
+ "loss": 0.9852,
335
+ "step": 2200
336
+ },
337
+ {
338
+ "epoch": 0.3082191780821918,
339
+ "grad_norm": 55.4783935546875,
340
+ "learning_rate": 2.6932876712328768e-05,
341
+ "loss": 0.9469,
342
+ "step": 2250
343
+ },
344
+ {
345
+ "epoch": 0.3150684931506849,
346
+ "grad_norm": 24.332387924194336,
347
+ "learning_rate": 2.6864383561643837e-05,
348
+ "loss": 1.0859,
349
+ "step": 2300
350
+ },
351
+ {
352
+ "epoch": 0.3219178082191781,
353
+ "grad_norm": 16.438060760498047,
354
+ "learning_rate": 2.6795890410958907e-05,
355
+ "loss": 0.9121,
356
+ "step": 2350
357
+ },
358
+ {
359
+ "epoch": 0.3287671232876712,
360
+ "grad_norm": 19.229644775390625,
361
+ "learning_rate": 2.6727397260273976e-05,
362
+ "loss": 1.0109,
363
+ "step": 2400
364
+ },
365
+ {
366
+ "epoch": 0.3356164383561644,
367
+ "grad_norm": 22.6262149810791,
368
+ "learning_rate": 2.6658904109589042e-05,
369
+ "loss": 1.077,
370
+ "step": 2450
371
+ },
372
+ {
373
+ "epoch": 0.3424657534246575,
374
+ "grad_norm": 10.421500205993652,
375
+ "learning_rate": 2.659041095890411e-05,
376
+ "loss": 1.1242,
377
+ "step": 2500
378
+ },
379
+ {
380
+ "epoch": 0.3493150684931507,
381
+ "grad_norm": 44.75662612915039,
382
+ "learning_rate": 2.6521917808219178e-05,
383
+ "loss": 1.0445,
384
+ "step": 2550
385
+ },
386
+ {
387
+ "epoch": 0.3561643835616438,
388
+ "grad_norm": 10.374776840209961,
389
+ "learning_rate": 2.6453424657534247e-05,
390
+ "loss": 0.9927,
391
+ "step": 2600
392
+ },
393
+ {
394
+ "epoch": 0.363013698630137,
395
+ "grad_norm": 21.781341552734375,
396
+ "learning_rate": 2.6384931506849313e-05,
397
+ "loss": 0.9601,
398
+ "step": 2650
399
+ },
400
+ {
401
+ "epoch": 0.3698630136986301,
402
+ "grad_norm": 7.096819877624512,
403
+ "learning_rate": 2.6316438356164383e-05,
404
+ "loss": 1.0105,
405
+ "step": 2700
406
+ },
407
+ {
408
+ "epoch": 0.3767123287671233,
409
+ "grad_norm": 128.46046447753906,
410
+ "learning_rate": 2.6247945205479452e-05,
411
+ "loss": 0.9627,
412
+ "step": 2750
413
+ },
414
+ {
415
+ "epoch": 0.3835616438356164,
416
+ "grad_norm": 43.575111389160156,
417
+ "learning_rate": 2.6179452054794522e-05,
418
+ "loss": 1.0495,
419
+ "step": 2800
420
+ },
421
+ {
422
+ "epoch": 0.3904109589041096,
423
+ "grad_norm": 8.99010944366455,
424
+ "learning_rate": 2.6110958904109588e-05,
425
+ "loss": 0.9465,
426
+ "step": 2850
427
+ },
428
+ {
429
+ "epoch": 0.3972602739726027,
430
+ "grad_norm": 21.84307289123535,
431
+ "learning_rate": 2.6042465753424657e-05,
432
+ "loss": 0.9971,
433
+ "step": 2900
434
+ },
435
+ {
436
+ "epoch": 0.4041095890410959,
437
+ "grad_norm": 42.8900032043457,
438
+ "learning_rate": 2.5973972602739727e-05,
439
+ "loss": 0.907,
440
+ "step": 2950
441
+ },
442
+ {
443
+ "epoch": 0.410958904109589,
444
+ "grad_norm": 50.012393951416016,
445
+ "learning_rate": 2.5905479452054796e-05,
446
+ "loss": 0.9572,
447
+ "step": 3000
448
+ },
449
+ {
450
+ "epoch": 0.410958904109589,
451
+ "eval_exact_match": 80.07568590350047,
452
+ "eval_f1": 88.2866062038965,
453
+ "eval_runtime": 417.1017,
454
+ "eval_samples_per_second": 25.342,
455
+ "eval_steps_per_second": 1.585,
456
+ "step": 3000
457
+ },
458
+ {
459
+ "epoch": 0.4178082191780822,
460
+ "grad_norm": 33.76340103149414,
461
+ "learning_rate": 2.5836986301369862e-05,
462
+ "loss": 0.9699,
463
+ "step": 3050
464
+ },
465
+ {
466
+ "epoch": 0.4246575342465753,
467
+ "grad_norm": 14.410927772521973,
468
+ "learning_rate": 2.5768493150684932e-05,
469
+ "loss": 0.8769,
470
+ "step": 3100
471
+ },
472
+ {
473
+ "epoch": 0.4315068493150685,
474
+ "grad_norm": 13.709430694580078,
475
+ "learning_rate": 2.57e-05,
476
+ "loss": 0.9086,
477
+ "step": 3150
478
+ },
479
+ {
480
+ "epoch": 0.4383561643835616,
481
+ "grad_norm": 30.455875396728516,
482
+ "learning_rate": 2.563150684931507e-05,
483
+ "loss": 0.9569,
484
+ "step": 3200
485
+ },
486
+ {
487
+ "epoch": 0.4452054794520548,
488
+ "grad_norm": 10.021869659423828,
489
+ "learning_rate": 2.5563013698630137e-05,
490
+ "loss": 0.9403,
491
+ "step": 3250
492
+ },
493
+ {
494
+ "epoch": 0.4520547945205479,
495
+ "grad_norm": 16.0334415435791,
496
+ "learning_rate": 2.5494520547945206e-05,
497
+ "loss": 0.9413,
498
+ "step": 3300
499
+ },
500
+ {
501
+ "epoch": 0.4589041095890411,
502
+ "grad_norm": 17.812877655029297,
503
+ "learning_rate": 2.5426027397260276e-05,
504
+ "loss": 0.9739,
505
+ "step": 3350
506
+ },
507
+ {
508
+ "epoch": 0.4657534246575342,
509
+ "grad_norm": 13.129268646240234,
510
+ "learning_rate": 2.5357534246575345e-05,
511
+ "loss": 0.9456,
512
+ "step": 3400
513
+ },
514
+ {
515
+ "epoch": 0.4726027397260274,
516
+ "grad_norm": 37.476993560791016,
517
+ "learning_rate": 2.528904109589041e-05,
518
+ "loss": 0.9359,
519
+ "step": 3450
520
+ },
521
+ {
522
+ "epoch": 0.4794520547945205,
523
+ "grad_norm": 16.441556930541992,
524
+ "learning_rate": 2.522054794520548e-05,
525
+ "loss": 0.9096,
526
+ "step": 3500
527
+ },
528
+ {
529
+ "epoch": 0.4863013698630137,
530
+ "grad_norm": 11.985368728637695,
531
+ "learning_rate": 2.515342465753425e-05,
532
+ "loss": 0.9038,
533
+ "step": 3550
534
+ },
535
+ {
536
+ "epoch": 0.4931506849315068,
537
+ "grad_norm": 28.11038589477539,
538
+ "learning_rate": 2.5084931506849316e-05,
539
+ "loss": 0.8699,
540
+ "step": 3600
541
+ },
542
+ {
543
+ "epoch": 0.5,
544
+ "grad_norm": 26.2198543548584,
545
+ "learning_rate": 2.5016438356164385e-05,
546
+ "loss": 0.9071,
547
+ "step": 3650
548
+ },
549
+ {
550
+ "epoch": 0.5068493150684932,
551
+ "grad_norm": 21.6419620513916,
552
+ "learning_rate": 2.4947945205479455e-05,
553
+ "loss": 1.1274,
554
+ "step": 3700
555
+ },
556
+ {
557
+ "epoch": 0.5136986301369864,
558
+ "grad_norm": 27.075368881225586,
559
+ "learning_rate": 2.4879452054794524e-05,
560
+ "loss": 0.9475,
561
+ "step": 3750
562
+ },
563
+ {
564
+ "epoch": 0.5205479452054794,
565
+ "grad_norm": 25.394577026367188,
566
+ "learning_rate": 2.481095890410959e-05,
567
+ "loss": 0.9166,
568
+ "step": 3800
569
+ },
570
+ {
571
+ "epoch": 0.5273972602739726,
572
+ "grad_norm": 32.769256591796875,
573
+ "learning_rate": 2.474246575342466e-05,
574
+ "loss": 0.9656,
575
+ "step": 3850
576
+ },
577
+ {
578
+ "epoch": 0.5342465753424658,
579
+ "grad_norm": 9.152816772460938,
580
+ "learning_rate": 2.4673972602739726e-05,
581
+ "loss": 0.9538,
582
+ "step": 3900
583
+ },
584
+ {
585
+ "epoch": 0.541095890410959,
586
+ "grad_norm": 24.151294708251953,
587
+ "learning_rate": 2.4605479452054795e-05,
588
+ "loss": 0.9396,
589
+ "step": 3950
590
+ },
591
+ {
592
+ "epoch": 0.547945205479452,
593
+ "grad_norm": 10.290411949157715,
594
+ "learning_rate": 2.453698630136986e-05,
595
+ "loss": 0.971,
596
+ "step": 4000
597
+ },
598
+ {
599
+ "epoch": 0.547945205479452,
600
+ "eval_exact_match": 82.82876064333018,
601
+ "eval_f1": 90.23705695343135,
602
+ "eval_runtime": 418.5094,
603
+ "eval_samples_per_second": 25.256,
604
+ "eval_steps_per_second": 1.579,
605
+ "step": 4000
606
+ },
607
+ {
608
+ "epoch": 0.5547945205479452,
609
+ "grad_norm": 30.38792610168457,
610
+ "learning_rate": 2.446849315068493e-05,
611
+ "loss": 0.9951,
612
+ "step": 4050
613
+ },
614
+ {
615
+ "epoch": 0.5616438356164384,
616
+ "grad_norm": 12.29451847076416,
617
+ "learning_rate": 2.44e-05,
618
+ "loss": 0.9758,
619
+ "step": 4100
620
+ },
621
+ {
622
+ "epoch": 0.5684931506849316,
623
+ "grad_norm": 56.75043869018555,
624
+ "learning_rate": 2.433150684931507e-05,
625
+ "loss": 0.8756,
626
+ "step": 4150
627
+ },
628
+ {
629
+ "epoch": 0.5753424657534246,
630
+ "grad_norm": 26.024311065673828,
631
+ "learning_rate": 2.4263013698630136e-05,
632
+ "loss": 0.9083,
633
+ "step": 4200
634
+ },
635
+ {
636
+ "epoch": 0.5821917808219178,
637
+ "grad_norm": 14.479564666748047,
638
+ "learning_rate": 2.4194520547945205e-05,
639
+ "loss": 0.9357,
640
+ "step": 4250
641
+ },
642
+ {
643
+ "epoch": 0.589041095890411,
644
+ "grad_norm": 9.043543815612793,
645
+ "learning_rate": 2.4126027397260275e-05,
646
+ "loss": 0.9954,
647
+ "step": 4300
648
+ },
649
+ {
650
+ "epoch": 0.5958904109589042,
651
+ "grad_norm": 17.975173950195312,
652
+ "learning_rate": 2.4057534246575344e-05,
653
+ "loss": 0.8758,
654
+ "step": 4350
655
+ },
656
+ {
657
+ "epoch": 0.6027397260273972,
658
+ "grad_norm": 25.138750076293945,
659
+ "learning_rate": 2.398904109589041e-05,
660
+ "loss": 0.8641,
661
+ "step": 4400
662
+ },
663
+ {
664
+ "epoch": 0.6095890410958904,
665
+ "grad_norm": 10.003458976745605,
666
+ "learning_rate": 2.392054794520548e-05,
667
+ "loss": 1.0095,
668
+ "step": 4450
669
+ },
670
+ {
671
+ "epoch": 0.6164383561643836,
672
+ "grad_norm": 48.206233978271484,
673
+ "learning_rate": 2.385205479452055e-05,
674
+ "loss": 0.924,
675
+ "step": 4500
676
+ },
677
+ {
678
+ "epoch": 0.6232876712328768,
679
+ "grad_norm": 10.344615936279297,
680
+ "learning_rate": 2.378356164383562e-05,
681
+ "loss": 0.9114,
682
+ "step": 4550
683
+ },
684
+ {
685
+ "epoch": 0.6301369863013698,
686
+ "grad_norm": 15.59772777557373,
687
+ "learning_rate": 2.3715068493150685e-05,
688
+ "loss": 0.9313,
689
+ "step": 4600
690
+ },
691
+ {
692
+ "epoch": 0.636986301369863,
693
+ "grad_norm": 4.01880407333374,
694
+ "learning_rate": 2.3646575342465754e-05,
695
+ "loss": 0.9007,
696
+ "step": 4650
697
+ },
698
+ {
699
+ "epoch": 0.6438356164383562,
700
+ "grad_norm": 6.648110866546631,
701
+ "learning_rate": 2.3578082191780824e-05,
702
+ "loss": 0.9106,
703
+ "step": 4700
704
+ },
705
+ {
706
+ "epoch": 0.6506849315068494,
707
+ "grad_norm": 36.89479446411133,
708
+ "learning_rate": 2.3509589041095893e-05,
709
+ "loss": 0.8683,
710
+ "step": 4750
711
+ },
712
+ {
713
+ "epoch": 0.6575342465753424,
714
+ "grad_norm": 46.464141845703125,
715
+ "learning_rate": 2.344109589041096e-05,
716
+ "loss": 0.7399,
717
+ "step": 4800
718
+ },
719
+ {
720
+ "epoch": 0.6643835616438356,
721
+ "grad_norm": 24.97540283203125,
722
+ "learning_rate": 2.337260273972603e-05,
723
+ "loss": 0.9609,
724
+ "step": 4850
725
+ },
726
+ {
727
+ "epoch": 0.6712328767123288,
728
+ "grad_norm": 50.79806137084961,
729
+ "learning_rate": 2.3304109589041098e-05,
730
+ "loss": 0.8724,
731
+ "step": 4900
732
+ },
733
+ {
734
+ "epoch": 0.678082191780822,
735
+ "grad_norm": 24.82395362854004,
736
+ "learning_rate": 2.3235616438356168e-05,
737
+ "loss": 0.8565,
738
+ "step": 4950
739
+ },
740
+ {
741
+ "epoch": 0.684931506849315,
742
+ "grad_norm": 28.389963150024414,
743
+ "learning_rate": 2.3167123287671234e-05,
744
+ "loss": 0.8048,
745
+ "step": 5000
746
+ },
747
+ {
748
+ "epoch": 0.684931506849315,
749
+ "eval_exact_match": 84.12488174077578,
750
+ "eval_f1": 91.07595753183573,
751
+ "eval_runtime": 417.955,
752
+ "eval_samples_per_second": 25.29,
753
+ "eval_steps_per_second": 1.582,
754
+ "step": 5000
755
+ },
756
+ {
757
+ "epoch": 0.6917808219178082,
758
+ "grad_norm": 14.127419471740723,
759
+ "learning_rate": 2.3098630136986303e-05,
760
+ "loss": 0.7959,
761
+ "step": 5050
762
+ },
763
+ {
764
+ "epoch": 0.6986301369863014,
765
+ "grad_norm": 12.644275665283203,
766
+ "learning_rate": 2.3030136986301373e-05,
767
+ "loss": 0.9732,
768
+ "step": 5100
769
+ },
770
+ {
771
+ "epoch": 0.7054794520547946,
772
+ "grad_norm": 26.438093185424805,
773
+ "learning_rate": 2.296164383561644e-05,
774
+ "loss": 0.8267,
775
+ "step": 5150
776
+ },
777
+ {
778
+ "epoch": 0.7123287671232876,
779
+ "grad_norm": 14.37074089050293,
780
+ "learning_rate": 2.2893150684931505e-05,
781
+ "loss": 0.8899,
782
+ "step": 5200
783
+ },
784
+ {
785
+ "epoch": 0.7191780821917808,
786
+ "grad_norm": 23.79348373413086,
787
+ "learning_rate": 2.2824657534246574e-05,
788
+ "loss": 0.906,
789
+ "step": 5250
790
+ },
791
+ {
792
+ "epoch": 0.726027397260274,
793
+ "grad_norm": 12.331747055053711,
794
+ "learning_rate": 2.2756164383561644e-05,
795
+ "loss": 0.823,
796
+ "step": 5300
797
+ },
798
+ {
799
+ "epoch": 0.7328767123287672,
800
+ "grad_norm": 17.69827651977539,
801
+ "learning_rate": 2.2687671232876713e-05,
802
+ "loss": 0.8854,
803
+ "step": 5350
804
+ },
805
+ {
806
+ "epoch": 0.7397260273972602,
807
+ "grad_norm": 30.975631713867188,
808
+ "learning_rate": 2.261917808219178e-05,
809
+ "loss": 0.8866,
810
+ "step": 5400
811
+ },
812
+ {
813
+ "epoch": 0.7465753424657534,
814
+ "grad_norm": 8.740948677062988,
815
+ "learning_rate": 2.255068493150685e-05,
816
+ "loss": 0.843,
817
+ "step": 5450
818
+ },
819
+ {
820
+ "epoch": 0.7534246575342466,
821
+ "grad_norm": 25.42966651916504,
822
+ "learning_rate": 2.248219178082192e-05,
823
+ "loss": 0.937,
824
+ "step": 5500
825
+ },
826
+ {
827
+ "epoch": 0.7602739726027398,
828
+ "grad_norm": 15.364262580871582,
829
+ "learning_rate": 2.2413698630136988e-05,
830
+ "loss": 0.8485,
831
+ "step": 5550
832
+ },
833
+ {
834
+ "epoch": 0.7671232876712328,
835
+ "grad_norm": 13.131271362304688,
836
+ "learning_rate": 2.2345205479452054e-05,
837
+ "loss": 0.8018,
838
+ "step": 5600
839
+ },
840
+ {
841
+ "epoch": 0.773972602739726,
842
+ "grad_norm": 9.219653129577637,
843
+ "learning_rate": 2.2276712328767123e-05,
844
+ "loss": 0.8492,
845
+ "step": 5650
846
+ },
847
+ {
848
+ "epoch": 0.7808219178082192,
849
+ "grad_norm": 20.117149353027344,
850
+ "learning_rate": 2.2208219178082193e-05,
851
+ "loss": 0.9339,
852
+ "step": 5700
853
+ },
854
+ {
855
+ "epoch": 0.7876712328767124,
856
+ "grad_norm": 10.591473579406738,
857
+ "learning_rate": 2.2139726027397262e-05,
858
+ "loss": 0.8925,
859
+ "step": 5750
860
+ },
861
+ {
862
+ "epoch": 0.7945205479452054,
863
+ "grad_norm": 25.7907772064209,
864
+ "learning_rate": 2.207123287671233e-05,
865
+ "loss": 0.9092,
866
+ "step": 5800
867
+ },
868
+ {
869
+ "epoch": 0.8013698630136986,
870
+ "grad_norm": 12.926765441894531,
871
+ "learning_rate": 2.2002739726027398e-05,
872
+ "loss": 0.8824,
873
+ "step": 5850
874
+ },
875
+ {
876
+ "epoch": 0.8082191780821918,
877
+ "grad_norm": 15.865346908569336,
878
+ "learning_rate": 2.1934246575342467e-05,
879
+ "loss": 0.846,
880
+ "step": 5900
881
+ },
882
+ {
883
+ "epoch": 0.815068493150685,
884
+ "grad_norm": 20.416015625,
885
+ "learning_rate": 2.1865753424657537e-05,
886
+ "loss": 0.8126,
887
+ "step": 5950
888
+ },
889
+ {
890
+ "epoch": 0.821917808219178,
891
+ "grad_norm": 11.727408409118652,
892
+ "learning_rate": 2.1797260273972603e-05,
893
+ "loss": 0.8308,
894
+ "step": 6000
895
+ },
896
+ {
897
+ "epoch": 0.821917808219178,
898
+ "eval_exact_match": 84.57899716177862,
899
+ "eval_f1": 91.43611698741225,
900
+ "eval_runtime": 417.3274,
901
+ "eval_samples_per_second": 25.328,
902
+ "eval_steps_per_second": 1.584,
903
+ "step": 6000
904
+ },
905
+ {
906
+ "epoch": 0.8287671232876712,
907
+ "grad_norm": 25.75244903564453,
908
+ "learning_rate": 2.1728767123287672e-05,
909
+ "loss": 0.8027,
910
+ "step": 6050
911
+ },
912
+ {
913
+ "epoch": 0.8356164383561644,
914
+ "grad_norm": 10.4827880859375,
915
+ "learning_rate": 2.1660273972602742e-05,
916
+ "loss": 0.7853,
917
+ "step": 6100
918
+ },
919
+ {
920
+ "epoch": 0.8424657534246576,
921
+ "grad_norm": 20.461090087890625,
922
+ "learning_rate": 2.159178082191781e-05,
923
+ "loss": 0.9597,
924
+ "step": 6150
925
+ },
926
+ {
927
+ "epoch": 0.8493150684931506,
928
+ "grad_norm": 5.970315456390381,
929
+ "learning_rate": 2.1523287671232877e-05,
930
+ "loss": 0.8921,
931
+ "step": 6200
932
+ },
933
+ {
934
+ "epoch": 0.8561643835616438,
935
+ "grad_norm": 31.718955993652344,
936
+ "learning_rate": 2.1454794520547947e-05,
937
+ "loss": 0.7981,
938
+ "step": 6250
939
+ },
940
+ {
941
+ "epoch": 0.863013698630137,
942
+ "grad_norm": 7.239502906799316,
943
+ "learning_rate": 2.1386301369863016e-05,
944
+ "loss": 0.8386,
945
+ "step": 6300
946
+ },
947
+ {
948
+ "epoch": 0.8698630136986302,
949
+ "grad_norm": 8.74398422241211,
950
+ "learning_rate": 2.1317808219178086e-05,
951
+ "loss": 0.8466,
952
+ "step": 6350
953
+ },
954
+ {
955
+ "epoch": 0.8767123287671232,
956
+ "grad_norm": 26.66971206665039,
957
+ "learning_rate": 2.1249315068493152e-05,
958
+ "loss": 0.8054,
959
+ "step": 6400
960
+ },
961
+ {
962
+ "epoch": 0.8835616438356164,
963
+ "grad_norm": 30.80714988708496,
964
+ "learning_rate": 2.1180821917808218e-05,
965
+ "loss": 0.8719,
966
+ "step": 6450
967
+ },
968
+ {
969
+ "epoch": 0.8904109589041096,
970
+ "grad_norm": 10.245841026306152,
971
+ "learning_rate": 2.1112328767123287e-05,
972
+ "loss": 0.8324,
973
+ "step": 6500
974
+ },
975
+ {
976
+ "epoch": 0.8972602739726028,
977
+ "grad_norm": 11.401703834533691,
978
+ "learning_rate": 2.1043835616438357e-05,
979
+ "loss": 0.7636,
980
+ "step": 6550
981
+ },
982
+ {
983
+ "epoch": 0.9041095890410958,
984
+ "grad_norm": 10.791388511657715,
985
+ "learning_rate": 2.0975342465753423e-05,
986
+ "loss": 0.8163,
987
+ "step": 6600
988
+ },
989
+ {
990
+ "epoch": 0.910958904109589,
991
+ "grad_norm": 73.97101593017578,
992
+ "learning_rate": 2.0906849315068493e-05,
993
+ "loss": 0.8267,
994
+ "step": 6650
995
+ },
996
+ {
997
+ "epoch": 0.9178082191780822,
998
+ "grad_norm": 5.0418925285339355,
999
+ "learning_rate": 2.0838356164383562e-05,
1000
+ "loss": 0.7787,
1001
+ "step": 6700
1002
+ },
1003
+ {
1004
+ "epoch": 0.9246575342465754,
1005
+ "grad_norm": 16.25297737121582,
1006
+ "learning_rate": 2.076986301369863e-05,
1007
+ "loss": 0.8305,
1008
+ "step": 6750
1009
+ },
1010
+ {
1011
+ "epoch": 0.9315068493150684,
1012
+ "grad_norm": 15.411580085754395,
1013
+ "learning_rate": 2.0701369863013698e-05,
1014
+ "loss": 0.7606,
1015
+ "step": 6800
1016
+ },
1017
+ {
1018
+ "epoch": 0.9383561643835616,
1019
+ "grad_norm": 14.359014511108398,
1020
+ "learning_rate": 2.0632876712328767e-05,
1021
+ "loss": 0.8578,
1022
+ "step": 6850
1023
+ },
1024
+ {
1025
+ "epoch": 0.9452054794520548,
1026
+ "grad_norm": 21.724008560180664,
1027
+ "learning_rate": 2.0564383561643836e-05,
1028
+ "loss": 0.7836,
1029
+ "step": 6900
1030
+ },
1031
+ {
1032
+ "epoch": 0.952054794520548,
1033
+ "grad_norm": 16.269027709960938,
1034
+ "learning_rate": 2.0495890410958906e-05,
1035
+ "loss": 0.8681,
1036
+ "step": 6950
1037
+ },
1038
+ {
1039
+ "epoch": 0.958904109589041,
1040
+ "grad_norm": 33.833797454833984,
1041
+ "learning_rate": 2.0427397260273972e-05,
1042
+ "loss": 0.8034,
1043
+ "step": 7000
1044
+ },
1045
+ {
1046
+ "epoch": 0.958904109589041,
1047
+ "eval_exact_match": 84.64522232734153,
1048
+ "eval_f1": 91.36486838251115,
1049
+ "eval_runtime": 419.048,
1050
+ "eval_samples_per_second": 25.224,
1051
+ "eval_steps_per_second": 1.577,
1052
+ "step": 7000
1053
+ },
1054
+ {
1055
+ "epoch": 0.9657534246575342,
1056
+ "grad_norm": 11.25650691986084,
1057
+ "learning_rate": 2.035890410958904e-05,
1058
+ "loss": 0.8451,
1059
+ "step": 7050
1060
+ },
1061
+ {
1062
+ "epoch": 0.9726027397260274,
1063
+ "grad_norm": 42.16932678222656,
1064
+ "learning_rate": 2.029041095890411e-05,
1065
+ "loss": 0.8701,
1066
+ "step": 7100
1067
+ },
1068
+ {
1069
+ "epoch": 0.9794520547945206,
1070
+ "grad_norm": 23.863306045532227,
1071
+ "learning_rate": 2.022191780821918e-05,
1072
+ "loss": 0.8651,
1073
+ "step": 7150
1074
+ },
1075
+ {
1076
+ "epoch": 0.9863013698630136,
1077
+ "grad_norm": 40.55854034423828,
1078
+ "learning_rate": 2.0153424657534247e-05,
1079
+ "loss": 0.8481,
1080
+ "step": 7200
1081
+ },
1082
+ {
1083
+ "epoch": 0.9931506849315068,
1084
+ "grad_norm": 14.687817573547363,
1085
+ "learning_rate": 2.0084931506849316e-05,
1086
+ "loss": 0.8462,
1087
+ "step": 7250
1088
+ },
1089
+ {
1090
+ "epoch": 1.0,
1091
+ "grad_norm": 27.034555435180664,
1092
+ "learning_rate": 2.0016438356164386e-05,
1093
+ "loss": 0.7084,
1094
+ "step": 7300
1095
+ },
1096
+ {
1097
+ "epoch": 1.0068493150684932,
1098
+ "grad_norm": 20.166122436523438,
1099
+ "learning_rate": 1.9947945205479455e-05,
1100
+ "loss": 0.6103,
1101
+ "step": 7350
1102
+ },
1103
+ {
1104
+ "epoch": 1.0136986301369864,
1105
+ "grad_norm": 7.819751262664795,
1106
+ "learning_rate": 1.987945205479452e-05,
1107
+ "loss": 0.5915,
1108
+ "step": 7400
1109
+ },
1110
+ {
1111
+ "epoch": 1.0205479452054795,
1112
+ "grad_norm": 7.091070175170898,
1113
+ "learning_rate": 1.981095890410959e-05,
1114
+ "loss": 0.6133,
1115
+ "step": 7450
1116
+ },
1117
+ {
1118
+ "epoch": 1.0273972602739727,
1119
+ "grad_norm": 8.464035034179688,
1120
+ "learning_rate": 1.974246575342466e-05,
1121
+ "loss": 0.626,
1122
+ "step": 7500
1123
+ },
1124
+ {
1125
+ "epoch": 1.0342465753424657,
1126
+ "grad_norm": 9.27121353149414,
1127
+ "learning_rate": 1.9673972602739726e-05,
1128
+ "loss": 0.6166,
1129
+ "step": 7550
1130
+ },
1131
+ {
1132
+ "epoch": 1.0410958904109588,
1133
+ "grad_norm": 10.510052680969238,
1134
+ "learning_rate": 1.9605479452054796e-05,
1135
+ "loss": 0.5873,
1136
+ "step": 7600
1137
+ },
1138
+ {
1139
+ "epoch": 1.047945205479452,
1140
+ "grad_norm": 12.773880958557129,
1141
+ "learning_rate": 1.9536986301369865e-05,
1142
+ "loss": 0.6074,
1143
+ "step": 7650
1144
+ },
1145
+ {
1146
+ "epoch": 1.0547945205479452,
1147
+ "grad_norm": 5.2743425369262695,
1148
+ "learning_rate": 1.9468493150684935e-05,
1149
+ "loss": 0.5606,
1150
+ "step": 7700
1151
+ },
1152
+ {
1153
+ "epoch": 1.0616438356164384,
1154
+ "grad_norm": 45.79352569580078,
1155
+ "learning_rate": 1.9399999999999997e-05,
1156
+ "loss": 0.6101,
1157
+ "step": 7750
1158
+ },
1159
+ {
1160
+ "epoch": 1.0684931506849316,
1161
+ "grad_norm": 9.683128356933594,
1162
+ "learning_rate": 1.9331506849315067e-05,
1163
+ "loss": 0.521,
1164
+ "step": 7800
1165
+ },
1166
+ {
1167
+ "epoch": 1.0753424657534247,
1168
+ "grad_norm": 15.017319679260254,
1169
+ "learning_rate": 1.9263013698630136e-05,
1170
+ "loss": 0.6256,
1171
+ "step": 7850
1172
+ },
1173
+ {
1174
+ "epoch": 1.0821917808219177,
1175
+ "grad_norm": 17.09823226928711,
1176
+ "learning_rate": 1.9194520547945206e-05,
1177
+ "loss": 0.6204,
1178
+ "step": 7900
1179
+ },
1180
+ {
1181
+ "epoch": 1.0890410958904109,
1182
+ "grad_norm": 16.04283905029297,
1183
+ "learning_rate": 1.9126027397260272e-05,
1184
+ "loss": 0.6075,
1185
+ "step": 7950
1186
+ },
1187
+ {
1188
+ "epoch": 1.095890410958904,
1189
+ "grad_norm": 10.665413856506348,
1190
+ "learning_rate": 1.905753424657534e-05,
1191
+ "loss": 0.5418,
1192
+ "step": 8000
1193
+ },
1194
+ {
1195
+ "epoch": 1.095890410958904,
1196
+ "eval_exact_match": 85.19394512771996,
1197
+ "eval_f1": 92.0232735390316,
1198
+ "eval_runtime": 417.4898,
1199
+ "eval_samples_per_second": 25.318,
1200
+ "eval_steps_per_second": 1.583,
1201
+ "step": 8000
1202
+ },
1203
+ {
1204
+ "epoch": 1.1027397260273972,
1205
+ "grad_norm": 28.214811325073242,
1206
+ "learning_rate": 1.898904109589041e-05,
1207
+ "loss": 0.6014,
1208
+ "step": 8050
1209
+ },
1210
+ {
1211
+ "epoch": 1.1095890410958904,
1212
+ "grad_norm": 34.26095199584961,
1213
+ "learning_rate": 1.892054794520548e-05,
1214
+ "loss": 0.6195,
1215
+ "step": 8100
1216
+ },
1217
+ {
1218
+ "epoch": 1.1164383561643836,
1219
+ "grad_norm": 7.891120910644531,
1220
+ "learning_rate": 1.8852054794520546e-05,
1221
+ "loss": 0.723,
1222
+ "step": 8150
1223
+ },
1224
+ {
1225
+ "epoch": 1.1232876712328768,
1226
+ "grad_norm": 21.476566314697266,
1227
+ "learning_rate": 1.8783561643835616e-05,
1228
+ "loss": 0.6577,
1229
+ "step": 8200
1230
+ },
1231
+ {
1232
+ "epoch": 1.13013698630137,
1233
+ "grad_norm": 15.302453994750977,
1234
+ "learning_rate": 1.8715068493150685e-05,
1235
+ "loss": 0.5872,
1236
+ "step": 8250
1237
+ },
1238
+ {
1239
+ "epoch": 1.1369863013698631,
1240
+ "grad_norm": 22.77974510192871,
1241
+ "learning_rate": 1.8646575342465755e-05,
1242
+ "loss": 0.6372,
1243
+ "step": 8300
1244
+ },
1245
+ {
1246
+ "epoch": 1.143835616438356,
1247
+ "grad_norm": 7.8337297439575195,
1248
+ "learning_rate": 1.857808219178082e-05,
1249
+ "loss": 0.5487,
1250
+ "step": 8350
1251
+ },
1252
+ {
1253
+ "epoch": 1.1506849315068493,
1254
+ "grad_norm": 26.61360740661621,
1255
+ "learning_rate": 1.850958904109589e-05,
1256
+ "loss": 0.5555,
1257
+ "step": 8400
1258
+ },
1259
+ {
1260
+ "epoch": 1.1575342465753424,
1261
+ "grad_norm": 9.870406150817871,
1262
+ "learning_rate": 1.844109589041096e-05,
1263
+ "loss": 0.5728,
1264
+ "step": 8450
1265
+ },
1266
+ {
1267
+ "epoch": 1.1643835616438356,
1268
+ "grad_norm": 8.367820739746094,
1269
+ "learning_rate": 1.837260273972603e-05,
1270
+ "loss": 0.6081,
1271
+ "step": 8500
1272
+ },
1273
+ {
1274
+ "epoch": 1.1712328767123288,
1275
+ "grad_norm": 22.267404556274414,
1276
+ "learning_rate": 1.8304109589041095e-05,
1277
+ "loss": 0.6432,
1278
+ "step": 8550
1279
+ },
1280
+ {
1281
+ "epoch": 1.178082191780822,
1282
+ "grad_norm": 20.5167236328125,
1283
+ "learning_rate": 1.8235616438356165e-05,
1284
+ "loss": 0.5952,
1285
+ "step": 8600
1286
+ },
1287
+ {
1288
+ "epoch": 1.1849315068493151,
1289
+ "grad_norm": 13.30615520477295,
1290
+ "learning_rate": 1.8167123287671234e-05,
1291
+ "loss": 0.6364,
1292
+ "step": 8650
1293
+ },
1294
+ {
1295
+ "epoch": 1.191780821917808,
1296
+ "grad_norm": 16.95148277282715,
1297
+ "learning_rate": 1.8098630136986304e-05,
1298
+ "loss": 0.676,
1299
+ "step": 8700
1300
+ },
1301
+ {
1302
+ "epoch": 1.1986301369863013,
1303
+ "grad_norm": 19.168123245239258,
1304
+ "learning_rate": 1.803013698630137e-05,
1305
+ "loss": 0.5939,
1306
+ "step": 8750
1307
+ },
1308
+ {
1309
+ "epoch": 1.2054794520547945,
1310
+ "grad_norm": 13.934683799743652,
1311
+ "learning_rate": 1.796164383561644e-05,
1312
+ "loss": 0.6304,
1313
+ "step": 8800
1314
+ },
1315
+ {
1316
+ "epoch": 1.2123287671232876,
1317
+ "grad_norm": 16.010356903076172,
1318
+ "learning_rate": 1.789315068493151e-05,
1319
+ "loss": 0.7142,
1320
+ "step": 8850
1321
+ },
1322
+ {
1323
+ "epoch": 1.2191780821917808,
1324
+ "grad_norm": 20.430280685424805,
1325
+ "learning_rate": 1.7824657534246578e-05,
1326
+ "loss": 0.6254,
1327
+ "step": 8900
1328
+ },
1329
+ {
1330
+ "epoch": 1.226027397260274,
1331
+ "grad_norm": 44.09880828857422,
1332
+ "learning_rate": 1.7756164383561644e-05,
1333
+ "loss": 0.6566,
1334
+ "step": 8950
1335
+ },
1336
+ {
1337
+ "epoch": 1.2328767123287672,
1338
+ "grad_norm": 41.73299026489258,
1339
+ "learning_rate": 1.7687671232876714e-05,
1340
+ "loss": 0.5479,
1341
+ "step": 9000
1342
+ },
1343
+ {
1344
+ "epoch": 1.2328767123287672,
1345
+ "eval_exact_match": 85.38315988647115,
1346
+ "eval_f1": 92.15019137693149,
1347
+ "eval_runtime": 418.4718,
1348
+ "eval_samples_per_second": 25.259,
1349
+ "eval_steps_per_second": 1.58,
1350
+ "step": 9000
1351
+ },
1352
+ {
1353
+ "epoch": 1.2397260273972603,
1354
+ "grad_norm": 20.86771011352539,
1355
+ "learning_rate": 1.761917808219178e-05,
1356
+ "loss": 0.5645,
1357
+ "step": 9050
1358
+ },
1359
+ {
1360
+ "epoch": 1.2465753424657535,
1361
+ "grad_norm": 6.557628631591797,
1362
+ "learning_rate": 1.755068493150685e-05,
1363
+ "loss": 0.6265,
1364
+ "step": 9100
1365
+ },
1366
+ {
1367
+ "epoch": 1.2534246575342465,
1368
+ "grad_norm": 6.776758193969727,
1369
+ "learning_rate": 1.7482191780821915e-05,
1370
+ "loss": 0.5482,
1371
+ "step": 9150
1372
+ },
1373
+ {
1374
+ "epoch": 1.2602739726027397,
1375
+ "grad_norm": 20.538101196289062,
1376
+ "learning_rate": 1.7413698630136985e-05,
1377
+ "loss": 0.5793,
1378
+ "step": 9200
1379
+ },
1380
+ {
1381
+ "epoch": 1.2671232876712328,
1382
+ "grad_norm": 19.532896041870117,
1383
+ "learning_rate": 1.7345205479452054e-05,
1384
+ "loss": 0.6236,
1385
+ "step": 9250
1386
+ },
1387
+ {
1388
+ "epoch": 1.273972602739726,
1389
+ "grad_norm": 16.863801956176758,
1390
+ "learning_rate": 1.7276712328767124e-05,
1391
+ "loss": 0.5566,
1392
+ "step": 9300
1393
+ },
1394
+ {
1395
+ "epoch": 1.2808219178082192,
1396
+ "grad_norm": 46.87477493286133,
1397
+ "learning_rate": 1.720821917808219e-05,
1398
+ "loss": 0.5785,
1399
+ "step": 9350
1400
+ },
1401
+ {
1402
+ "epoch": 1.2876712328767124,
1403
+ "grad_norm": 10.397550582885742,
1404
+ "learning_rate": 1.713972602739726e-05,
1405
+ "loss": 0.7018,
1406
+ "step": 9400
1407
+ },
1408
+ {
1409
+ "epoch": 1.2945205479452055,
1410
+ "grad_norm": 6.6365275382995605,
1411
+ "learning_rate": 1.7072602739726028e-05,
1412
+ "loss": 0.5387,
1413
+ "step": 9450
1414
+ },
1415
+ {
1416
+ "epoch": 1.3013698630136985,
1417
+ "grad_norm": 24.18856430053711,
1418
+ "learning_rate": 1.7004109589041094e-05,
1419
+ "loss": 0.5884,
1420
+ "step": 9500
1421
+ },
1422
+ {
1423
+ "epoch": 1.308219178082192,
1424
+ "grad_norm": 33.152278900146484,
1425
+ "learning_rate": 1.6935616438356164e-05,
1426
+ "loss": 0.6385,
1427
+ "step": 9550
1428
+ },
1429
+ {
1430
+ "epoch": 1.3150684931506849,
1431
+ "grad_norm": 4.547526836395264,
1432
+ "learning_rate": 1.6867123287671233e-05,
1433
+ "loss": 0.5276,
1434
+ "step": 9600
1435
+ },
1436
+ {
1437
+ "epoch": 1.321917808219178,
1438
+ "grad_norm": 34.75554656982422,
1439
+ "learning_rate": 1.6798630136986303e-05,
1440
+ "loss": 0.6225,
1441
+ "step": 9650
1442
+ },
1443
+ {
1444
+ "epoch": 1.3287671232876712,
1445
+ "grad_norm": 15.248549461364746,
1446
+ "learning_rate": 1.673013698630137e-05,
1447
+ "loss": 0.5654,
1448
+ "step": 9700
1449
+ },
1450
+ {
1451
+ "epoch": 1.3356164383561644,
1452
+ "grad_norm": 19.826614379882812,
1453
+ "learning_rate": 1.6661643835616438e-05,
1454
+ "loss": 0.6356,
1455
+ "step": 9750
1456
+ },
1457
+ {
1458
+ "epoch": 1.3424657534246576,
1459
+ "grad_norm": 29.037817001342773,
1460
+ "learning_rate": 1.6593150684931508e-05,
1461
+ "loss": 0.5997,
1462
+ "step": 9800
1463
+ },
1464
+ {
1465
+ "epoch": 1.3493150684931507,
1466
+ "grad_norm": 5.311439037322998,
1467
+ "learning_rate": 1.6524657534246577e-05,
1468
+ "loss": 0.5873,
1469
+ "step": 9850
1470
+ },
1471
+ {
1472
+ "epoch": 1.356164383561644,
1473
+ "grad_norm": 44.800636291503906,
1474
+ "learning_rate": 1.6456164383561643e-05,
1475
+ "loss": 0.5232,
1476
+ "step": 9900
1477
+ },
1478
+ {
1479
+ "epoch": 1.3630136986301369,
1480
+ "grad_norm": 19.861589431762695,
1481
+ "learning_rate": 1.6387671232876713e-05,
1482
+ "loss": 0.5877,
1483
+ "step": 9950
1484
+ },
1485
+ {
1486
+ "epoch": 1.36986301369863,
1487
+ "grad_norm": 8.236838340759277,
1488
+ "learning_rate": 1.6319178082191782e-05,
1489
+ "loss": 0.5612,
1490
+ "step": 10000
1491
+ },
1492
+ {
1493
+ "epoch": 1.36986301369863,
1494
+ "eval_exact_match": 84.95742667928099,
1495
+ "eval_f1": 91.94903795865467,
1496
+ "eval_runtime": 417.7906,
1497
+ "eval_samples_per_second": 25.3,
1498
+ "eval_steps_per_second": 1.582,
1499
+ "step": 10000
1500
+ },
1501
+ {
1502
+ "epoch": 1.3767123287671232,
1503
+ "grad_norm": 5.004638671875,
1504
+ "learning_rate": 1.625068493150685e-05,
1505
+ "loss": 0.6488,
1506
+ "step": 10050
1507
+ },
1508
+ {
1509
+ "epoch": 1.3835616438356164,
1510
+ "grad_norm": 6.322940349578857,
1511
+ "learning_rate": 1.6182191780821918e-05,
1512
+ "loss": 0.53,
1513
+ "step": 10100
1514
+ },
1515
+ {
1516
+ "epoch": 1.3904109589041096,
1517
+ "grad_norm": 18.412700653076172,
1518
+ "learning_rate": 1.6113698630136987e-05,
1519
+ "loss": 0.6148,
1520
+ "step": 10150
1521
+ },
1522
+ {
1523
+ "epoch": 1.3972602739726028,
1524
+ "grad_norm": 10.37777042388916,
1525
+ "learning_rate": 1.6045205479452057e-05,
1526
+ "loss": 0.5429,
1527
+ "step": 10200
1528
+ },
1529
+ {
1530
+ "epoch": 1.404109589041096,
1531
+ "grad_norm": 14.783160209655762,
1532
+ "learning_rate": 1.5976712328767126e-05,
1533
+ "loss": 0.5469,
1534
+ "step": 10250
1535
+ },
1536
+ {
1537
+ "epoch": 1.410958904109589,
1538
+ "grad_norm": 9.634529113769531,
1539
+ "learning_rate": 1.5908219178082192e-05,
1540
+ "loss": 0.5273,
1541
+ "step": 10300
1542
+ },
1543
+ {
1544
+ "epoch": 1.4178082191780823,
1545
+ "grad_norm": 8.868270874023438,
1546
+ "learning_rate": 1.5839726027397258e-05,
1547
+ "loss": 0.6968,
1548
+ "step": 10350
1549
+ },
1550
+ {
1551
+ "epoch": 1.4246575342465753,
1552
+ "grad_norm": 2.925807476043701,
1553
+ "learning_rate": 1.5771232876712328e-05,
1554
+ "loss": 0.5857,
1555
+ "step": 10400
1556
+ },
1557
+ {
1558
+ "epoch": 1.4315068493150684,
1559
+ "grad_norm": 14.901018142700195,
1560
+ "learning_rate": 1.5702739726027397e-05,
1561
+ "loss": 0.5813,
1562
+ "step": 10450
1563
+ },
1564
+ {
1565
+ "epoch": 1.4383561643835616,
1566
+ "grad_norm": 18.872957229614258,
1567
+ "learning_rate": 1.5634246575342463e-05,
1568
+ "loss": 0.6373,
1569
+ "step": 10500
1570
+ },
1571
+ {
1572
+ "epoch": 1.4452054794520548,
1573
+ "grad_norm": 35.207847595214844,
1574
+ "learning_rate": 1.5565753424657533e-05,
1575
+ "loss": 0.5539,
1576
+ "step": 10550
1577
+ },
1578
+ {
1579
+ "epoch": 1.452054794520548,
1580
+ "grad_norm": 11.02206802368164,
1581
+ "learning_rate": 1.5497260273972602e-05,
1582
+ "loss": 0.5957,
1583
+ "step": 10600
1584
+ },
1585
+ {
1586
+ "epoch": 1.4589041095890412,
1587
+ "grad_norm": 25.235326766967773,
1588
+ "learning_rate": 1.5428767123287672e-05,
1589
+ "loss": 0.6574,
1590
+ "step": 10650
1591
+ },
1592
+ {
1593
+ "epoch": 1.4657534246575343,
1594
+ "grad_norm": 32.09264373779297,
1595
+ "learning_rate": 1.5360273972602738e-05,
1596
+ "loss": 0.5925,
1597
+ "step": 10700
1598
+ },
1599
+ {
1600
+ "epoch": 1.4726027397260273,
1601
+ "grad_norm": 19.04875373840332,
1602
+ "learning_rate": 1.5291780821917807e-05,
1603
+ "loss": 0.5259,
1604
+ "step": 10750
1605
+ },
1606
+ {
1607
+ "epoch": 1.4794520547945205,
1608
+ "grad_norm": 16.185894012451172,
1609
+ "learning_rate": 1.5223287671232877e-05,
1610
+ "loss": 0.6109,
1611
+ "step": 10800
1612
+ },
1613
+ {
1614
+ "epoch": 1.4863013698630136,
1615
+ "grad_norm": 17.464332580566406,
1616
+ "learning_rate": 1.5154794520547946e-05,
1617
+ "loss": 0.5961,
1618
+ "step": 10850
1619
+ },
1620
+ {
1621
+ "epoch": 1.4931506849315068,
1622
+ "grad_norm": 8.539608001708984,
1623
+ "learning_rate": 1.5086301369863012e-05,
1624
+ "loss": 0.5867,
1625
+ "step": 10900
1626
+ },
1627
+ {
1628
+ "epoch": 1.5,
1629
+ "grad_norm": 28.3868465423584,
1630
+ "learning_rate": 1.5017808219178082e-05,
1631
+ "loss": 0.6098,
1632
+ "step": 10950
1633
+ },
1634
+ {
1635
+ "epoch": 1.5068493150684932,
1636
+ "grad_norm": 6.247623443603516,
1637
+ "learning_rate": 1.4949315068493151e-05,
1638
+ "loss": 0.6059,
1639
+ "step": 11000
1640
+ },
1641
+ {
1642
+ "epoch": 1.5068493150684932,
1643
+ "eval_exact_match": 85.22232734153263,
1644
+ "eval_f1": 92.06076201446909,
1645
+ "eval_runtime": 416.847,
1646
+ "eval_samples_per_second": 25.357,
1647
+ "eval_steps_per_second": 1.586,
1648
+ "step": 11000
1649
+ },
1650
+ {
1651
+ "epoch": 1.5136986301369864,
1652
+ "grad_norm": 22.93709373474121,
1653
+ "learning_rate": 1.4880821917808219e-05,
1654
+ "loss": 0.5952,
1655
+ "step": 11050
1656
+ },
1657
+ {
1658
+ "epoch": 1.5205479452054793,
1659
+ "grad_norm": 40.23944091796875,
1660
+ "learning_rate": 1.4812328767123289e-05,
1661
+ "loss": 0.5395,
1662
+ "step": 11100
1663
+ },
1664
+ {
1665
+ "epoch": 1.5273972602739727,
1666
+ "grad_norm": 18.1081485748291,
1667
+ "learning_rate": 1.4743835616438356e-05,
1668
+ "loss": 0.5989,
1669
+ "step": 11150
1670
+ },
1671
+ {
1672
+ "epoch": 1.5342465753424657,
1673
+ "grad_norm": 34.16764831542969,
1674
+ "learning_rate": 1.4675342465753426e-05,
1675
+ "loss": 0.5738,
1676
+ "step": 11200
1677
+ },
1678
+ {
1679
+ "epoch": 1.541095890410959,
1680
+ "grad_norm": 8.558218002319336,
1681
+ "learning_rate": 1.4606849315068494e-05,
1682
+ "loss": 0.5767,
1683
+ "step": 11250
1684
+ },
1685
+ {
1686
+ "epoch": 1.547945205479452,
1687
+ "grad_norm": 50.398895263671875,
1688
+ "learning_rate": 1.4538356164383563e-05,
1689
+ "loss": 0.581,
1690
+ "step": 11300
1691
+ },
1692
+ {
1693
+ "epoch": 1.5547945205479452,
1694
+ "grad_norm": 8.642769813537598,
1695
+ "learning_rate": 1.4469863013698629e-05,
1696
+ "loss": 0.6654,
1697
+ "step": 11350
1698
+ },
1699
+ {
1700
+ "epoch": 1.5616438356164384,
1701
+ "grad_norm": 22.660131454467773,
1702
+ "learning_rate": 1.4401369863013699e-05,
1703
+ "loss": 0.6377,
1704
+ "step": 11400
1705
+ },
1706
+ {
1707
+ "epoch": 1.5684931506849316,
1708
+ "grad_norm": 17.38098907470703,
1709
+ "learning_rate": 1.4332876712328766e-05,
1710
+ "loss": 0.5625,
1711
+ "step": 11450
1712
+ },
1713
+ {
1714
+ "epoch": 1.5753424657534247,
1715
+ "grad_norm": 9.987977981567383,
1716
+ "learning_rate": 1.4264383561643836e-05,
1717
+ "loss": 0.5191,
1718
+ "step": 11500
1719
+ },
1720
+ {
1721
+ "epoch": 1.5821917808219177,
1722
+ "grad_norm": 42.332763671875,
1723
+ "learning_rate": 1.4195890410958904e-05,
1724
+ "loss": 0.6211,
1725
+ "step": 11550
1726
+ },
1727
+ {
1728
+ "epoch": 1.589041095890411,
1729
+ "grad_norm": 8.304152488708496,
1730
+ "learning_rate": 1.4127397260273973e-05,
1731
+ "loss": 0.6175,
1732
+ "step": 11600
1733
+ },
1734
+ {
1735
+ "epoch": 1.595890410958904,
1736
+ "grad_norm": 142.1208038330078,
1737
+ "learning_rate": 1.4058904109589041e-05,
1738
+ "loss": 0.5902,
1739
+ "step": 11650
1740
+ },
1741
+ {
1742
+ "epoch": 1.6027397260273972,
1743
+ "grad_norm": 13.38881778717041,
1744
+ "learning_rate": 1.399041095890411e-05,
1745
+ "loss": 0.5528,
1746
+ "step": 11700
1747
+ },
1748
+ {
1749
+ "epoch": 1.6095890410958904,
1750
+ "grad_norm": 8.86255168914795,
1751
+ "learning_rate": 1.3921917808219178e-05,
1752
+ "loss": 0.6205,
1753
+ "step": 11750
1754
+ },
1755
+ {
1756
+ "epoch": 1.6164383561643836,
1757
+ "grad_norm": 33.42393493652344,
1758
+ "learning_rate": 1.3853424657534248e-05,
1759
+ "loss": 0.6477,
1760
+ "step": 11800
1761
+ },
1762
+ {
1763
+ "epoch": 1.6232876712328768,
1764
+ "grad_norm": 15.220393180847168,
1765
+ "learning_rate": 1.3784931506849315e-05,
1766
+ "loss": 0.5493,
1767
+ "step": 11850
1768
+ },
1769
+ {
1770
+ "epoch": 1.6301369863013697,
1771
+ "grad_norm": 9.7684907913208,
1772
+ "learning_rate": 1.3716438356164385e-05,
1773
+ "loss": 0.5256,
1774
+ "step": 11900
1775
+ },
1776
+ {
1777
+ "epoch": 1.6369863013698631,
1778
+ "grad_norm": 19.64045524597168,
1779
+ "learning_rate": 1.3647945205479453e-05,
1780
+ "loss": 0.5653,
1781
+ "step": 11950
1782
+ },
1783
+ {
1784
+ "epoch": 1.643835616438356,
1785
+ "grad_norm": 28.879430770874023,
1786
+ "learning_rate": 1.357945205479452e-05,
1787
+ "loss": 0.5989,
1788
+ "step": 12000
1789
+ },
1790
+ {
1791
+ "epoch": 1.643835616438356,
1792
+ "eval_exact_match": 85.12771996215704,
1793
+ "eval_f1": 91.80306824412318,
1794
+ "eval_runtime": 417.2891,
1795
+ "eval_samples_per_second": 25.33,
1796
+ "eval_steps_per_second": 1.584,
1797
+ "step": 12000
1798
+ },
1799
+ {
1800
+ "epoch": 1.6506849315068495,
1801
+ "grad_norm": 7.393039226531982,
1802
+ "learning_rate": 1.3510958904109588e-05,
1803
+ "loss": 0.5271,
1804
+ "step": 12050
1805
+ },
1806
+ {
1807
+ "epoch": 1.6575342465753424,
1808
+ "grad_norm": 10.611188888549805,
1809
+ "learning_rate": 1.3442465753424658e-05,
1810
+ "loss": 0.5939,
1811
+ "step": 12100
1812
+ },
1813
+ {
1814
+ "epoch": 1.6643835616438356,
1815
+ "grad_norm": 9.510908126831055,
1816
+ "learning_rate": 1.3373972602739725e-05,
1817
+ "loss": 0.5574,
1818
+ "step": 12150
1819
+ },
1820
+ {
1821
+ "epoch": 1.6712328767123288,
1822
+ "grad_norm": 15.351234436035156,
1823
+ "learning_rate": 1.3305479452054795e-05,
1824
+ "loss": 0.5861,
1825
+ "step": 12200
1826
+ },
1827
+ {
1828
+ "epoch": 1.678082191780822,
1829
+ "grad_norm": 31.311676025390625,
1830
+ "learning_rate": 1.3236986301369863e-05,
1831
+ "loss": 0.5568,
1832
+ "step": 12250
1833
+ },
1834
+ {
1835
+ "epoch": 1.6849315068493151,
1836
+ "grad_norm": 4.728596210479736,
1837
+ "learning_rate": 1.3168493150684932e-05,
1838
+ "loss": 0.5431,
1839
+ "step": 12300
1840
+ },
1841
+ {
1842
+ "epoch": 1.691780821917808,
1843
+ "grad_norm": 21.880786895751953,
1844
+ "learning_rate": 1.31e-05,
1845
+ "loss": 0.5993,
1846
+ "step": 12350
1847
+ },
1848
+ {
1849
+ "epoch": 1.6986301369863015,
1850
+ "grad_norm": 5.625259876251221,
1851
+ "learning_rate": 1.303150684931507e-05,
1852
+ "loss": 0.6345,
1853
+ "step": 12400
1854
+ },
1855
+ {
1856
+ "epoch": 1.7054794520547945,
1857
+ "grad_norm": 12.03124713897705,
1858
+ "learning_rate": 1.2963013698630137e-05,
1859
+ "loss": 0.6195,
1860
+ "step": 12450
1861
+ },
1862
+ {
1863
+ "epoch": 1.7123287671232876,
1864
+ "grad_norm": 11.920297622680664,
1865
+ "learning_rate": 1.2894520547945207e-05,
1866
+ "loss": 0.5523,
1867
+ "step": 12500
1868
+ },
1869
+ {
1870
+ "epoch": 1.7191780821917808,
1871
+ "grad_norm": 12.449995994567871,
1872
+ "learning_rate": 1.2826027397260274e-05,
1873
+ "loss": 0.5442,
1874
+ "step": 12550
1875
+ },
1876
+ {
1877
+ "epoch": 1.726027397260274,
1878
+ "grad_norm": 15.602882385253906,
1879
+ "learning_rate": 1.2757534246575342e-05,
1880
+ "loss": 0.4794,
1881
+ "step": 12600
1882
+ },
1883
+ {
1884
+ "epoch": 1.7328767123287672,
1885
+ "grad_norm": 27.904523849487305,
1886
+ "learning_rate": 1.268904109589041e-05,
1887
+ "loss": 0.5184,
1888
+ "step": 12650
1889
+ },
1890
+ {
1891
+ "epoch": 1.7397260273972601,
1892
+ "grad_norm": 6.819875717163086,
1893
+ "learning_rate": 1.262054794520548e-05,
1894
+ "loss": 0.4637,
1895
+ "step": 12700
1896
+ },
1897
+ {
1898
+ "epoch": 1.7465753424657535,
1899
+ "grad_norm": 17.69037437438965,
1900
+ "learning_rate": 1.2552054794520547e-05,
1901
+ "loss": 0.5248,
1902
+ "step": 12750
1903
+ },
1904
+ {
1905
+ "epoch": 1.7534246575342465,
1906
+ "grad_norm": 25.76197052001953,
1907
+ "learning_rate": 1.2483561643835617e-05,
1908
+ "loss": 0.6165,
1909
+ "step": 12800
1910
+ },
1911
+ {
1912
+ "epoch": 1.7602739726027399,
1913
+ "grad_norm": 5.317371845245361,
1914
+ "learning_rate": 1.2415068493150685e-05,
1915
+ "loss": 0.5206,
1916
+ "step": 12850
1917
+ },
1918
+ {
1919
+ "epoch": 1.7671232876712328,
1920
+ "grad_norm": 8.703845977783203,
1921
+ "learning_rate": 1.2346575342465754e-05,
1922
+ "loss": 0.5994,
1923
+ "step": 12900
1924
+ },
1925
+ {
1926
+ "epoch": 1.773972602739726,
1927
+ "grad_norm": 16.243268966674805,
1928
+ "learning_rate": 1.2278082191780822e-05,
1929
+ "loss": 0.5416,
1930
+ "step": 12950
1931
+ },
1932
+ {
1933
+ "epoch": 1.7808219178082192,
1934
+ "grad_norm": 15.478755950927734,
1935
+ "learning_rate": 1.2209589041095891e-05,
1936
+ "loss": 0.6375,
1937
+ "step": 13000
1938
+ },
1939
+ {
1940
+ "epoch": 1.7808219178082192,
1941
+ "eval_exact_match": 85.65752128666036,
1942
+ "eval_f1": 92.15631910787795,
1943
+ "eval_runtime": 416.7008,
1944
+ "eval_samples_per_second": 25.366,
1945
+ "eval_steps_per_second": 1.586,
1946
+ "step": 13000
1947
+ },
1948
+ {
1949
+ "epoch": 1.7876712328767124,
1950
+ "grad_norm": 14.403656005859375,
1951
+ "learning_rate": 1.2141095890410959e-05,
1952
+ "loss": 0.6117,
1953
+ "step": 13050
1954
+ },
1955
+ {
1956
+ "epoch": 1.7945205479452055,
1957
+ "grad_norm": 22.657033920288086,
1958
+ "learning_rate": 1.2072602739726028e-05,
1959
+ "loss": 0.5757,
1960
+ "step": 13100
1961
+ },
1962
+ {
1963
+ "epoch": 1.8013698630136985,
1964
+ "grad_norm": 13.059876441955566,
1965
+ "learning_rate": 1.2004109589041096e-05,
1966
+ "loss": 0.5765,
1967
+ "step": 13150
1968
+ },
1969
+ {
1970
+ "epoch": 1.808219178082192,
1971
+ "grad_norm": 2.8631415367126465,
1972
+ "learning_rate": 1.1935616438356166e-05,
1973
+ "loss": 0.5688,
1974
+ "step": 13200
1975
+ },
1976
+ {
1977
+ "epoch": 1.8150684931506849,
1978
+ "grad_norm": 8.850284576416016,
1979
+ "learning_rate": 1.1867123287671232e-05,
1980
+ "loss": 0.5713,
1981
+ "step": 13250
1982
+ },
1983
+ {
1984
+ "epoch": 1.821917808219178,
1985
+ "grad_norm": 9.604997634887695,
1986
+ "learning_rate": 1.1798630136986301e-05,
1987
+ "loss": 0.5602,
1988
+ "step": 13300
1989
+ },
1990
+ {
1991
+ "epoch": 1.8287671232876712,
1992
+ "grad_norm": 27.603858947753906,
1993
+ "learning_rate": 1.1730136986301369e-05,
1994
+ "loss": 0.6105,
1995
+ "step": 13350
1996
+ },
1997
+ {
1998
+ "epoch": 1.8356164383561644,
1999
+ "grad_norm": 10.399881362915039,
2000
+ "learning_rate": 1.1661643835616439e-05,
2001
+ "loss": 0.5744,
2002
+ "step": 13400
2003
+ },
2004
+ {
2005
+ "epoch": 1.8424657534246576,
2006
+ "grad_norm": 16.775104522705078,
2007
+ "learning_rate": 1.1593150684931506e-05,
2008
+ "loss": 0.5776,
2009
+ "step": 13450
2010
+ },
2011
+ {
2012
+ "epoch": 1.8493150684931505,
2013
+ "grad_norm": 8.317610740661621,
2014
+ "learning_rate": 1.1524657534246576e-05,
2015
+ "loss": 0.6075,
2016
+ "step": 13500
2017
+ },
2018
+ {
2019
+ "epoch": 1.856164383561644,
2020
+ "grad_norm": 18.899354934692383,
2021
+ "learning_rate": 1.1456164383561644e-05,
2022
+ "loss": 0.5967,
2023
+ "step": 13550
2024
+ },
2025
+ {
2026
+ "epoch": 1.8630136986301369,
2027
+ "grad_norm": 10.251896858215332,
2028
+ "learning_rate": 1.1387671232876713e-05,
2029
+ "loss": 0.6121,
2030
+ "step": 13600
2031
+ },
2032
+ {
2033
+ "epoch": 1.8698630136986303,
2034
+ "grad_norm": 24.907438278198242,
2035
+ "learning_rate": 1.131917808219178e-05,
2036
+ "loss": 0.6348,
2037
+ "step": 13650
2038
+ },
2039
+ {
2040
+ "epoch": 1.8767123287671232,
2041
+ "grad_norm": 17.239213943481445,
2042
+ "learning_rate": 1.125068493150685e-05,
2043
+ "loss": 0.5835,
2044
+ "step": 13700
2045
+ },
2046
+ {
2047
+ "epoch": 1.8835616438356164,
2048
+ "grad_norm": 14.36588191986084,
2049
+ "learning_rate": 1.1182191780821918e-05,
2050
+ "loss": 0.5681,
2051
+ "step": 13750
2052
+ },
2053
+ {
2054
+ "epoch": 1.8904109589041096,
2055
+ "grad_norm": 10.424467086791992,
2056
+ "learning_rate": 1.1113698630136988e-05,
2057
+ "loss": 0.5334,
2058
+ "step": 13800
2059
+ },
2060
+ {
2061
+ "epoch": 1.8972602739726028,
2062
+ "grad_norm": 11.122437477111816,
2063
+ "learning_rate": 1.1045205479452055e-05,
2064
+ "loss": 0.5594,
2065
+ "step": 13850
2066
+ },
2067
+ {
2068
+ "epoch": 1.904109589041096,
2069
+ "grad_norm": 10.735795021057129,
2070
+ "learning_rate": 1.0976712328767123e-05,
2071
+ "loss": 0.5565,
2072
+ "step": 13900
2073
+ },
2074
+ {
2075
+ "epoch": 1.910958904109589,
2076
+ "grad_norm": 10.91677474975586,
2077
+ "learning_rate": 1.0908219178082191e-05,
2078
+ "loss": 0.5945,
2079
+ "step": 13950
2080
+ },
2081
+ {
2082
+ "epoch": 1.9178082191780823,
2083
+ "grad_norm": 7.375208377838135,
2084
+ "learning_rate": 1.083972602739726e-05,
2085
+ "loss": 0.536,
2086
+ "step": 14000
2087
+ },
2088
+ {
2089
+ "epoch": 1.9178082191780823,
2090
+ "eval_exact_match": 86.10217596972564,
2091
+ "eval_f1": 92.4099426929563,
2092
+ "eval_runtime": 417.908,
2093
+ "eval_samples_per_second": 25.293,
2094
+ "eval_steps_per_second": 1.582,
2095
+ "step": 14000
2096
+ },
2097
+ {
2098
+ "epoch": 1.9246575342465753,
2099
+ "grad_norm": 32.034141540527344,
2100
+ "learning_rate": 1.0771232876712328e-05,
2101
+ "loss": 0.5696,
2102
+ "step": 14050
2103
+ },
2104
+ {
2105
+ "epoch": 1.9315068493150684,
2106
+ "grad_norm": 21.64228057861328,
2107
+ "learning_rate": 1.0702739726027398e-05,
2108
+ "loss": 0.6152,
2109
+ "step": 14100
2110
+ },
2111
+ {
2112
+ "epoch": 1.9383561643835616,
2113
+ "grad_norm": 21.606149673461914,
2114
+ "learning_rate": 1.0634246575342465e-05,
2115
+ "loss": 0.5286,
2116
+ "step": 14150
2117
+ },
2118
+ {
2119
+ "epoch": 1.9452054794520548,
2120
+ "grad_norm": 29.915918350219727,
2121
+ "learning_rate": 1.0565753424657535e-05,
2122
+ "loss": 0.5781,
2123
+ "step": 14200
2124
+ },
2125
+ {
2126
+ "epoch": 1.952054794520548,
2127
+ "grad_norm": 17.88494873046875,
2128
+ "learning_rate": 1.0497260273972603e-05,
2129
+ "loss": 0.6066,
2130
+ "step": 14250
2131
+ },
2132
+ {
2133
+ "epoch": 1.958904109589041,
2134
+ "grad_norm": 24.821956634521484,
2135
+ "learning_rate": 1.0428767123287672e-05,
2136
+ "loss": 0.5819,
2137
+ "step": 14300
2138
+ },
2139
+ {
2140
+ "epoch": 1.9657534246575343,
2141
+ "grad_norm": 34.31120681762695,
2142
+ "learning_rate": 1.036027397260274e-05,
2143
+ "loss": 0.6531,
2144
+ "step": 14350
2145
+ },
2146
+ {
2147
+ "epoch": 1.9726027397260273,
2148
+ "grad_norm": 13.344314575195312,
2149
+ "learning_rate": 1.029178082191781e-05,
2150
+ "loss": 0.5585,
2151
+ "step": 14400
2152
+ },
2153
+ {
2154
+ "epoch": 1.9794520547945207,
2155
+ "grad_norm": 40.556358337402344,
2156
+ "learning_rate": 1.0223287671232877e-05,
2157
+ "loss": 0.5257,
2158
+ "step": 14450
2159
+ },
2160
+ {
2161
+ "epoch": 1.9863013698630136,
2162
+ "grad_norm": 11.713874816894531,
2163
+ "learning_rate": 1.0154794520547947e-05,
2164
+ "loss": 0.6307,
2165
+ "step": 14500
2166
+ },
2167
+ {
2168
+ "epoch": 1.9931506849315068,
2169
+ "grad_norm": 9.927220344543457,
2170
+ "learning_rate": 1.0086301369863013e-05,
2171
+ "loss": 0.5814,
2172
+ "step": 14550
2173
+ },
2174
+ {
2175
+ "epoch": 2.0,
2176
+ "grad_norm": 14.175002098083496,
2177
+ "learning_rate": 1.0017808219178082e-05,
2178
+ "loss": 0.6428,
2179
+ "step": 14600
2180
+ },
2181
+ {
2182
+ "epoch": 2.006849315068493,
2183
+ "grad_norm": 28.156503677368164,
2184
+ "learning_rate": 9.94931506849315e-06,
2185
+ "loss": 0.3428,
2186
+ "step": 14650
2187
+ },
2188
+ {
2189
+ "epoch": 2.0136986301369864,
2190
+ "grad_norm": 12.531486511230469,
2191
+ "learning_rate": 9.88082191780822e-06,
2192
+ "loss": 0.3625,
2193
+ "step": 14700
2194
+ },
2195
+ {
2196
+ "epoch": 2.0205479452054793,
2197
+ "grad_norm": 6.839471340179443,
2198
+ "learning_rate": 9.812328767123287e-06,
2199
+ "loss": 0.3675,
2200
+ "step": 14750
2201
+ },
2202
+ {
2203
+ "epoch": 2.0273972602739727,
2204
+ "grad_norm": 9.022225379943848,
2205
+ "learning_rate": 9.743835616438357e-06,
2206
+ "loss": 0.3613,
2207
+ "step": 14800
2208
+ },
2209
+ {
2210
+ "epoch": 2.0342465753424657,
2211
+ "grad_norm": 9.764174461364746,
2212
+ "learning_rate": 9.675342465753424e-06,
2213
+ "loss": 0.3353,
2214
+ "step": 14850
2215
+ },
2216
+ {
2217
+ "epoch": 2.041095890410959,
2218
+ "grad_norm": 8.139081001281738,
2219
+ "learning_rate": 9.606849315068494e-06,
2220
+ "loss": 0.3749,
2221
+ "step": 14900
2222
+ },
2223
+ {
2224
+ "epoch": 2.047945205479452,
2225
+ "grad_norm": 19.99053955078125,
2226
+ "learning_rate": 9.538356164383562e-06,
2227
+ "loss": 0.3434,
2228
+ "step": 14950
2229
+ },
2230
+ {
2231
+ "epoch": 2.0547945205479454,
2232
+ "grad_norm": 4.129643440246582,
2233
+ "learning_rate": 9.469863013698631e-06,
2234
+ "loss": 0.3753,
2235
+ "step": 15000
2236
+ },
2237
+ {
2238
+ "epoch": 2.0547945205479454,
2239
+ "eval_exact_match": 85.34531693472091,
2240
+ "eval_f1": 92.12097175946987,
2241
+ "eval_runtime": 417.8979,
2242
+ "eval_samples_per_second": 25.293,
2243
+ "eval_steps_per_second": 1.582,
2244
+ "step": 15000
2245
+ },
2246
+ {
2247
+ "epoch": 2.0616438356164384,
2248
+ "grad_norm": 33.54549026489258,
2249
+ "learning_rate": 9.401369863013699e-06,
2250
+ "loss": 0.3592,
2251
+ "step": 15050
2252
+ },
2253
+ {
2254
+ "epoch": 2.0684931506849313,
2255
+ "grad_norm": 24.34673500061035,
2256
+ "learning_rate": 9.332876712328768e-06,
2257
+ "loss": 0.3847,
2258
+ "step": 15100
2259
+ },
2260
+ {
2261
+ "epoch": 2.0753424657534247,
2262
+ "grad_norm": 5.306540489196777,
2263
+ "learning_rate": 9.264383561643836e-06,
2264
+ "loss": 0.3157,
2265
+ "step": 15150
2266
+ },
2267
+ {
2268
+ "epoch": 2.0821917808219177,
2269
+ "grad_norm": 5.992457866668701,
2270
+ "learning_rate": 9.195890410958904e-06,
2271
+ "loss": 0.3308,
2272
+ "step": 15200
2273
+ },
2274
+ {
2275
+ "epoch": 2.089041095890411,
2276
+ "grad_norm": 15.749208450317383,
2277
+ "learning_rate": 9.127397260273972e-06,
2278
+ "loss": 0.3631,
2279
+ "step": 15250
2280
+ },
2281
+ {
2282
+ "epoch": 2.095890410958904,
2283
+ "grad_norm": 10.97288703918457,
2284
+ "learning_rate": 9.058904109589041e-06,
2285
+ "loss": 0.3107,
2286
+ "step": 15300
2287
+ },
2288
+ {
2289
+ "epoch": 2.1027397260273974,
2290
+ "grad_norm": 5.519676208496094,
2291
+ "learning_rate": 8.990410958904109e-06,
2292
+ "loss": 0.3225,
2293
+ "step": 15350
2294
+ },
2295
+ {
2296
+ "epoch": 2.1095890410958904,
2297
+ "grad_norm": 24.270828247070312,
2298
+ "learning_rate": 8.921917808219179e-06,
2299
+ "loss": 0.3833,
2300
+ "step": 15400
2301
+ },
2302
+ {
2303
+ "epoch": 2.1164383561643834,
2304
+ "grad_norm": 17.408357620239258,
2305
+ "learning_rate": 8.854794520547946e-06,
2306
+ "loss": 0.3484,
2307
+ "step": 15450
2308
+ },
2309
+ {
2310
+ "epoch": 2.1232876712328768,
2311
+ "grad_norm": 19.13545036315918,
2312
+ "learning_rate": 8.786301369863013e-06,
2313
+ "loss": 0.3417,
2314
+ "step": 15500
2315
+ },
2316
+ {
2317
+ "epoch": 2.1301369863013697,
2318
+ "grad_norm": 12.084640502929688,
2319
+ "learning_rate": 8.717808219178083e-06,
2320
+ "loss": 0.3475,
2321
+ "step": 15550
2322
+ },
2323
+ {
2324
+ "epoch": 2.136986301369863,
2325
+ "grad_norm": 32.05975341796875,
2326
+ "learning_rate": 8.64931506849315e-06,
2327
+ "loss": 0.3406,
2328
+ "step": 15600
2329
+ },
2330
+ {
2331
+ "epoch": 2.143835616438356,
2332
+ "grad_norm": 19.190528869628906,
2333
+ "learning_rate": 8.58082191780822e-06,
2334
+ "loss": 0.4047,
2335
+ "step": 15650
2336
+ },
2337
+ {
2338
+ "epoch": 2.1506849315068495,
2339
+ "grad_norm": 2.928743600845337,
2340
+ "learning_rate": 8.512328767123288e-06,
2341
+ "loss": 0.3339,
2342
+ "step": 15700
2343
+ },
2344
+ {
2345
+ "epoch": 2.1575342465753424,
2346
+ "grad_norm": 11.721294403076172,
2347
+ "learning_rate": 8.443835616438357e-06,
2348
+ "loss": 0.3314,
2349
+ "step": 15750
2350
+ },
2351
+ {
2352
+ "epoch": 2.1643835616438354,
2353
+ "grad_norm": 8.114335060119629,
2354
+ "learning_rate": 8.375342465753425e-06,
2355
+ "loss": 0.3736,
2356
+ "step": 15800
2357
+ },
2358
+ {
2359
+ "epoch": 2.171232876712329,
2360
+ "grad_norm": 14.555135726928711,
2361
+ "learning_rate": 8.306849315068495e-06,
2362
+ "loss": 0.3332,
2363
+ "step": 15850
2364
+ },
2365
+ {
2366
+ "epoch": 2.1780821917808217,
2367
+ "grad_norm": 81.77395629882812,
2368
+ "learning_rate": 8.23835616438356e-06,
2369
+ "loss": 0.3049,
2370
+ "step": 15900
2371
+ },
2372
+ {
2373
+ "epoch": 2.184931506849315,
2374
+ "grad_norm": 8.095141410827637,
2375
+ "learning_rate": 8.16986301369863e-06,
2376
+ "loss": 0.2637,
2377
+ "step": 15950
2378
+ },
2379
+ {
2380
+ "epoch": 2.191780821917808,
2381
+ "grad_norm": 17.374752044677734,
2382
+ "learning_rate": 8.101369863013698e-06,
2383
+ "loss": 0.4439,
2384
+ "step": 16000
2385
+ },
2386
+ {
2387
+ "epoch": 2.191780821917808,
2388
+ "eval_exact_match": 85.61021759697256,
2389
+ "eval_f1": 92.16749429035703,
2390
+ "eval_runtime": 416.4255,
2391
+ "eval_samples_per_second": 25.383,
2392
+ "eval_steps_per_second": 1.587,
2393
+ "step": 16000
2394
+ },
2395
+ {
2396
+ "epoch": 2.1986301369863015,
2397
+ "grad_norm": 8.427570343017578,
2398
+ "learning_rate": 8.032876712328767e-06,
2399
+ "loss": 0.2902,
2400
+ "step": 16050
2401
+ },
2402
+ {
2403
+ "epoch": 2.2054794520547945,
2404
+ "grad_norm": 6.139337539672852,
2405
+ "learning_rate": 7.964383561643835e-06,
2406
+ "loss": 0.3242,
2407
+ "step": 16100
2408
+ },
2409
+ {
2410
+ "epoch": 2.212328767123288,
2411
+ "grad_norm": 26.255056381225586,
2412
+ "learning_rate": 7.895890410958905e-06,
2413
+ "loss": 0.3496,
2414
+ "step": 16150
2415
+ },
2416
+ {
2417
+ "epoch": 2.219178082191781,
2418
+ "grad_norm": 16.986501693725586,
2419
+ "learning_rate": 7.827397260273972e-06,
2420
+ "loss": 0.3297,
2421
+ "step": 16200
2422
+ },
2423
+ {
2424
+ "epoch": 2.2260273972602738,
2425
+ "grad_norm": 17.626201629638672,
2426
+ "learning_rate": 7.758904109589042e-06,
2427
+ "loss": 0.3733,
2428
+ "step": 16250
2429
+ },
2430
+ {
2431
+ "epoch": 2.232876712328767,
2432
+ "grad_norm": 20.750221252441406,
2433
+ "learning_rate": 7.69041095890411e-06,
2434
+ "loss": 0.3316,
2435
+ "step": 16300
2436
+ },
2437
+ {
2438
+ "epoch": 2.23972602739726,
2439
+ "grad_norm": 42.70435333251953,
2440
+ "learning_rate": 7.621917808219179e-06,
2441
+ "loss": 0.3721,
2442
+ "step": 16350
2443
+ },
2444
+ {
2445
+ "epoch": 2.2465753424657535,
2446
+ "grad_norm": 11.971570014953613,
2447
+ "learning_rate": 7.553424657534246e-06,
2448
+ "loss": 0.3056,
2449
+ "step": 16400
2450
+ },
2451
+ {
2452
+ "epoch": 2.2534246575342465,
2453
+ "grad_norm": 7.298882484436035,
2454
+ "learning_rate": 7.484931506849315e-06,
2455
+ "loss": 0.4077,
2456
+ "step": 16450
2457
+ },
2458
+ {
2459
+ "epoch": 2.26027397260274,
2460
+ "grad_norm": 8.013861656188965,
2461
+ "learning_rate": 7.416438356164383e-06,
2462
+ "loss": 0.3488,
2463
+ "step": 16500
2464
+ },
2465
+ {
2466
+ "epoch": 2.267123287671233,
2467
+ "grad_norm": 10.631738662719727,
2468
+ "learning_rate": 7.347945205479452e-06,
2469
+ "loss": 0.3567,
2470
+ "step": 16550
2471
+ },
2472
+ {
2473
+ "epoch": 2.2739726027397262,
2474
+ "grad_norm": 47.24718475341797,
2475
+ "learning_rate": 7.2794520547945206e-06,
2476
+ "loss": 0.3319,
2477
+ "step": 16600
2478
+ },
2479
+ {
2480
+ "epoch": 2.280821917808219,
2481
+ "grad_norm": 31.66643714904785,
2482
+ "learning_rate": 7.210958904109589e-06,
2483
+ "loss": 0.3768,
2484
+ "step": 16650
2485
+ },
2486
+ {
2487
+ "epoch": 2.287671232876712,
2488
+ "grad_norm": 34.2202033996582,
2489
+ "learning_rate": 7.142465753424657e-06,
2490
+ "loss": 0.4115,
2491
+ "step": 16700
2492
+ },
2493
+ {
2494
+ "epoch": 2.2945205479452055,
2495
+ "grad_norm": 35.246585845947266,
2496
+ "learning_rate": 7.073972602739726e-06,
2497
+ "loss": 0.3538,
2498
+ "step": 16750
2499
+ },
2500
+ {
2501
+ "epoch": 2.3013698630136985,
2502
+ "grad_norm": 17.598068237304688,
2503
+ "learning_rate": 7.005479452054794e-06,
2504
+ "loss": 0.3189,
2505
+ "step": 16800
2506
+ },
2507
+ {
2508
+ "epoch": 2.308219178082192,
2509
+ "grad_norm": 31.153356552124023,
2510
+ "learning_rate": 6.936986301369863e-06,
2511
+ "loss": 0.4082,
2512
+ "step": 16850
2513
+ },
2514
+ {
2515
+ "epoch": 2.315068493150685,
2516
+ "grad_norm": 3.9330878257751465,
2517
+ "learning_rate": 6.8684931506849315e-06,
2518
+ "loss": 0.366,
2519
+ "step": 16900
2520
+ },
2521
+ {
2522
+ "epoch": 2.3219178082191783,
2523
+ "grad_norm": 13.598389625549316,
2524
+ "learning_rate": 6.8e-06,
2525
+ "loss": 0.3723,
2526
+ "step": 16950
2527
+ },
2528
+ {
2529
+ "epoch": 2.328767123287671,
2530
+ "grad_norm": 12.808009147644043,
2531
+ "learning_rate": 6.731506849315069e-06,
2532
+ "loss": 0.323,
2533
+ "step": 17000
2534
+ },
2535
+ {
2536
+ "epoch": 2.328767123287671,
2537
+ "eval_exact_match": 85.30747398297068,
2538
+ "eval_f1": 92.19853158333738,
2539
+ "eval_runtime": 418.1668,
2540
+ "eval_samples_per_second": 25.277,
2541
+ "eval_steps_per_second": 1.581,
2542
+ "step": 17000
2543
+ },
2544
+ {
2545
+ "epoch": 2.3356164383561646,
2546
+ "grad_norm": 7.125260353088379,
2547
+ "learning_rate": 6.6630136986301365e-06,
2548
+ "loss": 0.3353,
2549
+ "step": 17050
2550
+ },
2551
+ {
2552
+ "epoch": 2.3424657534246576,
2553
+ "grad_norm": 11.679519653320312,
2554
+ "learning_rate": 6.594520547945205e-06,
2555
+ "loss": 0.3704,
2556
+ "step": 17100
2557
+ },
2558
+ {
2559
+ "epoch": 2.3493150684931505,
2560
+ "grad_norm": 12.352055549621582,
2561
+ "learning_rate": 6.526027397260274e-06,
2562
+ "loss": 0.3419,
2563
+ "step": 17150
2564
+ },
2565
+ {
2566
+ "epoch": 2.356164383561644,
2567
+ "grad_norm": 5.7046709060668945,
2568
+ "learning_rate": 6.457534246575342e-06,
2569
+ "loss": 0.3601,
2570
+ "step": 17200
2571
+ },
2572
+ {
2573
+ "epoch": 2.363013698630137,
2574
+ "grad_norm": 6.679656505584717,
2575
+ "learning_rate": 6.389041095890411e-06,
2576
+ "loss": 0.3537,
2577
+ "step": 17250
2578
+ },
2579
+ {
2580
+ "epoch": 2.3698630136986303,
2581
+ "grad_norm": 1.9317164421081543,
2582
+ "learning_rate": 6.32054794520548e-06,
2583
+ "loss": 0.3496,
2584
+ "step": 17300
2585
+ },
2586
+ {
2587
+ "epoch": 2.3767123287671232,
2588
+ "grad_norm": 7.0557026863098145,
2589
+ "learning_rate": 6.2520547945205474e-06,
2590
+ "loss": 0.3469,
2591
+ "step": 17350
2592
+ },
2593
+ {
2594
+ "epoch": 2.383561643835616,
2595
+ "grad_norm": 10.497695922851562,
2596
+ "learning_rate": 6.183561643835616e-06,
2597
+ "loss": 0.3477,
2598
+ "step": 17400
2599
+ },
2600
+ {
2601
+ "epoch": 2.3904109589041096,
2602
+ "grad_norm": 7.504631996154785,
2603
+ "learning_rate": 6.115068493150685e-06,
2604
+ "loss": 0.3188,
2605
+ "step": 17450
2606
+ },
2607
+ {
2608
+ "epoch": 2.3972602739726026,
2609
+ "grad_norm": 2.7814149856567383,
2610
+ "learning_rate": 6.046575342465753e-06,
2611
+ "loss": 0.3582,
2612
+ "step": 17500
2613
+ },
2614
+ {
2615
+ "epoch": 2.404109589041096,
2616
+ "grad_norm": 4.986628532409668,
2617
+ "learning_rate": 5.978082191780822e-06,
2618
+ "loss": 0.3234,
2619
+ "step": 17550
2620
+ },
2621
+ {
2622
+ "epoch": 2.410958904109589,
2623
+ "grad_norm": 10.343502044677734,
2624
+ "learning_rate": 5.9095890410958906e-06,
2625
+ "loss": 0.3475,
2626
+ "step": 17600
2627
+ },
2628
+ {
2629
+ "epoch": 2.4178082191780823,
2630
+ "grad_norm": 19.3160400390625,
2631
+ "learning_rate": 5.841095890410958e-06,
2632
+ "loss": 0.3722,
2633
+ "step": 17650
2634
+ },
2635
+ {
2636
+ "epoch": 2.4246575342465753,
2637
+ "grad_norm": 11.461263656616211,
2638
+ "learning_rate": 5.772602739726027e-06,
2639
+ "loss": 0.3842,
2640
+ "step": 17700
2641
+ },
2642
+ {
2643
+ "epoch": 2.4315068493150687,
2644
+ "grad_norm": 9.144445419311523,
2645
+ "learning_rate": 5.704109589041096e-06,
2646
+ "loss": 0.3361,
2647
+ "step": 17750
2648
+ },
2649
+ {
2650
+ "epoch": 2.4383561643835616,
2651
+ "grad_norm": 6.6573686599731445,
2652
+ "learning_rate": 5.635616438356164e-06,
2653
+ "loss": 0.3334,
2654
+ "step": 17800
2655
+ },
2656
+ {
2657
+ "epoch": 2.4452054794520546,
2658
+ "grad_norm": 8.072036743164062,
2659
+ "learning_rate": 5.567123287671233e-06,
2660
+ "loss": 0.2792,
2661
+ "step": 17850
2662
+ },
2663
+ {
2664
+ "epoch": 2.452054794520548,
2665
+ "grad_norm": 28.824522018432617,
2666
+ "learning_rate": 5.4986301369863015e-06,
2667
+ "loss": 0.3438,
2668
+ "step": 17900
2669
+ },
2670
+ {
2671
+ "epoch": 2.458904109589041,
2672
+ "grad_norm": 13.252080917358398,
2673
+ "learning_rate": 5.43013698630137e-06,
2674
+ "loss": 0.3496,
2675
+ "step": 17950
2676
+ },
2677
+ {
2678
+ "epoch": 2.4657534246575343,
2679
+ "grad_norm": 46.41205596923828,
2680
+ "learning_rate": 5.361643835616438e-06,
2681
+ "loss": 0.381,
2682
+ "step": 18000
2683
+ },
2684
+ {
2685
+ "epoch": 2.4657534246575343,
2686
+ "eval_exact_match": 85.30747398297068,
2687
+ "eval_f1": 92.19996078048693,
2688
+ "eval_runtime": 416.6834,
2689
+ "eval_samples_per_second": 25.367,
2690
+ "eval_steps_per_second": 1.586,
2691
+ "step": 18000
2692
+ },
2693
+ {
2694
+ "epoch": 2.4726027397260273,
2695
+ "grad_norm": 10.28970718383789,
2696
+ "learning_rate": 5.2931506849315065e-06,
2697
+ "loss": 0.396,
2698
+ "step": 18050
2699
+ },
2700
+ {
2701
+ "epoch": 2.4794520547945207,
2702
+ "grad_norm": 15.0628080368042,
2703
+ "learning_rate": 5.224657534246575e-06,
2704
+ "loss": 0.3326,
2705
+ "step": 18100
2706
+ },
2707
+ {
2708
+ "epoch": 2.4863013698630136,
2709
+ "grad_norm": 11.597463607788086,
2710
+ "learning_rate": 5.156164383561644e-06,
2711
+ "loss": 0.3538,
2712
+ "step": 18150
2713
+ },
2714
+ {
2715
+ "epoch": 2.493150684931507,
2716
+ "grad_norm": 7.035673141479492,
2717
+ "learning_rate": 5.087671232876712e-06,
2718
+ "loss": 0.3049,
2719
+ "step": 18200
2720
+ },
2721
+ {
2722
+ "epoch": 2.5,
2723
+ "grad_norm": 24.056955337524414,
2724
+ "learning_rate": 5.019178082191781e-06,
2725
+ "loss": 0.4231,
2726
+ "step": 18250
2727
+ },
2728
+ {
2729
+ "epoch": 2.506849315068493,
2730
+ "grad_norm": 19.858354568481445,
2731
+ "learning_rate": 4.950684931506849e-06,
2732
+ "loss": 0.3143,
2733
+ "step": 18300
2734
+ },
2735
+ {
2736
+ "epoch": 2.5136986301369864,
2737
+ "grad_norm": 10.258318901062012,
2738
+ "learning_rate": 4.8821917808219174e-06,
2739
+ "loss": 0.3242,
2740
+ "step": 18350
2741
+ },
2742
+ {
2743
+ "epoch": 2.5205479452054793,
2744
+ "grad_norm": 14.595016479492188,
2745
+ "learning_rate": 4.813698630136986e-06,
2746
+ "loss": 0.3642,
2747
+ "step": 18400
2748
+ },
2749
+ {
2750
+ "epoch": 2.5273972602739727,
2751
+ "grad_norm": 6.389834403991699,
2752
+ "learning_rate": 4.745205479452055e-06,
2753
+ "loss": 0.3086,
2754
+ "step": 18450
2755
+ },
2756
+ {
2757
+ "epoch": 2.5342465753424657,
2758
+ "grad_norm": 5.80775260925293,
2759
+ "learning_rate": 4.676712328767123e-06,
2760
+ "loss": 0.3142,
2761
+ "step": 18500
2762
+ },
2763
+ {
2764
+ "epoch": 2.541095890410959,
2765
+ "grad_norm": 4.96169376373291,
2766
+ "learning_rate": 4.608219178082192e-06,
2767
+ "loss": 0.3292,
2768
+ "step": 18550
2769
+ },
2770
+ {
2771
+ "epoch": 2.547945205479452,
2772
+ "grad_norm": 22.449113845825195,
2773
+ "learning_rate": 4.5397260273972606e-06,
2774
+ "loss": 0.4852,
2775
+ "step": 18600
2776
+ },
2777
+ {
2778
+ "epoch": 2.5547945205479454,
2779
+ "grad_norm": 23.349910736083984,
2780
+ "learning_rate": 4.471232876712328e-06,
2781
+ "loss": 0.3374,
2782
+ "step": 18650
2783
+ },
2784
+ {
2785
+ "epoch": 2.5616438356164384,
2786
+ "grad_norm": 9.687122344970703,
2787
+ "learning_rate": 4.402739726027397e-06,
2788
+ "loss": 0.3257,
2789
+ "step": 18700
2790
+ },
2791
+ {
2792
+ "epoch": 2.5684931506849313,
2793
+ "grad_norm": 8.347723007202148,
2794
+ "learning_rate": 4.334246575342466e-06,
2795
+ "loss": 0.2819,
2796
+ "step": 18750
2797
+ },
2798
+ {
2799
+ "epoch": 2.5753424657534247,
2800
+ "grad_norm": 11.261799812316895,
2801
+ "learning_rate": 4.265753424657534e-06,
2802
+ "loss": 0.2929,
2803
+ "step": 18800
2804
+ },
2805
+ {
2806
+ "epoch": 2.5821917808219177,
2807
+ "grad_norm": 7.757129192352295,
2808
+ "learning_rate": 4.197260273972603e-06,
2809
+ "loss": 0.2746,
2810
+ "step": 18850
2811
+ },
2812
+ {
2813
+ "epoch": 2.589041095890411,
2814
+ "grad_norm": 13.809581756591797,
2815
+ "learning_rate": 4.1287671232876715e-06,
2816
+ "loss": 0.3915,
2817
+ "step": 18900
2818
+ },
2819
+ {
2820
+ "epoch": 2.595890410958904,
2821
+ "grad_norm": 15.997642517089844,
2822
+ "learning_rate": 4.060273972602739e-06,
2823
+ "loss": 0.3181,
2824
+ "step": 18950
2825
+ },
2826
+ {
2827
+ "epoch": 2.602739726027397,
2828
+ "grad_norm": 27.046567916870117,
2829
+ "learning_rate": 3.991780821917808e-06,
2830
+ "loss": 0.309,
2831
+ "step": 19000
2832
+ },
2833
+ {
2834
+ "epoch": 2.602739726027397,
2835
+ "eval_exact_match": 85.57237464522233,
2836
+ "eval_f1": 92.37303362559862,
2837
+ "eval_runtime": 417.6236,
2838
+ "eval_samples_per_second": 25.31,
2839
+ "eval_steps_per_second": 1.583,
2840
+ "step": 19000
2841
+ },
2842
+ {
2843
+ "epoch": 2.6095890410958904,
2844
+ "grad_norm": 12.608302116394043,
2845
+ "learning_rate": 3.9232876712328765e-06,
2846
+ "loss": 0.3295,
2847
+ "step": 19050
2848
+ },
2849
+ {
2850
+ "epoch": 2.616438356164384,
2851
+ "grad_norm": 8.650862693786621,
2852
+ "learning_rate": 3.854794520547945e-06,
2853
+ "loss": 0.3119,
2854
+ "step": 19100
2855
+ },
2856
+ {
2857
+ "epoch": 2.6232876712328768,
2858
+ "grad_norm": 8.674386978149414,
2859
+ "learning_rate": 3.7863013698630138e-06,
2860
+ "loss": 0.2546,
2861
+ "step": 19150
2862
+ },
2863
+ {
2864
+ "epoch": 2.6301369863013697,
2865
+ "grad_norm": 5.919064998626709,
2866
+ "learning_rate": 3.717808219178082e-06,
2867
+ "loss": 0.3697,
2868
+ "step": 19200
2869
+ },
2870
+ {
2871
+ "epoch": 2.636986301369863,
2872
+ "grad_norm": 7.016098499298096,
2873
+ "learning_rate": 3.6493150684931506e-06,
2874
+ "loss": 0.3302,
2875
+ "step": 19250
2876
+ },
2877
+ {
2878
+ "epoch": 2.643835616438356,
2879
+ "grad_norm": 8.310173034667969,
2880
+ "learning_rate": 3.5808219178082192e-06,
2881
+ "loss": 0.349,
2882
+ "step": 19300
2883
+ },
2884
+ {
2885
+ "epoch": 2.6506849315068495,
2886
+ "grad_norm": 8.628369331359863,
2887
+ "learning_rate": 3.5123287671232874e-06,
2888
+ "loss": 0.3424,
2889
+ "step": 19350
2890
+ },
2891
+ {
2892
+ "epoch": 2.6575342465753424,
2893
+ "grad_norm": 13.100319862365723,
2894
+ "learning_rate": 3.443835616438356e-06,
2895
+ "loss": 0.3611,
2896
+ "step": 19400
2897
+ },
2898
+ {
2899
+ "epoch": 2.6643835616438354,
2900
+ "grad_norm": 5.1128692626953125,
2901
+ "learning_rate": 3.3753424657534247e-06,
2902
+ "loss": 0.3575,
2903
+ "step": 19450
2904
+ },
2905
+ {
2906
+ "epoch": 2.671232876712329,
2907
+ "grad_norm": 15.877631187438965,
2908
+ "learning_rate": 3.306849315068493e-06,
2909
+ "loss": 0.3791,
2910
+ "step": 19500
2911
+ },
2912
+ {
2913
+ "epoch": 2.678082191780822,
2914
+ "grad_norm": 14.39353084564209,
2915
+ "learning_rate": 3.2383561643835615e-06,
2916
+ "loss": 0.2931,
2917
+ "step": 19550
2918
+ },
2919
+ {
2920
+ "epoch": 2.684931506849315,
2921
+ "grad_norm": 6.899120807647705,
2922
+ "learning_rate": 3.16986301369863e-06,
2923
+ "loss": 0.3782,
2924
+ "step": 19600
2925
+ },
2926
+ {
2927
+ "epoch": 2.691780821917808,
2928
+ "grad_norm": 10.824689865112305,
2929
+ "learning_rate": 3.1013698630136988e-06,
2930
+ "loss": 0.3049,
2931
+ "step": 19650
2932
+ },
2933
+ {
2934
+ "epoch": 2.6986301369863015,
2935
+ "grad_norm": 18.822484970092773,
2936
+ "learning_rate": 3.032876712328767e-06,
2937
+ "loss": 0.4038,
2938
+ "step": 19700
2939
+ },
2940
+ {
2941
+ "epoch": 2.7054794520547945,
2942
+ "grad_norm": 15.976180076599121,
2943
+ "learning_rate": 2.9643835616438356e-06,
2944
+ "loss": 0.2828,
2945
+ "step": 19750
2946
+ },
2947
+ {
2948
+ "epoch": 2.712328767123288,
2949
+ "grad_norm": 27.864824295043945,
2950
+ "learning_rate": 2.8958904109589042e-06,
2951
+ "loss": 0.2974,
2952
+ "step": 19800
2953
+ },
2954
+ {
2955
+ "epoch": 2.719178082191781,
2956
+ "grad_norm": 16.251916885375977,
2957
+ "learning_rate": 2.8273972602739724e-06,
2958
+ "loss": 0.3124,
2959
+ "step": 19850
2960
+ },
2961
+ {
2962
+ "epoch": 2.7260273972602738,
2963
+ "grad_norm": 55.30453872680664,
2964
+ "learning_rate": 2.758904109589041e-06,
2965
+ "loss": 0.3211,
2966
+ "step": 19900
2967
+ },
2968
+ {
2969
+ "epoch": 2.732876712328767,
2970
+ "grad_norm": 15.253798484802246,
2971
+ "learning_rate": 2.6904109589041097e-06,
2972
+ "loss": 0.3369,
2973
+ "step": 19950
2974
+ },
2975
+ {
2976
+ "epoch": 2.73972602739726,
2977
+ "grad_norm": 67.18714904785156,
2978
+ "learning_rate": 2.623287671232877e-06,
2979
+ "loss": 0.376,
2980
+ "step": 20000
2981
+ },
2982
+ {
2983
+ "epoch": 2.73972602739726,
2984
+ "eval_exact_match": 85.26963103122044,
2985
+ "eval_f1": 92.09863360357713,
2986
+ "eval_runtime": 417.6598,
2987
+ "eval_samples_per_second": 25.308,
2988
+ "eval_steps_per_second": 1.583,
2989
+ "step": 20000
2990
+ },
2991
+ {
2992
+ "epoch": 2.7465753424657535,
2993
+ "grad_norm": 11.747102737426758,
2994
+ "learning_rate": 2.5547945205479453e-06,
2995
+ "loss": 0.3359,
2996
+ "step": 20050
2997
+ },
2998
+ {
2999
+ "epoch": 2.7534246575342465,
3000
+ "grad_norm": 16.32077980041504,
3001
+ "learning_rate": 2.486301369863014e-06,
3002
+ "loss": 0.3529,
3003
+ "step": 20100
3004
+ },
3005
+ {
3006
+ "epoch": 2.76027397260274,
3007
+ "grad_norm": 40.523311614990234,
3008
+ "learning_rate": 2.4178082191780826e-06,
3009
+ "loss": 0.3346,
3010
+ "step": 20150
3011
+ },
3012
+ {
3013
+ "epoch": 2.767123287671233,
3014
+ "grad_norm": 16.648462295532227,
3015
+ "learning_rate": 2.3493150684931508e-06,
3016
+ "loss": 0.3167,
3017
+ "step": 20200
3018
+ },
3019
+ {
3020
+ "epoch": 2.7739726027397262,
3021
+ "grad_norm": 13.732590675354004,
3022
+ "learning_rate": 2.2808219178082194e-06,
3023
+ "loss": 0.3431,
3024
+ "step": 20250
3025
+ },
3026
+ {
3027
+ "epoch": 2.780821917808219,
3028
+ "grad_norm": 10.560003280639648,
3029
+ "learning_rate": 2.2123287671232876e-06,
3030
+ "loss": 0.3688,
3031
+ "step": 20300
3032
+ },
3033
+ {
3034
+ "epoch": 2.787671232876712,
3035
+ "grad_norm": 6.171405792236328,
3036
+ "learning_rate": 2.1438356164383562e-06,
3037
+ "loss": 0.3314,
3038
+ "step": 20350
3039
+ },
3040
+ {
3041
+ "epoch": 2.7945205479452055,
3042
+ "grad_norm": 11.62563419342041,
3043
+ "learning_rate": 2.0753424657534244e-06,
3044
+ "loss": 0.3371,
3045
+ "step": 20400
3046
+ },
3047
+ {
3048
+ "epoch": 2.8013698630136985,
3049
+ "grad_norm": 12.698870658874512,
3050
+ "learning_rate": 2.006849315068493e-06,
3051
+ "loss": 0.3429,
3052
+ "step": 20450
3053
+ },
3054
+ {
3055
+ "epoch": 2.808219178082192,
3056
+ "grad_norm": 12.162155151367188,
3057
+ "learning_rate": 1.9383561643835617e-06,
3058
+ "loss": 0.3383,
3059
+ "step": 20500
3060
+ },
3061
+ {
3062
+ "epoch": 2.815068493150685,
3063
+ "grad_norm": 11.249317169189453,
3064
+ "learning_rate": 1.8698630136986303e-06,
3065
+ "loss": 0.3228,
3066
+ "step": 20550
3067
+ },
3068
+ {
3069
+ "epoch": 2.821917808219178,
3070
+ "grad_norm": 15.184226989746094,
3071
+ "learning_rate": 1.8013698630136987e-06,
3072
+ "loss": 0.3299,
3073
+ "step": 20600
3074
+ },
3075
+ {
3076
+ "epoch": 2.828767123287671,
3077
+ "grad_norm": 21.128114700317383,
3078
+ "learning_rate": 1.7328767123287671e-06,
3079
+ "loss": 0.2959,
3080
+ "step": 20650
3081
+ },
3082
+ {
3083
+ "epoch": 2.8356164383561646,
3084
+ "grad_norm": 18.958660125732422,
3085
+ "learning_rate": 1.6643835616438358e-06,
3086
+ "loss": 0.2943,
3087
+ "step": 20700
3088
+ },
3089
+ {
3090
+ "epoch": 2.8424657534246576,
3091
+ "grad_norm": 11.99002456665039,
3092
+ "learning_rate": 1.5958904109589042e-06,
3093
+ "loss": 0.3151,
3094
+ "step": 20750
3095
+ },
3096
+ {
3097
+ "epoch": 2.8493150684931505,
3098
+ "grad_norm": 41.37422561645508,
3099
+ "learning_rate": 1.5273972602739726e-06,
3100
+ "loss": 0.3659,
3101
+ "step": 20800
3102
+ },
3103
+ {
3104
+ "epoch": 2.856164383561644,
3105
+ "grad_norm": 63.95098876953125,
3106
+ "learning_rate": 1.4589041095890412e-06,
3107
+ "loss": 0.3366,
3108
+ "step": 20850
3109
+ },
3110
+ {
3111
+ "epoch": 2.863013698630137,
3112
+ "grad_norm": 7.210549354553223,
3113
+ "learning_rate": 1.3904109589041096e-06,
3114
+ "loss": 0.3207,
3115
+ "step": 20900
3116
+ },
3117
+ {
3118
+ "epoch": 2.8698630136986303,
3119
+ "grad_norm": 21.405946731567383,
3120
+ "learning_rate": 1.3219178082191783e-06,
3121
+ "loss": 0.314,
3122
+ "step": 20950
3123
+ },
3124
+ {
3125
+ "epoch": 2.8767123287671232,
3126
+ "grad_norm": 40.97032165527344,
3127
+ "learning_rate": 1.2534246575342467e-06,
3128
+ "loss": 0.3598,
3129
+ "step": 21000
3130
+ },
3131
+ {
3132
+ "epoch": 2.8767123287671232,
3133
+ "eval_exact_match": 85.58183538315988,
3134
+ "eval_f1": 92.20238705674495,
3135
+ "eval_runtime": 416.9361,
3136
+ "eval_samples_per_second": 25.352,
3137
+ "eval_steps_per_second": 1.585,
3138
+ "step": 21000
3139
+ },
3140
+ {
3141
+ "epoch": 2.883561643835616,
3142
+ "grad_norm": 28.014732360839844,
3143
+ "learning_rate": 1.184931506849315e-06,
3144
+ "loss": 0.2415,
3145
+ "step": 21050
3146
+ },
3147
+ {
3148
+ "epoch": 2.8904109589041096,
3149
+ "grad_norm": 28.053062438964844,
3150
+ "learning_rate": 1.1164383561643837e-06,
3151
+ "loss": 0.3468,
3152
+ "step": 21100
3153
+ },
3154
+ {
3155
+ "epoch": 2.897260273972603,
3156
+ "grad_norm": 7.82937479019165,
3157
+ "learning_rate": 1.0479452054794521e-06,
3158
+ "loss": 0.3102,
3159
+ "step": 21150
3160
+ },
3161
+ {
3162
+ "epoch": 2.904109589041096,
3163
+ "grad_norm": 9.58170223236084,
3164
+ "learning_rate": 9.794520547945205e-07,
3165
+ "loss": 0.3115,
3166
+ "step": 21200
3167
+ },
3168
+ {
3169
+ "epoch": 2.910958904109589,
3170
+ "grad_norm": 15.694490432739258,
3171
+ "learning_rate": 9.109589041095891e-07,
3172
+ "loss": 0.2848,
3173
+ "step": 21250
3174
+ },
3175
+ {
3176
+ "epoch": 2.9178082191780823,
3177
+ "grad_norm": 11.358589172363281,
3178
+ "learning_rate": 8.424657534246575e-07,
3179
+ "loss": 0.3353,
3180
+ "step": 21300
3181
+ },
3182
+ {
3183
+ "epoch": 2.9246575342465753,
3184
+ "grad_norm": 42.85386276245117,
3185
+ "learning_rate": 7.73972602739726e-07,
3186
+ "loss": 0.271,
3187
+ "step": 21350
3188
+ },
3189
+ {
3190
+ "epoch": 2.9315068493150687,
3191
+ "grad_norm": 4.216646194458008,
3192
+ "learning_rate": 7.054794520547945e-07,
3193
+ "loss": 0.352,
3194
+ "step": 21400
3195
+ },
3196
+ {
3197
+ "epoch": 2.9383561643835616,
3198
+ "grad_norm": 18.55265235900879,
3199
+ "learning_rate": 6.36986301369863e-07,
3200
+ "loss": 0.3376,
3201
+ "step": 21450
3202
+ },
3203
+ {
3204
+ "epoch": 2.9452054794520546,
3205
+ "grad_norm": 10.340827941894531,
3206
+ "learning_rate": 5.684931506849316e-07,
3207
+ "loss": 0.2902,
3208
+ "step": 21500
3209
+ },
3210
+ {
3211
+ "epoch": 2.952054794520548,
3212
+ "grad_norm": 13.433025360107422,
3213
+ "learning_rate": 5e-07,
3214
+ "loss": 0.3152,
3215
+ "step": 21550
3216
+ },
3217
+ {
3218
+ "epoch": 2.958904109589041,
3219
+ "grad_norm": 26.042661666870117,
3220
+ "learning_rate": 4.315068493150685e-07,
3221
+ "loss": 0.3431,
3222
+ "step": 21600
3223
+ },
3224
+ {
3225
+ "epoch": 2.9657534246575343,
3226
+ "grad_norm": 19.630699157714844,
3227
+ "learning_rate": 3.63013698630137e-07,
3228
+ "loss": 0.296,
3229
+ "step": 21650
3230
+ },
3231
+ {
3232
+ "epoch": 2.9726027397260273,
3233
+ "grad_norm": 25.595293045043945,
3234
+ "learning_rate": 2.945205479452055e-07,
3235
+ "loss": 0.352,
3236
+ "step": 21700
3237
+ },
3238
+ {
3239
+ "epoch": 2.9794520547945207,
3240
+ "grad_norm": 11.616922378540039,
3241
+ "learning_rate": 2.2602739726027398e-07,
3242
+ "loss": 0.3358,
3243
+ "step": 21750
3244
+ },
3245
+ {
3246
+ "epoch": 2.9863013698630136,
3247
+ "grad_norm": 5.852312088012695,
3248
+ "learning_rate": 1.5753424657534248e-07,
3249
+ "loss": 0.3017,
3250
+ "step": 21800
3251
+ },
3252
+ {
3253
+ "epoch": 2.993150684931507,
3254
+ "grad_norm": 30.67576789855957,
3255
+ "learning_rate": 8.904109589041097e-08,
3256
+ "loss": 0.3171,
3257
+ "step": 21850
3258
+ },
3259
+ {
3260
+ "epoch": 3.0,
3261
+ "grad_norm": 5.107680320739746,
3262
+ "learning_rate": 2.0547945205479452e-08,
3263
+ "loss": 0.3315,
3264
+ "step": 21900
3265
+ },
3266
+ {
3267
+ "epoch": 3.0,
3268
+ "step": 21900,
3269
+ "total_flos": 2.006738209660207e+18,
3270
+ "train_loss": 0.6596583454358523,
3271
+ "train_runtime": 64667.9553,
3272
+ "train_samples_per_second": 4.064,
3273
+ "train_steps_per_second": 0.339
3274
+ }
3275
+ ],
3276
+ "logging_steps": 50,
3277
+ "max_steps": 21900,
3278
+ "num_input_tokens_seen": 0,
3279
+ "num_train_epochs": 3,
3280
+ "save_steps": 5000,
3281
+ "stateful_callbacks": {
3282
+ "TrainerControl": {
3283
+ "args": {
3284
+ "should_epoch_stop": false,
3285
+ "should_evaluate": false,
3286
+ "should_log": false,
3287
+ "should_save": true,
3288
+ "should_training_stop": true
3289
+ },
3290
+ "attributes": {}
3291
+ }
3292
+ },
3293
+ "total_flos": 2.006738209660207e+18,
3294
+ "train_batch_size": 3,
3295
+ "trial_name": null,
3296
+ "trial_params": null
3297
+ }