kaizerBox commited on
Commit
508732a
1 Parent(s): 937019f

retnet-summarization

Browse files
README.md CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [kaizerBox/retnet-summarization](https://huggingface.co/kaizerBox/retnet-summarization) on the xsum dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 3.3419
20
 
21
  ## Model description
22
 
@@ -35,7 +35,7 @@ More information needed
35
  ### Training hyperparameters
36
 
37
  The following hyperparameters were used during training:
38
- - learning_rate: 0.0006
39
  - train_batch_size: 4
40
  - eval_batch_size: 4
41
  - seed: 42
@@ -43,20 +43,21 @@ The following hyperparameters were used during training:
43
  - total_train_batch_size: 16
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: cosine
46
- - lr_scheduler_warmup_steps: 10
47
- - num_epochs: 1
48
  - mixed_precision_training: Native AMP
49
 
50
  ### Training results
51
 
52
  | Training Loss | Epoch | Step | Validation Loss |
53
  |:-------------:|:-----:|:-----:|:---------------:|
54
- | 3.526 | 1.0 | 11525 | 3.3419 |
 
55
 
56
 
57
  ### Framework versions
58
 
59
- - Transformers 4.35.1
60
  - Pytorch 2.1.0+cu118
61
- - Datasets 2.14.7
62
- - Tokenizers 0.14.1
 
16
 
17
  This model is a fine-tuned version of [kaizerBox/retnet-summarization](https://huggingface.co/kaizerBox/retnet-summarization) on the xsum dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 3.2278
20
 
21
  ## Model description
22
 
 
35
  ### Training hyperparameters
36
 
37
  The following hyperparameters were used during training:
38
+ - learning_rate: 0.001
39
  - train_batch_size: 4
40
  - eval_batch_size: 4
41
  - seed: 42
 
43
  - total_train_batch_size: 16
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: cosine
46
+ - lr_scheduler_warmup_steps: 100
47
+ - num_epochs: 2
48
  - mixed_precision_training: Native AMP
49
 
50
  ### Training results
51
 
52
  | Training Loss | Epoch | Step | Validation Loss |
53
  |:-------------:|:-----:|:-----:|:---------------:|
54
+ | 3.5745 | 1.0 | 11525 | 3.3519 |
55
+ | 3.2943 | 2.0 | 23050 | 3.2278 |
56
 
57
 
58
  ### Framework versions
59
 
60
+ - Transformers 4.35.2
61
  - Pytorch 2.1.0+cu118
62
+ - Datasets 2.15.0
63
+ - Tokenizers 0.15.0
config.json CHANGED
@@ -28,7 +28,7 @@
28
  "subln": true,
29
  "tie_word_embeddings": false,
30
  "torch_dtype": "float32",
31
- "transformers_version": "4.35.1",
32
  "use_cache": true,
33
  "use_ffn_rms_norm": false,
34
  "use_glu": true,
 
28
  "subln": true,
29
  "tie_word_embeddings": false,
30
  "torch_dtype": "float32",
31
+ "transformers_version": "4.35.2",
32
  "use_cache": true,
33
  "use_ffn_rms_norm": false,
34
  "use_glu": true,
generation_config.json CHANGED
@@ -2,5 +2,5 @@
2
  "_from_model_config": true,
3
  "eos_token_id": 50256,
4
  "pad_token_id": 50257,
5
- "transformers_version": "4.35.1"
6
  }
 
2
  "_from_model_config": true,
3
  "eos_token_id": 50256,
4
  "pad_token_id": 50257,
5
+ "transformers_version": "4.35.2"
6
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:135c553a8555e8441356fc8dd27dbec164342d7490649da58f7f3d439bce1d6d
3
  size 282181632
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcf0d709b16e07b15db027202b870e08f7d3ae077557bf74cc06b0d7ad3bf14f
3
  size 282181632
runs/Nov18_20-31-38_8c72afc6b1ac/events.out.tfevents.1700339498.8c72afc6b1ac.1388.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:39dac12db60d2bfcf075c0adfc1fb7d7a55d9dbd713d7fa53e714904fe18db14
3
+ size 5797
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:adaeb137f022ec8cb43ec6a21e1f916423a3301acbc6c3559f4f65cf756e7e63
3
  size 4600
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7a96b8d61b7662077d1506e6bf1b4972a9c3cc5082f1a698e147e19a9a879099
3
  size 4600