Capstone-lpx commited on
Commit
39429bf
1 Parent(s): e3e2d41

Model save

Browse files
README.md CHANGED
@@ -1,6 +1,6 @@
1
  ---
2
  license: apache-2.0
3
- base_model: distilroberta-base
4
  tags:
5
  - generated_from_trainer
6
  model-index:
@@ -13,9 +13,9 @@ should probably proofread and complete it, then remove this comment. -->
13
 
14
  # my_awesome_eli5_mlm_model
15
 
16
- This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 2.9773
19
 
20
  ## Model description
21
 
@@ -40,15 +40,37 @@ The following hyperparameters were used during training:
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
- - num_epochs: 3
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
- | No log | 1.0 | 123 | 3.0550 |
50
- | No log | 2.0 | 246 | 2.9230 |
51
- | No log | 3.0 | 369 | 2.9773 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52
 
53
 
54
  ### Framework versions
 
1
  ---
2
  license: apache-2.0
3
+ base_model: bert-base-cased
4
  tags:
5
  - generated_from_trainer
6
  model-index:
 
13
 
14
  # my_awesome_eli5_mlm_model
15
 
16
+ This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 2.5349
19
 
20
  ## Model description
21
 
 
40
  - seed: 42
41
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
  - lr_scheduler_type: linear
43
+ - num_epochs: 25
44
 
45
  ### Training results
46
 
47
  | Training Loss | Epoch | Step | Validation Loss |
48
  |:-------------:|:-----:|:----:|:---------------:|
49
+ | No log | 1.0 | 38 | 3.3747 |
50
+ | No log | 2.0 | 76 | 3.1852 |
51
+ | No log | 3.0 | 114 | 3.0839 |
52
+ | No log | 4.0 | 152 | 3.1410 |
53
+ | No log | 5.0 | 190 | 3.0394 |
54
+ | No log | 6.0 | 228 | 3.0631 |
55
+ | No log | 7.0 | 266 | 3.1484 |
56
+ | No log | 8.0 | 304 | 2.7834 |
57
+ | No log | 9.0 | 342 | 2.9527 |
58
+ | No log | 10.0 | 380 | 3.2091 |
59
+ | No log | 11.0 | 418 | 3.0497 |
60
+ | No log | 12.0 | 456 | 2.7234 |
61
+ | No log | 13.0 | 494 | 2.7865 |
62
+ | 2.8163 | 14.0 | 532 | 2.5425 |
63
+ | 2.8163 | 15.0 | 570 | 2.9351 |
64
+ | 2.8163 | 16.0 | 608 | 2.9367 |
65
+ | 2.8163 | 17.0 | 646 | 3.0570 |
66
+ | 2.8163 | 18.0 | 684 | 2.8902 |
67
+ | 2.8163 | 19.0 | 722 | 2.8285 |
68
+ | 2.8163 | 20.0 | 760 | 3.1220 |
69
+ | 2.8163 | 21.0 | 798 | 2.5891 |
70
+ | 2.8163 | 22.0 | 836 | 2.7217 |
71
+ | 2.8163 | 23.0 | 874 | 2.8660 |
72
+ | 2.8163 | 24.0 | 912 | 2.9048 |
73
+ | 2.8163 | 25.0 | 950 | 2.5349 |
74
 
75
 
76
  ### Framework versions
generation_config.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "pad_token_id": 0,
4
+ "transformers_version": "4.35.2"
5
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:71467fba6c9538de8ed450f2e6b7f71f98274a40b32ad958304b15a43861c205
3
  size 433386912
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6a6f1959eddf6348ecf992f9fbac8016ad224488a0f601e6f523f00dc4fad294
3
  size 433386912
runs/Feb05_07-55-20_a00f031c4cdb/events.out.tfevents.1707119721.a00f031c4cdb.148.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fb1a142d924c6403280b97c1528a9749219a2c923bdcf91ea430c8111610b8da
3
- size 4184
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8970e4a44cdd5d2f2a4f9ef18102bed2582c5212817484b18ea1f2acaab0fd3
3
+ size 4227
runs/Feb05_07-56-03_a00f031c4cdb/events.out.tfevents.1707119764.a00f031c4cdb.148.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a8be506b0d091f9cf3e1380e972db6c09e2d1ef0a4e10c45bc156f7de0394ef7
3
- size 7892
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7800e2558e885675ff982623eefb2708ef9e4e58dda4b9348aef23537c82bebb
3
+ size 11498