jgammack commited on
Commit
04d1e3f
1 Parent(s): 77c5fc8

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -11
README.md CHANGED
@@ -1,4 +1,5 @@
1
  ---
 
2
  tags:
3
  - generated_from_trainer
4
  model-index:
@@ -11,14 +12,9 @@ should probably proofread and complete it, then remove this comment. -->
11
 
12
  # MTL-bert-base-uncased
13
 
14
- This model is a fine-tuned version of [](https://huggingface.co/) on the None dataset.
15
  It achieves the following results on the evaluation set:
16
- - eval_loss: 6.4337
17
- - eval_runtime: 15.4115
18
- - eval_samples_per_second: 45.096
19
- - eval_steps_per_second: 5.645
20
- - epoch: 9.11
21
- - step: 7152
22
 
23
  ## Model description
24
 
@@ -45,9 +41,30 @@ The following hyperparameters were used during training:
45
  - lr_scheduler_type: linear
46
  - num_epochs: 15
47
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
48
  ### Framework versions
49
 
50
- - Transformers 4.11.3
51
- - Pytorch 1.9.0+cu111
52
- - Datasets 1.13.3
53
- - Tokenizers 0.10.3
1
  ---
2
+ license: apache-2.0
3
  tags:
4
  - generated_from_trainer
5
  model-index:
12
 
13
  # MTL-bert-base-uncased
14
 
15
+ This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 2.2486
 
 
 
 
 
18
 
19
  ## Model description
20
 
41
  - lr_scheduler_type: linear
42
  - num_epochs: 15
43
 
44
+ ### Training results
45
+
46
+ | Training Loss | Epoch | Step | Validation Loss |
47
+ |:-------------:|:-----:|:----:|:---------------:|
48
+ | No log | 1.0 | 94 | 2.6014 |
49
+ | No log | 2.0 | 188 | 2.3898 |
50
+ | No log | 3.0 | 282 | 2.3706 |
51
+ | No log | 4.0 | 376 | 2.2708 |
52
+ | No log | 5.0 | 470 | 2.2666 |
53
+ | 2.5403 | 6.0 | 564 | 2.2639 |
54
+ | 2.5403 | 7.0 | 658 | 2.3061 |
55
+ | 2.5403 | 8.0 | 752 | 2.2659 |
56
+ | 2.5403 | 9.0 | 846 | 2.1882 |
57
+ | 2.5403 | 10.0 | 940 | 2.2198 |
58
+ | 2.2739 | 11.0 | 1034 | 2.1124 |
59
+ | 2.2739 | 12.0 | 1128 | 2.1887 |
60
+ | 2.2739 | 13.0 | 1222 | 2.1740 |
61
+ | 2.2739 | 14.0 | 1316 | 2.1517 |
62
+ | 2.2739 | 15.0 | 1410 | 2.1510 |
63
+
64
+
65
  ### Framework versions
66
 
67
+ - Transformers 4.16.2
68
+ - Pytorch 1.10.0+cu111
69
+ - Datasets 1.18.3
70
+ - Tokenizers 0.11.0