gossminn commited on
Commit
266e4da
1 Parent(s): 8938279

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +96 -0
README.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: predict-perception-bert-focus-victim
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # predict-perception-bert-focus-victim
14
+
15
+ This model is a fine-tuned version of [dbmdz/bert-base-italian-xxl-cased](https://huggingface.co/dbmdz/bert-base-italian-xxl-cased) on an unknown dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.2466
18
+ - Rmse: 0.6201
19
+ - Rmse Focus::a Sulla vittima: 0.6201
20
+ - Mae: 0.4936
21
+ - Mae Focus::a Sulla vittima: 0.4936
22
+ - R2: 0.7293
23
+ - R2 Focus::a Sulla vittima: 0.7293
24
+ - Cos: 0.8261
25
+ - Pair: 0.0
26
+ - Rank: 0.5
27
+ - Neighbors: 0.8155
28
+ - Rsa: nan
29
+
30
+ ## Model description
31
+
32
+ More information needed
33
+
34
+ ## Intended uses & limitations
35
+
36
+ More information needed
37
+
38
+ ## Training and evaluation data
39
+
40
+ More information needed
41
+
42
+ ## Training procedure
43
+
44
+ ### Training hyperparameters
45
+
46
+ The following hyperparameters were used during training:
47
+ - learning_rate: 1e-05
48
+ - train_batch_size: 20
49
+ - eval_batch_size: 8
50
+ - seed: 1996
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: linear
53
+ - num_epochs: 30
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Validation Loss | Rmse | Rmse Focus::a Sulla vittima | Mae | Mae Focus::a Sulla vittima | R2 | R2 Focus::a Sulla vittima | Cos | Pair | Rank | Neighbors | Rsa |
58
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:---------------------------:|:------:|:--------------------------:|:-------:|:-------------------------:|:------:|:----:|:----:|:---------:|:---:|
59
+ | 1.0247 | 1.0 | 15 | 1.0286 | 1.2665 | 1.2665 | 1.0280 | 1.0280 | -0.1292 | -0.1292 | 0.1304 | 0.0 | 0.5 | 0.3685 | nan |
60
+ | 0.9912 | 2.0 | 30 | 1.0039 | 1.2512 | 1.2512 | 1.0347 | 1.0347 | -0.1020 | -0.1020 | 0.0435 | 0.0 | 0.5 | 0.3333 | nan |
61
+ | 0.9147 | 3.0 | 45 | 0.9338 | 1.2067 | 1.2067 | 0.9770 | 0.9770 | -0.0251 | -0.0251 | 0.1304 | 0.0 | 0.5 | 0.3685 | nan |
62
+ | 0.8194 | 4.0 | 60 | 0.7641 | 1.0916 | 1.0916 | 0.8476 | 0.8476 | 0.1612 | 0.1612 | 0.4783 | 0.0 | 0.5 | 0.5284 | nan |
63
+ | 0.6636 | 5.0 | 75 | 0.6618 | 1.0159 | 1.0159 | 0.8012 | 0.8012 | 0.2735 | 0.2735 | 0.6522 | 0.0 | 0.5 | 0.4741 | nan |
64
+ | 0.523 | 6.0 | 90 | 0.5176 | 0.8984 | 0.8984 | 0.7044 | 0.7044 | 0.4318 | 0.4318 | 0.6522 | 0.0 | 0.5 | 0.4741 | nan |
65
+ | 0.402 | 7.0 | 105 | 0.3804 | 0.7702 | 0.7702 | 0.6042 | 0.6042 | 0.5824 | 0.5824 | 0.6522 | 0.0 | 0.5 | 0.5395 | nan |
66
+ | 0.3401 | 8.0 | 120 | 0.3594 | 0.7487 | 0.7487 | 0.5703 | 0.5703 | 0.6054 | 0.6054 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
67
+ | 0.2615 | 9.0 | 135 | 0.3429 | 0.7312 | 0.7312 | 0.6049 | 0.6049 | 0.6236 | 0.6236 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
68
+ | 0.1928 | 10.0 | 150 | 0.2889 | 0.6712 | 0.6712 | 0.5487 | 0.5487 | 0.6828 | 0.6828 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
69
+ | 0.1703 | 11.0 | 165 | 0.2675 | 0.6458 | 0.6458 | 0.5188 | 0.5188 | 0.7064 | 0.7064 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
70
+ | 0.1209 | 12.0 | 180 | 0.2826 | 0.6639 | 0.6639 | 0.5475 | 0.5475 | 0.6897 | 0.6897 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
71
+ | 0.1428 | 13.0 | 195 | 0.2978 | 0.6815 | 0.6815 | 0.5777 | 0.5777 | 0.6731 | 0.6731 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
72
+ | 0.1038 | 14.0 | 210 | 0.2924 | 0.6753 | 0.6753 | 0.5865 | 0.5865 | 0.6790 | 0.6790 | 0.6522 | 0.0 | 0.5 | 0.2760 | nan |
73
+ | 0.0951 | 15.0 | 225 | 0.2905 | 0.6731 | 0.6731 | 0.5750 | 0.5750 | 0.6811 | 0.6811 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
74
+ | 0.0809 | 16.0 | 240 | 0.2676 | 0.6460 | 0.6460 | 0.5552 | 0.5552 | 0.7062 | 0.7062 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
75
+ | 0.0811 | 17.0 | 255 | 0.2770 | 0.6572 | 0.6572 | 0.5543 | 0.5543 | 0.6959 | 0.6959 | 0.7391 | 0.0 | 0.5 | 0.6920 | nan |
76
+ | 0.0703 | 18.0 | 270 | 0.2634 | 0.6409 | 0.6409 | 0.5251 | 0.5251 | 0.7108 | 0.7108 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
77
+ | 0.0595 | 19.0 | 285 | 0.2638 | 0.6413 | 0.6413 | 0.5196 | 0.5196 | 0.7104 | 0.7104 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
78
+ | 0.0651 | 20.0 | 300 | 0.2520 | 0.6268 | 0.6268 | 0.4970 | 0.4970 | 0.7234 | 0.7234 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
79
+ | 0.0637 | 21.0 | 315 | 0.2668 | 0.6451 | 0.6451 | 0.4965 | 0.4965 | 0.7071 | 0.7071 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
80
+ | 0.0582 | 22.0 | 330 | 0.2455 | 0.6188 | 0.6188 | 0.4759 | 0.4759 | 0.7305 | 0.7305 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
81
+ | 0.0616 | 23.0 | 345 | 0.2509 | 0.6255 | 0.6255 | 0.5084 | 0.5084 | 0.7246 | 0.7246 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
82
+ | 0.0492 | 24.0 | 360 | 0.2510 | 0.6256 | 0.6256 | 0.4985 | 0.4985 | 0.7244 | 0.7244 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
83
+ | 0.0504 | 25.0 | 375 | 0.2512 | 0.6259 | 0.6259 | 0.4849 | 0.4849 | 0.7242 | 0.7242 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
84
+ | 0.0501 | 26.0 | 390 | 0.2585 | 0.6350 | 0.6350 | 0.5140 | 0.5140 | 0.7162 | 0.7162 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
85
+ | 0.0411 | 27.0 | 405 | 0.2544 | 0.6299 | 0.6299 | 0.5148 | 0.5148 | 0.7207 | 0.7207 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
86
+ | 0.044 | 28.0 | 420 | 0.2466 | 0.6201 | 0.6201 | 0.4964 | 0.4964 | 0.7293 | 0.7293 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
87
+ | 0.042 | 29.0 | 435 | 0.2466 | 0.6201 | 0.6201 | 0.4836 | 0.4836 | 0.7293 | 0.7293 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
88
+ | 0.0446 | 30.0 | 450 | 0.2466 | 0.6201 | 0.6201 | 0.4936 | 0.4936 | 0.7293 | 0.7293 | 0.8261 | 0.0 | 0.5 | 0.8155 | nan |
89
+
90
+
91
+ ### Framework versions
92
+
93
+ - Transformers 4.16.2
94
+ - Pytorch 1.10.2+cu113
95
+ - Datasets 1.18.3
96
+ - Tokenizers 0.11.0