File size: 9,268 Bytes
8295e50
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: predict-perception-bert-cause-human
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# predict-perception-bert-cause-human

This model is a fine-tuned version of [dbmdz/bert-base-italian-xxl-cased](https://huggingface.co/dbmdz/bert-base-italian-xxl-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7139
- Rmse: 1.2259
- Rmse Cause::a Causata da un essere umano: 1.2259
- Mae: 1.0480
- Mae Cause::a Causata da un essere umano: 1.0480
- R2: 0.4563
- R2 Cause::a Causata da un essere umano: 0.4563
- Cos: 0.4783
- Pair: 0.0
- Rank: 0.5
- Neighbors: 0.3953
- Rsa: nan

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 20
- eval_batch_size: 8
- seed: 1996
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rmse   | Rmse Cause::a Causata da un essere umano | Mae    | Mae Cause::a Causata da un essere umano | R2     | R2 Cause::a Causata da un essere umano | Cos    | Pair | Rank | Neighbors | Rsa |
|:-------------:|:-----:|:----:|:---------------:|:------:|:----------------------------------------:|:------:|:---------------------------------------:|:------:|:--------------------------------------:|:------:|:----:|:----:|:---------:|:---:|
| 1.0874        | 1.0   | 15   | 1.2615          | 1.6296 | 1.6296                                   | 1.3836 | 1.3836                                  | 0.0393 | 0.0393                                 | 0.0435 | 0.0  | 0.5  | 0.2935    | nan |
| 0.9577        | 2.0   | 30   | 1.1988          | 1.5886 | 1.5886                                   | 1.3017 | 1.3017                                  | 0.0870 | 0.0870                                 | 0.4783 | 0.0  | 0.5  | 0.3944    | nan |
| 0.8414        | 3.0   | 45   | 0.9870          | 1.4414 | 1.4414                                   | 1.1963 | 1.1963                                  | 0.2483 | 0.2483                                 | 0.3913 | 0.0  | 0.5  | 0.3048    | nan |
| 0.7291        | 4.0   | 60   | 0.9098          | 1.3839 | 1.3839                                   | 1.1297 | 1.1297                                  | 0.3071 | 0.3071                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.5949        | 5.0   | 75   | 0.9207          | 1.3921 | 1.3921                                   | 1.2079 | 1.2079                                  | 0.2988 | 0.2988                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.4938        | 6.0   | 90   | 0.8591          | 1.3448 | 1.3448                                   | 1.1842 | 1.1842                                  | 0.3458 | 0.3458                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.3611        | 7.0   | 105  | 0.8176          | 1.3119 | 1.3119                                   | 1.1454 | 1.1454                                  | 0.3774 | 0.3774                                 | 0.5652 | 0.0  | 0.5  | 0.4091    | nan |
| 0.2663        | 8.0   | 120  | 0.6879          | 1.2034 | 1.2034                                   | 1.0300 | 1.0300                                  | 0.4761 | 0.4761                                 | 0.5652 | 0.0  | 0.5  | 0.4091    | nan |
| 0.1833        | 9.0   | 135  | 0.7704          | 1.2735 | 1.2735                                   | 1.1031 | 1.1031                                  | 0.4133 | 0.4133                                 | 0.5652 | 0.0  | 0.5  | 0.3152    | nan |
| 0.1704        | 10.0  | 150  | 0.7097          | 1.2222 | 1.2222                                   | 1.0382 | 1.0382                                  | 0.4596 | 0.4596                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.1219        | 11.0  | 165  | 0.6872          | 1.2027 | 1.2027                                   | 1.0198 | 1.0198                                  | 0.4767 | 0.4767                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.1011        | 12.0  | 180  | 0.7201          | 1.2312 | 1.2312                                   | 1.0466 | 1.0466                                  | 0.4516 | 0.4516                                 | 0.5652 | 0.0  | 0.5  | 0.3152    | nan |
| 0.0849        | 13.0  | 195  | 0.7267          | 1.2368 | 1.2368                                   | 1.0454 | 1.0454                                  | 0.4466 | 0.4466                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.0818        | 14.0  | 210  | 0.7361          | 1.2448 | 1.2448                                   | 1.0565 | 1.0565                                  | 0.4394 | 0.4394                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.0634        | 15.0  | 225  | 0.7158          | 1.2275 | 1.2275                                   | 1.0384 | 1.0384                                  | 0.4549 | 0.4549                                 | 0.3913 | 0.0  | 0.5  | 0.3306    | nan |
| 0.065         | 16.0  | 240  | 0.7394          | 1.2475 | 1.2475                                   | 1.0659 | 1.0659                                  | 0.4369 | 0.4369                                 | 0.3913 | 0.0  | 0.5  | 0.3306    | nan |
| 0.0541        | 17.0  | 255  | 0.7642          | 1.2683 | 1.2683                                   | 1.0496 | 1.0496                                  | 0.4181 | 0.4181                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.0577        | 18.0  | 270  | 0.7137          | 1.2257 | 1.2257                                   | 1.0303 | 1.0303                                  | 0.4565 | 0.4565                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.0474        | 19.0  | 285  | 0.7393          | 1.2475 | 1.2475                                   | 1.0447 | 1.0447                                  | 0.4370 | 0.4370                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.0494        | 20.0  | 300  | 0.7157          | 1.2274 | 1.2274                                   | 1.0453 | 1.0453                                  | 0.4550 | 0.4550                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.0434        | 21.0  | 315  | 0.7248          | 1.2352 | 1.2352                                   | 1.0462 | 1.0462                                  | 0.4480 | 0.4480                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.049         | 22.0  | 330  | 0.7384          | 1.2467 | 1.2467                                   | 1.0613 | 1.0613                                  | 0.4377 | 0.4377                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.0405        | 23.0  | 345  | 0.7420          | 1.2498 | 1.2498                                   | 1.0653 | 1.0653                                  | 0.4349 | 0.4349                                 | 0.3913 | 0.0  | 0.5  | 0.3306    | nan |
| 0.0398        | 24.0  | 360  | 0.7355          | 1.2442 | 1.2442                                   | 1.0620 | 1.0620                                  | 0.4399 | 0.4399                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.0398        | 25.0  | 375  | 0.7570          | 1.2623 | 1.2623                                   | 1.0698 | 1.0698                                  | 0.4235 | 0.4235                                 | 0.3913 | 0.0  | 0.5  | 0.3306    | nan |
| 0.0345        | 26.0  | 390  | 0.7359          | 1.2446 | 1.2446                                   | 1.0610 | 1.0610                                  | 0.4396 | 0.4396                                 | 0.5652 | 0.0  | 0.5  | 0.3152    | nan |
| 0.0345        | 27.0  | 405  | 0.7417          | 1.2495 | 1.2495                                   | 1.0660 | 1.0660                                  | 0.4352 | 0.4352                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |
| 0.0386        | 28.0  | 420  | 0.7215          | 1.2323 | 1.2323                                   | 1.0514 | 1.0514                                  | 0.4506 | 0.4506                                 | 0.4783 | 0.0  | 0.5  | 0.3084    | nan |
| 0.0372        | 29.0  | 435  | 0.7140          | 1.2260 | 1.2260                                   | 1.0477 | 1.0477                                  | 0.4562 | 0.4562                                 | 0.5652 | 0.0  | 0.5  | 0.4091    | nan |
| 0.0407        | 30.0  | 450  | 0.7139          | 1.2259 | 1.2259                                   | 1.0480 | 1.0480                                  | 0.4563 | 0.4563                                 | 0.4783 | 0.0  | 0.5  | 0.3953    | nan |


### Framework versions

- Transformers 4.16.2
- Pytorch 1.10.2+cu113
- Datasets 1.18.3
- Tokenizers 0.11.0