adriansanz commited on
Commit
61e41b1
·
verified ·
1 Parent(s): 531e06e

End of training

Browse files
README.md CHANGED
@@ -9,23 +9,23 @@ metrics:
9
  - recall
10
  - f1
11
  model-index:
12
- - name: VICH_300524_epoch_2
13
  results: []
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
  should probably proofread and complete it, then remove this comment. -->
18
 
19
- # VICH_300524_epoch_2
20
 
21
  This model is a fine-tuned version of [projecte-aina/roberta-base-ca-v2-cased-te](https://huggingface.co/projecte-aina/roberta-base-ca-v2-cased-te) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.4088
24
- - Accuracy: 0.945
25
- - Precision: 0.9459
26
- - Recall: 0.9450
27
- - F1: 0.9450
28
- - Ratio: 0.477
29
 
30
  ## Model description
31
 
@@ -61,69 +61,69 @@ The following hyperparameters were used during training:
61
 
62
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Ratio |
63
  |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:-----:|
64
- | 0.32 | 0.0157 | 10 | 0.4469 | 0.939 | 0.9403 | 0.9390 | 0.9390 | 0.473 |
65
- | 0.3466 | 0.0314 | 20 | 0.4538 | 0.935 | 0.9354 | 0.935 | 0.9350 | 0.515 |
66
- | 0.3494 | 0.0472 | 30 | 0.4344 | 0.943 | 0.9435 | 0.9430 | 0.9430 | 0.483 |
67
- | 0.302 | 0.0629 | 40 | 0.4569 | 0.937 | 0.9375 | 0.937 | 0.9370 | 0.483 |
68
- | 0.3517 | 0.0786 | 50 | 0.5094 | 0.914 | 0.9169 | 0.9140 | 0.9138 | 0.542 |
69
- | 0.3754 | 0.0943 | 60 | 0.4491 | 0.934 | 0.9363 | 0.9340 | 0.9339 | 0.464 |
70
- | 0.3253 | 0.1101 | 70 | 0.4673 | 0.931 | 0.9334 | 0.931 | 0.9309 | 0.463 |
71
- | 0.368 | 0.1258 | 80 | 0.4525 | 0.93 | 0.9301 | 0.93 | 0.9300 | 0.506 |
72
- | 0.3782 | 0.1415 | 90 | 0.4497 | 0.93 | 0.9308 | 0.9300 | 0.9300 | 0.478 |
73
- | 0.378 | 0.1572 | 100 | 0.4438 | 0.929 | 0.9292 | 0.929 | 0.9290 | 0.511 |
74
- | 0.3678 | 0.1730 | 110 | 0.4277 | 0.939 | 0.9393 | 0.9390 | 0.9390 | 0.487 |
75
- | 0.3363 | 0.1887 | 120 | 0.4635 | 0.936 | 0.9388 | 0.9360 | 0.9359 | 0.46 |
76
- | 0.364 | 0.2044 | 130 | 0.4544 | 0.932 | 0.9321 | 0.9320 | 0.9320 | 0.508 |
77
- | 0.3535 | 0.2201 | 140 | 0.4624 | 0.925 | 0.9259 | 0.925 | 0.9250 | 0.523 |
78
- | 0.3675 | 0.2358 | 150 | 0.4482 | 0.924 | 0.9242 | 0.924 | 0.9240 | 0.51 |
79
- | 0.345 | 0.2516 | 160 | 0.4362 | 0.938 | 0.9394 | 0.938 | 0.9380 | 0.472 |
80
- | 0.3305 | 0.2673 | 170 | 0.4420 | 0.935 | 0.9365 | 0.935 | 0.9349 | 0.471 |
81
- | 0.3705 | 0.2830 | 180 | 0.4282 | 0.938 | 0.9388 | 0.938 | 0.9380 | 0.478 |
82
- | 0.3425 | 0.2987 | 190 | 0.4253 | 0.937 | 0.9375 | 0.937 | 0.9370 | 0.483 |
83
- | 0.3122 | 0.3145 | 200 | 0.4427 | 0.941 | 0.9416 | 0.9410 | 0.9410 | 0.481 |
84
- | 0.3354 | 0.3302 | 210 | 0.4460 | 0.94 | 0.9401 | 0.94 | 0.9400 | 0.494 |
85
- | 0.3439 | 0.3459 | 220 | 0.4355 | 0.94 | 0.9400 | 0.94 | 0.9400 | 0.498 |
86
- | 0.3384 | 0.3616 | 230 | 0.4414 | 0.935 | 0.9351 | 0.935 | 0.9350 | 0.509 |
87
- | 0.3466 | 0.3774 | 240 | 0.4438 | 0.939 | 0.9403 | 0.9390 | 0.9390 | 0.473 |
88
- | 0.3461 | 0.3931 | 250 | 0.4327 | 0.939 | 0.9403 | 0.9390 | 0.9390 | 0.473 |
89
- | 0.3279 | 0.4088 | 260 | 0.4386 | 0.937 | 0.9376 | 0.937 | 0.9370 | 0.481 |
90
- | 0.3674 | 0.4245 | 270 | 0.4366 | 0.936 | 0.9360 | 0.9360 | 0.9360 | 0.498 |
91
- | 0.3193 | 0.4403 | 280 | 0.4265 | 0.941 | 0.9413 | 0.9410 | 0.9410 | 0.487 |
92
- | 0.3444 | 0.4560 | 290 | 0.4313 | 0.942 | 0.9434 | 0.942 | 0.9420 | 0.472 |
93
- | 0.351 | 0.4717 | 300 | 0.4224 | 0.946 | 0.9469 | 0.946 | 0.9460 | 0.478 |
94
- | 0.3459 | 0.4874 | 310 | 0.4182 | 0.944 | 0.9443 | 0.944 | 0.9440 | 0.486 |
95
- | 0.3566 | 0.5031 | 320 | 0.4228 | 0.947 | 0.9473 | 0.9470 | 0.9470 | 0.487 |
96
- | 0.3379 | 0.5189 | 330 | 0.4268 | 0.945 | 0.9451 | 0.9450 | 0.9450 | 0.491 |
97
- | 0.3358 | 0.5346 | 340 | 0.4244 | 0.948 | 0.9489 | 0.948 | 0.9480 | 0.478 |
98
- | 0.4231 | 0.5503 | 350 | 0.4203 | 0.948 | 0.9496 | 0.948 | 0.9480 | 0.47 |
99
- | 0.3911 | 0.5660 | 360 | 0.4090 | 0.947 | 0.9476 | 0.9470 | 0.9470 | 0.481 |
100
- | 0.3662 | 0.5818 | 370 | 0.4065 | 0.944 | 0.9449 | 0.944 | 0.9440 | 0.478 |
101
- | 0.3847 | 0.5975 | 380 | 0.4053 | 0.943 | 0.9439 | 0.9430 | 0.9430 | 0.477 |
102
- | 0.3477 | 0.6132 | 390 | 0.4140 | 0.946 | 0.9462 | 0.946 | 0.9460 | 0.49 |
103
- | 0.3368 | 0.6289 | 400 | 0.4168 | 0.948 | 0.9485 | 0.948 | 0.9480 | 0.484 |
104
- | 0.3599 | 0.6447 | 410 | 0.4250 | 0.944 | 0.9454 | 0.944 | 0.9440 | 0.472 |
105
- | 0.373 | 0.6604 | 420 | 0.4207 | 0.945 | 0.9461 | 0.9450 | 0.9450 | 0.475 |
106
- | 0.3644 | 0.6761 | 430 | 0.4159 | 0.95 | 0.9504 | 0.95 | 0.9500 | 0.486 |
107
- | 0.3865 | 0.6918 | 440 | 0.4169 | 0.948 | 0.9482 | 0.948 | 0.9480 | 0.49 |
108
- | 0.4083 | 0.7075 | 450 | 0.4157 | 0.948 | 0.9487 | 0.948 | 0.9480 | 0.48 |
109
- | 0.3864 | 0.7233 | 460 | 0.4206 | 0.945 | 0.9458 | 0.9450 | 0.9450 | 0.479 |
110
- | 0.4147 | 0.7390 | 470 | 0.4251 | 0.942 | 0.9425 | 0.942 | 0.9420 | 0.484 |
111
- | 0.4165 | 0.7547 | 480 | 0.4301 | 0.941 | 0.9415 | 0.9410 | 0.9410 | 0.483 |
112
- | 0.4301 | 0.7704 | 490 | 0.4333 | 0.939 | 0.9398 | 0.9390 | 0.9390 | 0.479 |
113
- | 0.3616 | 0.7862 | 500 | 0.4341 | 0.938 | 0.9388 | 0.938 | 0.9380 | 0.478 |
114
- | 0.4109 | 0.8019 | 510 | 0.4343 | 0.938 | 0.9387 | 0.938 | 0.9380 | 0.48 |
115
- | 0.447 | 0.8176 | 520 | 0.4296 | 0.939 | 0.9398 | 0.9390 | 0.9390 | 0.479 |
116
- | 0.4226 | 0.8333 | 530 | 0.4239 | 0.94 | 0.9409 | 0.94 | 0.9400 | 0.478 |
117
- | 0.4284 | 0.8491 | 540 | 0.4193 | 0.941 | 0.9418 | 0.9410 | 0.9410 | 0.479 |
118
- | 0.4233 | 0.8648 | 550 | 0.4155 | 0.943 | 0.9438 | 0.9430 | 0.9430 | 0.479 |
119
- | 0.4788 | 0.8805 | 560 | 0.4132 | 0.943 | 0.9439 | 0.9430 | 0.9430 | 0.477 |
120
- | 0.3963 | 0.8962 | 570 | 0.4117 | 0.944 | 0.9447 | 0.944 | 0.9440 | 0.48 |
121
- | 0.4601 | 0.9119 | 580 | 0.4103 | 0.943 | 0.9438 | 0.9430 | 0.9430 | 0.479 |
122
- | 0.4265 | 0.9277 | 590 | 0.4093 | 0.943 | 0.9438 | 0.9430 | 0.9430 | 0.479 |
123
- | 0.4114 | 0.9434 | 600 | 0.4092 | 0.943 | 0.9438 | 0.9430 | 0.9430 | 0.479 |
124
- | 0.4356 | 0.9591 | 610 | 0.4090 | 0.944 | 0.9449 | 0.944 | 0.9440 | 0.478 |
125
- | 0.4465 | 0.9748 | 620 | 0.4089 | 0.945 | 0.9459 | 0.9450 | 0.9450 | 0.477 |
126
- | 0.3974 | 0.9906 | 630 | 0.4089 | 0.945 | 0.9459 | 0.9450 | 0.9450 | 0.477 |
127
 
128
 
129
  ### Framework versions
 
9
  - recall
10
  - f1
11
  model-index:
12
+ - name: VICH_300524_epoch_3
13
  results: []
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
  should probably proofread and complete it, then remove this comment. -->
18
 
19
+ # VICH_300524_epoch_3
20
 
21
  This model is a fine-tuned version of [projecte-aina/roberta-base-ca-v2-cased-te](https://huggingface.co/projecte-aina/roberta-base-ca-v2-cased-te) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.3866
24
+ - Accuracy: 0.954
25
+ - Precision: 0.9552
26
+ - Recall: 0.954
27
+ - F1: 0.9540
28
+ - Ratio: 0.474
29
 
30
  ## Model description
31
 
 
61
 
62
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Ratio |
63
  |:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:-----:|
64
+ | 0.339 | 0.0157 | 10 | 0.4216 | 0.945 | 0.9453 | 0.9450 | 0.9450 | 0.487 |
65
+ | 0.3573 | 0.0314 | 20 | 0.4397 | 0.943 | 0.9430 | 0.943 | 0.9430 | 0.501 |
66
+ | 0.4019 | 0.0472 | 30 | 0.4330 | 0.945 | 0.9452 | 0.9450 | 0.9450 | 0.489 |
67
+ | 0.3443 | 0.0629 | 40 | 0.4368 | 0.942 | 0.9434 | 0.942 | 0.9420 | 0.472 |
68
+ | 0.3805 | 0.0786 | 50 | 0.4335 | 0.933 | 0.9331 | 0.933 | 0.9330 | 0.507 |
69
+ | 0.3837 | 0.0943 | 60 | 0.4273 | 0.938 | 0.9380 | 0.938 | 0.9380 | 0.498 |
70
+ | 0.3428 | 0.1101 | 70 | 0.4313 | 0.94 | 0.9403 | 0.94 | 0.9400 | 0.488 |
71
+ | 0.3954 | 0.1258 | 80 | 0.4323 | 0.945 | 0.9458 | 0.9450 | 0.9450 | 0.479 |
72
+ | 0.4144 | 0.1415 | 90 | 0.4299 | 0.94 | 0.9400 | 0.94 | 0.9400 | 0.502 |
73
+ | 0.3481 | 0.1572 | 100 | 0.4249 | 0.939 | 0.9391 | 0.9390 | 0.9390 | 0.491 |
74
+ | 0.3825 | 0.1730 | 110 | 0.4293 | 0.942 | 0.9420 | 0.942 | 0.9420 | 0.498 |
75
+ | 0.3605 | 0.1887 | 120 | 0.4130 | 0.949 | 0.9498 | 0.9490 | 0.9490 | 0.479 |
76
+ | 0.4028 | 0.2044 | 130 | 0.4105 | 0.948 | 0.9490 | 0.948 | 0.9480 | 0.476 |
77
+ | 0.3729 | 0.2201 | 140 | 0.4324 | 0.939 | 0.9391 | 0.9390 | 0.9390 | 0.507 |
78
+ | 0.3611 | 0.2358 | 150 | 0.4255 | 0.937 | 0.9371 | 0.937 | 0.9370 | 0.491 |
79
+ | 0.3683 | 0.2516 | 160 | 0.4290 | 0.943 | 0.9443 | 0.9430 | 0.9430 | 0.473 |
80
+ | 0.351 | 0.2673 | 170 | 0.4215 | 0.942 | 0.9426 | 0.942 | 0.9420 | 0.482 |
81
+ | 0.3697 | 0.2830 | 180 | 0.4280 | 0.944 | 0.9441 | 0.944 | 0.9440 | 0.492 |
82
+ | 0.3851 | 0.2987 | 190 | 0.4251 | 0.945 | 0.9461 | 0.9450 | 0.9450 | 0.475 |
83
+ | 0.335 | 0.3145 | 200 | 0.4276 | 0.945 | 0.9455 | 0.9450 | 0.9450 | 0.483 |
84
+ | 0.3744 | 0.3302 | 210 | 0.4173 | 0.947 | 0.9476 | 0.9470 | 0.9470 | 0.481 |
85
+ | 0.376 | 0.3459 | 220 | 0.4080 | 0.947 | 0.9478 | 0.9470 | 0.9470 | 0.479 |
86
+ | 0.3856 | 0.3616 | 230 | 0.4131 | 0.947 | 0.9472 | 0.9470 | 0.9470 | 0.489 |
87
+ | 0.4036 | 0.3774 | 240 | 0.4285 | 0.937 | 0.9370 | 0.937 | 0.9370 | 0.503 |
88
+ | 0.3863 | 0.3931 | 250 | 0.4159 | 0.939 | 0.9396 | 0.9390 | 0.9390 | 0.481 |
89
+ | 0.3619 | 0.4088 | 260 | 0.4212 | 0.944 | 0.9446 | 0.944 | 0.9440 | 0.482 |
90
+ | 0.4042 | 0.4245 | 270 | 0.4233 | 0.941 | 0.9411 | 0.9410 | 0.9410 | 0.493 |
91
+ | 0.3783 | 0.4403 | 280 | 0.4153 | 0.939 | 0.9390 | 0.9390 | 0.9390 | 0.505 |
92
+ | 0.3744 | 0.4560 | 290 | 0.4170 | 0.943 | 0.9447 | 0.9430 | 0.9429 | 0.469 |
93
+ | 0.4052 | 0.4717 | 300 | 0.4219 | 0.94 | 0.9423 | 0.94 | 0.9399 | 0.464 |
94
+ | 0.3531 | 0.4874 | 310 | 0.4049 | 0.949 | 0.9493 | 0.9490 | 0.9490 | 0.487 |
95
+ | 0.3812 | 0.5031 | 320 | 0.4042 | 0.951 | 0.9520 | 0.9510 | 0.9510 | 0.477 |
96
+ | 0.3587 | 0.5189 | 330 | 0.4030 | 0.95 | 0.9509 | 0.95 | 0.9500 | 0.478 |
97
+ | 0.3455 | 0.5346 | 340 | 0.4007 | 0.951 | 0.9512 | 0.951 | 0.9510 | 0.489 |
98
+ | 0.4174 | 0.5503 | 350 | 0.3989 | 0.952 | 0.9525 | 0.952 | 0.9520 | 0.484 |
99
+ | 0.4173 | 0.5660 | 360 | 0.4004 | 0.948 | 0.9487 | 0.948 | 0.9480 | 0.48 |
100
+ | 0.4012 | 0.5818 | 370 | 0.3956 | 0.95 | 0.9504 | 0.95 | 0.9500 | 0.486 |
101
+ | 0.388 | 0.5975 | 380 | 0.3968 | 0.949 | 0.9490 | 0.949 | 0.9490 | 0.495 |
102
+ | 0.3613 | 0.6132 | 390 | 0.3978 | 0.948 | 0.9482 | 0.948 | 0.9480 | 0.49 |
103
+ | 0.3699 | 0.6289 | 400 | 0.3988 | 0.956 | 0.9563 | 0.956 | 0.9560 | 0.488 |
104
+ | 0.3585 | 0.6447 | 410 | 0.3967 | 0.956 | 0.9569 | 0.956 | 0.9560 | 0.478 |
105
+ | 0.4017 | 0.6604 | 420 | 0.3888 | 0.959 | 0.9595 | 0.959 | 0.9590 | 0.483 |
106
+ | 0.3657 | 0.6761 | 430 | 0.3898 | 0.954 | 0.9541 | 0.954 | 0.9540 | 0.494 |
107
+ | 0.413 | 0.6918 | 440 | 0.3923 | 0.955 | 0.9550 | 0.955 | 0.9550 | 0.499 |
108
+ | 0.3977 | 0.7075 | 450 | 0.3884 | 0.955 | 0.9551 | 0.955 | 0.9550 | 0.491 |
109
+ | 0.4066 | 0.7233 | 460 | 0.3869 | 0.959 | 0.9593 | 0.959 | 0.9590 | 0.487 |
110
+ | 0.3908 | 0.7390 | 470 | 0.3878 | 0.956 | 0.9561 | 0.956 | 0.9560 | 0.492 |
111
+ | 0.4041 | 0.7547 | 480 | 0.3872 | 0.958 | 0.9584 | 0.958 | 0.9580 | 0.486 |
112
+ | 0.4191 | 0.7704 | 490 | 0.3945 | 0.952 | 0.9534 | 0.952 | 0.9520 | 0.472 |
113
+ | 0.3443 | 0.7862 | 500 | 0.3932 | 0.949 | 0.9500 | 0.9490 | 0.9490 | 0.477 |
114
+ | 0.3735 | 0.8019 | 510 | 0.3934 | 0.955 | 0.9552 | 0.955 | 0.9550 | 0.489 |
115
+ | 0.3913 | 0.8176 | 520 | 0.3965 | 0.954 | 0.9541 | 0.954 | 0.9540 | 0.494 |
116
+ | 0.4038 | 0.8333 | 530 | 0.3949 | 0.953 | 0.9531 | 0.953 | 0.9530 | 0.493 |
117
+ | 0.4055 | 0.8491 | 540 | 0.3933 | 0.952 | 0.9524 | 0.952 | 0.9520 | 0.486 |
118
+ | 0.4073 | 0.8648 | 550 | 0.3932 | 0.954 | 0.9546 | 0.954 | 0.9540 | 0.482 |
119
+ | 0.4471 | 0.8805 | 560 | 0.3944 | 0.952 | 0.9532 | 0.952 | 0.9520 | 0.474 |
120
+ | 0.4098 | 0.8962 | 570 | 0.3942 | 0.951 | 0.9525 | 0.9510 | 0.9510 | 0.471 |
121
+ | 0.4512 | 0.9119 | 580 | 0.3933 | 0.952 | 0.9534 | 0.952 | 0.9520 | 0.472 |
122
+ | 0.4309 | 0.9277 | 590 | 0.3914 | 0.952 | 0.9534 | 0.952 | 0.9520 | 0.472 |
123
+ | 0.3962 | 0.9434 | 600 | 0.3894 | 0.953 | 0.9543 | 0.9530 | 0.9530 | 0.473 |
124
+ | 0.4242 | 0.9591 | 610 | 0.3878 | 0.953 | 0.9543 | 0.9530 | 0.9530 | 0.473 |
125
+ | 0.3824 | 0.9748 | 620 | 0.3869 | 0.954 | 0.9552 | 0.954 | 0.9540 | 0.474 |
126
+ | 0.3837 | 0.9906 | 630 | 0.3867 | 0.954 | 0.9552 | 0.954 | 0.9540 | 0.474 |
127
 
128
 
129
  ### Framework versions
logs/events.out.tfevents.1722355177.183c440d058f.2175.6 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e1ce10cc877cc0e59f3446c3d0a3c973288f8c8e5dacb071a5958f9616227f77
3
+ size 609
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4532cf6f1ce006960a920371e6b67a8b398227506747933bfc5a3d6ee3a4fc01
3
  size 498606684
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2c64857b2cecead479cb328718e2b3610c447c0df1166ff1373b7df94a63b307
3
  size 498606684
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a179bca9489ffb5a2caf811aa6548ea99927dc0427f6ca9fc316d86fa7923721
3
  size 5176
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:60e8c24f4f67b17b31bdbe1b4bf24f198c5753f44b8877e12ee80937434ab48a
3
  size 5176