adriansanz
commited on
End of training
Browse files- README.md +71 -71
- logs/events.out.tfevents.1722355177.183c440d058f.2175.6 +3 -0
- model.safetensors +1 -1
- training_args.bin +1 -1
README.md
CHANGED
@@ -9,23 +9,23 @@ metrics:
|
|
9 |
- recall
|
10 |
- f1
|
11 |
model-index:
|
12 |
-
- name:
|
13 |
results: []
|
14 |
---
|
15 |
|
16 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
17 |
should probably proofread and complete it, then remove this comment. -->
|
18 |
|
19 |
-
#
|
20 |
|
21 |
This model is a fine-tuned version of [projecte-aina/roberta-base-ca-v2-cased-te](https://huggingface.co/projecte-aina/roberta-base-ca-v2-cased-te) on the None dataset.
|
22 |
It achieves the following results on the evaluation set:
|
23 |
-
- Loss: 0.
|
24 |
-
- Accuracy: 0.
|
25 |
-
- Precision: 0.
|
26 |
-
- Recall: 0.
|
27 |
-
- F1: 0.
|
28 |
-
- Ratio: 0.
|
29 |
|
30 |
## Model description
|
31 |
|
@@ -61,69 +61,69 @@ The following hyperparameters were used during training:
|
|
61 |
|
62 |
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Ratio |
|
63 |
|:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:-----:|
|
64 |
-
| 0.
|
65 |
-
| 0.
|
66 |
-
| 0.
|
67 |
-
| 0.
|
68 |
-
| 0.
|
69 |
-
| 0.
|
70 |
-
| 0.
|
71 |
-
| 0.
|
72 |
-
| 0.
|
73 |
-
| 0.
|
74 |
-
| 0.
|
75 |
-
| 0.
|
76 |
-
| 0.
|
77 |
-
| 0.
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
| 0.
|
125 |
-
| 0.
|
126 |
-
| 0.
|
127 |
|
128 |
|
129 |
### Framework versions
|
|
|
9 |
- recall
|
10 |
- f1
|
11 |
model-index:
|
12 |
+
- name: VICH_300524_epoch_3
|
13 |
results: []
|
14 |
---
|
15 |
|
16 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
17 |
should probably proofread and complete it, then remove this comment. -->
|
18 |
|
19 |
+
# VICH_300524_epoch_3
|
20 |
|
21 |
This model is a fine-tuned version of [projecte-aina/roberta-base-ca-v2-cased-te](https://huggingface.co/projecte-aina/roberta-base-ca-v2-cased-te) on the None dataset.
|
22 |
It achieves the following results on the evaluation set:
|
23 |
+
- Loss: 0.3866
|
24 |
+
- Accuracy: 0.954
|
25 |
+
- Precision: 0.9552
|
26 |
+
- Recall: 0.954
|
27 |
+
- F1: 0.9540
|
28 |
+
- Ratio: 0.474
|
29 |
|
30 |
## Model description
|
31 |
|
|
|
61 |
|
62 |
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Ratio |
|
63 |
|:-------------:|:------:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:-----:|
|
64 |
+
| 0.339 | 0.0157 | 10 | 0.4216 | 0.945 | 0.9453 | 0.9450 | 0.9450 | 0.487 |
|
65 |
+
| 0.3573 | 0.0314 | 20 | 0.4397 | 0.943 | 0.9430 | 0.943 | 0.9430 | 0.501 |
|
66 |
+
| 0.4019 | 0.0472 | 30 | 0.4330 | 0.945 | 0.9452 | 0.9450 | 0.9450 | 0.489 |
|
67 |
+
| 0.3443 | 0.0629 | 40 | 0.4368 | 0.942 | 0.9434 | 0.942 | 0.9420 | 0.472 |
|
68 |
+
| 0.3805 | 0.0786 | 50 | 0.4335 | 0.933 | 0.9331 | 0.933 | 0.9330 | 0.507 |
|
69 |
+
| 0.3837 | 0.0943 | 60 | 0.4273 | 0.938 | 0.9380 | 0.938 | 0.9380 | 0.498 |
|
70 |
+
| 0.3428 | 0.1101 | 70 | 0.4313 | 0.94 | 0.9403 | 0.94 | 0.9400 | 0.488 |
|
71 |
+
| 0.3954 | 0.1258 | 80 | 0.4323 | 0.945 | 0.9458 | 0.9450 | 0.9450 | 0.479 |
|
72 |
+
| 0.4144 | 0.1415 | 90 | 0.4299 | 0.94 | 0.9400 | 0.94 | 0.9400 | 0.502 |
|
73 |
+
| 0.3481 | 0.1572 | 100 | 0.4249 | 0.939 | 0.9391 | 0.9390 | 0.9390 | 0.491 |
|
74 |
+
| 0.3825 | 0.1730 | 110 | 0.4293 | 0.942 | 0.9420 | 0.942 | 0.9420 | 0.498 |
|
75 |
+
| 0.3605 | 0.1887 | 120 | 0.4130 | 0.949 | 0.9498 | 0.9490 | 0.9490 | 0.479 |
|
76 |
+
| 0.4028 | 0.2044 | 130 | 0.4105 | 0.948 | 0.9490 | 0.948 | 0.9480 | 0.476 |
|
77 |
+
| 0.3729 | 0.2201 | 140 | 0.4324 | 0.939 | 0.9391 | 0.9390 | 0.9390 | 0.507 |
|
78 |
+
| 0.3611 | 0.2358 | 150 | 0.4255 | 0.937 | 0.9371 | 0.937 | 0.9370 | 0.491 |
|
79 |
+
| 0.3683 | 0.2516 | 160 | 0.4290 | 0.943 | 0.9443 | 0.9430 | 0.9430 | 0.473 |
|
80 |
+
| 0.351 | 0.2673 | 170 | 0.4215 | 0.942 | 0.9426 | 0.942 | 0.9420 | 0.482 |
|
81 |
+
| 0.3697 | 0.2830 | 180 | 0.4280 | 0.944 | 0.9441 | 0.944 | 0.9440 | 0.492 |
|
82 |
+
| 0.3851 | 0.2987 | 190 | 0.4251 | 0.945 | 0.9461 | 0.9450 | 0.9450 | 0.475 |
|
83 |
+
| 0.335 | 0.3145 | 200 | 0.4276 | 0.945 | 0.9455 | 0.9450 | 0.9450 | 0.483 |
|
84 |
+
| 0.3744 | 0.3302 | 210 | 0.4173 | 0.947 | 0.9476 | 0.9470 | 0.9470 | 0.481 |
|
85 |
+
| 0.376 | 0.3459 | 220 | 0.4080 | 0.947 | 0.9478 | 0.9470 | 0.9470 | 0.479 |
|
86 |
+
| 0.3856 | 0.3616 | 230 | 0.4131 | 0.947 | 0.9472 | 0.9470 | 0.9470 | 0.489 |
|
87 |
+
| 0.4036 | 0.3774 | 240 | 0.4285 | 0.937 | 0.9370 | 0.937 | 0.9370 | 0.503 |
|
88 |
+
| 0.3863 | 0.3931 | 250 | 0.4159 | 0.939 | 0.9396 | 0.9390 | 0.9390 | 0.481 |
|
89 |
+
| 0.3619 | 0.4088 | 260 | 0.4212 | 0.944 | 0.9446 | 0.944 | 0.9440 | 0.482 |
|
90 |
+
| 0.4042 | 0.4245 | 270 | 0.4233 | 0.941 | 0.9411 | 0.9410 | 0.9410 | 0.493 |
|
91 |
+
| 0.3783 | 0.4403 | 280 | 0.4153 | 0.939 | 0.9390 | 0.9390 | 0.9390 | 0.505 |
|
92 |
+
| 0.3744 | 0.4560 | 290 | 0.4170 | 0.943 | 0.9447 | 0.9430 | 0.9429 | 0.469 |
|
93 |
+
| 0.4052 | 0.4717 | 300 | 0.4219 | 0.94 | 0.9423 | 0.94 | 0.9399 | 0.464 |
|
94 |
+
| 0.3531 | 0.4874 | 310 | 0.4049 | 0.949 | 0.9493 | 0.9490 | 0.9490 | 0.487 |
|
95 |
+
| 0.3812 | 0.5031 | 320 | 0.4042 | 0.951 | 0.9520 | 0.9510 | 0.9510 | 0.477 |
|
96 |
+
| 0.3587 | 0.5189 | 330 | 0.4030 | 0.95 | 0.9509 | 0.95 | 0.9500 | 0.478 |
|
97 |
+
| 0.3455 | 0.5346 | 340 | 0.4007 | 0.951 | 0.9512 | 0.951 | 0.9510 | 0.489 |
|
98 |
+
| 0.4174 | 0.5503 | 350 | 0.3989 | 0.952 | 0.9525 | 0.952 | 0.9520 | 0.484 |
|
99 |
+
| 0.4173 | 0.5660 | 360 | 0.4004 | 0.948 | 0.9487 | 0.948 | 0.9480 | 0.48 |
|
100 |
+
| 0.4012 | 0.5818 | 370 | 0.3956 | 0.95 | 0.9504 | 0.95 | 0.9500 | 0.486 |
|
101 |
+
| 0.388 | 0.5975 | 380 | 0.3968 | 0.949 | 0.9490 | 0.949 | 0.9490 | 0.495 |
|
102 |
+
| 0.3613 | 0.6132 | 390 | 0.3978 | 0.948 | 0.9482 | 0.948 | 0.9480 | 0.49 |
|
103 |
+
| 0.3699 | 0.6289 | 400 | 0.3988 | 0.956 | 0.9563 | 0.956 | 0.9560 | 0.488 |
|
104 |
+
| 0.3585 | 0.6447 | 410 | 0.3967 | 0.956 | 0.9569 | 0.956 | 0.9560 | 0.478 |
|
105 |
+
| 0.4017 | 0.6604 | 420 | 0.3888 | 0.959 | 0.9595 | 0.959 | 0.9590 | 0.483 |
|
106 |
+
| 0.3657 | 0.6761 | 430 | 0.3898 | 0.954 | 0.9541 | 0.954 | 0.9540 | 0.494 |
|
107 |
+
| 0.413 | 0.6918 | 440 | 0.3923 | 0.955 | 0.9550 | 0.955 | 0.9550 | 0.499 |
|
108 |
+
| 0.3977 | 0.7075 | 450 | 0.3884 | 0.955 | 0.9551 | 0.955 | 0.9550 | 0.491 |
|
109 |
+
| 0.4066 | 0.7233 | 460 | 0.3869 | 0.959 | 0.9593 | 0.959 | 0.9590 | 0.487 |
|
110 |
+
| 0.3908 | 0.7390 | 470 | 0.3878 | 0.956 | 0.9561 | 0.956 | 0.9560 | 0.492 |
|
111 |
+
| 0.4041 | 0.7547 | 480 | 0.3872 | 0.958 | 0.9584 | 0.958 | 0.9580 | 0.486 |
|
112 |
+
| 0.4191 | 0.7704 | 490 | 0.3945 | 0.952 | 0.9534 | 0.952 | 0.9520 | 0.472 |
|
113 |
+
| 0.3443 | 0.7862 | 500 | 0.3932 | 0.949 | 0.9500 | 0.9490 | 0.9490 | 0.477 |
|
114 |
+
| 0.3735 | 0.8019 | 510 | 0.3934 | 0.955 | 0.9552 | 0.955 | 0.9550 | 0.489 |
|
115 |
+
| 0.3913 | 0.8176 | 520 | 0.3965 | 0.954 | 0.9541 | 0.954 | 0.9540 | 0.494 |
|
116 |
+
| 0.4038 | 0.8333 | 530 | 0.3949 | 0.953 | 0.9531 | 0.953 | 0.9530 | 0.493 |
|
117 |
+
| 0.4055 | 0.8491 | 540 | 0.3933 | 0.952 | 0.9524 | 0.952 | 0.9520 | 0.486 |
|
118 |
+
| 0.4073 | 0.8648 | 550 | 0.3932 | 0.954 | 0.9546 | 0.954 | 0.9540 | 0.482 |
|
119 |
+
| 0.4471 | 0.8805 | 560 | 0.3944 | 0.952 | 0.9532 | 0.952 | 0.9520 | 0.474 |
|
120 |
+
| 0.4098 | 0.8962 | 570 | 0.3942 | 0.951 | 0.9525 | 0.9510 | 0.9510 | 0.471 |
|
121 |
+
| 0.4512 | 0.9119 | 580 | 0.3933 | 0.952 | 0.9534 | 0.952 | 0.9520 | 0.472 |
|
122 |
+
| 0.4309 | 0.9277 | 590 | 0.3914 | 0.952 | 0.9534 | 0.952 | 0.9520 | 0.472 |
|
123 |
+
| 0.3962 | 0.9434 | 600 | 0.3894 | 0.953 | 0.9543 | 0.9530 | 0.9530 | 0.473 |
|
124 |
+
| 0.4242 | 0.9591 | 610 | 0.3878 | 0.953 | 0.9543 | 0.9530 | 0.9530 | 0.473 |
|
125 |
+
| 0.3824 | 0.9748 | 620 | 0.3869 | 0.954 | 0.9552 | 0.954 | 0.9540 | 0.474 |
|
126 |
+
| 0.3837 | 0.9906 | 630 | 0.3867 | 0.954 | 0.9552 | 0.954 | 0.9540 | 0.474 |
|
127 |
|
128 |
|
129 |
### Framework versions
|
logs/events.out.tfevents.1722355177.183c440d058f.2175.6
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:e1ce10cc877cc0e59f3446c3d0a3c973288f8c8e5dacb071a5958f9616227f77
|
3 |
+
size 609
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 498606684
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2c64857b2cecead479cb328718e2b3610c447c0df1166ff1373b7df94a63b307
|
3 |
size 498606684
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 5176
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:60e8c24f4f67b17b31bdbe1b4bf24f198c5753f44b8877e12ee80937434ab48a
|
3 |
size 5176
|