fydhfzh commited on
Commit
e4c6798
1 Parent(s): 7a31a98

End of training

Browse files
README.md CHANGED
@@ -20,12 +20,12 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 3.0783
24
- - Accuracy: 0.2075
25
- - Precision: 0.1563
26
- - Recall: 0.2075
27
- - F1: 0.1504
28
- - Binary: 0.4396
29
 
30
  ## Model description
31
 
@@ -44,7 +44,7 @@ More information needed
44
  ### Training hyperparameters
45
 
46
  The following hyperparameters were used during training:
47
- - learning_rate: 1e-05
48
  - train_batch_size: 32
49
  - eval_batch_size: 32
50
  - seed: 42
@@ -59,58 +59,58 @@ The following hyperparameters were used during training:
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
61
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
62
- | No log | 0.19 | 50 | 4.4148 | 0.0243 | 0.0244 | 0.0243 | 0.0114 | 0.1755 |
63
- | No log | 0.38 | 100 | 4.3445 | 0.0404 | 0.0101 | 0.0404 | 0.0123 | 0.2865 |
64
- | No log | 0.58 | 150 | 4.2100 | 0.0350 | 0.0154 | 0.0350 | 0.0080 | 0.3043 |
65
- | No log | 0.77 | 200 | 4.1002 | 0.0350 | 0.0185 | 0.0350 | 0.0088 | 0.3140 |
66
- | No log | 0.96 | 250 | 4.0141 | 0.0512 | 0.0295 | 0.0512 | 0.0232 | 0.3253 |
67
- | No log | 1.15 | 300 | 3.9438 | 0.0593 | 0.0321 | 0.0593 | 0.0293 | 0.3318 |
68
- | No log | 1.34 | 350 | 3.8728 | 0.0647 | 0.0290 | 0.0647 | 0.0281 | 0.3372 |
69
- | No log | 1.53 | 400 | 3.8297 | 0.0755 | 0.0242 | 0.0755 | 0.0331 | 0.3423 |
70
- | No log | 1.73 | 450 | 3.7627 | 0.0620 | 0.0186 | 0.0620 | 0.0263 | 0.3385 |
71
- | 4.147 | 1.92 | 500 | 3.7166 | 0.0728 | 0.0364 | 0.0728 | 0.0360 | 0.3437 |
72
- | 4.147 | 2.11 | 550 | 3.6779 | 0.0889 | 0.0425 | 0.0889 | 0.0486 | 0.3558 |
73
- | 4.147 | 2.3 | 600 | 3.6396 | 0.0755 | 0.0345 | 0.0755 | 0.0407 | 0.3447 |
74
- | 4.147 | 2.49 | 650 | 3.6005 | 0.0889 | 0.0336 | 0.0889 | 0.0412 | 0.3550 |
75
- | 4.147 | 2.68 | 700 | 3.5602 | 0.0970 | 0.0314 | 0.0970 | 0.0420 | 0.3631 |
76
- | 4.147 | 2.88 | 750 | 3.5309 | 0.0997 | 0.0473 | 0.0997 | 0.0507 | 0.3642 |
77
- | 4.147 | 3.07 | 800 | 3.5331 | 0.1051 | 0.0385 | 0.1051 | 0.0490 | 0.3615 |
78
- | 4.147 | 3.26 | 850 | 3.4774 | 0.1105 | 0.0507 | 0.1105 | 0.0604 | 0.3701 |
79
- | 4.147 | 3.45 | 900 | 3.4571 | 0.1159 | 0.0568 | 0.1159 | 0.0611 | 0.3730 |
80
- | 4.147 | 3.64 | 950 | 3.4265 | 0.1132 | 0.0431 | 0.1132 | 0.0582 | 0.3736 |
81
- | 3.6862 | 3.84 | 1000 | 3.4260 | 0.0970 | 0.0406 | 0.0970 | 0.0502 | 0.3582 |
82
- | 3.6862 | 4.03 | 1050 | 3.3821 | 0.1105 | 0.0421 | 0.1105 | 0.0542 | 0.3709 |
83
- | 3.6862 | 4.22 | 1100 | 3.3825 | 0.1186 | 0.0448 | 0.1186 | 0.0578 | 0.3725 |
84
- | 3.6862 | 4.41 | 1150 | 3.3575 | 0.1213 | 0.0507 | 0.1213 | 0.0634 | 0.3776 |
85
- | 3.6862 | 4.6 | 1200 | 3.3453 | 0.1267 | 0.0659 | 0.1267 | 0.0653 | 0.3790 |
86
- | 3.6862 | 4.79 | 1250 | 3.3205 | 0.1321 | 0.0592 | 0.1321 | 0.0736 | 0.3871 |
87
- | 3.6862 | 4.99 | 1300 | 3.2912 | 0.1294 | 0.0552 | 0.1294 | 0.0724 | 0.3868 |
88
- | 3.6862 | 5.18 | 1350 | 3.2741 | 0.1536 | 0.0731 | 0.1536 | 0.0880 | 0.4022 |
89
- | 3.6862 | 5.37 | 1400 | 3.2767 | 0.1509 | 0.0723 | 0.1509 | 0.0893 | 0.3978 |
90
- | 3.6862 | 5.56 | 1450 | 3.2485 | 0.1509 | 0.0743 | 0.1509 | 0.0907 | 0.4003 |
91
- | 3.4619 | 5.75 | 1500 | 3.2421 | 0.1509 | 0.0783 | 0.1509 | 0.0855 | 0.4003 |
92
- | 3.4619 | 5.94 | 1550 | 3.2366 | 0.1375 | 0.0686 | 0.1375 | 0.0754 | 0.3892 |
93
- | 3.4619 | 6.14 | 1600 | 3.2102 | 0.1456 | 0.0959 | 0.1456 | 0.0862 | 0.3965 |
94
- | 3.4619 | 6.33 | 1650 | 3.1962 | 0.1456 | 0.0688 | 0.1456 | 0.0858 | 0.3957 |
95
- | 3.4619 | 6.52 | 1700 | 3.1917 | 0.1590 | 0.1160 | 0.1590 | 0.0994 | 0.4035 |
96
- | 3.4619 | 6.71 | 1750 | 3.1746 | 0.1590 | 0.0922 | 0.1590 | 0.0978 | 0.4051 |
97
- | 3.4619 | 6.9 | 1800 | 3.1791 | 0.1590 | 0.0671 | 0.1590 | 0.0863 | 0.4059 |
98
- | 3.4619 | 7.09 | 1850 | 3.1714 | 0.1725 | 0.0952 | 0.1725 | 0.1028 | 0.4135 |
99
- | 3.4619 | 7.29 | 1900 | 3.1427 | 0.1725 | 0.1084 | 0.1725 | 0.1090 | 0.4194 |
100
- | 3.4619 | 7.48 | 1950 | 3.1410 | 0.1833 | 0.1313 | 0.1833 | 0.1221 | 0.4226 |
101
- | 3.3361 | 7.67 | 2000 | 3.1334 | 0.1806 | 0.1385 | 0.1806 | 0.1239 | 0.4197 |
102
- | 3.3361 | 7.86 | 2050 | 3.1246 | 0.1806 | 0.1474 | 0.1806 | 0.1193 | 0.4208 |
103
- | 3.3361 | 8.05 | 2100 | 3.1151 | 0.1995 | 0.1582 | 0.1995 | 0.1388 | 0.4332 |
104
- | 3.3361 | 8.25 | 2150 | 3.1085 | 0.2049 | 0.1578 | 0.2049 | 0.1439 | 0.4377 |
105
- | 3.3361 | 8.44 | 2200 | 3.0897 | 0.2102 | 0.1546 | 0.2102 | 0.1483 | 0.4445 |
106
- | 3.3361 | 8.63 | 2250 | 3.0934 | 0.2210 | 0.1511 | 0.2210 | 0.1541 | 0.4469 |
107
- | 3.3361 | 8.82 | 2300 | 3.0906 | 0.2102 | 0.1625 | 0.2102 | 0.1535 | 0.4394 |
108
- | 3.3361 | 9.01 | 2350 | 3.0792 | 0.2129 | 0.1586 | 0.2129 | 0.1573 | 0.4437 |
109
- | 3.3361 | 9.2 | 2400 | 3.0849 | 0.2049 | 0.1442 | 0.2049 | 0.1446 | 0.4358 |
110
- | 3.3361 | 9.4 | 2450 | 3.0794 | 0.2102 | 0.1576 | 0.2102 | 0.1532 | 0.4396 |
111
- | 3.2647 | 9.59 | 2500 | 3.0801 | 0.2129 | 0.1560 | 0.2129 | 0.1552 | 0.4415 |
112
- | 3.2647 | 9.78 | 2550 | 3.0823 | 0.2075 | 0.1669 | 0.2075 | 0.1521 | 0.4396 |
113
- | 3.2647 | 9.97 | 2600 | 3.0783 | 0.2075 | 0.1563 | 0.2075 | 0.1504 | 0.4396 |
114
 
115
 
116
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.5264
24
+ - Accuracy: 0.8625
25
+ - Precision: 0.8662
26
+ - Recall: 0.8625
27
+ - F1: 0.8517
28
+ - Binary: 0.9030
29
 
30
  ## Model description
31
 
 
44
  ### Training hyperparameters
45
 
46
  The following hyperparameters were used during training:
47
+ - learning_rate: 0.0001
48
  - train_batch_size: 32
49
  - eval_batch_size: 32
50
  - seed: 42
 
59
 
60
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Binary |
61
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|:------:|
62
+ | No log | 0.19 | 50 | 3.8945 | 0.0566 | 0.0077 | 0.0566 | 0.0127 | 0.3326 |
63
+ | No log | 0.38 | 100 | 3.4610 | 0.0701 | 0.0208 | 0.0701 | 0.0174 | 0.3418 |
64
+ | No log | 0.58 | 150 | 3.2223 | 0.1051 | 0.0294 | 0.1051 | 0.0364 | 0.3720 |
65
+ | No log | 0.77 | 200 | 3.1153 | 0.1294 | 0.0504 | 0.1294 | 0.0565 | 0.3795 |
66
+ | No log | 0.96 | 250 | 2.8292 | 0.1914 | 0.1010 | 0.1914 | 0.1073 | 0.4315 |
67
+ | No log | 1.15 | 300 | 2.7080 | 0.2264 | 0.1522 | 0.2264 | 0.1461 | 0.4496 |
68
+ | No log | 1.34 | 350 | 2.4083 | 0.2776 | 0.1986 | 0.2776 | 0.1896 | 0.4935 |
69
+ | No log | 1.53 | 400 | 2.2517 | 0.3720 | 0.2762 | 0.3720 | 0.2845 | 0.5580 |
70
+ | No log | 1.73 | 450 | 2.1201 | 0.3908 | 0.3501 | 0.3908 | 0.3098 | 0.5712 |
71
+ | 3.1927 | 1.92 | 500 | 1.9149 | 0.4582 | 0.3806 | 0.4582 | 0.3781 | 0.6210 |
72
+ | 3.1927 | 2.11 | 550 | 1.7920 | 0.5013 | 0.4684 | 0.5013 | 0.4456 | 0.6515 |
73
+ | 3.1927 | 2.3 | 600 | 1.5973 | 0.5418 | 0.4910 | 0.5418 | 0.4765 | 0.6803 |
74
+ | 3.1927 | 2.49 | 650 | 1.5067 | 0.5957 | 0.5572 | 0.5957 | 0.5409 | 0.7162 |
75
+ | 3.1927 | 2.68 | 700 | 1.3985 | 0.6253 | 0.6046 | 0.6253 | 0.5740 | 0.7361 |
76
+ | 3.1927 | 2.88 | 750 | 1.3198 | 0.6604 | 0.6224 | 0.6604 | 0.6114 | 0.7623 |
77
+ | 3.1927 | 3.07 | 800 | 1.2483 | 0.6685 | 0.6709 | 0.6685 | 0.6273 | 0.7674 |
78
+ | 3.1927 | 3.26 | 850 | 1.1560 | 0.7116 | 0.7063 | 0.7116 | 0.6710 | 0.7973 |
79
+ | 3.1927 | 3.45 | 900 | 1.0992 | 0.7197 | 0.7345 | 0.7197 | 0.6872 | 0.8030 |
80
+ | 3.1927 | 3.64 | 950 | 1.1148 | 0.7143 | 0.7477 | 0.7143 | 0.6918 | 0.7992 |
81
+ | 2.0117 | 3.84 | 1000 | 0.9688 | 0.7682 | 0.7634 | 0.7682 | 0.7404 | 0.8369 |
82
+ | 2.0117 | 4.03 | 1050 | 0.9990 | 0.7062 | 0.7148 | 0.7062 | 0.6717 | 0.7927 |
83
+ | 2.0117 | 4.22 | 1100 | 0.9516 | 0.7412 | 0.7619 | 0.7412 | 0.7229 | 0.8199 |
84
+ | 2.0117 | 4.41 | 1150 | 0.8740 | 0.7763 | 0.7947 | 0.7763 | 0.7582 | 0.8426 |
85
+ | 2.0117 | 4.6 | 1200 | 0.8611 | 0.7682 | 0.7800 | 0.7682 | 0.7469 | 0.8388 |
86
+ | 2.0117 | 4.79 | 1250 | 0.7992 | 0.7898 | 0.8228 | 0.7898 | 0.7775 | 0.8539 |
87
+ | 2.0117 | 4.99 | 1300 | 0.8161 | 0.7898 | 0.8209 | 0.7898 | 0.7756 | 0.8512 |
88
+ | 2.0117 | 5.18 | 1350 | 0.7420 | 0.7925 | 0.8144 | 0.7925 | 0.7768 | 0.8539 |
89
+ | 2.0117 | 5.37 | 1400 | 0.7420 | 0.7925 | 0.8070 | 0.7925 | 0.7712 | 0.8550 |
90
+ | 2.0117 | 5.56 | 1450 | 0.7126 | 0.8140 | 0.8187 | 0.8140 | 0.8017 | 0.8701 |
91
+ | 1.5617 | 5.75 | 1500 | 0.6797 | 0.8194 | 0.8436 | 0.8194 | 0.8086 | 0.8739 |
92
+ | 1.5617 | 5.94 | 1550 | 0.6877 | 0.8221 | 0.8279 | 0.8221 | 0.8028 | 0.8747 |
93
+ | 1.5617 | 6.14 | 1600 | 0.6547 | 0.8329 | 0.8525 | 0.8329 | 0.8230 | 0.8822 |
94
+ | 1.5617 | 6.33 | 1650 | 0.5935 | 0.8410 | 0.8589 | 0.8410 | 0.8270 | 0.8879 |
95
+ | 1.5617 | 6.52 | 1700 | 0.6423 | 0.8194 | 0.8255 | 0.8194 | 0.8052 | 0.8728 |
96
+ | 1.5617 | 6.71 | 1750 | 0.5980 | 0.8464 | 0.8610 | 0.8464 | 0.8322 | 0.8916 |
97
+ | 1.5617 | 6.9 | 1800 | 0.6111 | 0.8437 | 0.8543 | 0.8437 | 0.8287 | 0.8916 |
98
+ | 1.5617 | 7.09 | 1850 | 0.5835 | 0.8437 | 0.8588 | 0.8437 | 0.8336 | 0.8927 |
99
+ | 1.5617 | 7.29 | 1900 | 0.5804 | 0.8329 | 0.8461 | 0.8329 | 0.8210 | 0.8822 |
100
+ | 1.5617 | 7.48 | 1950 | 0.5711 | 0.8410 | 0.8580 | 0.8410 | 0.8290 | 0.8908 |
101
+ | 1.3255 | 7.67 | 2000 | 0.5468 | 0.8571 | 0.8633 | 0.8571 | 0.8457 | 0.9011 |
102
+ | 1.3255 | 7.86 | 2050 | 0.5384 | 0.8652 | 0.8720 | 0.8652 | 0.8553 | 0.9049 |
103
+ | 1.3255 | 8.05 | 2100 | 0.5673 | 0.8625 | 0.8684 | 0.8625 | 0.8547 | 0.9030 |
104
+ | 1.3255 | 8.25 | 2150 | 0.5450 | 0.8491 | 0.8582 | 0.8491 | 0.8403 | 0.8935 |
105
+ | 1.3255 | 8.44 | 2200 | 0.5278 | 0.8706 | 0.8770 | 0.8706 | 0.8630 | 0.9086 |
106
+ | 1.3255 | 8.63 | 2250 | 0.5339 | 0.8652 | 0.8692 | 0.8652 | 0.8542 | 0.9049 |
107
+ | 1.3255 | 8.82 | 2300 | 0.5469 | 0.8598 | 0.8648 | 0.8598 | 0.8489 | 0.9011 |
108
+ | 1.3255 | 9.01 | 2350 | 0.5404 | 0.8706 | 0.8747 | 0.8706 | 0.8602 | 0.9086 |
109
+ | 1.3255 | 9.2 | 2400 | 0.5455 | 0.8491 | 0.8565 | 0.8491 | 0.8378 | 0.8935 |
110
+ | 1.3255 | 9.4 | 2450 | 0.5317 | 0.8598 | 0.8664 | 0.8598 | 0.8479 | 0.9011 |
111
+ | 1.1934 | 9.59 | 2500 | 0.5227 | 0.8760 | 0.8798 | 0.8760 | 0.8657 | 0.9124 |
112
+ | 1.1934 | 9.78 | 2550 | 0.5278 | 0.8598 | 0.8653 | 0.8598 | 0.8481 | 0.9011 |
113
+ | 1.1934 | 9.97 | 2600 | 0.5264 | 0.8625 | 0.8662 | 0.8625 | 0.8517 | 0.9030 |
114
 
115
 
116
  ### Framework versions
runs/Jul10_22-50-00_LAPTOP-1GID9RGH/events.out.tfevents.1720626601.LAPTOP-1GID9RGH.24508.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:641f25ba0553a1aa635d2fe7b25ad826d990d1026c995c40d11e14998efe1a39
3
- size 36171
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:20c4cc0747056d6c0a9c739c378829b1994658191a93c1dcea74eb694e71d5dd
3
+ size 37569