Edit model card

wav2vec2-5Class-train-test-finetune-Medium

This model is a fine-tuned version of anderloh/Hugginhface-master-wav2vec-pretreined-5-class-train-test on the anderloh/Master5Class dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7855
  • Accuracy: 0.7448

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 0
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 250.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.6852 0.95 14 1.5987 0.3427
1.5721 1.97 29 1.5976 0.3427
1.5696 2.98 44 1.5957 0.3427
1.5671 4.0 59 1.5932 0.3357
1.6757 4.95 73 1.5902 0.3252
1.5595 5.97 88 1.5864 0.3217
1.5536 6.98 103 1.5818 0.3182
1.5484 8.0 118 1.5765 0.3112
1.6506 8.95 132 1.5708 0.3077
1.5317 9.97 147 1.5641 0.3007
1.5226 10.98 162 1.5564 0.2867
1.5116 12.0 177 1.5478 0.2692
1.6046 12.95 191 1.5385 0.2622
1.4822 13.97 206 1.5285 0.2552
1.4614 14.98 221 1.5184 0.2552
1.4396 16.0 236 1.5101 0.2517
1.5047 16.95 250 1.5104 0.2448
1.3741 17.97 265 1.5250 0.2517
1.3512 18.98 280 1.5329 0.2657
1.3286 20.0 295 1.5095 0.3252
1.3967 20.95 309 1.4829 0.3497
1.2779 21.97 324 1.4598 0.3846
1.2449 22.98 339 1.4284 0.4161
1.2118 24.0 354 1.4162 0.4231
1.2521 24.95 368 1.3798 0.4476
1.1183 25.97 383 1.3246 0.4790
1.0778 26.98 398 1.2761 0.4930
1.0306 28.0 413 1.2300 0.5105
1.0808 28.95 427 1.2064 0.5140
0.9767 29.97 442 1.1670 0.5524
0.9589 30.98 457 1.1264 0.5734
0.9193 32.0 472 1.1013 0.5874
0.9462 32.95 486 1.0736 0.6049
0.85 33.97 501 1.0628 0.6049
0.8294 34.98 516 1.0473 0.6189
0.8025 36.0 531 1.0030 0.6329
0.8206 36.95 545 0.9964 0.6399
0.7541 37.97 560 0.9605 0.6608
0.7413 38.98 575 0.9467 0.6643
0.709 40.0 590 0.9348 0.6678
0.7817 40.95 604 0.9366 0.6678
0.7034 41.97 619 0.9109 0.6818
0.6856 42.98 634 0.9277 0.6573
0.6625 44.0 649 0.8980 0.6783
0.7207 44.95 663 0.9050 0.6713
0.6684 45.97 678 0.8973 0.6748
0.6651 46.98 693 0.8935 0.6783
0.6451 48.0 708 0.8945 0.6748
0.6774 48.95 722 0.8879 0.6818
0.6308 49.97 737 0.8883 0.6818
0.6199 50.98 752 0.8826 0.6818
0.6379 52.0 767 0.8582 0.6923
0.6588 52.95 781 0.8825 0.6818
0.5857 53.97 796 0.8808 0.6748
0.6076 54.98 811 0.8555 0.6958
0.5934 56.0 826 0.8654 0.6888
0.6427 56.95 840 0.8616 0.6853
0.5782 57.97 855 0.8711 0.6678
0.5819 58.98 870 0.8689 0.6748
0.5918 60.0 885 0.8602 0.6923
0.5845 60.95 899 0.8459 0.6993
0.5667 61.97 914 0.8467 0.7028
0.5327 62.98 929 0.8540 0.6923
0.523 64.0 944 0.8323 0.7063
0.548 64.95 958 0.8407 0.6993
0.5399 65.97 973 0.8379 0.6993
0.5324 66.98 988 0.8119 0.7028
0.5171 68.0 1003 0.8445 0.6923
0.538 68.95 1017 0.8196 0.7098
0.5312 69.97 1032 0.8415 0.6853
0.4914 70.98 1047 0.8184 0.6958
0.5055 72.0 1062 0.8218 0.6923
0.5401 72.95 1076 0.8160 0.7028
0.4966 73.97 1091 0.8238 0.6888
0.4768 74.98 1106 0.8185 0.6993
0.4789 76.0 1121 0.8261 0.7028
0.5176 76.95 1135 0.8110 0.7098
0.466 77.97 1150 0.8141 0.6993
0.4736 78.98 1165 0.7970 0.7168
0.4785 80.0 1180 0.8062 0.7098
0.5309 80.95 1194 0.8051 0.6958
0.4571 81.97 1209 0.8024 0.7098
0.47 82.98 1224 0.8031 0.7168
0.4525 84.0 1239 0.7911 0.7133
0.5058 84.95 1253 0.7877 0.7133
0.4627 85.97 1268 0.7918 0.7063
0.4343 86.98 1283 0.7882 0.7168
0.4442 88.0 1298 0.8058 0.7133
0.4745 88.95 1312 0.7810 0.7238
0.4282 89.97 1327 0.7951 0.7098
0.4307 90.98 1342 0.7739 0.7168
0.4403 92.0 1357 0.7788 0.7203
0.4567 92.95 1371 0.7927 0.7168
0.4233 93.97 1386 0.7885 0.7203
0.4347 94.98 1401 0.7849 0.7203
0.4167 96.0 1416 0.7880 0.7238
0.4394 96.95 1430 0.7889 0.7203
0.4359 97.97 1445 0.7785 0.7203
0.4085 98.98 1460 0.7852 0.7133
0.3965 100.0 1475 0.7785 0.7273
0.445 100.95 1489 0.7826 0.7203
0.3988 101.97 1504 0.8045 0.7098
0.4129 102.98 1519 0.7686 0.7273
0.3937 104.0 1534 0.7912 0.7133
0.4356 104.95 1548 0.7922 0.7133
0.3969 105.97 1563 0.7752 0.7203
0.4051 106.98 1578 0.7917 0.7133
0.3982 108.0 1593 0.7917 0.7098
0.4117 108.95 1607 0.8071 0.7063
0.3666 109.97 1622 0.7840 0.7203
0.3894 110.98 1637 0.7790 0.7238
0.3858 112.0 1652 0.7961 0.7098
0.4037 112.95 1666 0.7822 0.7203
0.3886 113.97 1681 0.7748 0.7238
0.3762 114.98 1696 0.7782 0.7168
0.3444 116.0 1711 0.7746 0.7308
0.3961 116.95 1725 0.7842 0.7203
0.3578 117.97 1740 0.7819 0.7203
0.3578 118.98 1755 0.7806 0.7203
0.3489 120.0 1770 0.7809 0.7238
0.3622 120.95 1784 0.7947 0.7098
0.3545 121.97 1799 0.7878 0.7168
0.3361 122.98 1814 0.7855 0.7168
0.3618 124.0 1829 0.7890 0.7133
0.3472 124.95 1843 0.7810 0.7168
0.3511 125.97 1858 0.7897 0.7133
0.3389 126.98 1873 0.7923 0.7133
0.3391 128.0 1888 0.7782 0.7273
0.3746 128.95 1902 0.7838 0.7203
0.3238 129.97 1917 0.7943 0.7168
0.3601 130.98 1932 0.7863 0.7168
0.3339 132.0 1947 0.7949 0.7133
0.3805 132.95 1961 0.7823 0.7238
0.3524 133.97 1976 0.8052 0.7098
0.3103 134.98 1991 0.7809 0.7238
0.3484 136.0 2006 0.7879 0.7203
0.3424 136.95 2020 0.7875 0.7273
0.316 137.97 2035 0.7829 0.7273
0.3171 138.98 2050 0.7882 0.7203
0.3155 140.0 2065 0.7830 0.7168
0.3382 140.95 2079 0.7826 0.7273
0.3175 141.97 2094 0.7964 0.7203
0.3444 142.98 2109 0.7859 0.7238
0.3208 144.0 2124 0.7860 0.7273
0.3286 144.95 2138 0.7869 0.7273
0.3319 145.97 2153 0.7916 0.7168
0.2954 146.98 2168 0.7938 0.7238
0.3283 148.0 2183 0.7974 0.7168
0.3306 148.95 2197 0.7795 0.7308
0.3073 149.97 2212 0.7910 0.7203
0.3089 150.98 2227 0.7942 0.7203
0.2915 152.0 2242 0.7934 0.7168
0.3286 152.95 2256 0.7808 0.7308
0.2817 153.97 2271 0.7788 0.7308
0.3118 154.98 2286 0.7898 0.7273
0.3155 156.0 2301 0.7966 0.7203
0.3156 156.95 2315 0.7947 0.7203
0.2936 157.97 2330 0.7917 0.7168
0.3049 158.98 2345 0.7780 0.7308
0.2896 160.0 2360 0.7926 0.7273
0.3194 160.95 2374 0.8023 0.7203
0.2918 161.97 2389 0.7933 0.7273
0.2992 162.98 2404 0.7829 0.7413
0.3 164.0 2419 0.7946 0.7203
0.322 164.95 2433 0.7969 0.7273
0.2994 165.97 2448 0.8076 0.7238
0.2849 166.98 2463 0.7951 0.7308
0.2745 168.0 2478 0.7892 0.7343
0.2974 168.95 2492 0.7916 0.7308
0.2656 169.97 2507 0.7995 0.7343
0.295 170.98 2522 0.8026 0.7238
0.2791 172.0 2537 0.7973 0.7343
0.2836 172.95 2551 0.8023 0.7308
0.2806 173.97 2566 0.8013 0.7308
0.2661 174.98 2581 0.7965 0.7308
0.2695 176.0 2596 0.8063 0.7273
0.286 176.95 2610 0.7963 0.7238
0.2743 177.97 2625 0.7929 0.7413
0.2775 178.98 2640 0.7855 0.7448
0.2878 180.0 2655 0.7894 0.7378
0.2757 180.95 2669 0.8013 0.7273
0.3067 181.97 2684 0.8015 0.7308
0.2412 182.98 2699 0.7975 0.7273
0.2686 184.0 2714 0.8037 0.7238
0.3176 184.95 2728 0.8017 0.7308
0.269 185.97 2743 0.8060 0.7308
0.2668 186.98 2758 0.8030 0.7308
0.2761 188.0 2773 0.8007 0.7413
0.2731 188.95 2787 0.8039 0.7343
0.2678 189.97 2802 0.8054 0.7308
0.2686 190.98 2817 0.8042 0.7343
0.2721 192.0 2832 0.8063 0.7308
0.3111 192.95 2846 0.8075 0.7378
0.2525 193.97 2861 0.8114 0.7273
0.2589 194.98 2876 0.8071 0.7413
0.2597 196.0 2891 0.8161 0.7378
0.3034 196.95 2905 0.8163 0.7378
0.2721 197.97 2920 0.8151 0.7308
0.2783 198.98 2935 0.8093 0.7448
0.259 200.0 2950 0.8026 0.7413
0.2896 200.95 2964 0.8070 0.7308
0.2584 201.97 2979 0.8113 0.7308
0.2672 202.98 2994 0.8096 0.7343
0.2622 204.0 3009 0.8060 0.7413
0.2677 204.95 3023 0.8084 0.7343
0.263 205.97 3038 0.8010 0.7378
0.2608 206.98 3053 0.7989 0.7448
0.2528 208.0 3068 0.7954 0.7448
0.2553 208.95 3082 0.7965 0.7413
0.2652 209.97 3097 0.7995 0.7413
0.246 210.98 3112 0.8026 0.7378
0.2665 212.0 3127 0.8049 0.7378
0.2731 212.95 3141 0.8052 0.7378
0.2539 213.97 3156 0.8089 0.7378
0.2376 214.98 3171 0.8116 0.7378
0.2667 216.0 3186 0.8099 0.7413
0.2768 216.95 3200 0.8079 0.7413
0.2551 217.97 3215 0.8065 0.7448
0.256 218.98 3230 0.8080 0.7413
0.2387 220.0 3245 0.8076 0.7448
0.2736 220.95 3259 0.8081 0.7448
0.2551 221.97 3274 0.8092 0.7448
0.2482 222.98 3289 0.8085 0.7448
0.2432 224.0 3304 0.8085 0.7448
0.2857 224.95 3318 0.8079 0.7448
0.2434 225.97 3333 0.8079 0.7448
0.2531 226.98 3348 0.8078 0.7448
0.222 228.0 3363 0.8093 0.7448
0.2651 228.95 3377 0.8105 0.7448
0.2886 229.97 3392 0.8116 0.7448
0.2534 230.98 3407 0.8116 0.7448
0.2483 232.0 3422 0.8116 0.7448
0.272 232.95 3436 0.8118 0.7448
0.2562 233.97 3451 0.8120 0.7448
0.2531 234.98 3466 0.8122 0.7448
0.2397 236.0 3481 0.8123 0.7448
0.2573 236.95 3495 0.8123 0.7448
0.2023 237.29 3500 0.8122 0.7448

Framework versions

  • Transformers 4.39.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
22

Finetuned from