Edit model card

HuggingfaceTest

This model is a fine-tuned version of anderloh/Hugginhface-master-wav2vec-pretreined-5-class-train-test on the anderloh/Master5Class dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8156
  • Accuracy: 0.7028

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 0
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 350.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.5989 0.3427
No log 1.85 6 1.5988 0.3427
No log 2.77 9 1.5986 0.3427
No log 4.0 13 1.5981 0.3427
No log 4.92 16 1.5976 0.3357
No log 5.85 19 1.5970 0.3427
No log 6.77 22 1.5963 0.3392
No log 8.0 26 1.5953 0.3357
No log 8.92 29 1.5943 0.3287
No log 9.85 32 1.5933 0.3287
No log 10.77 35 1.5922 0.3217
No log 12.0 39 1.5906 0.3182
No log 12.92 42 1.5892 0.3147
No log 13.85 45 1.5877 0.3007
No log 14.77 48 1.5862 0.2937
1.5907 16.0 52 1.5841 0.2972
1.5907 16.92 55 1.5824 0.2832
1.5907 17.85 58 1.5806 0.2797
1.5907 18.77 61 1.5788 0.2692
1.5907 20.0 65 1.5762 0.2692
1.5907 20.92 68 1.5740 0.2657
1.5907 21.85 71 1.5717 0.2552
1.5907 22.77 74 1.5694 0.2517
1.5907 24.0 78 1.5661 0.2378
1.5907 24.92 81 1.5635 0.2343
1.5907 25.85 84 1.5608 0.2238
1.5907 26.77 87 1.5581 0.2238
1.5907 28.0 91 1.5542 0.2273
1.5907 28.92 94 1.5511 0.2273
1.5907 29.85 97 1.5479 0.2273
1.5431 30.77 100 1.5448 0.2273
1.5431 32.0 104 1.5408 0.2273
1.5431 32.92 107 1.5380 0.2273
1.5431 33.85 110 1.5359 0.2273
1.5431 34.77 113 1.5345 0.2273
1.5431 36.0 117 1.5335 0.2273
1.5431 36.92 120 1.5341 0.2273
1.5431 37.85 123 1.5361 0.2273
1.5431 38.77 126 1.5397 0.2273
1.5431 40.0 130 1.5479 0.2273
1.5431 40.92 133 1.5564 0.2273
1.5431 41.85 136 1.5679 0.2273
1.5431 42.77 139 1.5822 0.2273
1.5431 44.0 143 1.6002 0.2273
1.5431 44.92 146 1.6109 0.2273
1.5431 45.85 149 1.6146 0.2273
1.4033 46.77 152 1.6131 0.2273
1.4033 48.0 156 1.6008 0.2273
1.4033 48.92 159 1.5862 0.2413
1.4033 49.85 162 1.5726 0.2692
1.4033 50.77 165 1.5599 0.2692
1.4033 52.0 169 1.5459 0.2867
1.4033 52.92 172 1.5383 0.2937
1.4033 53.85 175 1.5311 0.3147
1.4033 54.77 178 1.5242 0.3252
1.4033 56.0 182 1.5169 0.3357
1.4033 56.92 185 1.5103 0.3427
1.4033 57.85 188 1.5056 0.3462
1.4033 58.77 191 1.4995 0.3462
1.4033 60.0 195 1.4939 0.3497
1.4033 60.92 198 1.4870 0.3601
1.2485 61.85 201 1.4829 0.3671
1.2485 62.77 204 1.4735 0.3741
1.2485 64.0 208 1.4612 0.3811
1.2485 64.92 211 1.4492 0.3986
1.2485 65.85 214 1.4365 0.4126
1.2485 66.77 217 1.4227 0.4231
1.2485 68.0 221 1.4096 0.4336
1.2485 68.92 224 1.4010 0.4371
1.2485 69.85 227 1.3950 0.4406
1.2485 70.77 230 1.3920 0.4371
1.2485 72.0 234 1.3799 0.4406
1.2485 72.92 237 1.3669 0.4476
1.2485 73.85 240 1.3515 0.4545
1.2485 74.77 243 1.3401 0.4720
1.2485 76.0 247 1.3286 0.4825
1.1198 76.92 250 1.3175 0.4860
1.1198 77.85 253 1.3067 0.4895
1.1198 78.77 256 1.3013 0.4825
1.1198 80.0 260 1.2954 0.4790
1.1198 80.92 263 1.2897 0.4860
1.1198 81.85 266 1.2832 0.4860
1.1198 82.77 269 1.2712 0.4825
1.1198 84.0 273 1.2584 0.4930
1.1198 84.92 276 1.2516 0.4965
1.1198 85.85 279 1.2456 0.5
1.1198 86.77 282 1.2444 0.5105
1.1198 88.0 286 1.2373 0.5105
1.1198 88.92 289 1.2309 0.5140
1.1198 89.85 292 1.2219 0.5210
1.1198 90.77 295 1.2145 0.5210
1.1198 92.0 299 1.2054 0.5280
0.9915 92.92 302 1.1982 0.5350
0.9915 93.85 305 1.1913 0.5385
0.9915 94.77 308 1.1859 0.5455
0.9915 96.0 312 1.1794 0.5490
0.9915 96.92 315 1.1734 0.5455
0.9915 97.85 318 1.1638 0.5524
0.9915 98.77 321 1.1550 0.5524
0.9915 100.0 325 1.1465 0.5490
0.9915 100.92 328 1.1444 0.5594
0.9915 101.85 331 1.1359 0.5629
0.9915 102.77 334 1.1271 0.5664
0.9915 104.0 338 1.1090 0.5769
0.9915 104.92 341 1.0972 0.5944
0.9915 105.85 344 1.0901 0.6014
0.9915 106.77 347 1.0809 0.6084
0.8834 108.0 351 1.0683 0.6119
0.8834 108.92 354 1.0605 0.6224
0.8834 109.85 357 1.0563 0.6259
0.8834 110.77 360 1.0538 0.6224
0.8834 112.0 364 1.0491 0.6154
0.8834 112.92 367 1.0441 0.6119
0.8834 113.85 370 1.0358 0.6119
0.8834 114.77 373 1.0194 0.6224
0.8834 116.0 377 1.0034 0.6294
0.8834 116.92 380 0.9991 0.6259
0.8834 117.85 383 0.9960 0.6259
0.8834 118.77 386 0.9911 0.6294
0.8834 120.0 390 0.9834 0.6434
0.8834 120.92 393 0.9776 0.6434
0.8834 121.85 396 0.9773 0.6434
0.8834 122.77 399 0.9735 0.6434
0.7786 124.0 403 0.9731 0.6399
0.7786 124.92 406 0.9728 0.6434
0.7786 125.85 409 0.9657 0.6573
0.7786 126.77 412 0.9548 0.6573
0.7786 128.0 416 0.9424 0.6643
0.7786 128.92 419 0.9391 0.6678
0.7786 129.85 422 0.9418 0.6678
0.7786 130.77 425 0.9476 0.6608
0.7786 132.0 429 0.9457 0.6643
0.7786 132.92 432 0.9413 0.6643
0.7786 133.85 435 0.9334 0.6678
0.7786 134.77 438 0.9329 0.6678
0.7786 136.0 442 0.9334 0.6713
0.7786 136.92 445 0.9265 0.6713
0.7786 137.85 448 0.9187 0.6713
0.7133 138.77 451 0.9169 0.6678
0.7133 140.0 455 0.9142 0.6713
0.7133 140.92 458 0.9131 0.6713
0.7133 141.85 461 0.9161 0.6783
0.7133 142.77 464 0.9224 0.6678
0.7133 144.0 468 0.9139 0.6748
0.7133 144.92 471 0.9090 0.6748
0.7133 145.85 474 0.9073 0.6713
0.7133 146.77 477 0.9110 0.6608
0.7133 148.0 481 0.9167 0.6573
0.7133 148.92 484 0.9118 0.6643
0.7133 149.85 487 0.8996 0.6713
0.7133 150.77 490 0.8904 0.6748
0.7133 152.0 494 0.8889 0.6748
0.7133 152.92 497 0.8899 0.6713
0.6674 153.85 500 0.8874 0.6748
0.6674 154.77 503 0.8874 0.6748
0.6674 156.0 507 0.8905 0.6748
0.6674 156.92 510 0.8881 0.6783
0.6674 157.85 513 0.8829 0.6748
0.6674 158.77 516 0.8809 0.6783
0.6674 160.0 520 0.8781 0.6783
0.6674 160.92 523 0.8776 0.6818
0.6674 161.85 526 0.8796 0.6783
0.6674 162.77 529 0.8795 0.6818
0.6674 164.0 533 0.8797 0.6783
0.6674 164.92 536 0.8707 0.6783
0.6674 165.85 539 0.8697 0.6783
0.6674 166.77 542 0.8724 0.6783
0.6674 168.0 546 0.8704 0.6748
0.6674 168.92 549 0.8694 0.6748
0.6305 169.85 552 0.8740 0.6748
0.6305 170.77 555 0.8713 0.6748
0.6305 172.0 559 0.8682 0.6783
0.6305 172.92 562 0.8688 0.6783
0.6305 173.85 565 0.8693 0.6818
0.6305 174.77 568 0.8744 0.6783
0.6305 176.0 572 0.8760 0.6783
0.6305 176.92 575 0.8696 0.6853
0.6305 177.85 578 0.8669 0.6853
0.6305 178.77 581 0.8641 0.6853
0.6305 180.0 585 0.8697 0.6713
0.6305 180.92 588 0.8678 0.6748
0.6305 181.85 591 0.8621 0.6818
0.6305 182.77 594 0.8557 0.6888
0.6305 184.0 598 0.8481 0.6888
0.6095 184.92 601 0.8429 0.6888
0.6095 185.85 604 0.8413 0.6888
0.6095 186.77 607 0.8402 0.6923
0.6095 188.0 611 0.8415 0.6888
0.6095 188.92 614 0.8410 0.6923
0.6095 189.85 617 0.8389 0.6853
0.6095 190.77 620 0.8354 0.6853
0.6095 192.0 624 0.8357 0.6888
0.6095 192.92 627 0.8401 0.6958
0.6095 193.85 630 0.8449 0.6958
0.6095 194.77 633 0.8479 0.6958
0.6095 196.0 637 0.8455 0.6923
0.6095 196.92 640 0.8422 0.6923
0.6095 197.85 643 0.8425 0.6923
0.6095 198.77 646 0.8437 0.6923
0.5908 200.0 650 0.8367 0.6958
0.5908 200.92 653 0.8347 0.6993
0.5908 201.85 656 0.8287 0.6958
0.5908 202.77 659 0.8260 0.6923
0.5908 204.0 663 0.8264 0.6958
0.5908 204.92 666 0.8295 0.6958
0.5908 205.85 669 0.8302 0.6923
0.5908 206.77 672 0.8285 0.6923
0.5908 208.0 676 0.8311 0.6923
0.5908 208.92 679 0.8321 0.6923
0.5908 209.85 682 0.8306 0.6923
0.5908 210.77 685 0.8303 0.6923
0.5908 212.0 689 0.8256 0.6993
0.5908 212.92 692 0.8230 0.6958
0.5908 213.85 695 0.8194 0.6958
0.5908 214.77 698 0.8183 0.6958
0.5763 216.0 702 0.8232 0.6958
0.5763 216.92 705 0.8237 0.6888
0.5763 217.85 708 0.8196 0.6993
0.5763 218.77 711 0.8142 0.6993
0.5763 220.0 715 0.8115 0.6993
0.5763 220.92 718 0.8130 0.6993
0.5763 221.85 721 0.8156 0.7028
0.5763 222.77 724 0.8201 0.6958
0.5763 224.0 728 0.8227 0.6958
0.5763 224.92 731 0.8232 0.6958
0.5763 225.85 734 0.8198 0.6923
0.5763 226.77 737 0.8151 0.6923
0.5763 228.0 741 0.8136 0.6923
0.5763 228.92 744 0.8134 0.6923
0.5763 229.85 747 0.8123 0.6958
0.57 230.77 750 0.8095 0.6958
0.57 232.0 754 0.8082 0.6958
0.57 232.92 757 0.8084 0.6958
0.57 233.85 760 0.8114 0.6923
0.57 234.77 763 0.8130 0.6923
0.57 236.0 767 0.8154 0.6923
0.57 236.92 770 0.8160 0.6923
0.57 237.85 773 0.8126 0.6888
0.57 238.77 776 0.8114 0.6888
0.57 240.0 780 0.8041 0.6923
0.57 240.92 783 0.8006 0.6923
0.57 241.85 786 0.7987 0.6958
0.57 242.77 789 0.7977 0.6993
0.57 244.0 793 0.8001 0.6993
0.57 244.92 796 0.8044 0.6958
0.57 245.85 799 0.8082 0.6958
0.5456 246.77 802 0.8121 0.6888
0.5456 248.0 806 0.8107 0.6888
0.5456 248.92 809 0.8064 0.6958
0.5456 249.85 812 0.8042 0.6958
0.5456 250.77 815 0.8006 0.6958
0.5456 252.0 819 0.7969 0.6958
0.5456 252.92 822 0.7955 0.6993
0.5456 253.85 825 0.7973 0.6958
0.5456 254.77 828 0.8001 0.6958
0.5456 256.0 832 0.8035 0.6888
0.5456 256.92 835 0.8035 0.6853
0.5456 257.85 838 0.8012 0.6923
0.5456 258.77 841 0.8000 0.6923
0.5456 260.0 845 0.7963 0.6888
0.5456 260.92 848 0.7928 0.6958
0.5369 261.85 851 0.7919 0.6923
0.5369 262.77 854 0.7913 0.6888
0.5369 264.0 858 0.7929 0.6888
0.5369 264.92 861 0.7955 0.6818
0.5369 265.85 864 0.7963 0.6853
0.5369 266.77 867 0.7952 0.6888
0.5369 268.0 871 0.7936 0.6888
0.5369 268.92 874 0.7929 0.6853
0.5369 269.85 877 0.7933 0.6853
0.5369 270.77 880 0.7941 0.6853
0.5369 272.0 884 0.7940 0.6853
0.5369 272.92 887 0.7929 0.6853
0.5369 273.85 890 0.7930 0.6853
0.5369 274.77 893 0.7943 0.6853
0.5369 276.0 897 0.7944 0.6853
0.5388 276.92 900 0.7933 0.6853
0.5388 277.85 903 0.7914 0.6853
0.5388 278.77 906 0.7904 0.6853
0.5388 280.0 910 0.7888 0.6853
0.5388 280.92 913 0.7900 0.6853
0.5388 281.85 916 0.7906 0.6853
0.5388 282.77 919 0.7911 0.6853
0.5388 284.0 923 0.7907 0.6853
0.5388 284.92 926 0.7907 0.6853
0.5388 285.85 929 0.7905 0.6818
0.5388 286.77 932 0.7900 0.6818
0.5388 288.0 936 0.7901 0.6853
0.5388 288.92 939 0.7902 0.6853
0.5388 289.85 942 0.7910 0.6853
0.5388 290.77 945 0.7914 0.6888
0.5388 292.0 949 0.7920 0.6888
0.5261 292.92 952 0.7928 0.6853
0.5261 293.85 955 0.7932 0.6888
0.5261 294.77 958 0.7925 0.6888
0.5261 296.0 962 0.7922 0.6888
0.5261 296.92 965 0.7919 0.6888
0.5261 297.85 968 0.7922 0.6888
0.5261 298.77 971 0.7921 0.6888
0.5261 300.0 975 0.7912 0.6853
0.5261 300.92 978 0.7907 0.6853
0.5261 301.85 981 0.7896 0.6853
0.5261 302.77 984 0.7885 0.6888
0.5261 304.0 988 0.7877 0.6888
0.5261 304.92 991 0.7874 0.6888
0.5261 305.85 994 0.7876 0.6888
0.5261 306.77 997 0.7879 0.6888
0.5188 308.0 1001 0.7884 0.6888
0.5188 308.92 1004 0.7887 0.6888
0.5188 309.85 1007 0.7890 0.6888
0.5188 310.77 1010 0.7894 0.6888
0.5188 312.0 1014 0.7899 0.6888
0.5188 312.92 1017 0.7904 0.6888
0.5188 313.85 1020 0.7907 0.6923
0.5188 314.77 1023 0.7910 0.6923
0.5188 316.0 1027 0.7912 0.6923
0.5188 316.92 1030 0.7912 0.6923
0.5188 317.85 1033 0.7912 0.6923
0.5188 318.77 1036 0.7913 0.6923
0.5188 320.0 1040 0.7913 0.6923
0.5188 320.92 1043 0.7912 0.6923
0.5188 321.85 1046 0.7912 0.6923
0.5188 322.77 1049 0.7911 0.6923
0.5194 323.08 1050 0.7911 0.6923

Framework versions

  • Transformers 4.39.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
3
Safetensors
Model size
13M params
Tensor type
F32
·
Inference API
or
This model can be loaded on Inference API (serverless).

Finetuned from