Edit model card

wav2vec2-5Class-train-test-finetune-V7

This model is a fine-tuned version of anderloh/Hugginhface-master-wav2vec-pretreined-5-class-train-test on the anderloh/Master5Class dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9663
  • Accuracy: 0.6538

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 0
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 250.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.6026 0.1608
No log 1.85 6 1.6024 0.1608
No log 2.77 9 1.6021 0.1608
No log 4.0 13 1.6014 0.1608
No log 4.92 16 1.6008 0.1608
No log 5.85 19 1.6001 0.1608
No log 6.77 22 1.5992 0.1608
No log 8.0 26 1.5978 0.1748
No log 8.92 29 1.5965 0.1888
No log 9.85 32 1.5952 0.2098
No log 10.77 35 1.5938 0.2273
No log 12.0 39 1.5916 0.2343
No log 12.92 42 1.5899 0.2692
No log 13.85 45 1.5880 0.2727
No log 14.77 48 1.5860 0.3077
No log 16.0 52 1.5833 0.3566
No log 16.92 55 1.5811 0.3881
No log 17.85 58 1.5788 0.3811
No log 18.77 61 1.5764 0.3671
No log 20.0 65 1.5731 0.3497
No log 20.92 68 1.5702 0.3287
No log 21.85 71 1.5672 0.3252
No log 22.77 74 1.5641 0.3147
No log 24.0 78 1.5597 0.3112
No log 24.92 81 1.5564 0.3077
No log 25.85 84 1.5532 0.3042
No log 26.77 87 1.5499 0.2937
No log 28.0 91 1.5454 0.2902
No log 28.92 94 1.5419 0.2867
No log 29.85 97 1.5383 0.2832
1.5563 30.77 100 1.5349 0.2762
1.5563 32.0 104 1.5304 0.2797
1.5563 32.92 107 1.5273 0.2762
1.5563 33.85 110 1.5247 0.2657
1.5563 34.77 113 1.5223 0.2517
1.5563 36.0 117 1.5194 0.2483
1.5563 36.92 120 1.5178 0.2413
1.5563 37.85 123 1.5168 0.2378
1.5563 38.77 126 1.5162 0.2448
1.5563 40.0 130 1.5162 0.2448
1.5563 40.92 133 1.5167 0.2483
1.5563 41.85 136 1.5181 0.2483
1.5563 42.77 139 1.5203 0.2587
1.5563 44.0 143 1.5227 0.2692
1.5563 44.92 146 1.5243 0.2832
1.5563 45.85 149 1.5239 0.2797
1.5563 46.77 152 1.5224 0.3007
1.5563 48.0 156 1.5170 0.3077
1.5563 48.92 159 1.5103 0.3287
1.5563 49.85 162 1.5032 0.3497
1.5563 50.77 165 1.4959 0.3601
1.5563 52.0 169 1.4857 0.3636
1.5563 52.92 172 1.4788 0.3671
1.5563 53.85 175 1.4713 0.3741
1.5563 54.77 178 1.4642 0.3811
1.5563 56.0 182 1.4553 0.3881
1.5563 56.92 185 1.4481 0.3986
1.5563 57.85 188 1.4421 0.4021
1.5563 58.77 191 1.4357 0.4126
1.5563 60.0 195 1.4284 0.4196
1.5563 60.92 198 1.4218 0.4196
1.3138 61.85 201 1.4167 0.4301
1.3138 62.77 204 1.4091 0.4301
1.3138 64.0 208 1.3995 0.4371
1.3138 64.92 211 1.3911 0.4406
1.3138 65.85 214 1.3825 0.4336
1.3138 66.77 217 1.3735 0.4441
1.3138 68.0 221 1.3632 0.4476
1.3138 68.92 224 1.3556 0.4510
1.3138 69.85 227 1.3492 0.4510
1.3138 70.77 230 1.3441 0.4510
1.3138 72.0 234 1.3352 0.4580
1.3138 72.92 237 1.3269 0.4615
1.3138 73.85 240 1.3186 0.4755
1.3138 74.77 243 1.3105 0.4755
1.3138 76.0 247 1.2992 0.4790
1.3138 76.92 250 1.2896 0.4825
1.3138 77.85 253 1.2797 0.4825
1.3138 78.77 256 1.2707 0.4860
1.3138 80.0 260 1.2587 0.4930
1.3138 80.92 263 1.2494 0.4930
1.3138 81.85 266 1.2407 0.4930
1.3138 82.77 269 1.2314 0.5105
1.3138 84.0 273 1.2205 0.5140
1.3138 84.92 276 1.2124 0.5210
1.3138 85.85 279 1.2043 0.5315
1.3138 86.77 282 1.1973 0.5350
1.3138 88.0 286 1.1870 0.5524
1.3138 88.92 289 1.1788 0.5629
1.3138 89.85 292 1.1700 0.5629
1.3138 90.77 295 1.1613 0.5699
1.3138 92.0 299 1.1498 0.5839
1.047 92.92 302 1.1411 0.5874
1.047 93.85 305 1.1330 0.5944
1.047 94.77 308 1.1261 0.5944
1.047 96.0 312 1.1161 0.6014
1.047 96.92 315 1.1084 0.6014
1.047 97.85 318 1.1003 0.6049
1.047 98.77 321 1.0926 0.6049
1.047 100.0 325 1.0821 0.6084
1.047 100.92 328 1.0754 0.6084
1.047 101.85 331 1.0690 0.6084
1.047 102.77 334 1.0637 0.6154
1.047 104.0 338 1.0549 0.6189
1.047 104.92 341 1.0478 0.6224
1.047 105.85 344 1.0420 0.6259
1.047 106.77 347 1.0370 0.6294
1.047 108.0 351 1.0308 0.6294
1.047 108.92 354 1.0263 0.6259
1.047 109.85 357 1.0231 0.6259
1.047 110.77 360 1.0204 0.6329
1.047 112.0 364 1.0167 0.6294
1.047 112.92 367 1.0145 0.6294
1.047 113.85 370 1.0119 0.6329
1.047 114.77 373 1.0077 0.6294
1.047 116.0 377 1.0012 0.6364
1.047 116.92 380 0.9975 0.6364
1.047 117.85 383 0.9938 0.6364
1.047 118.77 386 0.9913 0.6399
1.047 120.0 390 0.9886 0.6469
1.047 120.92 393 0.9870 0.6469
1.047 121.85 396 0.9861 0.6399
1.047 122.77 399 0.9857 0.6434
0.8183 124.0 403 0.9855 0.6399
0.8183 124.92 406 0.9864 0.6399
0.8183 125.85 409 0.9857 0.6399
0.8183 126.77 412 0.9818 0.6399
0.8183 128.0 416 0.9765 0.6399
0.8183 128.92 419 0.9740 0.6399
0.8183 129.85 422 0.9737 0.6434
0.8183 130.77 425 0.9754 0.6469
0.8183 132.0 429 0.9753 0.6469
0.8183 132.92 432 0.9740 0.6469
0.8183 133.85 435 0.9710 0.6469
0.8183 134.77 438 0.9686 0.6469
0.8183 136.0 442 0.9671 0.6469
0.8183 136.92 445 0.9669 0.6434
0.8183 137.85 448 0.9659 0.6399
0.8183 138.77 451 0.9662 0.6434
0.8183 140.0 455 0.9674 0.6434
0.8183 140.92 458 0.9694 0.6399
0.8183 141.85 461 0.9716 0.6469
0.8183 142.77 464 0.9739 0.6434
0.8183 144.0 468 0.9712 0.6469
0.8183 144.92 471 0.9670 0.6434
0.8183 145.85 474 0.9637 0.6434
0.8183 146.77 477 0.9625 0.6469
0.8183 148.0 481 0.9634 0.6469
0.8183 148.92 484 0.9659 0.6469
0.8183 149.85 487 0.9663 0.6469
0.8183 150.77 490 0.9649 0.6503
0.8183 152.0 494 0.9655 0.6503
0.8183 152.92 497 0.9648 0.6503
0.7321 153.85 500 0.9638 0.6503
0.7321 154.77 503 0.9631 0.6503
0.7321 156.0 507 0.9647 0.6503
0.7321 156.92 510 0.9653 0.6503
0.7321 157.85 513 0.9662 0.6503
0.7321 158.77 516 0.9679 0.6503
0.7321 160.0 520 0.9675 0.6503
0.7321 160.92 523 0.9664 0.6503
0.7321 161.85 526 0.9655 0.6503
0.7321 162.77 529 0.9642 0.6503
0.7321 164.0 533 0.9635 0.6503
0.7321 164.92 536 0.9633 0.6503
0.7321 165.85 539 0.9645 0.6503
0.7321 166.77 542 0.9649 0.6503
0.7321 168.0 546 0.9651 0.6503
0.7321 168.92 549 0.9657 0.6503
0.7321 169.85 552 0.9663 0.6538
0.7321 170.77 555 0.9653 0.6503
0.7321 172.0 559 0.9638 0.6503
0.7321 172.92 562 0.9616 0.6503
0.7321 173.85 565 0.9601 0.6503
0.7321 174.77 568 0.9610 0.6538
0.7321 176.0 572 0.9630 0.6503
0.7321 176.92 575 0.9633 0.6503
0.7321 177.85 578 0.9646 0.6503
0.7321 178.77 581 0.9655 0.6503
0.7321 180.0 585 0.9673 0.6503
0.7321 180.92 588 0.9680 0.6503
0.7321 181.85 591 0.9687 0.6503
0.7321 182.77 594 0.9692 0.6503
0.7321 184.0 598 0.9684 0.6503
0.6941 184.92 601 0.9677 0.6503
0.6941 185.85 604 0.9674 0.6503
0.6941 186.77 607 0.9671 0.6503
0.6941 188.0 611 0.9670 0.6503
0.6941 188.92 614 0.9662 0.6503
0.6941 189.85 617 0.9653 0.6503
0.6941 190.77 620 0.9645 0.6503
0.6941 192.0 624 0.9648 0.6503
0.6941 192.92 627 0.9652 0.6503
0.6941 193.85 630 0.9663 0.6503
0.6941 194.77 633 0.9662 0.6503
0.6941 196.0 637 0.9665 0.6503
0.6941 196.92 640 0.9668 0.6503
0.6941 197.85 643 0.9669 0.6469
0.6941 198.77 646 0.9674 0.6434
0.6941 200.0 650 0.9669 0.6469
0.6941 200.92 653 0.9672 0.6469
0.6941 201.85 656 0.9671 0.6469
0.6941 202.77 659 0.9673 0.6503
0.6941 204.0 663 0.9666 0.6503
0.6941 204.92 666 0.9660 0.6503
0.6941 205.85 669 0.9656 0.6503
0.6941 206.77 672 0.9651 0.6503
0.6941 208.0 676 0.9661 0.6503
0.6941 208.92 679 0.9667 0.6503
0.6941 209.85 682 0.9668 0.6503
0.6941 210.77 685 0.9669 0.6503
0.6941 212.0 689 0.9665 0.6503
0.6941 212.92 692 0.9665 0.6503
0.6941 213.85 695 0.9664 0.6503
0.6941 214.77 698 0.9663 0.6503
0.6696 216.0 702 0.9666 0.6503
0.6696 216.92 705 0.9667 0.6503
0.6696 217.85 708 0.9665 0.6503
0.6696 218.77 711 0.9663 0.6503
0.6696 220.0 715 0.9661 0.6503
0.6696 220.92 718 0.9661 0.6503
0.6696 221.85 721 0.9662 0.6503
0.6696 222.77 724 0.9664 0.6503
0.6696 224.0 728 0.9664 0.6503
0.6696 224.92 731 0.9664 0.6503
0.6696 225.85 734 0.9666 0.6503
0.6696 226.77 737 0.9666 0.6503
0.6696 228.0 741 0.9665 0.6503
0.6696 228.92 744 0.9666 0.6503
0.6696 229.85 747 0.9666 0.6503
0.6696 230.77 750 0.9666 0.6503

Framework versions

  • Transformers 4.39.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
0
Safetensors
Model size
13M params
Tensor type
F32
·

Finetuned from