FarhadMadadzade's picture
Update README.md
73bd541 verified
metadata
license: apache-2.0
base_model: jonatasgrosman/wav2vec2-large-xlsr-53-english
datasets:
  - RAVDESS
  - SAVEE
  - TESS
  - CREMA-D
tags:
  - generated_from_trainer
  - audio
  - speech
  - speech-emotion-recognition
metrics:
  - accuracy
model-index:
  - name: wav2vec2-large-xlsr-53-english-ser-cosine
    results: []

wav2vec2-large-xlsr-53-english-ser-linear

This model is a fine-tuned version of jonatasgrosman/wav2vec2-large-xlsr-53-english on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4643
  • Accuracy: 0.8587

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1.5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.8767 0.01 10 1.8078 0.1684
1.7967 0.02 20 1.7544 0.2235
1.8173 0.02 30 1.7072 0.3032
1.7604 0.03 40 1.7162 0.2227
1.7271 0.04 50 1.6655 0.3032
1.764 0.05 60 1.5927 0.3599
1.55 0.06 70 1.5354 0.3657
1.5448 0.07 80 1.4057 0.4560
1.5118 0.07 90 1.3551 0.4733
1.354 0.08 100 1.2319 0.5596
1.3675 0.09 110 1.1786 0.5735
1.4058 0.1 120 1.0949 0.6105
1.1595 0.11 130 1.0964 0.5908
1.0444 0.12 140 1.1262 0.6212
1.0483 0.12 150 1.0863 0.5982
1.0439 0.13 160 1.0488 0.6491
1.0129 0.14 170 0.9045 0.6549
1.0171 0.15 180 1.0276 0.6270
1.0867 0.16 190 1.0888 0.6023
1.0646 0.16 200 0.9730 0.6311
1.0403 0.17 210 0.9315 0.6582
0.869 0.18 220 0.9686 0.6574
0.9193 0.19 230 0.9076 0.6960
1.0266 0.2 240 1.0796 0.6565
0.8563 0.21 250 1.0173 0.6426
0.8382 0.21 260 0.9155 0.6820
0.9275 0.22 270 0.9397 0.6689
0.9402 0.23 280 0.8919 0.6861
0.8636 0.24 290 0.9795 0.6680
1.2393 0.25 300 0.9872 0.6680
0.9537 0.25 310 0.8181 0.7247
0.7361 0.26 320 0.8470 0.7025
0.8452 0.27 330 0.8045 0.7198
0.9613 0.28 340 0.7530 0.7313
0.9335 0.29 350 0.9019 0.6902
0.9414 0.3 360 0.8981 0.6795
0.7473 0.3 370 0.7532 0.7321
0.8774 0.31 380 0.8953 0.7165
0.6989 0.32 390 0.7381 0.7387
0.9826 0.33 400 0.7128 0.7403
0.783 0.34 410 0.8292 0.6952
0.9668 0.35 420 0.7826 0.7239
0.7935 0.35 430 0.7081 0.7510
0.8284 0.36 440 0.7304 0.7264
0.9404 0.37 450 0.6761 0.7650
0.7735 0.38 460 0.6827 0.7469
0.6811 0.39 470 0.7926 0.7132
0.683 0.39 480 0.6883 0.7428
0.6779 0.4 490 0.6608 0.7486
0.6329 0.41 500 0.6578 0.7617
0.5824 0.42 510 0.7696 0.7420
0.6974 0.43 520 0.6755 0.7625
0.7716 0.44 530 0.6453 0.7716
0.7463 0.44 540 0.6644 0.7642
0.7993 0.45 550 0.6059 0.7864
0.606 0.46 560 0.6857 0.7461
0.8619 0.47 570 0.6570 0.7560
0.699 0.48 580 0.7400 0.7313
0.6619 0.49 590 0.7014 0.7494
0.7696 0.49 600 0.6621 0.7584
0.6544 0.5 610 0.6826 0.7650
0.5403 0.51 620 0.7464 0.7551
0.746 0.52 630 0.7323 0.7551
0.8129 0.53 640 0.7221 0.7634
0.7245 0.53 650 0.6306 0.7790
0.7062 0.54 660 0.6250 0.7896
0.741 0.55 670 0.6129 0.7938
0.7185 0.56 680 0.6332 0.7847
0.7706 0.57 690 0.5988 0.7954
0.8147 0.58 700 0.7032 0.7781
0.5144 0.58 710 0.6849 0.7634
0.9247 0.59 720 0.6088 0.7749
0.629 0.6 730 0.6393 0.7806
0.5908 0.61 740 0.5696 0.7913
0.4951 0.62 750 0.6370 0.7765
0.6358 0.62 760 0.6232 0.7979
0.6396 0.63 770 0.6707 0.7905
0.6947 0.64 780 0.6981 0.7683
0.6748 0.65 790 0.6761 0.7765
0.5607 0.66 800 0.6551 0.7921
0.6991 0.67 810 0.6134 0.7905
0.5793 0.67 820 0.5633 0.8118
0.4755 0.68 830 0.6031 0.7929
0.7645 0.69 840 0.5896 0.7962
0.742 0.7 850 0.5811 0.8036
0.5281 0.71 860 0.6449 0.7855
0.722 0.72 870 0.6593 0.7765
0.8174 0.72 880 0.5410 0.8003
0.5373 0.73 890 0.5802 0.7954
0.3868 0.74 900 0.6015 0.7954
0.5459 0.75 910 0.5485 0.7970
0.4629 0.76 920 0.6961 0.7584
0.6952 0.76 930 0.5608 0.8053
0.8452 0.77 940 0.5649 0.8044
0.6026 0.78 950 0.5330 0.8127
0.5131 0.79 960 0.5971 0.7888
0.6814 0.8 970 0.5594 0.8061
0.6001 0.81 980 0.5851 0.7954
0.5367 0.81 990 0.5716 0.8003
0.8356 0.82 1000 0.6519 0.7683
0.502 0.83 1010 0.6180 0.7749
0.5343 0.84 1020 0.5377 0.8053
0.5288 0.85 1030 0.5902 0.7962
0.5786 0.86 1040 0.6221 0.7905
0.6272 0.86 1050 0.6688 0.7831
0.5105 0.87 1060 0.6209 0.7880
0.5806 0.88 1070 0.6145 0.7929
0.5805 0.89 1080 0.6150 0.7847
0.4812 0.9 1090 0.5812 0.8061
0.5558 0.9 1100 0.6388 0.8044
0.7507 0.91 1110 0.5873 0.8044
0.7217 0.92 1120 0.5404 0.8085
0.8146 0.93 1130 0.5449 0.8003
0.6112 0.94 1140 0.5038 0.8151
0.7305 0.95 1150 0.4767 0.8316
0.3422 0.95 1160 0.5178 0.8127
0.4644 0.96 1170 0.5073 0.8200
0.4664 0.97 1180 0.4988 0.8184
0.6223 0.98 1190 0.5120 0.8283
0.6961 0.99 1200 0.5217 0.8118
0.6706 1.0 1210 0.5235 0.8094
0.3899 1.0 1220 0.5085 0.8184
0.418 1.01 1230 0.5171 0.8135
0.5011 1.02 1240 0.5056 0.8217
0.2969 1.03 1250 0.5209 0.8217
0.5093 1.04 1260 0.4921 0.8348
0.5167 1.04 1270 0.5081 0.8274
0.6382 1.05 1280 0.4851 0.8291
0.3493 1.06 1290 0.4946 0.8324
0.3471 1.07 1300 0.5122 0.8299
0.452 1.08 1310 0.5592 0.8291
0.4362 1.09 1320 0.5528 0.8266
0.4224 1.09 1330 0.5571 0.8192
0.333 1.1 1340 0.5714 0.8110
0.2944 1.11 1350 0.5156 0.8299
0.4004 1.12 1360 0.5208 0.8340
0.6824 1.13 1370 0.5426 0.8258
0.3746 1.13 1380 0.4902 0.8365
0.3679 1.14 1390 0.4868 0.8373
0.5009 1.15 1400 0.5192 0.8283
0.5577 1.16 1410 0.4937 0.8316
0.2566 1.17 1420 0.5043 0.8250
0.6625 1.18 1430 0.5416 0.8209
0.3251 1.18 1440 0.5146 0.8291
0.4306 1.19 1450 0.5313 0.8266
0.3159 1.2 1460 0.5308 0.8291
0.3598 1.21 1470 0.4869 0.8439
0.5498 1.22 1480 0.4670 0.8537
0.4947 1.23 1490 0.4928 0.8463
0.3948 1.23 1500 0.4816 0.8455
0.3137 1.24 1510 0.4755 0.8439
0.3525 1.25 1520 0.4972 0.8389
0.4821 1.26 1530 0.4954 0.8381
0.6099 1.27 1540 0.5096 0.8324
0.3172 1.27 1550 0.5029 0.8389
0.29 1.28 1560 0.4852 0.8455
0.288 1.29 1570 0.4916 0.8496
0.3771 1.3 1580 0.4734 0.8505
0.3106 1.31 1590 0.4746 0.8431
0.3494 1.32 1600 0.5069 0.8431
0.3183 1.32 1610 0.5155 0.8398
0.4353 1.33 1620 0.5242 0.8332
0.6207 1.34 1630 0.5161 0.8340
0.3241 1.35 1640 0.5037 0.8406
0.3646 1.36 1650 0.4890 0.8439
0.2341 1.37 1660 0.4884 0.8496
0.4874 1.37 1670 0.4688 0.8562
0.6701 1.38 1680 0.4589 0.8554
0.391 1.39 1690 0.4684 0.8537
0.3333 1.4 1700 0.4738 0.8513
0.2449 1.41 1710 0.4753 0.8488
0.361 1.41 1720 0.4946 0.8496
0.2229 1.42 1730 0.4971 0.8463
0.5915 1.43 1740 0.4904 0.8513
0.1812 1.44 1750 0.4782 0.8537
0.5887 1.45 1760 0.4702 0.8570
0.2823 1.46 1770 0.4665 0.8570
0.3397 1.46 1780 0.4673 0.8546
0.4727 1.47 1790 0.4638 0.8578
0.3303 1.48 1800 0.4636 0.8578
0.4544 1.49 1810 0.4646 0.8587
0.366 1.5 1820 0.4643 0.8587

Framework versions

  • Transformers 4.40.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.1.dev0
  • Tokenizers 0.15.2