Edit model card

intent_classify

This model is a fine-tuned version of facebook/wav2vec2-base-960h on the minds14 dataset. It achieves the following results on the evaluation set:

  • Loss: 3.1513
  • Accuracy: 0.0885

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 64
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 8 2.6393 0.1239
2.6231 2.0 16 2.6391 0.1239
2.6382 3.0 24 2.6394 0.1239
2.6373 4.0 32 2.6409 0.0796
2.6421 5.0 40 2.6438 0.0619
2.6421 6.0 48 2.6447 0.0442
2.6388 7.0 56 2.6447 0.0265
2.6228 8.0 64 2.6457 0.0531
2.6305 9.0 72 2.6459 0.0442
2.6417 10.0 80 2.6500 0.0177
2.6417 11.0 88 2.6496 0.0354
2.6399 12.0 96 2.6453 0.0708
2.6355 13.0 104 2.6516 0.0354
2.652 14.0 112 2.6526 0.0796
2.6323 15.0 120 2.6592 0.0619
2.6323 16.0 128 2.6540 0.0619
2.6254 17.0 136 2.6519 0.0619
2.6377 18.0 144 2.6567 0.1150
2.6283 19.0 152 2.6649 0.0796
2.6192 20.0 160 2.6708 0.0796
2.6192 21.0 168 2.6700 0.0265
2.6289 22.0 176 2.6723 0.0177
2.6487 23.0 184 2.6700 0.0442
2.6324 24.0 192 2.6723 0.0265
2.6376 25.0 200 2.6639 0.0708
2.6376 26.0 208 2.6750 0.0177
2.6223 27.0 216 2.6837 0.0708
2.6323 28.0 224 2.6734 0.0796
2.6218 29.0 232 2.6737 0.0973
2.6231 30.0 240 2.6764 0.0708
2.6231 31.0 248 2.6793 0.0796
2.6228 32.0 256 2.6872 0.0442
2.6107 33.0 264 2.6816 0.0796
2.6078 34.0 272 2.6828 0.0796
2.6453 35.0 280 2.6746 0.0885
2.6453 36.0 288 2.6760 0.0885
2.6027 37.0 296 2.6885 0.0708
2.6179 38.0 304 2.6784 0.0973
2.5933 39.0 312 2.6808 0.0708
2.5909 40.0 320 2.6947 0.0796
2.5909 41.0 328 2.7123 0.0619
2.6179 42.0 336 2.7139 0.0619
2.5969 43.0 344 2.7160 0.0973
2.5966 44.0 352 2.7054 0.1062
2.6293 45.0 360 2.7260 0.0796
2.6293 46.0 368 2.6994 0.0973
2.6023 47.0 376 2.7039 0.0885
2.605 48.0 384 2.6680 0.0885
2.597 49.0 392 2.7001 0.0796
2.5936 50.0 400 2.7036 0.0796
2.5936 51.0 408 2.6866 0.0973
2.5823 52.0 416 2.7055 0.0973
2.5902 53.0 424 2.7130 0.0885
2.5793 54.0 432 2.7249 0.1062
2.5972 55.0 440 2.7253 0.1062
2.5972 56.0 448 2.6929 0.0885
2.5913 57.0 456 2.7252 0.0973
2.5937 58.0 464 2.7137 0.0796
2.5435 59.0 472 2.7252 0.0885
2.5475 60.0 480 2.7306 0.0708
2.5475 61.0 488 2.7158 0.0885
2.5591 62.0 496 2.7398 0.0531
2.6105 63.0 504 2.7323 0.0885
2.5593 64.0 512 2.7302 0.0885
2.574 65.0 520 2.7228 0.0973
2.574 66.0 528 2.7453 0.0885
2.549 67.0 536 2.7483 0.0796
2.543 68.0 544 2.7298 0.0531
2.5406 69.0 552 2.7341 0.0442
2.5245 70.0 560 2.7785 0.0708
2.5245 71.0 568 2.8005 0.0796
2.5764 72.0 576 2.7709 0.0708
2.529 73.0 584 2.7896 0.0796
2.5398 74.0 592 2.7806 0.0708
2.5436 75.0 600 2.7939 0.0796
2.5436 76.0 608 2.8015 0.0708
2.505 77.0 616 2.7643 0.0885
2.495 78.0 624 2.7971 0.0973
2.5473 79.0 632 2.8064 0.0796
2.5033 80.0 640 2.7837 0.0973
2.5033 81.0 648 2.7731 0.0885
2.5207 82.0 656 2.8325 0.0796
2.4956 83.0 664 2.7837 0.0885
2.494 84.0 672 2.8120 0.1150
2.4778 85.0 680 2.8099 0.0885
2.4778 86.0 688 2.7721 0.0885
2.4767 87.0 696 2.7981 0.1062
2.5287 88.0 704 2.8048 0.0531
2.4601 89.0 712 2.8174 0.0531
2.4073 90.0 720 2.8352 0.0708
2.4073 91.0 728 2.8099 0.0708
2.4156 92.0 736 2.8100 0.0973
2.4669 93.0 744 2.8282 0.0796
2.486 94.0 752 2.8443 0.0708
2.4439 95.0 760 2.8270 0.0796
2.4439 96.0 768 2.8432 0.0531
2.4477 97.0 776 2.8190 0.0885
2.4279 98.0 784 2.8173 0.0885
2.4116 99.0 792 2.8400 0.1150
2.3758 100.0 800 2.8620 0.0619
2.3758 101.0 808 2.8632 0.0442
2.4604 102.0 816 2.8518 0.0619
2.3987 103.0 824 2.8547 0.0354
2.3744 104.0 832 2.8221 0.0796
2.3797 105.0 840 2.8379 0.0531
2.3797 106.0 848 2.8498 0.0973
2.3711 107.0 856 2.8532 0.0796
2.3897 108.0 864 2.8078 0.1239
2.3108 109.0 872 2.8513 0.0885
2.2791 110.0 880 2.8794 0.1062
2.2791 111.0 888 2.8573 0.1062
2.3665 112.0 896 2.8454 0.1239
2.2993 113.0 904 2.8753 0.1062
2.3283 114.0 912 2.9077 0.1150
2.3286 115.0 920 2.9115 0.1150
2.3286 116.0 928 2.9227 0.0973
2.2489 117.0 936 2.9066 0.1062
2.2867 118.0 944 2.8912 0.0973
2.2914 119.0 952 2.9061 0.1239
2.2754 120.0 960 2.9024 0.1062
2.2754 121.0 968 2.9197 0.1062
2.2747 122.0 976 2.9232 0.0708
2.2423 123.0 984 2.9418 0.0973
2.2614 124.0 992 2.9615 0.0796
2.2761 125.0 1000 2.9344 0.1062
2.2761 126.0 1008 2.9412 0.0796
2.2377 127.0 1016 2.9950 0.1062
2.2496 128.0 1024 2.9659 0.1062
2.1857 129.0 1032 2.9819 0.1062
2.2464 130.0 1040 2.9840 0.1062
2.2464 131.0 1048 3.0030 0.0708
2.2035 132.0 1056 3.0148 0.0885
2.2497 133.0 1064 2.9930 0.0885
2.2877 134.0 1072 2.9949 0.0796
2.2939 135.0 1080 2.9780 0.0708
2.2939 136.0 1088 2.9763 0.0973
2.1982 137.0 1096 2.9887 0.0973
2.2871 138.0 1104 3.0152 0.0619
2.1358 139.0 1112 2.9990 0.0973
2.2026 140.0 1120 3.0051 0.0708
2.2026 141.0 1128 3.0082 0.0885
2.2605 142.0 1136 3.0037 0.0885
2.2208 143.0 1144 2.9893 0.0796
2.1251 144.0 1152 3.0048 0.0796
2.2273 145.0 1160 3.0431 0.0619
2.2273 146.0 1168 3.0515 0.0708
2.167 147.0 1176 3.0180 0.0708
2.0854 148.0 1184 3.0075 0.1062
2.1752 149.0 1192 3.0240 0.0973
2.0978 150.0 1200 3.0161 0.0885
2.0978 151.0 1208 3.0221 0.0885
2.2208 152.0 1216 3.0202 0.0796
2.0802 153.0 1224 3.0016 0.0708
2.1248 154.0 1232 3.0306 0.0796
2.0654 155.0 1240 3.0219 0.0885
2.0654 156.0 1248 3.0431 0.0885
2.0384 157.0 1256 2.9748 0.1062
2.24 158.0 1264 3.0428 0.1062
2.1095 159.0 1272 3.0469 0.0885
2.0334 160.0 1280 3.0766 0.0708
2.0334 161.0 1288 3.0622 0.0796
2.0276 162.0 1296 3.0810 0.0796
2.1454 163.0 1304 3.0658 0.0973
2.1115 164.0 1312 3.0959 0.0973
1.9748 165.0 1320 3.0532 0.1062
1.9748 166.0 1328 3.0925 0.0708
2.0136 167.0 1336 3.1021 0.0796
2.0998 168.0 1344 3.0993 0.0885
2.1276 169.0 1352 3.1212 0.0708
2.0504 170.0 1360 3.1114 0.0708
2.0504 171.0 1368 3.1104 0.0796
2.0975 172.0 1376 3.1190 0.0885
2.0216 173.0 1384 3.1361 0.0796
2.0501 174.0 1392 3.1312 0.0885
1.9737 175.0 1400 3.1344 0.0973
1.9737 176.0 1408 3.1300 0.0973
1.9742 177.0 1416 3.1306 0.0973
1.9977 178.0 1424 3.1490 0.0885
1.976 179.0 1432 3.1558 0.0973
2.0002 180.0 1440 3.1514 0.0973
2.0002 181.0 1448 3.1465 0.1062
2.004 182.0 1456 3.1450 0.1062
2.0281 183.0 1464 3.1352 0.1062
2.0258 184.0 1472 3.1503 0.1062
2.0263 185.0 1480 3.1394 0.0973
2.0263 186.0 1488 3.1274 0.1062
1.9759 187.0 1496 3.1319 0.1062
1.961 188.0 1504 3.1519 0.1062
2.0122 189.0 1512 3.1557 0.0973
2.0037 190.0 1520 3.1491 0.0885
2.0037 191.0 1528 3.1503 0.0885
1.9606 192.0 1536 3.1489 0.0885
1.9304 193.0 1544 3.1491 0.0885
2.0565 194.0 1552 3.1515 0.0973
2.0293 195.0 1560 3.1481 0.0885
2.0293 196.0 1568 3.1498 0.0885
2.0206 197.0 1576 3.1510 0.0885
1.9536 198.0 1584 3.1517 0.0885
2.0261 199.0 1592 3.1512 0.0885
2.1627 200.0 1600 3.1513 0.0885

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.2
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
0
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Finetuned from

Evaluation results