|
--- |
|
base_model: Harveenchadha/vakyansh-wav2vec2-odia-orm-100 |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: vakyansh-wav2vec2-odia-orm-100-audio-abuse-feature |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# vakyansh-wav2vec2-odia-orm-100-audio-abuse-feature |
|
|
|
This model is a fine-tuned version of [Harveenchadha/vakyansh-wav2vec2-odia-orm-100](https://huggingface.co/Harveenchadha/vakyansh-wav2vec2-odia-orm-100) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.7299 |
|
- Accuracy: 0.7014 |
|
- Macro F1-score: 0.6792 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 16 |
|
- eval_batch_size: 16 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 64 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- num_epochs: 50 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro F1-score | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------------:| |
|
| 6.7078 | 0.78 | 10 | 6.6948 | 0.0 | 0.0 | |
|
| 6.6539 | 1.57 | 20 | 6.5580 | 0.2 | 0.0342 | |
|
| 6.5111 | 2.35 | 30 | 6.3377 | 0.5726 | 0.3641 | |
|
| 6.268 | 3.14 | 40 | 6.0361 | 0.5726 | 0.3641 | |
|
| 6.0748 | 3.92 | 50 | 5.7417 | 0.5726 | 0.3641 | |
|
| 5.8205 | 4.71 | 60 | 5.4985 | 0.5726 | 0.3641 | |
|
| 5.6051 | 5.49 | 70 | 5.2743 | 0.5726 | 0.3641 | |
|
| 5.3589 | 6.27 | 80 | 5.0823 | 0.5726 | 0.3641 | |
|
| 5.2019 | 7.06 | 90 | 4.8953 | 0.5726 | 0.3641 | |
|
| 5.0528 | 7.84 | 100 | 4.7077 | 0.5726 | 0.3641 | |
|
| 4.868 | 8.63 | 110 | 4.5244 | 0.5726 | 0.3641 | |
|
| 4.7081 | 9.41 | 120 | 4.3347 | 0.5726 | 0.3641 | |
|
| 4.437 | 10.2 | 130 | 4.1455 | 0.5726 | 0.3641 | |
|
| 4.3225 | 10.98 | 140 | 3.9551 | 0.5726 | 0.3641 | |
|
| 4.0945 | 11.76 | 150 | 3.7694 | 0.5726 | 0.3641 | |
|
| 4.014 | 12.55 | 160 | 3.5710 | 0.5726 | 0.3641 | |
|
| 3.8491 | 13.33 | 170 | 3.3814 | 0.5726 | 0.3641 | |
|
| 3.4724 | 14.12 | 180 | 3.1873 | 0.5726 | 0.3641 | |
|
| 3.2728 | 14.9 | 190 | 2.9999 | 0.5726 | 0.3641 | |
|
| 3.1948 | 15.69 | 200 | 2.8224 | 0.5726 | 0.3641 | |
|
| 2.9968 | 16.47 | 210 | 2.6368 | 0.5726 | 0.3641 | |
|
| 2.6739 | 17.25 | 220 | 2.4462 | 0.5726 | 0.3641 | |
|
| 2.561 | 18.04 | 230 | 2.2871 | 0.5726 | 0.3641 | |
|
| 2.5101 | 18.82 | 240 | 2.1260 | 0.5726 | 0.3641 | |
|
| 2.3307 | 19.61 | 250 | 1.9620 | 0.5726 | 0.3641 | |
|
| 2.1022 | 20.39 | 260 | 1.8260 | 0.5726 | 0.3641 | |
|
| 1.9909 | 21.18 | 270 | 1.6933 | 0.5726 | 0.3641 | |
|
| 1.766 | 21.96 | 280 | 1.5644 | 0.5726 | 0.3641 | |
|
| 1.7143 | 22.75 | 290 | 1.4669 | 0.5726 | 0.3641 | |
|
| 1.5073 | 23.53 | 300 | 1.3482 | 0.5726 | 0.3641 | |
|
| 1.6055 | 24.31 | 310 | 1.2643 | 0.5726 | 0.3641 | |
|
| 1.321 | 25.1 | 320 | 1.1930 | 0.5726 | 0.3641 | |
|
| 1.2165 | 25.88 | 330 | 1.1128 | 0.5726 | 0.3641 | |
|
| 1.1484 | 26.67 | 340 | 1.0493 | 0.6712 | 0.6033 | |
|
| 1.1413 | 27.45 | 350 | 0.9925 | 0.7096 | 0.6737 | |
|
| 1.0462 | 28.24 | 360 | 0.9471 | 0.6877 | 0.6190 | |
|
| 0.9667 | 29.02 | 370 | 0.9209 | 0.7123 | 0.6869 | |
|
| 0.9918 | 29.8 | 380 | 0.8892 | 0.7205 | 0.6953 | |
|
| 0.9112 | 30.59 | 390 | 0.8414 | 0.7123 | 0.6705 | |
|
| 0.8666 | 31.37 | 400 | 0.8291 | 0.7123 | 0.6836 | |
|
| 0.8096 | 32.16 | 410 | 0.8284 | 0.6959 | 0.6501 | |
|
| 0.7987 | 32.94 | 420 | 0.7729 | 0.7425 | 0.7270 | |
|
| 0.7529 | 33.73 | 430 | 0.7542 | 0.7260 | 0.7023 | |
|
| 0.7605 | 34.51 | 440 | 0.7535 | 0.7260 | 0.7043 | |
|
| 0.7011 | 35.29 | 450 | 0.7882 | 0.6959 | 0.6891 | |
|
| 0.6868 | 36.08 | 460 | 0.7378 | 0.7260 | 0.7013 | |
|
| 0.6858 | 36.86 | 470 | 0.7518 | 0.7096 | 0.6865 | |
|
| 0.7546 | 37.65 | 480 | 0.7163 | 0.7342 | 0.7108 | |
|
| 0.6717 | 38.43 | 490 | 0.7158 | 0.7397 | 0.7158 | |
|
| 0.7048 | 39.22 | 500 | 0.7755 | 0.6575 | 0.6487 | |
|
| 0.6767 | 40.0 | 510 | 0.7469 | 0.7068 | 0.6798 | |
|
| 0.6621 | 40.78 | 520 | 0.7166 | 0.7205 | 0.7020 | |
|
| 0.6639 | 41.57 | 530 | 0.7143 | 0.7151 | 0.6934 | |
|
| 0.5988 | 42.35 | 540 | 0.7547 | 0.6767 | 0.6661 | |
|
| 0.6179 | 43.14 | 550 | 0.7394 | 0.7014 | 0.6820 | |
|
| 0.7033 | 43.92 | 560 | 0.7312 | 0.6986 | 0.6757 | |
|
| 0.6076 | 44.71 | 570 | 0.7331 | 0.6904 | 0.6674 | |
|
| 0.602 | 45.49 | 580 | 0.7341 | 0.6932 | 0.6718 | |
|
| 0.545 | 46.27 | 590 | 0.7363 | 0.6932 | 0.6738 | |
|
| 0.5881 | 47.06 | 600 | 0.7299 | 0.7014 | 0.6792 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.33.0 |
|
- Pytorch 2.0.0 |
|
- Datasets 2.1.0 |
|
- Tokenizers 0.13.3 |
|
|