File size: 6,719 Bytes
634d8e6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
---
base_model: Harveenchadha/vakyansh-wav2vec2-bengali-bnm-200
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vakyansh-wav2vec2-bengali-bnm-200-audio-abuse-feature
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vakyansh-wav2vec2-bengali-bnm-200-audio-abuse-feature

This model is a fine-tuned version of [Harveenchadha/vakyansh-wav2vec2-bengali-bnm-200](https://huggingface.co/Harveenchadha/vakyansh-wav2vec2-bengali-bnm-200) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8024
- Accuracy: 0.6459
- Macro F1-score: 0.6339

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro F1-score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------------:|
| 6.7346        | 0.77  | 10   | 6.7291          | 0.0      | 0.0            |
| 6.6926        | 1.54  | 20   | 6.6304          | 0.0027   | 0.0004         |
| 6.5587        | 2.31  | 30   | 6.4243          | 0.5730   | 0.2830         |
| 6.3449        | 3.08  | 40   | 6.1051          | 0.5649   | 0.5571         |
| 6.1232        | 3.85  | 50   | 5.7862          | 0.4216   | 0.3430         |
| 5.8191        | 4.62  | 60   | 5.5131          | 0.4027   | 0.2908         |
| 5.592         | 5.38  | 70   | 5.2719          | 0.5216   | 0.5022         |
| 5.3414        | 6.15  | 80   | 5.0558          | 0.6189   | 0.6186         |
| 5.1331        | 6.92  | 90   | 4.8552          | 0.6865   | 0.6852         |
| 4.98          | 7.69  | 100  | 4.6603          | 0.6568   | 0.6567         |
| 4.7844        | 8.46  | 110  | 4.4634          | 0.6703   | 0.6702         |
| 4.7028        | 9.23  | 120  | 4.2715          | 0.6568   | 0.6567         |
| 4.4476        | 10.0  | 130  | 4.0733          | 0.6297   | 0.6280         |
| 4.2098        | 10.77 | 140  | 3.8749          | 0.6108   | 0.6041         |
| 4.0715        | 11.54 | 150  | 3.6803          | 0.5027   | 0.4564         |
| 3.8545        | 12.31 | 160  | 3.4603          | 0.6649   | 0.6648         |
| 3.708         | 13.08 | 170  | 3.2559          | 0.6541   | 0.6534         |
| 3.4318        | 13.85 | 180  | 3.0493          | 0.6676   | 0.6675         |
| 3.1874        | 14.62 | 190  | 2.8456          | 0.6838   | 0.6837         |
| 3.1887        | 15.38 | 200  | 2.6625          | 0.5595   | 0.5384         |
| 2.8359        | 16.15 | 210  | 2.4679          | 0.5757   | 0.5604         |
| 2.6265        | 16.92 | 220  | 2.2662          | 0.6892   | 0.6841         |
| 2.4536        | 17.69 | 230  | 2.0843          | 0.6649   | 0.6644         |
| 2.2288        | 18.46 | 240  | 1.9218          | 0.6459   | 0.6431         |
| 2.2955        | 19.23 | 250  | 1.7633          | 0.6595   | 0.6578         |
| 1.9739        | 20.0  | 260  | 1.6105          | 0.6730   | 0.6671         |
| 1.8575        | 20.77 | 270  | 1.4855          | 0.6378   | 0.6351         |
| 1.607         | 21.54 | 280  | 1.3582          | 0.6649   | 0.6646         |
| 1.4831        | 22.31 | 290  | 1.2425          | 0.6676   | 0.6646         |
| 1.4484        | 23.08 | 300  | 1.1522          | 0.6703   | 0.6660         |
| 1.2517        | 23.85 | 310  | 1.0688          | 0.6595   | 0.6554         |
| 1.2793        | 24.62 | 320  | 1.0006          | 0.6541   | 0.6523         |
| 1.0722        | 25.38 | 330  | 0.9486          | 0.6568   | 0.6543         |
| 0.9888        | 26.15 | 340  | 0.9292          | 0.6135   | 0.6135         |
| 0.9134        | 26.92 | 350  | 0.8580          | 0.6514   | 0.6492         |
| 0.9208        | 27.69 | 360  | 0.8352          | 0.6649   | 0.6646         |
| 0.966         | 28.46 | 370  | 0.8220          | 0.6162   | 0.6160         |
| 0.8746        | 29.23 | 380  | 0.8064          | 0.6568   | 0.6420         |
| 0.8619        | 30.0  | 390  | 0.7856          | 0.6405   | 0.5942         |
| 0.841         | 30.77 | 400  | 0.7612          | 0.6459   | 0.6020         |
| 0.7629        | 31.54 | 410  | 0.7441          | 0.6459   | 0.6434         |
| 0.6736        | 32.31 | 420  | 0.7610          | 0.6568   | 0.6562         |
| 0.6579        | 33.08 | 430  | 0.7624          | 0.6514   | 0.6456         |
| 0.7514        | 33.85 | 440  | 0.7374          | 0.6649   | 0.6467         |
| 0.6579        | 34.62 | 450  | 0.7503          | 0.6541   | 0.6471         |
| 0.6864        | 35.38 | 460  | 0.8286          | 0.5892   | 0.5889         |
| 0.6863        | 36.15 | 470  | 0.7393          | 0.6541   | 0.6396         |
| 0.6224        | 36.92 | 480  | 0.7427          | 0.6541   | 0.6507         |
| 0.6255        | 37.69 | 490  | 0.7495          | 0.6405   | 0.6268         |
| 0.5295        | 38.46 | 500  | 0.7787          | 0.6486   | 0.6385         |
| 0.5549        | 39.23 | 510  | 0.7909          | 0.6378   | 0.6360         |
| 0.5752        | 40.0  | 520  | 0.7631          | 0.6459   | 0.6361         |
| 0.616         | 40.77 | 530  | 0.7636          | 0.6432   | 0.6390         |
| 0.5038        | 41.54 | 540  | 0.7847          | 0.6514   | 0.6372         |
| 0.5935        | 42.31 | 550  | 0.7837          | 0.6595   | 0.6461         |
| 0.5453        | 43.08 | 560  | 0.7804          | 0.6405   | 0.6330         |
| 0.5378        | 43.85 | 570  | 0.7928          | 0.6514   | 0.6338         |
| 0.4852        | 44.62 | 580  | 0.8249          | 0.6324   | 0.6285         |
| 0.5198        | 45.38 | 590  | 0.8065          | 0.6459   | 0.6186         |
| 0.5067        | 46.15 | 600  | 0.8210          | 0.6162   | 0.6107         |
| 0.5533        | 46.92 | 610  | 0.8053          | 0.6432   | 0.6300         |
| 0.6282        | 47.69 | 620  | 0.7970          | 0.6459   | 0.6316         |
| 0.5617        | 48.46 | 630  | 0.8095          | 0.6243   | 0.6165         |
| 0.5016        | 49.23 | 640  | 0.8038          | 0.6378   | 0.6274         |
| 0.467         | 50.0  | 650  | 0.8024          | 0.6459   | 0.6339         |


### Framework versions

- Transformers 4.33.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3