File size: 6,719 Bytes
2fc5d09
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
---
base_model: Harveenchadha/vakyansh-wav2vec2-malayalam-mlm-8
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vakyansh-wav2vec2-malayalam-mlm-8-audio-abuse-feature
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vakyansh-wav2vec2-malayalam-mlm-8-audio-abuse-feature

This model is a fine-tuned version of [Harveenchadha/vakyansh-wav2vec2-malayalam-mlm-8](https://huggingface.co/Harveenchadha/vakyansh-wav2vec2-malayalam-mlm-8) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5558
- Accuracy: 0.8118
- Macro F1-score: 0.7576

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Macro F1-score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------------:|
| 6.7625        | 0.77  | 10   | 6.7667          | 0.0      | 0.0            |
| 6.6935        | 1.54  | 20   | 6.5968          | 0.2527   | 0.0191         |
| 6.5445        | 2.31  | 30   | 6.3912          | 0.6909   | 0.4086         |
| 6.338         | 3.08  | 40   | 6.0995          | 0.6909   | 0.4086         |
| 6.1044        | 3.85  | 50   | 5.7554          | 0.6909   | 0.4086         |
| 5.755         | 4.62  | 60   | 5.4099          | 0.6909   | 0.4086         |
| 5.4695        | 5.38  | 70   | 5.1276          | 0.6909   | 0.4086         |
| 5.1723        | 6.15  | 80   | 4.8795          | 0.6909   | 0.4086         |
| 4.9648        | 6.92  | 90   | 4.6629          | 0.6909   | 0.4086         |
| 4.7414        | 7.69  | 100  | 4.4481          | 0.6909   | 0.4086         |
| 4.5793        | 8.46  | 110  | 4.2398          | 0.6909   | 0.4086         |
| 4.4455        | 9.23  | 120  | 4.0339          | 0.6909   | 0.4086         |
| 4.2287        | 10.0  | 130  | 3.8247          | 0.6909   | 0.4086         |
| 3.9367        | 10.77 | 140  | 3.6164          | 0.6909   | 0.4086         |
| 3.7916        | 11.54 | 150  | 3.4090          | 0.6909   | 0.4086         |
| 3.6112        | 12.31 | 160  | 3.2043          | 0.6909   | 0.4086         |
| 3.408         | 13.08 | 170  | 3.0023          | 0.6909   | 0.4086         |
| 3.1359        | 13.85 | 180  | 2.8029          | 0.6909   | 0.4086         |
| 2.9607        | 14.62 | 190  | 2.6125          | 0.6909   | 0.4086         |
| 2.83          | 15.38 | 200  | 2.4336          | 0.6909   | 0.4086         |
| 2.4853        | 16.15 | 210  | 2.2649          | 0.6909   | 0.4086         |
| 2.3841        | 16.92 | 220  | 2.1059          | 0.6909   | 0.4086         |
| 2.2296        | 17.69 | 230  | 1.9583          | 0.6909   | 0.4086         |
| 1.9631        | 18.46 | 240  | 1.8302          | 0.6909   | 0.4086         |
| 2.0456        | 19.23 | 250  | 1.7146          | 0.6909   | 0.4086         |
| 1.8406        | 20.0  | 260  | 1.6100          | 0.6909   | 0.4086         |
| 1.7127        | 20.77 | 270  | 1.5130          | 0.6909   | 0.4086         |
| 1.5241        | 21.54 | 280  | 1.4264          | 0.6909   | 0.4086         |
| 1.4366        | 22.31 | 290  | 1.3458          | 0.6909   | 0.4086         |
| 1.4368        | 23.08 | 300  | 1.2710          | 0.6909   | 0.4086         |
| 1.2664        | 23.85 | 310  | 1.2024          | 0.6909   | 0.4086         |
| 1.2681        | 24.62 | 320  | 1.1391          | 0.6909   | 0.4086         |
| 1.1518        | 25.38 | 330  | 1.0791          | 0.6909   | 0.4086         |
| 1.0681        | 26.15 | 340  | 1.0221          | 0.6909   | 0.4086         |
| 1.014         | 26.92 | 350  | 0.9679          | 0.6909   | 0.4086         |
| 0.9918        | 27.69 | 360  | 0.9197          | 0.6909   | 0.4086         |
| 1.0046        | 28.46 | 370  | 0.8839          | 0.6909   | 0.4086         |
| 0.9582        | 29.23 | 380  | 0.8422          | 0.6909   | 0.4086         |
| 0.927         | 30.0  | 390  | 0.8017          | 0.6909   | 0.4086         |
| 0.8853        | 30.77 | 400  | 0.7666          | 0.6909   | 0.4086         |
| 0.7872        | 31.54 | 410  | 0.7353          | 0.6909   | 0.4086         |
| 0.7773        | 32.31 | 420  | 0.7032          | 0.6909   | 0.4086         |
| 0.7163        | 33.08 | 430  | 0.6929          | 0.6909   | 0.4086         |
| 0.7686        | 33.85 | 440  | 0.6617          | 0.6909   | 0.4086         |
| 0.7504        | 34.62 | 450  | 0.6623          | 0.6909   | 0.4086         |
| 0.7491        | 35.38 | 460  | 0.6333          | 0.6909   | 0.4086         |
| 0.6688        | 36.15 | 470  | 0.6115          | 0.6962   | 0.4348         |
| 0.6785        | 36.92 | 480  | 0.5968          | 0.6909   | 0.4086         |
| 0.6511        | 37.69 | 490  | 0.5879          | 0.6909   | 0.4086         |
| 0.5906        | 38.46 | 500  | 0.5855          | 0.8253   | 0.7679         |
| 0.6           | 39.23 | 510  | 0.5837          | 0.8065   | 0.7299         |
| 0.604         | 40.0  | 520  | 0.5683          | 0.8226   | 0.7699         |
| 0.6269        | 40.77 | 530  | 0.5697          | 0.8065   | 0.7362         |
| 0.5643        | 41.54 | 540  | 0.5628          | 0.8199   | 0.7687         |
| 0.6269        | 42.31 | 550  | 0.5650          | 0.8145   | 0.7570         |
| 0.5965        | 43.08 | 560  | 0.5666          | 0.8065   | 0.7473         |
| 0.5578        | 43.85 | 570  | 0.5683          | 0.8065   | 0.7401         |
| 0.5571        | 44.62 | 580  | 0.5607          | 0.8172   | 0.7690         |
| 0.5511        | 45.38 | 590  | 0.5566          | 0.8145   | 0.7618         |
| 0.5404        | 46.15 | 600  | 0.5587          | 0.8091   | 0.7482         |
| 0.5708        | 46.92 | 610  | 0.5541          | 0.8172   | 0.7660         |
| 0.62          | 47.69 | 620  | 0.5524          | 0.8145   | 0.7618         |
| 0.6095        | 48.46 | 630  | 0.5573          | 0.8065   | 0.7438         |
| 0.5282        | 49.23 | 640  | 0.5559          | 0.8145   | 0.7586         |
| 0.5307        | 50.0  | 650  | 0.5558          | 0.8118   | 0.7576         |


### Framework versions

- Transformers 4.33.0
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3