File size: 3,556 Bytes
c416453
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
---
license: cc-by-sa-4.0
tags:
- generated_from_trainer
datasets:
- common_voice
metrics:
- wer
model-index:
- name: wav2vec2-detect-toxic-th
  results:
  - task:
      name: Automatic Speech Recognition
      type: automatic-speech-recognition
    dataset:
      name: common_voice
      type: common_voice
      config: th
      split: validation
      args: th
    metrics:
    - name: Wer
      type: wer
      value: 0.4550641940085592
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# wav2vec2-detect-toxic-th

This model is a fine-tuned version of [airesearch/wav2vec2-large-xlsr-53-th](https://huggingface.co/airesearch/wav2vec2-large-xlsr-53-th) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5867
- Wer: 0.4551

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 30
- num_epochs: 100
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Wer    |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 5.4333        | 3.23  | 100  | 3.3662          | 1.0    |
| 3.3254        | 6.45  | 200  | 3.2575          | 1.0    |
| 2.5091        | 9.68  | 300  | 1.2965          | 0.5571 |
| 1.1749        | 12.9  | 400  | 1.0687          | 0.5464 |
| 0.9091        | 16.13 | 500  | 1.0564          | 0.4872 |
| 0.756         | 19.35 | 600  | 1.0998          | 0.4757 |
| 0.6527        | 22.58 | 700  | 1.1492          | 0.4829 |
| 0.5879        | 25.81 | 800  | 1.1916          | 0.4786 |
| 0.5184        | 29.03 | 900  | 1.2662          | 0.4815 |
| 0.4688        | 32.26 | 1000 | 1.2109          | 0.4864 |
| 0.4587        | 35.48 | 1100 | 1.3144          | 0.4722 |
| 0.4005        | 38.71 | 1200 | 1.3111          | 0.4686 |
| 0.3851        | 41.94 | 1300 | 1.3420          | 0.4786 |
| 0.3563        | 45.16 | 1400 | 1.3679          | 0.4743 |
| 0.3591        | 48.39 | 1500 | 1.4444          | 0.4643 |
| 0.325         | 51.61 | 1600 | 1.4076          | 0.4722 |
| 0.3409        | 54.84 | 1700 | 1.4586          | 0.4629 |
| 0.3019        | 58.06 | 1800 | 1.4579          | 0.4529 |
| 0.292         | 61.29 | 1900 | 1.4887          | 0.4522 |
| 0.2729        | 64.52 | 2000 | 1.4966          | 0.4608 |
| 0.2656        | 67.74 | 2100 | 1.5232          | 0.4593 |
| 0.2575        | 70.97 | 2200 | 1.4984          | 0.4508 |
| 0.2532        | 74.19 | 2300 | 1.5332          | 0.4544 |
| 0.2474        | 77.42 | 2400 | 1.5301          | 0.4529 |
| 0.2539        | 80.65 | 2500 | 1.5214          | 0.4601 |
| 0.2526        | 83.87 | 2600 | 1.5413          | 0.4572 |
| 0.2601        | 87.1  | 2700 | 1.5553          | 0.4608 |
| 0.2315        | 90.32 | 2800 | 1.5768          | 0.4515 |
| 0.2477        | 93.55 | 2900 | 1.5787          | 0.4650 |
| 0.2363        | 96.77 | 3000 | 1.5900          | 0.4565 |
| 0.242         | 100.0 | 3100 | 1.5867          | 0.4551 |


### Framework versions

- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 1.16.1
- Tokenizers 0.13.3