DRAGOO commited on
Commit
f87a4fd
1 Parent(s): ff85c7f

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +112 -0
README.md ADDED
@@ -0,0 +1,112 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - wer
7
+ model-index:
8
+ - name: assis
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # assis
16
+
17
+ This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.3440
20
+ - Wer: 1
21
+
22
+ ## Model description
23
+
24
+ More information needed
25
+
26
+ ## Intended uses & limitations
27
+
28
+ More information needed
29
+
30
+ ## Training and evaluation data
31
+
32
+ More information needed
33
+
34
+ ## Training procedure
35
+
36
+ ### Training hyperparameters
37
+
38
+ The following hyperparameters were used during training:
39
+ - learning_rate: 1e-05
40
+ - train_batch_size: 8
41
+ - eval_batch_size: 8
42
+ - seed: 42
43
+ - gradient_accumulation_steps: 2
44
+ - total_train_batch_size: 16
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - lr_scheduler_warmup_steps: 3000
48
+ - training_steps: 5000
49
+ - mixed_precision_training: Native AMP
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:-----:|:----:|:---------------:|:---:|
55
+ | 16.8292 | 1.56 | 100 | 16.7197 | 1 |
56
+ | 15.1534 | 3.12 | 200 | 14.3410 | 1 |
57
+ | 10.7755 | 4.69 | 300 | 9.9820 | 1 |
58
+ | 6.4859 | 6.25 | 400 | 6.1913 | 1 |
59
+ | 4.0464 | 7.81 | 500 | 3.8280 | 1 |
60
+ | 3.3418 | 9.38 | 600 | 3.2733 | 1 |
61
+ | 3.217 | 10.94 | 700 | 3.1409 | 1 |
62
+ | 3.0927 | 12.5 | 800 | 3.0469 | 1 |
63
+ | 3.0235 | 14.06 | 900 | 3.0015 | 1 |
64
+ | 2.9902 | 15.62 | 1000 | 2.9748 | 1 |
65
+ | 2.945 | 17.19 | 1100 | 2.9550 | 1 |
66
+ | 2.9293 | 18.75 | 1200 | 2.9262 | 1 |
67
+ | 2.9139 | 20.31 | 1300 | 2.9230 | 1 |
68
+ | 2.9084 | 21.88 | 1400 | 2.9067 | 1 |
69
+ | 2.8941 | 23.44 | 1500 | 2.9077 | 1 |
70
+ | 2.8883 | 25.0 | 1600 | 2.8858 | 1 |
71
+ | 2.872 | 26.56 | 1700 | 2.8709 | 1 |
72
+ | 2.8641 | 28.12 | 1800 | 2.8587 | 1 |
73
+ | 2.8548 | 29.69 | 1900 | 2.8537 | 1 |
74
+ | 2.8396 | 31.25 | 2000 | 2.8371 | 1 |
75
+ | 2.7043 | 32.81 | 2100 | 2.6063 | 1 |
76
+ | 2.3905 | 34.38 | 2200 | 2.2233 | 1 |
77
+ | 1.9862 | 35.94 | 2300 | 1.7478 | 1 |
78
+ | 1.5463 | 37.5 | 2400 | 1.3176 | 1 |
79
+ | 1.218 | 39.06 | 2500 | 0.9948 | 1 |
80
+ | 0.9606 | 40.62 | 2600 | 0.7820 | 1 |
81
+ | 0.7923 | 42.19 | 2700 | 0.6577 | 1 |
82
+ | 0.6811 | 43.75 | 2800 | 0.5650 | 1 |
83
+ | 0.5927 | 45.31 | 2900 | 0.5204 | 1 |
84
+ | 0.5449 | 46.88 | 3000 | 0.4857 | 1 |
85
+ | 0.4876 | 48.44 | 3100 | 0.4526 | 1 |
86
+ | 0.4646 | 50.0 | 3200 | 0.4281 | 1 |
87
+ | 0.4374 | 51.56 | 3300 | 0.4376 | 1 |
88
+ | 0.3952 | 53.12 | 3400 | 0.4075 | 1 |
89
+ | 0.3952 | 54.69 | 3500 | 0.3937 | 1 |
90
+ | 0.3558 | 56.25 | 3600 | 0.3875 | 1 |
91
+ | 0.3527 | 57.81 | 3700 | 0.3775 | 1 |
92
+ | 0.3349 | 59.38 | 3800 | 0.3701 | 1 |
93
+ | 0.3264 | 60.94 | 3900 | 0.3576 | 1 |
94
+ | 0.3108 | 62.5 | 4000 | 0.3644 | 1 |
95
+ | 0.3104 | 64.06 | 4100 | 0.3548 | 1 |
96
+ | 0.3012 | 65.62 | 4200 | 0.3510 | 1 |
97
+ | 0.3027 | 67.19 | 4300 | 0.3486 | 1 |
98
+ | 0.2967 | 68.75 | 4400 | 0.3431 | 1 |
99
+ | 0.2892 | 70.31 | 4500 | 0.3391 | 1 |
100
+ | 0.296 | 71.88 | 4600 | 0.3427 | 1 |
101
+ | 0.2821 | 73.44 | 4700 | 0.3469 | 1 |
102
+ | 0.2701 | 75.0 | 4800 | 0.3428 | 1 |
103
+ | 0.2825 | 76.56 | 4900 | 0.3426 | 1 |
104
+ | 0.2549 | 78.12 | 5000 | 0.3440 | 1 |
105
+
106
+
107
+ ### Framework versions
108
+
109
+ - Transformers 4.28.0
110
+ - Pytorch 2.0.1+cu117
111
+ - Datasets 2.12.0
112
+ - Tokenizers 0.13.3