qmeeus commited on
Commit
d7727b9
1 Parent(s): e28c0d6

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +104 -0
README.md ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - sacrebleu
7
+ - wer
8
+ model-index:
9
+ - name: la-whisper-small-covost2
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # la-whisper-small-covost2
17
+
18
+ This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.5845
21
+ - Sacrebleu: 2090.6716
22
+ - Wer: 73.0006
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 3e-05
42
+ - train_batch_size: 2
43
+ - eval_batch_size: 2
44
+ - seed: 42
45
+ - gradient_accumulation_steps: 16
46
+ - total_train_batch_size: 32
47
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
+ - lr_scheduler_type: linear
49
+ - lr_scheduler_warmup_steps: 100
50
+ - training_steps: 2000
51
+ - mixed_precision_training: Native AMP
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Sacrebleu | Wer |
56
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:--------:|
57
+ | 2.6943 | 0.11 | 50 | 2.1667 | 118.2640 | 686.6897 |
58
+ | 1.5505 | 0.23 | 100 | 1.6016 | 259.9307 | 165.6116 |
59
+ | 1.4093 | 0.34 | 150 | 1.5858 | 496.7335 | 197.0106 |
60
+ | 1.3209 | 0.45 | 200 | 1.5648 | 724.2491 | 121.8795 |
61
+ | 1.2941 | 0.56 | 250 | 1.5596 | 820.1241 | 161.7159 |
62
+ | 1.2078 | 0.68 | 300 | 1.5074 | 1022.0043 | 140.3875 |
63
+ | 1.1532 | 0.79 | 350 | 1.4972 | 174.8350 | 610.3716 |
64
+ | 1.0167 | 0.9 | 400 | 1.4551 | 1904.0921 | 82.7635 |
65
+ | 0.8842 | 1.01 | 450 | 1.4296 | 1883.6113 | 81.3906 |
66
+ | 0.5619 | 1.13 | 500 | 1.4333 | 1817.9440 | 84.9312 |
67
+ | 0.5523 | 1.24 | 550 | 1.4237 | 1517.1744 | 104.0918 |
68
+ | 0.4881 | 1.35 | 600 | 1.4413 | 1650.1807 | 97.2067 |
69
+ | 0.471 | 1.46 | 650 | 1.3961 | 1885.0014 | 82.2664 |
70
+ | 0.4412 | 1.58 | 700 | 1.3986 | 2145.9786 | 72.0469 |
71
+ | 0.4625 | 1.69 | 750 | 1.3885 | 1837.7812 | 87.4472 |
72
+ | 0.4195 | 1.8 | 800 | 1.4095 | 1909.2655 | 78.6920 |
73
+ | 0.4532 | 1.91 | 850 | 1.3891 | 1925.2238 | 82.0162 |
74
+ | 0.3201 | 2.03 | 900 | 1.4415 | 1919.2020 | 80.4437 |
75
+ | 0.1955 | 2.14 | 950 | 1.4410 | 1540.5046 | 101.0145 |
76
+ | 0.2111 | 2.25 | 1000 | 1.4345 | 1735.9648 | 90.9269 |
77
+ | 0.1981 | 2.36 | 1050 | 1.4597 | 1730.3250 | 91.5356 |
78
+ | 0.2052 | 2.48 | 1100 | 1.4439 | 2143.3630 | 72.4933 |
79
+ | 0.1886 | 2.59 | 1150 | 1.4702 | 1965.5005 | 77.7519 |
80
+ | 0.1918 | 2.7 | 1200 | 1.4518 | 2057.4517 | 75.4929 |
81
+ | 0.1755 | 2.81 | 1250 | 1.4788 | 1954.2237 | 78.2997 |
82
+ | 0.1769 | 2.93 | 1300 | 1.4588 | 1774.1464 | 91.9279 |
83
+ | 0.1104 | 3.04 | 1350 | 1.5281 | 1838.1999 | 84.7317 |
84
+ | 0.0718 | 3.15 | 1400 | 1.5133 | 2058.0955 | 76.0306 |
85
+ | 0.0855 | 3.26 | 1450 | 1.5271 | 1720.1072 | 89.1346 |
86
+ | 0.0717 | 3.38 | 1500 | 1.5289 | 2007.5163 | 75.9291 |
87
+ | 0.0707 | 3.49 | 1550 | 1.5366 | 2149.6478 | 71.9523 |
88
+ | 0.0704 | 3.6 | 1600 | 1.5355 | 2179.5147 | 69.8759 |
89
+ | 0.0676 | 3.71 | 1650 | 1.5393 | 2086.2197 | 73.2474 |
90
+ | 0.0748 | 3.83 | 1700 | 1.5398 | 1879.1610 | 80.7277 |
91
+ | 0.0695 | 3.94 | 1750 | 1.5351 | 2001.8476 | 78.8306 |
92
+ | 0.033 | 4.05 | 1800 | 1.5807 | 1892.0435 | 82.2630 |
93
+ | 0.0317 | 4.16 | 1850 | 1.5843 | 1967.1172 | 78.7765 |
94
+ | 0.0302 | 4.28 | 1900 | 1.5848 | 1969.6753 | 79.1248 |
95
+ | 0.0337 | 4.39 | 1950 | 1.5808 | 2062.9546 | 74.1537 |
96
+ | 0.0306 | 4.5 | 2000 | 1.5845 | 2090.6716 | 73.0006 |
97
+
98
+
99
+ ### Framework versions
100
+
101
+ - Transformers 4.28.0.dev0
102
+ - Pytorch 2.0.0
103
+ - Datasets 2.10.1
104
+ - Tokenizers 0.13.2