Umong commited on
Commit
c76e131
1 Parent(s): a30768a

End of training

Browse files
Files changed (3) hide show
  1. README.md +122 -0
  2. pytorch_model.bin +1 -1
  3. training_args.bin +1 -1
README.md ADDED
@@ -0,0 +1,122 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/wav2vec2-xls-r-300m
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: wav2vec2-xls-r-300m-bengali-macro
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # wav2vec2-xls-r-300m-bengali-macro
17
+
18
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 2.3787
21
+ - Wer: 0.88
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 0.0003
41
+ - train_batch_size: 16
42
+ - eval_batch_size: 1
43
+ - seed: 42
44
+ - gradient_accumulation_steps: 2
45
+ - total_train_batch_size: 32
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - lr_scheduler_warmup_steps: 500
49
+ - num_epochs: 1
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
55
+ | 4.686 | 0.02 | 500 | 2.9368 | 0.9254 |
56
+ | 1.465 | 0.03 | 1000 | 1.6714 | 0.88 |
57
+ | 1.2139 | 0.05 | 1500 | 1.6254 | 0.8292 |
58
+ | 1.1463 | 0.07 | 2000 | 1.5170 | 0.8292 |
59
+ | 1.12 | 0.08 | 2500 | 1.4973 | 0.7966 |
60
+ | 1.0766 | 0.1 | 3000 | 1.5682 | 0.8129 |
61
+ | 1.0547 | 0.12 | 3500 | 1.3838 | 0.7458 |
62
+ | 1.0163 | 0.13 | 4000 | 1.6073 | 0.8685 |
63
+ | 1.0149 | 0.15 | 4500 | 1.3993 | 0.7247 |
64
+ | 1.0125 | 0.17 | 5000 | 1.4888 | 0.7749 |
65
+ | 0.9882 | 0.18 | 5500 | 1.3766 | 0.7444 |
66
+ | 0.9736 | 0.2 | 6000 | 1.5816 | 0.8027 |
67
+ | 0.9737 | 0.22 | 6500 | 1.5761 | 0.7783 |
68
+ | 0.9445 | 0.23 | 7000 | 1.3593 | 0.7505 |
69
+ | 0.9335 | 0.25 | 7500 | 1.3453 | 0.7247 |
70
+ | 0.931 | 0.27 | 8000 | 1.4024 | 0.7397 |
71
+ | 0.9389 | 0.28 | 8500 | 1.5973 | 0.8508 |
72
+ | 0.9152 | 0.3 | 9000 | 1.4021 | 0.7193 |
73
+ | 0.9042 | 0.32 | 9500 | 1.3642 | 0.7620 |
74
+ | 0.8962 | 0.33 | 10000 | 1.4298 | 0.7383 |
75
+ | 0.8767 | 0.35 | 10500 | 1.4478 | 0.7580 |
76
+ | 0.8853 | 0.37 | 11000 | 1.3255 | 0.7302 |
77
+ | 0.8739 | 0.38 | 11500 | 1.3791 | 0.7431 |
78
+ | 0.8597 | 0.4 | 12000 | 1.5847 | 0.8325 |
79
+ | 0.8815 | 0.42 | 12500 | 1.6785 | 0.8163 |
80
+ | 0.8736 | 0.43 | 13000 | 1.6222 | 0.7871 |
81
+ | 0.8643 | 0.45 | 13500 | 1.8635 | 0.8502 |
82
+ | 0.84 | 0.46 | 14000 | 1.4343 | 0.7803 |
83
+ | 0.8323 | 0.48 | 14500 | 1.7500 | 0.8427 |
84
+ | 0.8223 | 0.5 | 15000 | 1.6916 | 0.8278 |
85
+ | 0.827 | 0.51 | 15500 | 2.6214 | 0.9085 |
86
+ | 0.8149 | 0.53 | 16000 | 1.6750 | 0.8169 |
87
+ | 0.8149 | 0.55 | 16500 | 1.7646 | 0.8142 |
88
+ | 0.8032 | 0.56 | 17000 | 2.1347 | 0.8617 |
89
+ | 0.8005 | 0.58 | 17500 | 1.7216 | 0.8122 |
90
+ | 0.7956 | 0.6 | 18000 | 2.3053 | 0.8936 |
91
+ | 0.7888 | 0.61 | 18500 | 1.7773 | 0.8359 |
92
+ | 0.7919 | 0.63 | 19000 | 2.2394 | 0.8597 |
93
+ | 0.7888 | 0.65 | 19500 | 1.5470 | 0.7403 |
94
+ | 0.7721 | 0.66 | 20000 | 1.6034 | 0.7593 |
95
+ | 0.7603 | 0.68 | 20500 | 1.6808 | 0.7803 |
96
+ | 0.751 | 0.7 | 21000 | 1.7942 | 0.8217 |
97
+ | 0.7555 | 0.71 | 21500 | 1.9897 | 0.8441 |
98
+ | 0.7583 | 0.73 | 22000 | 2.3329 | 0.8576 |
99
+ | 0.7346 | 0.75 | 22500 | 2.2255 | 0.8515 |
100
+ | 0.754 | 0.76 | 23000 | 2.2606 | 0.8861 |
101
+ | 0.7309 | 0.78 | 23500 | 2.0292 | 0.8529 |
102
+ | 0.7351 | 0.8 | 24000 | 2.4471 | 0.8942 |
103
+ | 0.7456 | 0.81 | 24500 | 2.1406 | 0.8224 |
104
+ | 0.7229 | 0.83 | 25000 | 2.4474 | 0.8888 |
105
+ | 0.7253 | 0.85 | 25500 | 2.0324 | 0.8441 |
106
+ | 0.7109 | 0.86 | 26000 | 2.2594 | 0.8671 |
107
+ | 0.7316 | 0.88 | 26500 | 2.3887 | 0.8827 |
108
+ | 0.716 | 0.9 | 27000 | 2.4739 | 0.8915 |
109
+ | 0.7264 | 0.91 | 27500 | 2.4291 | 0.8922 |
110
+ | 0.701 | 0.93 | 28000 | 2.3306 | 0.8936 |
111
+ | 0.7025 | 0.95 | 28500 | 2.3172 | 0.8834 |
112
+ | 0.6963 | 0.96 | 29000 | 2.4020 | 0.8841 |
113
+ | 0.6952 | 0.98 | 29500 | 2.4324 | 0.8895 |
114
+ | 0.6985 | 1.0 | 30000 | 2.3787 | 0.88 |
115
+
116
+
117
+ ### Framework versions
118
+
119
+ - Transformers 4.33.0
120
+ - Pytorch 2.0.0
121
+ - Datasets 2.14.5
122
+ - Tokenizers 0.13.3
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:fa97a2fa461ce91f8389308876a572047ca586cc44f0aea763e6ae105381ce86
3
  size 1262172461
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a80999e7500f4fcfbd6b854148cbf50ce21261a883692f12551b1633f5e69ab
3
  size 1262172461
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6fd2c7fb8086564fc235a828e3410d3ae70c5becf38faf09a311c0b15787308d
3
  size 4027
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a5d0d41e1b5c17aa282b3f6f61fb419faa8e0754e6c388f9628ccc03db229cbe
3
  size 4027