versae commited on
Commit
3f0c5b3
1 Parent(s): a15989f

update model card README.md

Browse files
README.md CHANGED
@@ -1,3 +1,105 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ model-index:
5
+ - name: wav2vec2-xls-r-300m-npsc-bokmaal
6
+ results: []
7
+ ---
8
+
9
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
+ should probably proofread and complete it, then remove this comment. -->
11
+
12
+ # wav2vec2-xls-r-300m-npsc-bokmaal
13
+
14
+ This model was trained from scratch on the None dataset.
15
+ It achieves the following results on the evaluation set:
16
+ - Loss: 0.1663
17
+ - Wer: 0.0932
18
+
19
+ ## Model description
20
+
21
+ More information needed
22
+
23
+ ## Intended uses & limitations
24
+
25
+ More information needed
26
+
27
+ ## Training and evaluation data
28
+
29
+ More information needed
30
+
31
+ ## Training procedure
32
+
33
+ ### Training hyperparameters
34
+
35
+ The following hyperparameters were used during training:
36
+ - learning_rate: 5e-05
37
+ - train_batch_size: 16
38
+ - eval_batch_size: 16
39
+ - seed: 42
40
+ - gradient_accumulation_steps: 2
41
+ - total_train_batch_size: 32
42
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
+ - lr_scheduler_type: linear
44
+ - lr_scheduler_warmup_steps: 500
45
+ - num_epochs: 15.0
46
+ - mixed_precision_training: Native AMP
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
51
+ |:-------------:|:-----:|:-----:|:---------------:|:------:|
52
+ | 0.0969 | 0.32 | 500 | 0.1773 | 0.1054 |
53
+ | 0.0929 | 0.64 | 1000 | 0.1672 | 0.1061 |
54
+ | 0.1018 | 0.97 | 1500 | 0.1770 | 0.1067 |
55
+ | 0.0871 | 1.29 | 2000 | 0.1832 | 0.1087 |
56
+ | 0.0908 | 1.61 | 2500 | 0.1830 | 0.1101 |
57
+ | 0.0975 | 1.93 | 3000 | 0.1848 | 0.1100 |
58
+ | 0.0936 | 2.26 | 3500 | 0.1853 | 0.1113 |
59
+ | 0.1025 | 2.58 | 4000 | 0.1958 | 0.1149 |
60
+ | 0.0989 | 2.9 | 4500 | 0.1776 | 0.1123 |
61
+ | 0.0946 | 3.22 | 5000 | 0.1825 | 0.1097 |
62
+ | 0.0859 | 3.55 | 5500 | 0.1864 | 0.1072 |
63
+ | 0.0867 | 3.87 | 6000 | 0.1886 | 0.1081 |
64
+ | 0.0783 | 4.19 | 6500 | 0.1883 | 0.1063 |
65
+ | 0.0804 | 4.51 | 7000 | 0.1831 | 0.1063 |
66
+ | 0.0797 | 4.84 | 7500 | 0.1884 | 0.1058 |
67
+ | 0.0705 | 5.16 | 8000 | 0.1802 | 0.1057 |
68
+ | 0.0795 | 5.48 | 8500 | 0.1854 | 0.1038 |
69
+ | 0.0711 | 5.8 | 9000 | 0.1766 | 0.1032 |
70
+ | 0.0973 | 6.13 | 9500 | 0.1663 | 0.1014 |
71
+ | 0.087 | 6.45 | 10000 | 0.1664 | 0.1014 |
72
+ | 0.0962 | 6.77 | 10500 | 0.1631 | 0.1009 |
73
+ | 0.0857 | 7.09 | 11000 | 0.1659 | 0.1002 |
74
+ | 0.0882 | 7.41 | 11500 | 0.1668 | 0.1007 |
75
+ | 0.0784 | 7.74 | 12000 | 0.1688 | 0.0996 |
76
+ | 0.0838 | 8.06 | 12500 | 0.1675 | 0.0984 |
77
+ | 0.0863 | 8.38 | 13000 | 0.1639 | 0.0979 |
78
+ | 0.0763 | 8.7 | 13500 | 0.1638 | 0.0980 |
79
+ | 0.0822 | 9.03 | 14000 | 0.1709 | 0.0972 |
80
+ | 0.0769 | 9.35 | 14500 | 0.1700 | 0.0965 |
81
+ | 0.0838 | 9.67 | 15000 | 0.1703 | 0.0974 |
82
+ | 0.0799 | 9.99 | 15500 | 0.1667 | 0.0957 |
83
+ | 0.0712 | 10.32 | 16000 | 0.1754 | 0.0960 |
84
+ | 0.0737 | 10.64 | 16500 | 0.1725 | 0.0968 |
85
+ | 0.0851 | 10.96 | 17000 | 0.1733 | 0.0958 |
86
+ | 0.076 | 11.28 | 17500 | 0.1682 | 0.0954 |
87
+ | 0.0712 | 11.61 | 18000 | 0.1713 | 0.0943 |
88
+ | 0.0745 | 11.93 | 18500 | 0.1662 | 0.0951 |
89
+ | 0.0864 | 12.25 | 19000 | 0.1692 | 0.0947 |
90
+ | 0.0937 | 12.57 | 19500 | 0.1624 | 0.0943 |
91
+ | 0.0915 | 12.89 | 20000 | 0.1678 | 0.0942 |
92
+ | 0.0926 | 13.22 | 20500 | 0.1641 | 0.0945 |
93
+ | 0.0912 | 13.54 | 21000 | 0.1665 | 0.0937 |
94
+ | 0.0917 | 13.86 | 21500 | 0.1648 | 0.0936 |
95
+ | 0.094 | 14.18 | 22000 | 0.1635 | 0.0935 |
96
+ | 0.0864 | 14.51 | 22500 | 0.1678 | 0.0934 |
97
+ | 0.0899 | 14.83 | 23000 | 0.1663 | 0.0932 |
98
+
99
+
100
+ ### Framework versions
101
+
102
+ - Transformers 4.17.0.dev0
103
+ - Pytorch 1.10.2+cu113
104
+ - Datasets 1.18.4.dev0
105
+ - Tokenizers 0.11.0
wandb/run-20220208_002217-138c86e3/files/output.log CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8a3ac6c970c4daeb0fc2f193ef101ff7103d11d3abeb3d5fdbb4eb6ae96fae2d
3
- size 16608528
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:55c39ebc157eb864999914b6d72b9f6ba259a47bdac42fca0eed801f95783f29
3
+ size 16752141
wandb/run-20220208_002217-138c86e3/logs/debug-internal.log CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f46aada3f1b53c9f03f85ac13f1c3e12e2208c9a76a643a2a486a1e2d0e8d41c
3
- size 7519732
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6df2ae69aab690e77a827cf0fcc1db173a98eba215855932a5fa9444dedab13d
3
+ size 7567112
wandb/run-20220208_002217-138c86e3/run-138c86e3.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b78966d35e098f2efb0083aeffab28f645af6e6a1b218dd8350a3b75aa9ff71f
3
- size 196962183
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:376ebf6b85b281548b50f549897f0decdf38f315399edb80c52da7bab2ccdc8a
3
+ size 197260009