omartariq612 commited on
Commit
a174ad5
1 Parent(s): edd511f

Model save

Browse files
Files changed (3) hide show
  1. README.md +67 -0
  2. generation_config.json +102 -0
  3. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ metrics:
5
+ - wer
6
+ model-index:
7
+ - name: quran-whisper-ar-tiny-1
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # quran-whisper-ar-tiny-1
15
+
16
+ This model was trained from scratch on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.5858
19
+ - Wer: 164.8352
20
+
21
+ ## Model description
22
+
23
+ More information needed
24
+
25
+ ## Intended uses & limitations
26
+
27
+ More information needed
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - learning_rate: 0.0001
39
+ - train_batch_size: 16
40
+ - eval_batch_size: 8
41
+ - seed: 42
42
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
+ - lr_scheduler_type: linear
44
+ - training_steps: 45
45
+ - mixed_precision_training: Native AMP
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
50
+ |:-------------:|:------:|:----:|:---------------:|:--------:|
51
+ | 2.2643 | 0.3333 | 5 | 4.2666 | 209.8901 |
52
+ | 3.5798 | 0.6667 | 10 | 1.9318 | 173.6264 |
53
+ | 1.2993 | 1.0 | 15 | 1.0968 | 102.1978 |
54
+ | 0.8612 | 0.6667 | 20 | 0.9798 | 145.0549 |
55
+ | 0.7235 | 0.8333 | 25 | 0.7984 | 131.8681 |
56
+ | 0.5694 | 1.0 | 30 | 0.7010 | 127.4725 |
57
+ | 0.5781 | 0.7778 | 35 | 0.7017 | 378.0220 |
58
+ | 0.5704 | 0.8889 | 40 | 0.6511 | 341.7582 |
59
+ | 0.4247 | 1.0 | 45 | 0.5858 | 164.8352 |
60
+
61
+
62
+ ### Framework versions
63
+
64
+ - Transformers 4.41.2
65
+ - Pytorch 2.1.2
66
+ - Datasets 2.19.2
67
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,102 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "begin_suppress_tokens": [
3
+ 220,
4
+ 50257
5
+ ],
6
+ "bos_token_id": 50257,
7
+ "decoder_start_token_id": 50258,
8
+ "eos_token_id": 50257,
9
+ "max_length": 448,
10
+ "pad_token_id": 50257,
11
+ "suppress_tokens": [
12
+ 1,
13
+ 2,
14
+ 7,
15
+ 8,
16
+ 9,
17
+ 10,
18
+ 14,
19
+ 25,
20
+ 26,
21
+ 27,
22
+ 28,
23
+ 29,
24
+ 31,
25
+ 58,
26
+ 59,
27
+ 60,
28
+ 61,
29
+ 62,
30
+ 63,
31
+ 90,
32
+ 91,
33
+ 92,
34
+ 93,
35
+ 359,
36
+ 503,
37
+ 522,
38
+ 542,
39
+ 873,
40
+ 893,
41
+ 902,
42
+ 918,
43
+ 922,
44
+ 931,
45
+ 1350,
46
+ 1853,
47
+ 1982,
48
+ 2460,
49
+ 2627,
50
+ 3246,
51
+ 3253,
52
+ 3268,
53
+ 3536,
54
+ 3846,
55
+ 3961,
56
+ 4183,
57
+ 4667,
58
+ 6585,
59
+ 6647,
60
+ 7273,
61
+ 9061,
62
+ 9383,
63
+ 10428,
64
+ 10929,
65
+ 11938,
66
+ 12033,
67
+ 12331,
68
+ 12562,
69
+ 13793,
70
+ 14157,
71
+ 14635,
72
+ 15265,
73
+ 15618,
74
+ 16553,
75
+ 16604,
76
+ 18362,
77
+ 18956,
78
+ 20075,
79
+ 21675,
80
+ 22520,
81
+ 26130,
82
+ 26161,
83
+ 26435,
84
+ 28279,
85
+ 29464,
86
+ 31650,
87
+ 32302,
88
+ 32470,
89
+ 36865,
90
+ 42863,
91
+ 47425,
92
+ 49870,
93
+ 50254,
94
+ 50258,
95
+ 50358,
96
+ 50359,
97
+ 50360,
98
+ 50361,
99
+ 50362
100
+ ],
101
+ "transformers_version": "4.41.2"
102
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1273c0e3239b9bba08cb20ba4ddb60e90c84bc42a212169143929f527fc9d8ae
3
  size 151061672
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:481139b035d68eeb280a27122f63d46fd007a33c2dfe5f7f6427c0e9dbcf6952
3
  size 151061672