pere commited on
Commit
e707816
1 Parent(s): 5b68fe2
README.md ADDED
@@ -0,0 +1,176 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - whisper-event
5
+ - generated_from_trainer
6
+ datasets:
7
+ - NbAiLab/NCC_S
8
+ metrics:
9
+ - wer
10
+ model-index:
11
+ - name: "Whisper Tiny Norwegian Bokm\xE5l"
12
+ results:
13
+ - task:
14
+ name: Automatic Speech Recognition
15
+ type: automatic-speech-recognition
16
+ dataset:
17
+ name: NbAiLab/NCC_S
18
+ type: NbAiLab/NCC_S
19
+ config: 'no'
20
+ split: validation
21
+ args: 'no'
22
+ metrics:
23
+ - name: Wer
24
+ type: wer
25
+ value: 24.878197320341048
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # Whisper Tiny Norwegian Bokmål
32
+
33
+ This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the NbAiLab/NCC_S dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.5100
36
+ - Wer: 24.8782
37
+
38
+ ## Model description
39
+
40
+ More information needed
41
+
42
+ ## Intended uses & limitations
43
+
44
+ More information needed
45
+
46
+ ## Training and evaluation data
47
+
48
+ More information needed
49
+
50
+ ## Training procedure
51
+
52
+ ### Training hyperparameters
53
+
54
+ The following hyperparameters were used during training:
55
+ - learning_rate: 3e-06
56
+ - train_batch_size: 256
57
+ - eval_batch_size: 64
58
+ - seed: 42
59
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
60
+ - lr_scheduler_type: constant_with_warmup
61
+ - lr_scheduler_warmup_steps: 1000
62
+ - training_steps: 100000
63
+ - mixed_precision_training: Native AMP
64
+
65
+ ### Training results
66
+
67
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
68
+ |:-------------:|:-----:|:------:|:---------------:|:-------:|
69
+ | 1.8819 | 0.01 | 1000 | 1.1869 | 61.9671 |
70
+ | 1.6425 | 0.02 | 2000 | 0.9991 | 53.6541 |
71
+ | 1.548 | 0.03 | 3000 | 0.9147 | 50.2132 |
72
+ | 1.4636 | 0.04 | 4000 | 0.8605 | 47.0767 |
73
+ | 1.4113 | 0.05 | 5000 | 0.8253 | 45.7369 |
74
+ | 1.3484 | 0.01 | 6000 | 0.7946 | 43.4531 |
75
+ | 1.3127 | 0.02 | 7000 | 0.7740 | 42.2655 |
76
+ | 1.2994 | 0.03 | 8000 | 0.7551 | 40.8952 |
77
+ | 1.265 | 0.04 | 9000 | 0.7378 | 39.8599 |
78
+ | 1.2458 | 0.05 | 10000 | 0.7257 | 39.8904 |
79
+ | 1.2257 | 0.06 | 11000 | 0.7114 | 39.7990 |
80
+ | 1.2126 | 0.07 | 12000 | 0.6972 | 37.8806 |
81
+ | 1.1971 | 0.08 | 13000 | 0.6871 | 37.3021 |
82
+ | 1.1786 | 1.01 | 14000 | 0.6786 | 37.4239 |
83
+ | 1.1486 | 1.02 | 15000 | 0.6703 | 36.9976 |
84
+ | 1.1505 | 1.03 | 16000 | 0.6647 | 36.3581 |
85
+ | 1.1238 | 1.04 | 17000 | 0.6559 | 36.3886 |
86
+ | 1.1184 | 1.05 | 18000 | 0.6509 | 36.5104 |
87
+ | 1.115 | 1.06 | 19000 | 0.6452 | 35.9927 |
88
+ | 1.1013 | 1.07 | 20000 | 0.6382 | 34.5006 |
89
+ | 1.0969 | 1.08 | 21000 | 0.6331 | 34.3484 |
90
+ | 1.0784 | 2.0 | 22000 | 0.6304 | 34.2875 |
91
+ | 1.0774 | 2.01 | 23000 | 0.6249 | 34.1048 |
92
+ | 1.0719 | 2.02 | 24000 | 0.6194 | 33.8307 |
93
+ | 1.0638 | 2.03 | 25000 | 0.6158 | 32.9781 |
94
+ | 1.0592 | 2.04 | 26000 | 0.6105 | 32.6431 |
95
+ | 1.0493 | 2.05 | 27000 | 0.6041 | 32.7345 |
96
+ | 1.047 | 2.06 | 28000 | 0.6040 | 32.7649 |
97
+ | 1.0323 | 2.07 | 29000 | 0.5984 | 31.6078 |
98
+ | 1.0189 | 3.0 | 30000 | 0.5957 | 31.3033 |
99
+ | 1.0078 | 3.01 | 31000 | 0.5924 | 31.4251 |
100
+ | 1.0146 | 3.02 | 32000 | 0.5940 | 31.3033 |
101
+ | 1.0128 | 3.03 | 33000 | 0.5892 | 31.0292 |
102
+ | 1.0025 | 3.04 | 34000 | 0.5873 | 31.1815 |
103
+ | 0.999 | 3.05 | 35000 | 0.5838 | 30.6334 |
104
+ | 1.0045 | 3.06 | 36000 | 0.5799 | 30.4202 |
105
+ | 1.0005 | 3.07 | 37000 | 0.5770 | 30.1766 |
106
+ | 1.0017 | 3.08 | 38000 | 0.5733 | 29.6590 |
107
+ | 0.9878 | 4.01 | 39000 | 0.5745 | 30.2680 |
108
+ | 0.9854 | 4.02 | 40000 | 0.5720 | 30.0548 |
109
+ | 0.9624 | 4.03 | 41000 | 0.5703 | 29.5981 |
110
+ | 0.9639 | 4.04 | 42000 | 0.5681 | 29.5067 |
111
+ | 0.9569 | 4.05 | 43000 | 0.5679 | 29.6285 |
112
+ | 0.9682 | 4.06 | 44000 | 0.5643 | 29.5676 |
113
+ | 0.9539 | 4.07 | 45000 | 0.5601 | 29.5676 |
114
+ | 0.946 | 4.08 | 46000 | 0.5562 | 29.7199 |
115
+ | 0.9429 | 5.01 | 47000 | 0.5592 | 29.2935 |
116
+ | 0.9462 | 5.02 | 48000 | 0.5540 | 29.0804 |
117
+ | 0.9312 | 5.03 | 49000 | 0.5535 | 29.2935 |
118
+ | 0.9462 | 5.04 | 50000 | 0.5536 | 28.6845 |
119
+ | 0.922 | 5.05 | 51000 | 0.5539 | 28.7150 |
120
+ | 0.9253 | 5.06 | 52000 | 0.5510 | 28.8368 |
121
+ | 0.9065 | 0.01 | 53000 | 0.5493 | 28.5932 |
122
+ | 0.9096 | 0.02 | 54000 | 0.5490 | 28.5018 |
123
+ | 0.9329 | 0.03 | 55000 | 0.5483 | 28.2887 |
124
+ | 0.9181 | 0.04 | 56000 | 0.5471 | 27.9842 |
125
+ | 0.914 | 0.05 | 57000 | 0.5457 | 28.4105 |
126
+ | 0.9149 | 0.06 | 58000 | 0.5449 | 27.5883 |
127
+ | 0.9092 | 0.07 | 59000 | 0.5405 | 27.8319 |
128
+ | 0.9101 | 0.08 | 60000 | 0.5402 | 27.3447 |
129
+ | 0.9046 | 1.01 | 61000 | 0.5374 | 27.5579 |
130
+ | 0.8917 | 1.02 | 62000 | 0.5390 | 27.7406 |
131
+ | 0.8993 | 1.03 | 63000 | 0.5386 | 27.4056 |
132
+ | 0.8875 | 1.04 | 64000 | 0.5361 | 26.8575 |
133
+ | 0.8892 | 1.05 | 65000 | 0.5358 | 27.3447 |
134
+ | 0.8929 | 1.06 | 66000 | 0.5346 | 26.7357 |
135
+ | 0.8703 | 0.01 | 67000 | 0.5332 | 26.8270 |
136
+ | 0.8709 | 0.02 | 68000 | 0.5336 | 26.7052 |
137
+ | 0.8917 | 0.03 | 69000 | 0.5329 | 27.0706 |
138
+ | 0.8867 | 0.04 | 70000 | 0.5323 | 26.3398 |
139
+ | 0.8778 | 0.05 | 71000 | 0.5315 | 27.2838 |
140
+ | 0.8757 | 0.06 | 72000 | 0.5317 | 26.2485 |
141
+ | 0.8726 | 0.07 | 73000 | 0.5269 | 26.6443 |
142
+ | 0.8792 | 0.08 | 74000 | 0.5268 | 26.1571 |
143
+ | 0.8706 | 1.01 | 75000 | 0.5247 | 26.1571 |
144
+ | 0.8585 | 1.02 | 76000 | 0.5265 | 26.3703 |
145
+ | 0.8659 | 1.03 | 77000 | 0.5262 | 26.7357 |
146
+ | 0.8551 | 1.04 | 78000 | 0.5249 | 26.0658 |
147
+ | 0.8572 | 1.05 | 79000 | 0.5249 | 26.2789 |
148
+ | 0.8612 | 1.06 | 80000 | 0.5235 | 25.7613 |
149
+ | 0.8598 | 1.07 | 81000 | 0.5208 | 25.7004 |
150
+ | 0.8686 | 1.08 | 82000 | 0.5214 | 25.7004 |
151
+ | 0.8503 | 2.0 | 83000 | 0.5214 | 25.7004 |
152
+ | 0.8545 | 2.01 | 84000 | 0.5215 | 28.2278 |
153
+ | 0.8594 | 2.02 | 85000 | 0.5186 | 25.6699 |
154
+ | 0.86 | 2.03 | 86000 | 0.5196 | 25.5786 |
155
+ | 0.8514 | 2.04 | 87000 | 0.5203 | 25.1827 |
156
+ | 0.8505 | 2.05 | 88000 | 0.5164 | 28.0146 |
157
+ | 0.8512 | 2.06 | 89000 | 0.5174 | 25.0914 |
158
+ | 0.8495 | 2.07 | 90000 | 0.5141 | 25.5481 |
159
+ | 0.8381 | 3.0 | 91000 | 0.5130 | 24.9695 |
160
+ | 0.8253 | 3.01 | 92000 | 0.5147 | 25.5786 |
161
+ | 0.8387 | 3.02 | 93000 | 0.5168 | 24.9086 |
162
+ | 0.8425 | 3.03 | 94000 | 0.5135 | 25.2436 |
163
+ | 0.8339 | 3.04 | 95000 | 0.5162 | 25.6699 |
164
+ | 0.8402 | 3.05 | 96000 | 0.5147 | 25.7308 |
165
+ | 0.8396 | 3.06 | 97000 | 0.5143 | 25.6699 |
166
+ | 0.8432 | 3.07 | 98000 | 0.5100 | 24.8782 |
167
+ | 0.844 | 3.08 | 99000 | 0.5100 | 25.0609 |
168
+ | 0.8333 | 4.01 | 100000 | 0.5128 | 24.9695 |
169
+
170
+
171
+ ### Framework versions
172
+
173
+ - Transformers 4.26.0.dev0
174
+ - Pytorch 1.13.0+cu117
175
+ - Datasets 2.7.1.dev0
176
+ - Tokenizers 0.13.2
added_tokens.json ADDED
@@ -0,0 +1,109 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|af|>": 50327,
3
+ "<|am|>": 50334,
4
+ "<|ar|>": 50272,
5
+ "<|as|>": 50350,
6
+ "<|az|>": 50304,
7
+ "<|ba|>": 50355,
8
+ "<|be|>": 50330,
9
+ "<|bg|>": 50292,
10
+ "<|bn|>": 50302,
11
+ "<|bo|>": 50347,
12
+ "<|br|>": 50309,
13
+ "<|bs|>": 50315,
14
+ "<|ca|>": 50270,
15
+ "<|cs|>": 50283,
16
+ "<|cy|>": 50297,
17
+ "<|da|>": 50285,
18
+ "<|de|>": 50261,
19
+ "<|el|>": 50281,
20
+ "<|endoftext|>": 50257,
21
+ "<|en|>": 50259,
22
+ "<|es|>": 50262,
23
+ "<|et|>": 50307,
24
+ "<|eu|>": 50310,
25
+ "<|fa|>": 50300,
26
+ "<|fi|>": 50277,
27
+ "<|fo|>": 50338,
28
+ "<|fr|>": 50265,
29
+ "<|gl|>": 50319,
30
+ "<|gu|>": 50333,
31
+ "<|haw|>": 50352,
32
+ "<|ha|>": 50354,
33
+ "<|hi|>": 50276,
34
+ "<|hr|>": 50291,
35
+ "<|ht|>": 50339,
36
+ "<|hu|>": 50286,
37
+ "<|hy|>": 50312,
38
+ "<|id|>": 50275,
39
+ "<|is|>": 50311,
40
+ "<|it|>": 50274,
41
+ "<|iw|>": 50279,
42
+ "<|ja|>": 50266,
43
+ "<|jw|>": 50356,
44
+ "<|ka|>": 50329,
45
+ "<|kk|>": 50316,
46
+ "<|km|>": 50323,
47
+ "<|kn|>": 50306,
48
+ "<|ko|>": 50264,
49
+ "<|la|>": 50294,
50
+ "<|lb|>": 50345,
51
+ "<|ln|>": 50353,
52
+ "<|lo|>": 50336,
53
+ "<|lt|>": 50293,
54
+ "<|lv|>": 50301,
55
+ "<|mg|>": 50349,
56
+ "<|mi|>": 50295,
57
+ "<|mk|>": 50308,
58
+ "<|ml|>": 50296,
59
+ "<|mn|>": 50314,
60
+ "<|mr|>": 50320,
61
+ "<|ms|>": 50282,
62
+ "<|mt|>": 50343,
63
+ "<|my|>": 50346,
64
+ "<|ne|>": 50313,
65
+ "<|nl|>": 50271,
66
+ "<|nn|>": 50342,
67
+ "<|nocaptions|>": 50362,
68
+ "<|notimestamps|>": 50363,
69
+ "<|no|>": 50288,
70
+ "<|oc|>": 50328,
71
+ "<|pa|>": 50321,
72
+ "<|pl|>": 50269,
73
+ "<|ps|>": 50340,
74
+ "<|pt|>": 50267,
75
+ "<|ro|>": 50284,
76
+ "<|ru|>": 50263,
77
+ "<|sa|>": 50344,
78
+ "<|sd|>": 50332,
79
+ "<|si|>": 50322,
80
+ "<|sk|>": 50298,
81
+ "<|sl|>": 50305,
82
+ "<|sn|>": 50324,
83
+ "<|so|>": 50326,
84
+ "<|sq|>": 50317,
85
+ "<|sr|>": 50303,
86
+ "<|startoflm|>": 50360,
87
+ "<|startofprev|>": 50361,
88
+ "<|startoftranscript|>": 50258,
89
+ "<|su|>": 50357,
90
+ "<|sv|>": 50273,
91
+ "<|sw|>": 50318,
92
+ "<|ta|>": 50287,
93
+ "<|te|>": 50299,
94
+ "<|tg|>": 50331,
95
+ "<|th|>": 50289,
96
+ "<|tk|>": 50341,
97
+ "<|tl|>": 50348,
98
+ "<|transcribe|>": 50359,
99
+ "<|translate|>": 50358,
100
+ "<|tr|>": 50268,
101
+ "<|tt|>": 50351,
102
+ "<|uk|>": 50280,
103
+ "<|ur|>": 50290,
104
+ "<|uz|>": 50337,
105
+ "<|vi|>": 50278,
106
+ "<|yi|>": 50335,
107
+ "<|yo|>": 50325,
108
+ "<|zh|>": 50260
109
+ }
all_results.json ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 4.01,
3
+ "eval_loss": 0.5099794864654541,
4
+ "eval_runtime": 37.0125,
5
+ "eval_samples_per_second": 4.35,
6
+ "eval_steps_per_second": 0.081,
7
+ "eval_wer": 24.878197320341048,
8
+ "train_loss": 0.2913371723175049,
9
+ "train_runtime": 623384.3788,
10
+ "train_samples_per_second": 41.066,
11
+ "train_steps_per_second": 0.16
12
+ }
config.json ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-tiny",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "architectures": [
6
+ "WhisperForConditionalGeneration"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "begin_suppress_tokens": [
10
+ 220,
11
+ 50257
12
+ ],
13
+ "bos_token_id": 50257,
14
+ "d_model": 384,
15
+ "decoder_attention_heads": 6,
16
+ "decoder_ffn_dim": 1536,
17
+ "decoder_layerdrop": 0.0,
18
+ "decoder_layers": 4,
19
+ "decoder_start_token_id": 50258,
20
+ "dropout": 0.0,
21
+ "encoder_attention_heads": 6,
22
+ "encoder_ffn_dim": 1536,
23
+ "encoder_layerdrop": 0.0,
24
+ "encoder_layers": 4,
25
+ "eos_token_id": 50257,
26
+ "forced_decoder_ids": null,
27
+ "init_std": 0.02,
28
+ "is_encoder_decoder": true,
29
+ "max_length": 448,
30
+ "max_source_positions": 1500,
31
+ "max_target_positions": 448,
32
+ "model_type": "whisper",
33
+ "num_hidden_layers": 4,
34
+ "num_mel_bins": 80,
35
+ "pad_token_id": 50257,
36
+ "scale_embedding": false,
37
+ "torch_dtype": "float32",
38
+ "transformers_version": "4.26.0.dev0",
39
+ "use_cache": false,
40
+ "vocab_size": 51865
41
+ }
eval_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 4.01,
3
+ "eval_loss": 0.5099794864654541,
4
+ "eval_runtime": 37.0125,
5
+ "eval_samples_per_second": 4.35,
6
+ "eval_steps_per_second": 0.081,
7
+ "eval_wer": 24.878197320341048
8
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
normalizer.json ADDED
@@ -0,0 +1,1742 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "accessorise": "accessorize",
3
+ "accessorised": "accessorized",
4
+ "accessorises": "accessorizes",
5
+ "accessorising": "accessorizing",
6
+ "acclimatisation": "acclimatization",
7
+ "acclimatise": "acclimatize",
8
+ "acclimatised": "acclimatized",
9
+ "acclimatises": "acclimatizes",
10
+ "acclimatising": "acclimatizing",
11
+ "accoutrements": "accouterments",
12
+ "aeon": "eon",
13
+ "aeons": "eons",
14
+ "aerogramme": "aerogram",
15
+ "aerogrammes": "aerograms",
16
+ "aeroplane": "airplane",
17
+ "aeroplanes": "airplanes",
18
+ "aesthete": "esthete",
19
+ "aesthetes": "esthetes",
20
+ "aesthetic": "esthetic",
21
+ "aesthetically": "esthetically",
22
+ "aesthetics": "esthetics",
23
+ "aetiology": "etiology",
24
+ "ageing": "aging",
25
+ "aggrandisement": "aggrandizement",
26
+ "agonise": "agonize",
27
+ "agonised": "agonized",
28
+ "agonises": "agonizes",
29
+ "agonising": "agonizing",
30
+ "agonisingly": "agonizingly",
31
+ "almanack": "almanac",
32
+ "almanacks": "almanacs",
33
+ "aluminium": "aluminum",
34
+ "amortisable": "amortizable",
35
+ "amortisation": "amortization",
36
+ "amortisations": "amortizations",
37
+ "amortise": "amortize",
38
+ "amortised": "amortized",
39
+ "amortises": "amortizes",
40
+ "amortising": "amortizing",
41
+ "amphitheatre": "amphitheater",
42
+ "amphitheatres": "amphitheaters",
43
+ "anaemia": "anemia",
44
+ "anaemic": "anemic",
45
+ "anaesthesia": "anesthesia",
46
+ "anaesthetic": "anesthetic",
47
+ "anaesthetics": "anesthetics",
48
+ "anaesthetise": "anesthetize",
49
+ "anaesthetised": "anesthetized",
50
+ "anaesthetises": "anesthetizes",
51
+ "anaesthetising": "anesthetizing",
52
+ "anaesthetist": "anesthetist",
53
+ "anaesthetists": "anesthetists",
54
+ "anaesthetize": "anesthetize",
55
+ "anaesthetized": "anesthetized",
56
+ "anaesthetizes": "anesthetizes",
57
+ "anaesthetizing": "anesthetizing",
58
+ "analogue": "analog",
59
+ "analogues": "analogs",
60
+ "analyse": "analyze",
61
+ "analysed": "analyzed",
62
+ "analyses": "analyzes",
63
+ "analysing": "analyzing",
64
+ "anglicise": "anglicize",
65
+ "anglicised": "anglicized",
66
+ "anglicises": "anglicizes",
67
+ "anglicising": "anglicizing",
68
+ "annualised": "annualized",
69
+ "antagonise": "antagonize",
70
+ "antagonised": "antagonized",
71
+ "antagonises": "antagonizes",
72
+ "antagonising": "antagonizing",
73
+ "apologise": "apologize",
74
+ "apologised": "apologized",
75
+ "apologises": "apologizes",
76
+ "apologising": "apologizing",
77
+ "appal": "appall",
78
+ "appals": "appalls",
79
+ "appetiser": "appetizer",
80
+ "appetisers": "appetizers",
81
+ "appetising": "appetizing",
82
+ "appetisingly": "appetizingly",
83
+ "arbour": "arbor",
84
+ "arbours": "arbors",
85
+ "archaeologically": "archeologically",
86
+ "archaeologist": "archeologist",
87
+ "archaeologists": "archeologists",
88
+ "archaeology": "archeology</span>",
89
+ "archeological": "archaeological",
90
+ "ardour": "ardor",
91
+ "armour": "armor",
92
+ "armoured": "armored",
93
+ "armourer": "armorer",
94
+ "armourers": "armorers",
95
+ "armouries": "armories",
96
+ "armoury": "armory",
97
+ "artefact": "artifact",
98
+ "artefacts": "artifacts",
99
+ "authorise": "authorize",
100
+ "authorised": "authorized",
101
+ "authorises": "authorizes",
102
+ "authorising": "authorizing",
103
+ "axe": "ax",
104
+ "backpedalled": "backpedaled",
105
+ "backpedalling": "backpedaling",
106
+ "bannister": "banister",
107
+ "bannisters": "banisters",
108
+ "baptise": "baptize",
109
+ "baptised": "baptized",
110
+ "baptises": "baptizes",
111
+ "baptising": "baptizing",
112
+ "bastardise": "bastardize",
113
+ "bastardised": "bastardized",
114
+ "bastardises": "bastardizes",
115
+ "bastardising": "bastardizing",
116
+ "battleax": "battleaxe",
117
+ "baulk": "balk",
118
+ "baulked": "balked",
119
+ "baulking": "balking",
120
+ "baulks": "balks",
121
+ "bedevilled": "bedeviled",
122
+ "bedevilling": "bedeviling",
123
+ "behaviour": "behavior",
124
+ "behavioural": "behavioral",
125
+ "behaviourism": "behaviorism",
126
+ "behaviourist": "behaviorist",
127
+ "behaviourists": "behaviorists",
128
+ "behaviours": "behaviors",
129
+ "behove": "behoove",
130
+ "behoved": "behooved",
131
+ "behoves": "behooves",
132
+ "bejewelled": "bejeweled",
133
+ "belabour": "belabor",
134
+ "belaboured": "belabored",
135
+ "belabouring": "belaboring",
136
+ "belabours": "belabors",
137
+ "bevelled": "beveled",
138
+ "bevvies": "bevies",
139
+ "bevvy": "bevy",
140
+ "biassed": "biased",
141
+ "biassing": "biasing",
142
+ "bingeing": "binging",
143
+ "bougainvillaea": "bougainvillea",
144
+ "bougainvillaeas": "bougainvilleas",
145
+ "bowdlerise": "bowdlerize",
146
+ "bowdlerised": "bowdlerized",
147
+ "bowdlerises": "bowdlerizes",
148
+ "bowdlerising": "bowdlerizing",
149
+ "breathalyse": "breathalyze",
150
+ "breathalysed": "breathalyzed",
151
+ "breathalyser": "breathalyzer",
152
+ "breathalysers": "breathalyzers",
153
+ "breathalyses": "breathalyzes",
154
+ "breathalysing": "breathalyzing",
155
+ "brutalise": "brutalize",
156
+