hyojin99 commited on
Commit
7cc7985
1 Parent(s): 6684504

End of training

Browse files
README.md CHANGED
@@ -2,12 +2,12 @@
2
  language:
3
  - ko
4
  license: apache-2.0
 
5
  tags:
6
  - hf-asr-leaderboard
7
  - generated_from_trainer
8
  datasets:
9
  - hyojin99/EBRC
10
- base_model: openai/whisper-base
11
  model-index:
12
  - name: ft_model
13
  results: []
@@ -20,8 +20,8 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on the EBRC dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.4271
24
- - Cer: 19.8362
25
 
26
  ## Model description
27
 
@@ -47,24 +47,30 @@ The following hyperparameters were used during training:
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
  - lr_scheduler_warmup_steps: 50
50
- - training_steps: 6000
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
- | Training Loss | Epoch | Step | Validation Loss | Cer |
56
- |:-------------:|:-----:|:----:|:---------------:|:-------:|
57
- | 0.5853 | 0.4 | 1000 | 0.5518 | 34.2215 |
58
- | 0.4595 | 0.8 | 2000 | 0.4852 | 23.2105 |
59
- | 0.313 | 1.2 | 3000 | 0.4593 | 21.0304 |
60
- | 0.327 | 1.6 | 4000 | 0.4400 | 21.5894 |
61
- | 0.311 | 2.0 | 5000 | 0.4277 | 19.6565 |
62
- | 0.2514 | 2.4 | 6000 | 0.4271 | 19.8362 |
 
 
 
 
 
 
63
 
64
 
65
  ### Framework versions
66
 
67
- - Transformers 4.39.0.dev0
68
  - Pytorch 2.2.1+cu121
69
  - Datasets 2.18.0
70
  - Tokenizers 0.15.2
 
2
  language:
3
  - ko
4
  license: apache-2.0
5
+ base_model: openai/whisper-base
6
  tags:
7
  - hf-asr-leaderboard
8
  - generated_from_trainer
9
  datasets:
10
  - hyojin99/EBRC
 
11
  model-index:
12
  - name: ft_model
13
  results: []
 
20
 
21
  This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on the EBRC dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.4127
24
+ - Cer: 18.2381
25
 
26
  ## Model description
27
 
 
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
  - lr_scheduler_warmup_steps: 50
50
+ - training_steps: 12000
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Cer |
56
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|
57
+ | 0.5845 | 0.4 | 1000 | 0.5512 | 34.8150 |
58
+ | 0.4598 | 0.8 | 2000 | 0.4840 | 22.6069 |
59
+ | 0.3077 | 1.2 | 3000 | 0.4570 | 20.7128 |
60
+ | 0.3212 | 1.6 | 4000 | 0.4381 | 21.4198 |
61
+ | 0.3027 | 2.0 | 5000 | 0.4181 | 20.1164 |
62
+ | 0.219 | 2.4 | 6000 | 0.4180 | 19.6479 |
63
+ | 0.2373 | 2.8 | 7000 | 0.4089 | 18.6477 |
64
+ | 0.1342 | 3.2 | 8000 | 0.4127 | 18.3603 |
65
+ | 0.1601 | 3.6 | 9000 | 0.4104 | 18.4824 |
66
+ | 0.1489 | 4.0 | 10000 | 0.4084 | 18.0628 |
67
+ | 0.1308 | 4.4 | 11000 | 0.4134 | 18.2856 |
68
+ | 0.114 | 4.8 | 12000 | 0.4127 | 18.2381 |
69
 
70
 
71
  ### Framework versions
72
 
73
+ - Transformers 4.40.0.dev0
74
  - Pytorch 2.2.1+cu121
75
  - Datasets 2.18.0
76
  - Tokenizers 0.15.2
generation_config.json CHANGED
@@ -167,5 +167,5 @@
167
  "transcribe": 50359,
168
  "translate": 50358
169
  },
170
- "transformers_version": "4.39.0.dev0"
171
  }
 
167
  "transcribe": 50359,
168
  "translate": 50358
169
  },
170
+ "transformers_version": "4.40.0.dev0"
171
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a6481514d8f5f2b1089092606fa5d4f5b8396dfd88c374d47d650ffe2e578252
3
  size 290403936
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f02a4a3e107b610f62c7fc239847e118e278e957e0017fd3973fa2c26bbdfe29
3
  size 290403936
runs/Mar20_21-44-59_b2a4d70b3dce/events.out.tfevents.1710971100.b2a4d70b3dce.406.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:bf7abe2f9cd2844d82a3f66d7bae09fa2d7d8cea86607db58af928c69b9560ae
3
- size 135912
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a521355fd8d8a240185808e7578e61a65fae1ea5f83eb733613053537afb016
3
+ size 136266