iulik-pisik commited on
Commit
604a50f
1 Parent(s): 9ecc2d1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +85 -0
README.md ADDED
@@ -0,0 +1,85 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - ro
4
+ license: apache-2.0
5
+ base_model: openai/whisper-small
6
+ tags:
7
+ - hf-asr-leaderboard
8
+ - generated_from_trainer
9
+ datasets:
10
+ - iulik-pisik/horoscop_neti
11
+ - iulik-pisik/audio_vreme
12
+ metrics:
13
+ - wer
14
+ model-index:
15
+ - name: Whisper Small - finetuned on weather and horoscope
16
+ results:
17
+ - task:
18
+ name: Automatic Speech Recognition
19
+ type: automatic-speech-recognition
20
+ dataset:
21
+ name: Vreme ProTV and Horoscop Neti
22
+ type: iulik-pisik/audio_vreme
23
+ config: default
24
+ split: test
25
+ args: 'config: ro, split: test'
26
+ metrics:
27
+ - name: Wer
28
+ type: wer
29
+ value: 8.51
30
+ pipeline_tag: automatic-speech-recognition
31
+ ---
32
+
33
+
34
+
35
+
36
+
37
+ # Whisper Small - finetuned on weather and horoscope
38
+ This model is a fine-tuned version of [openai/whisper-small](openai/whisper-small) on the Vreme ProTV and Horoscop Neti datasets.
39
+ It achieves the following results on the evaluation set:
40
+
41
+ - Loss: 0.0004
42
+ - Wer: 8.51
43
+
44
+
45
+ ## Model description
46
+
47
+ This is a fine-tuned version of the Whisper Small model, specifically adapted for Romanian language Automatic Speech Recognition (ASR)
48
+ in the domains of weather forecasts and horoscopes. The model has been trained on two custom datasets to improve its performance
49
+ in transcribing Romanian speech in these specific contexts.
50
+
51
+ ## Training procedure
52
+
53
+ The model was fine-tuned using transfer learning techniques on the pre-trained Whisper Small model.
54
+ Two custom datasets were used: audio recordings of weather forecasts and horoscopes in Romanian.
55
+
56
+ ### Training hyperparameters
57
+
58
+ The following hyperparameters were used during training:
59
+ - learning_rate: 1e-05
60
+ - train_batch_size: 16
61
+ - eval_batch_size: 8
62
+ - seed: 42
63
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
64
+ - lr_scheduler_type: linear
65
+ - lr_scheduler_warmup_steps: 500
66
+ - training_steps: 3000
67
+ - mixed_precision_training: Native AMP
68
+
69
+ ### Training results
70
+
71
+ | Epoch | Step | Validation Loss | WER |
72
+ |:-----:|:----:|:---------------:|:-------:|
73
+ | 3.85 | 1000 | 0.0332 | 9.1945 |
74
+ | 7.69 | 2000 | 0.0035 | 10.845 |
75
+ | 11.54 | 3000 | 0.0005 | 8.4679 |
76
+ | 15.38 | 4000 | 0.0004 | 8.5127 |
77
+
78
+
79
+
80
+ ### Framework versions
81
+
82
+ - Transformers 4.39.2
83
+ - Pytorch 2.2.1+cu121
84
+ - Datasets 2.18.0
85
+ - Tokenizers 0.15.2