wav2vec2-sft-sr / README.md
cminja's picture
Update README.md
90ef4a1 verified
|
raw
history blame contribute delete
No virus
555 Bytes
---
license: apache-2.0
---
wav2vec2-xlsr-530-serbian-colab
This model is a finetune of facebook/wav2vec2-xls-r-300m on an juznevesti serbian dataset.
The following hyperparameters were used during training:
learning_rate: 0.0003
train_batch_size: 16
eval_batch_size: 8
seed: 42
gradient_accumulation_steps: 2
total_train_batch_size: 32
optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
lr_scheduler_type: linear
lr_scheduler_warmup_steps: 500
num_epochs: 30
Framework versions
Transformers 4.20.0
Pytorch 1.12.0
Datasets 2.4.0
Tokenizers 0.12.1