Bisher's picture
Model save
72cb157 verified
metadata
license: apache-2.0
base_model: Bisher/wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - precision
  - recall
  - f1
model-index:
  - name: wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen_chngd_classifier
    results: []

wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen_chngd_classifier

This model is a fine-tuned version of Bisher/wav2vec2_ASV_deepfake_audio_detection_DF_finetune_frozen on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6355
  • Accuracy: 0.9146
  • Precision: 0.9161
  • Recall: 0.9146
  • F1: 0.8866
  • Tp: 330
  • Tn: 17889
  • Fn: 1677
  • Fp: 24
  • Eer: 0.1639
  • Min Tdcf: 0.0357
  • Auc Roc: 0.9189

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1 Tp Tn Fn Fp Eer Min Tdcf Auc Roc
0.6678 0.0206 5 0.6581 0.9410 0.9381 0.9410 0.9331 963 17782 1044 131 0.0717 0.0322 0.9729
0.6124 0.0412 10 0.5702 0.9060 0.9082 0.9060 0.8681 145 17902 1862 11 0.1978 0.0322 0.8640
0.5335 0.0619 15 0.5016 0.9016 0.9093 0.9016 0.8573 48 17912 1959 1 0.3503 0.0363 0.6924
0.4592 0.0825 20 0.4335 0.9094 0.9108 0.9094 0.8759 221 17895 1786 18 0.1774 0.0323 0.8683
0.3927 0.1031 25 0.3781 0.9104 0.9110 0.9104 0.8782 244 17891 1763 22 0.2297 0.0315 0.8110
0.3231 0.1237 30 0.3201 0.9138 0.9151 0.9138 0.8851 314 17889 1693 24 0.2900 0.0309 0.7424
0.2519 0.1443 35 0.2804 0.9196 0.9215 0.9196 0.8960 431 17887 1576 26 0.1141 0.0295 0.9340
0.1963 0.1649 40 0.2395 0.9346 0.9350 0.9346 0.9216 751 17866 1256 47 0.0898 0.0286 0.9623
0.1423 0.1856 45 0.3794 0.9048 0.9032 0.9048 0.8659 127 17897 1880 16 0.0901 0.0298 0.9659
0.1046 0.2062 50 0.3194 0.9287 0.9286 0.9287 0.9124 636 17863 1371 50 0.0751 0.0318 0.9767
0.0681 0.2268 55 0.4859 0.9021 0.9055 0.9021 0.8586 60 17909 1947 4 0.1709 0.0378 0.9015
0.0473 0.2474 60 0.5605 0.9100 0.9101 0.9100 0.8774 237 17890 1770 23 0.7055 0.0382 0.3149
0.0323 0.2680 65 0.5107 0.9164 0.9178 0.9164 0.8900 367 17887 1640 26 0.0703 0.0337 0.9791
0.0339 0.2887 70 0.8921 0.9026 0.9009 0.9026 0.8604 77 17903 1930 10 0.8316 0.0435 0.1773
0.0423 0.3093 75 0.8964 0.9030 0.8998 0.9030 0.8615 87 17900 1920 13 0.0732 0.0327 0.9753
0.0456 0.3299 80 1.0843 0.9013 0.8935 0.9013 0.8574 51 17902 1956 11 0.8520 0.0478 0.1126
0.0712 0.3505 85 0.8587 0.9023 0.8998 0.9023 0.8597 71 17903 1936 10 0.8665 0.0480 0.0990
0.0629 0.3711 90 0.4810 0.9267 0.9278 0.9267 0.9087 583 17877 1424 36 0.0848 0.0328 0.9683
0.0477 0.3918 95 0.9415 0.9094 0.9114 0.9094 0.8757 218 17897 1789 16 0.1219 0.0408 0.8890
0.0484 0.4124 100 0.7774 0.9150 0.9170 0.9150 0.8873 336 17891 1671 22 0.6906 0.0383 0.3129
0.0449 0.4330 105 0.3949 0.9197 0.9199 0.9197 0.8967 444 17876 1563 37 0.6527 0.0363 0.3629
0.0567 0.4536 110 0.5853 0.9232 0.9212 0.9232 0.9040 540 17850 1467 63 0.2192 0.0355 0.8158
0.0416 0.4742 115 0.7031 0.9036 0.9054 0.9036 0.8626 95 17905 1912 8 0.7633 0.0408 0.2549
0.1778 0.4948 120 0.5440 0.9033 0.9093 0.9033 0.8613 83 17910 1924 3 0.7389 0.0398 0.2838
0.036 0.5155 125 0.5825 0.9161 0.9187 0.9161 0.8892 356 17893 1651 20 0.1538 0.0366 0.9078
0.0797 0.5361 130 0.5864 0.9027 0.9059 0.9027 0.8601 73 17908 1934 5 0.2236 0.0407 0.7721
0.0669 0.5567 135 0.4264 0.9036 0.9079 0.9036 0.8623 92 17908 1915 5 0.1597 0.0370 0.8791
0.0353 0.5773 140 0.6355 0.9146 0.9161 0.9146 0.8866 330 17889 1677 24 0.1639 0.0357 0.9189

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.4.0
  • Datasets 2.21.0
  • Tokenizers 0.19.1