Edit model card

finetuned_helsinki_peft_model__en_to_ar

This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ar on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7775
  • Bleu: 27.1364
  • Gen Len: 13.5265

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 24
  • eval_batch_size: 24
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 10000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
3.0607 0.2 500 1.8457 26.8804 13.4975
2.7235 0.4 1000 1.8179 26.9476 13.5825
2.7011 0.6 1500 1.8063 26.8946 13.632
2.641 0.8 2000 1.7996 27.0619 13.613
2.7115 1.0 2500 1.7959 26.9972 13.616
2.694 1.2 3000 1.7931 27.0648 13.587
2.6653 1.4 3500 1.7906 27.058 13.5555
2.6602 1.6 4000 1.7882 26.9729 13.5755
2.6234 1.8 4500 1.7838 27.0022 13.566
2.5851 2.0 5000 1.7827 26.9623 13.561
2.5532 2.2 5500 1.7811 27.1004 13.5305
2.6314 2.4 6000 1.7800 26.9216 13.4905
2.6261 2.6 6500 1.7792 26.9747 13.52
2.6228 2.8 7000 1.7787 26.9506 13.529
2.7118 3.0 7500 1.7781 26.9363 13.5615
2.6205 3.2 8000 1.7777 27.0652 13.555
2.5799 3.4 8500 1.7775 27.0987 13.5325
2.6044 3.6 9000 1.7776 27.1322 13.5225
2.6391 3.8 9500 1.7775 27.1541 13.5285
2.6589 4.0 10000 1.7775 27.1364 13.5265

Framework versions

  • PEFT 0.10.1.dev0
  • Transformers 4.40.2
  • Pytorch 2.3.0+cu118
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for mido545/finetuned_helsinki_peft_model__en_to_ar

Adapter
(3)
this model