Arabic2 / README.md
PontifexMaximus's picture
update model card README.md
79de800
|
raw
history blame
No virus
2.77 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - opus100
metrics:
  - bleu
model-index:
  - name: opus-mt-ar-en-finetuned-ar-to-en
    results:
      - task:
          name: Sequence-to-sequence Language Modeling
          type: text2text-generation
        dataset:
          name: opus100
          type: opus100
          args: ar-en
        metrics:
          - name: Bleu
            type: bleu
            value: 46.8089

opus-mt-ar-en-finetuned-ar-to-en

This model is a fine-tuned version of Helsinki-NLP/opus-mt-ar-en on the opus100 dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0713
  • Bleu: 46.8089
  • Gen Len: 14.1755

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-06
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 16
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 312 1.2132 43.7663 14.4193
1.3072 2.0 624 1.1869 44.1712 14.4054
1.3072 3.0 936 1.1675 44.5448 14.2182
1.2535 4.0 1248 1.1510 44.8762 14.2004
1.2309 5.0 1560 1.1375 45.2067 14.1375
1.2309 6.0 1872 1.1251 45.4479 14.1887
1.21 7.0 2184 1.1145 45.7117 14.2103
1.21 8.0 2496 1.1051 45.951 14.1665
1.1896 9.0 2808 1.0968 46.1647 14.178
1.1837 10.0 3120 1.0899 46.342 14.1819
1.1837 11.0 3432 1.0842 46.4735 14.1672
1.1589 12.0 3744 1.0795 46.561 14.1729
1.1523 13.0 4056 1.0759 46.6884 14.1706
1.1523 14.0 4368 1.0733 46.7542 14.1735
1.1524 15.0 4680 1.0718 46.7835 14.1712
1.1524 16.0 4992 1.0713 46.8089 14.1755

Framework versions

  • Transformers 4.19.2
  • Pytorch 1.7.1+cu110
  • Datasets 2.2.2
  • Tokenizers 0.12.1