Edit model card

North-T5-large_NO-QA-idun-20epoch

This model is a fine-tuned version of north/t5_large_NCC_modern on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5362
  • Rouge1: 38.769
  • Rouge2: 15.9471
  • Rougel: 26.5124
  • Rougelsum: 34.9947
  • Gen Len: 94.6489

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 0.98 46 4.5352 17.0492 4.4759 10.456 15.2577 118.1702
No log 1.99 93 1.9786 31.9727 10.2837 18.7297 28.8476 82.0426
No log 2.99 140 1.6381 33.0332 11.5675 20.361 29.3026 74.0
No log 4.0 187 1.5769 35.8586 13.2608 23.5819 32.6021 90.3298
No log 4.98 233 1.5411 37.644 14.5558 24.9032 33.9629 94.6702
No log 5.99 280 1.5349 36.9237 14.2153 24.9174 33.2155 85.8404
No log 6.99 327 1.5120 38.1967 15.2791 25.5664 34.4922 92.9362
No log 8.0 374 1.5094 38.8448 15.6077 26.2711 34.9747 93.0851
No log 8.98 420 1.5133 38.0596 15.1036 26.38 34.2785 90.2021
No log 9.99 467 1.5221 38.465 14.8936 26.014 34.6291 97.8085
2.5538 10.99 514 1.5207 39.806 16.0433 27.2048 35.8647 97.9149
2.5538 12.0 561 1.5194 38.1513 15.4085 26.1441 34.4682 88.4787
2.5538 12.98 607 1.5199 38.2157 15.363 26.0975 34.5609 94.9043
2.5538 13.99 654 1.5243 38.6499 15.4096 25.9533 34.6419 94.0638
2.5538 14.99 701 1.5297 38.1416 15.1179 26.139 34.6999 93.7447
2.5538 16.0 748 1.5320 38.6153 15.6661 26.3465 34.9576 95.7979
2.5538 16.98 794 1.5318 38.0022 15.5531 25.9628 34.2994 93.3936
2.5538 17.99 841 1.5364 37.9608 15.396 25.7531 34.3565 92.2553
2.5538 18.99 888 1.5352 38.9808 16.235 26.7723 35.2738 95.3085
2.5538 19.68 920 1.5362 38.769 15.9471 26.5124 34.9947 94.6489

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Akselssss/North-T5-large_NO-QA-idun-20epoch

Finetuned
this model