Edit model card

dit-tiny_tobacco3482_kd_CEKD_t5.0_a0.7

This model is a fine-tuned version of microsoft/dit-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.1844
  • Accuracy: 0.18
  • Brier Loss: 0.8763
  • Nll: 6.0873
  • F1 Micro: 0.18
  • F1 Macro: 0.0306
  • Ece: 0.2492
  • Aurc: 0.8505

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Accuracy Brier Loss Nll F1 Micro F1 Macro Ece Aurc
No log 0.96 3 3.3625 0.145 0.8999 10.1577 0.145 0.0253 0.2220 0.8466
No log 1.96 6 3.3300 0.145 0.8947 10.5652 0.145 0.0253 0.2237 0.8468
No log 2.96 9 3.2822 0.14 0.8870 8.5877 0.14 0.0453 0.2040 0.8325
No log 3.96 12 3.2442 0.16 0.8812 6.5385 0.16 0.0327 0.2208 0.8814
No log 4.96 15 3.2219 0.155 0.8784 7.1527 0.155 0.0271 0.2352 0.8898
No log 5.96 18 3.2105 0.185 0.8778 8.7319 0.185 0.0517 0.2548 0.8944
No log 6.96 21 3.2032 0.18 0.8778 8.8034 0.18 0.0308 0.2478 0.8527
No log 7.96 24 3.1980 0.18 0.8779 8.1814 0.18 0.0306 0.2635 0.8527
No log 8.96 27 3.1937 0.18 0.8777 7.0314 0.18 0.0306 0.2618 0.8529
No log 9.96 30 3.1915 0.18 0.8776 6.9166 0.18 0.0306 0.2591 0.8537
No log 10.96 33 3.1900 0.18 0.8774 6.8864 0.18 0.0306 0.2551 0.8535
No log 11.96 36 3.1889 0.18 0.8773 6.5148 0.18 0.0306 0.2547 0.8532
No log 12.96 39 3.1881 0.18 0.8771 6.1469 0.18 0.0306 0.2543 0.8530
No log 13.96 42 3.1872 0.18 0.8769 6.1318 0.18 0.0306 0.2538 0.8525
No log 14.96 45 3.1865 0.18 0.8768 6.0783 0.18 0.0306 0.2501 0.8525
No log 15.96 48 3.1859 0.18 0.8766 6.0654 0.18 0.0306 0.2500 0.8520
No log 16.96 51 3.1855 0.18 0.8766 6.0809 0.18 0.0306 0.2459 0.8516
No log 17.96 54 3.1855 0.18 0.8766 6.0610 0.18 0.0306 0.2497 0.8515
No log 18.96 57 3.1854 0.18 0.8766 6.0659 0.18 0.0306 0.2579 0.8515
No log 19.96 60 3.1850 0.18 0.8764 6.0737 0.18 0.0306 0.2656 0.8513
No log 20.96 63 3.1848 0.18 0.8764 6.0869 0.18 0.0306 0.2575 0.8510
No log 21.96 66 3.1846 0.18 0.8764 6.1423 0.18 0.0306 0.2533 0.8510
No log 22.96 69 3.1845 0.18 0.8763 6.1047 0.18 0.0306 0.2532 0.8505
No log 23.96 72 3.1845 0.18 0.8763 6.0895 0.18 0.0306 0.2532 0.8504
No log 24.96 75 3.1844 0.18 0.8763 6.0873 0.18 0.0306 0.2492 0.8505

Framework versions

  • Transformers 4.26.1
  • Pytorch 1.13.1.post200
  • Datasets 2.9.0
  • Tokenizers 0.13.2
Downloads last month
13