judy93536's picture
End of training
6b4c2c7
metadata
license: apache-2.0
base_model: judy93536/distilroberta-rbm231k-ep20-op40
tags:
  - generated_from_trainer
datasets:
  - financial_phrasebank
metrics:
  - accuracy
model-index:
  - name: distilroberta-rbm231k-ep20-op40-all-agree_2p2k
    results:
      - task:
          name: Text Classification
          type: text-classification
        dataset:
          name: financial_phrasebank
          type: financial_phrasebank
          config: sentences_allagree
          split: train
          args: sentences_allagree
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9602649006622517

distilroberta-rbm231k-ep20-op40-all-agree_2p2k

This model is a fine-tuned version of judy93536/distilroberta-rbm231k-ep20-op40 on the financial_phrasebank dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1320
  • Accuracy: 0.9603

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1.253335054745316e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.4
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 114 1.0789 0.4327
No log 2.0 228 1.0442 0.6115
No log 3.0 342 0.9709 0.6137
No log 4.0 456 0.8693 0.6115
1.0223 5.0 570 0.8346 0.6115
1.0223 6.0 684 0.7876 0.6115
1.0223 7.0 798 0.7355 0.6203
1.0223 8.0 912 0.6974 0.6733
0.7904 9.0 1026 0.6535 0.7219
0.7904 10.0 1140 0.6045 0.7550
0.7904 11.0 1254 0.5653 0.7770
0.7904 12.0 1368 0.5122 0.7859
0.7904 13.0 1482 0.4652 0.7881
0.5806 14.0 1596 0.4319 0.7991
0.5806 15.0 1710 0.3951 0.8057
0.5806 16.0 1824 0.3557 0.8168
0.5806 17.0 1938 0.3174 0.8565
0.3751 18.0 2052 0.2652 0.9007
0.3751 19.0 2166 0.2188 0.9404
0.3751 20.0 2280 0.1797 0.9470
0.3751 21.0 2394 0.1822 0.9492
0.1873 22.0 2508 0.1523 0.9514
0.1873 23.0 2622 0.1425 0.9581
0.1873 24.0 2736 0.1394 0.9581
0.1873 25.0 2850 0.1396 0.9603
0.1873 26.0 2964 0.1345 0.9603
0.1072 27.0 3078 0.1334 0.9603
0.1072 28.0 3192 0.1322 0.9603
0.1072 29.0 3306 0.1316 0.9603
0.1072 30.0 3420 0.1320 0.9603

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0