training_results / README.md
logiczmaksimka's picture
Model save
bc2a34f
|
raw
history blame
No virus
3.1 kB
metadata
license: apache-2.0
base_model: ai-forever/ruBert-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - recall
  - precision
  - f1
model-index:
  - name: training_results
    results: []

training_results

This model is a fine-tuned version of ai-forever/ruBert-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.1162
  • Accuracy: 0.7193
  • Recall: 0.7342
  • Precision: 0.7262
  • F1: 0.7271

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Recall Precision F1
No log 1.0 200 0.7908 0.7251 0.6546 0.6651 0.6511
No log 2.0 400 0.9332 0.6988 0.6728 0.7010 0.6670
0.6701 3.0 600 1.0779 0.7427 0.7348 0.7912 0.7473
0.6701 4.0 800 1.2092 0.7427 0.6989 0.7237 0.7037
0.1446 5.0 1000 1.5001 0.7281 0.7049 0.7784 0.7231
0.1446 6.0 1200 1.5468 0.7368 0.7078 0.7808 0.7256
0.1446 7.0 1400 1.7923 0.7193 0.7284 0.7360 0.7230
0.0354 8.0 1600 1.7583 0.7339 0.7357 0.7462 0.7357
0.0354 9.0 1800 1.7942 0.7485 0.7203 0.7576 0.7308
0.0259 10.0 2000 1.9056 0.7310 0.7059 0.7337 0.7143
0.0259 11.0 2200 2.0351 0.7018 0.7047 0.6977 0.6952
0.0259 12.0 2400 1.6337 0.7602 0.7246 0.7861 0.7338
0.0262 13.0 2600 1.9012 0.7251 0.7056 0.7611 0.7148
0.0262 14.0 2800 2.0006 0.7339 0.7022 0.7625 0.7166
0.0302 15.0 3000 2.2857 0.6842 0.7072 0.6874 0.6766
0.0302 16.0 3200 2.0855 0.7310 0.7168 0.7557 0.7232
0.0302 17.0 3400 2.1281 0.7398 0.6868 0.7629 0.7132
0.0321 18.0 3600 2.1162 0.7193 0.7342 0.7262 0.7271

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.14.5
  • Tokenizers 0.14.1