has-abi's picture
update model card README.md
54434c0
|
raw
history blame
No virus
3.06 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - f1
  - accuracy
model-index:
  - name: distilBERT-finetuned-resumes-sections
    results: []

distilBERT-finetuned-resumes-sections

This model is a fine-tuned version of Geotrend/distilbert-base-en-fr-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0450
  • F1: 0.9585
  • Roc Auc: 0.9774
  • Accuracy: 0.9557

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss F1 Roc Auc Accuracy
0.0518 1.0 1174 0.0368 0.9406 0.9635 0.9302
0.0251 2.0 2348 0.0346 0.9375 0.9653 0.9289
0.0136 3.0 3522 0.0343 0.9475 0.9707 0.9425
0.0096 4.0 4696 0.0326 0.9539 0.9737 0.9468
0.007 5.0 5870 0.0357 0.9521 0.9740 0.9480
0.007 6.0 7044 0.0389 0.9509 0.9725 0.9472
0.0034 7.0 8218 0.0403 0.9532 0.9746 0.9510
0.0033 8.0 9392 0.0422 0.9493 0.9722 0.9468
0.0024 9.0 10566 0.0425 0.9512 0.9733 0.9485
0.0023 10.0 11740 0.0431 0.9537 0.9743 0.9502
0.0019 11.0 12914 0.0457 0.9501 0.9719 0.9463
0.002 12.0 14088 0.0428 0.9560 0.9751 0.9536
0.0012 13.0 15262 0.0435 0.9569 0.9761 0.9553
0.001 14.0 16436 0.0464 0.9565 0.9759 0.9544
0.001 15.0 17610 0.0460 0.9574 0.9766 0.9549
0.0007 16.0 18784 0.0450 0.9585 0.9774 0.9557
0.0003 17.0 19958 0.0481 0.9572 0.9764 0.9553
0.0005 18.0 21132 0.0478 0.9576 0.9764 0.9557
0.0005 19.0 22306 0.0483 0.9574 0.9766 0.9553
0.0005 20.0 23480 0.0481 0.9576 0.9766 0.9557

Framework versions

  • Transformers 4.21.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.4.0
  • Tokenizers 0.12.1