fgiauna's picture
update model card README.md
6e26020
|
raw
history blame
8.25 kB
metadata
license: mit
tags:
  - generated_from_trainer
model-index:
  - name: camembert-ner-finetuned-jul
    results: []

camembert-ner-finetuned-jul

This model is a fine-tuned version of Jean-Baptiste/camembert-ner on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1016
  • Loc: {'precision': 0.7560975609756098, 'recall': 0.7848101265822784, 'f1': 0.7701863354037267, 'number': 316}
  • Misc: {'precision': 0.5681818181818182, 'recall': 0.44642857142857145, 'f1': 0.5, 'number': 56}
  • Org: {'precision': 0.8184818481848185, 'recall': 0.8184818481848185, 'f1': 0.8184818481848186, 'number': 303}
  • Per: {'precision': 0.8456591639871383, 'recall': 0.8167701863354038, 'f1': 0.8309636650868879, 'number': 322}
  • Overall Precision: 0.7951
  • Overall Recall: 0.7864
  • Overall F1: 0.7907
  • Overall Accuracy: 0.9834

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Loc Misc Org Per Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 1.0 476 0.0747 {'precision': 0.6298342541436464, 'recall': 0.7215189873417721, 'f1': 0.672566371681416, 'number': 316} {'precision': 0.6666666666666666, 'recall': 0.2857142857142857, 'f1': 0.4, 'number': 56} {'precision': 0.6743515850144092, 'recall': 0.7722772277227723, 'f1': 0.7199999999999999, 'number': 303} {'precision': 0.7070422535211267, 'recall': 0.7795031055900621, 'f1': 0.741506646971935, 'number': 322} 0.6700 0.7312 0.6993 0.9795
0.1172 2.0 952 0.0640 {'precision': 0.7471264367816092, 'recall': 0.8227848101265823, 'f1': 0.7831325301204819, 'number': 316} {'precision': 0.6363636363636364, 'recall': 0.375, 'f1': 0.47191011235955066, 'number': 56} {'precision': 0.7660256410256411, 'recall': 0.7887788778877888, 'f1': 0.7772357723577236, 'number': 303} {'precision': 0.8138801261829653, 'recall': 0.8012422360248447, 'f1': 0.8075117370892019, 'number': 322} 0.7703 0.7803 0.7753 0.9839
0.0503 3.0 1428 0.0718 {'precision': 0.707774798927614, 'recall': 0.8354430379746836, 'f1': 0.7663280116110305, 'number': 316} {'precision': 0.75, 'recall': 0.42857142857142855, 'f1': 0.5454545454545454, 'number': 56} {'precision': 0.7639751552795031, 'recall': 0.8118811881188119, 'f1': 0.7872, 'number': 303} {'precision': 0.8024316109422492, 'recall': 0.8198757763975155, 'f1': 0.8110599078341013, 'number': 322} 0.7557 0.8004 0.7774 0.9831
0.0273 4.0 1904 0.0859 {'precision': 0.7601246105919003, 'recall': 0.7721518987341772, 'f1': 0.7660910518053375, 'number': 316} {'precision': 0.5897435897435898, 'recall': 0.4107142857142857, 'f1': 0.4842105263157895, 'number': 56} {'precision': 0.75, 'recall': 0.8514851485148515, 'f1': 0.7975270479134468, 'number': 303} {'precision': 0.8178913738019169, 'recall': 0.7950310559006211, 'f1': 0.8062992125984252, 'number': 322} 0.7679 0.7834 0.7756 0.9823
0.0179 5.0 2380 0.0924 {'precision': 0.7652733118971061, 'recall': 0.7531645569620253, 'f1': 0.759170653907496, 'number': 316} {'precision': 0.575, 'recall': 0.4107142857142857, 'f1': 0.47916666666666663, 'number': 56} {'precision': 0.8116883116883117, 'recall': 0.8250825082508251, 'f1': 0.8183306055646481, 'number': 303} {'precision': 0.8264984227129337, 'recall': 0.8136645962732919, 'f1': 0.8200312989045383, 'number': 322} 0.7920 0.7753 0.7836 0.9834
0.0126 6.0 2856 0.0881 {'precision': 0.7197640117994101, 'recall': 0.7721518987341772, 'f1': 0.7450381679389314, 'number': 316} {'precision': 0.5333333333333333, 'recall': 0.42857142857142855, 'f1': 0.4752475247524753, 'number': 56} {'precision': 0.8245033112582781, 'recall': 0.8217821782178217, 'f1': 0.8231404958677686, 'number': 303} {'precision': 0.8096676737160121, 'recall': 0.8322981366459627, 'f1': 0.8208269525267994, 'number': 322} 0.7719 0.7874 0.7795 0.9830
0.0094 7.0 3332 0.0988 {'precision': 0.7570977917981072, 'recall': 0.759493670886076, 'f1': 0.7582938388625592, 'number': 316} {'precision': 0.575, 'recall': 0.4107142857142857, 'f1': 0.47916666666666663, 'number': 56} {'precision': 0.8305647840531561, 'recall': 0.8250825082508251, 'f1': 0.8278145695364238, 'number': 303} {'precision': 0.8213166144200627, 'recall': 0.8136645962732919, 'f1': 0.8174726989079563, 'number': 322} 0.7932 0.7773 0.7852 0.9833
0.0057 8.0 3808 0.1023 {'precision': 0.7704402515723271, 'recall': 0.7753164556962026, 'f1': 0.7728706624605678, 'number': 316} {'precision': 0.6153846153846154, 'recall': 0.42857142857142855, 'f1': 0.5052631578947369, 'number': 56} {'precision': 0.8283828382838284, 'recall': 0.8283828382838284, 'f1': 0.8283828382838284, 'number': 303} {'precision': 0.8436482084690554, 'recall': 0.8043478260869565, 'f1': 0.8235294117647061, 'number': 322} 0.8056 0.7813 0.7933 0.9838
0.0058 9.0 4284 0.1005 {'precision': 0.7584097859327217, 'recall': 0.7848101265822784, 'f1': 0.7713841368584758, 'number': 316} {'precision': 0.5555555555555556, 'recall': 0.44642857142857145, 'f1': 0.4950495049504951, 'number': 56} {'precision': 0.8239202657807309, 'recall': 0.8184818481848185, 'f1': 0.8211920529801325, 'number': 303} {'precision': 0.8538961038961039, 'recall': 0.8167701863354038, 'f1': 0.834920634920635, 'number': 322} 0.7992 0.7864 0.7927 0.9836
0.0045 10.0 4760 0.1016 {'precision': 0.7560975609756098, 'recall': 0.7848101265822784, 'f1': 0.7701863354037267, 'number': 316} {'precision': 0.5681818181818182, 'recall': 0.44642857142857145, 'f1': 0.5, 'number': 56} {'precision': 0.8184818481848185, 'recall': 0.8184818481848185, 'f1': 0.8184818481848186, 'number': 303} {'precision': 0.8456591639871383, 'recall': 0.8167701863354038, 'f1': 0.8309636650868879, 'number': 322} 0.7951 0.7864 0.7907 0.9834

Framework versions

  • Transformers 4.29.1
  • Pytorch 2.0.0+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3