File size: 3,780 Bytes
68928a0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: camembert-ner-finetuned-jul
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# camembert-ner-finetuned-jul

This model is a fine-tuned version of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1879
- Er: {'precision': 0.6520963425512935, 'recall': 0.7567287784679089, 'f1': 0.7005270723526592, 'number': 966}
- Isc: {'precision': 0.6759708737864077, 'recall': 0.6975579211020664, 'f1': 0.6865947611710324, 'number': 1597}
- Oc: {'precision': 0.6144200626959248, 'recall': 0.5714285714285714, 'f1': 0.5921450151057402, 'number': 686}
- Overall Precision: 0.6566
- Overall Recall: 0.6885
- Overall F1: 0.6722
- Overall Accuracy: 0.9400

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3

### Training results

| Training Loss | Epoch | Step | Validation Loss | Er                                                                                                       | Isc                                                                                                       | Oc                                                                                                       | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.2687        | 1.0   | 654  | 0.2022          | {'precision': 0.6098294884653962, 'recall': 0.629399585921325, 'f1': 0.6194600101884871, 'number': 966}  | {'precision': 0.6234604105571847, 'recall': 0.6656230432060113, 'f1': 0.6438522107813446, 'number': 1597} | {'precision': 0.5617792421746294, 'recall': 0.4970845481049563, 'f1': 0.5274555297757154, 'number': 686} | 0.6080            | 0.6193         | 0.6136     | 0.9325           |
| 0.1623        | 2.0   | 1308 | 0.1819          | {'precision': 0.6175523349436393, 'recall': 0.7939958592132506, 'f1': 0.6947463768115941, 'number': 966} | {'precision': 0.6879526003949967, 'recall': 0.654351909830933, 'f1': 0.6707317073170731, 'number': 1597}  | {'precision': 0.6374367622259697, 'recall': 0.5510204081632653, 'f1': 0.5910867865519938, 'number': 686} | 0.6530            | 0.6741         | 0.6633     | 0.9390           |
| 0.128         | 3.0   | 1962 | 0.1879          | {'precision': 0.6520963425512935, 'recall': 0.7567287784679089, 'f1': 0.7005270723526592, 'number': 966} | {'precision': 0.6759708737864077, 'recall': 0.6975579211020664, 'f1': 0.6865947611710324, 'number': 1597} | {'precision': 0.6144200626959248, 'recall': 0.5714285714285714, 'f1': 0.5921450151057402, 'number': 686} | 0.6566            | 0.6885         | 0.6722     | 0.9400           |


### Framework versions

- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3