File size: 6,369 Bytes
1505910
 
 
 
 
 
 
 
 
 
 
 
 
 
76c72e1
1505910
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: bert-base-uncased_ai4privacy_en
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bert-base-uncased_ai4privacy_en

This model is a fine-tuned version of [google-bert/bert-base-uncased](https://huggingface.co/google-bert/bert-base-uncased) on the English subset of [ai4privacy/pii-masking-200k](https://huggingface.co/datasets/ai4privacy/pii-masking-200k) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0702
- Overall Precision: 0.9259
- Overall Recall: 0.9464
- Overall F1: 0.9360
- Overall Accuracy: 0.9706
- Accountname F1: 0.9924
- Accountnumber F1: 0.9905
- Age F1: 0.9339
- Amount F1: 0.9278
- Bic F1: 0.9598
- Bitcoinaddress F1: 0.9801
- Buildingnumber F1: 0.9091
- City F1: 0.9564
- Companyname F1: 0.9908
- County F1: 0.9853
- Creditcardcvv F1: 0.9639
- Creditcardissuer F1: 0.9868
- Creditcardnumber F1: 0.8929
- Currency F1: 0.7726
- Currencycode F1: 0.8608
- Currencyname F1: 0.3650
- Currencysymbol F1: 0.9536
- Date F1: 0.8590
- Dob F1: 0.6490
- Email F1: 0.9945
- Ethereumaddress F1: 0.9986
- Eyecolor F1: 0.9688
- Firstname F1: 0.9790
- Gender F1: 0.9832
- Height F1: 0.9906
- Iban F1: 1.0
- Ip F1: 0.1025
- Ipv4 F1: 0.8217
- Ipv6 F1: 0.7506
- Jobarea F1: 0.9306
- Jobtitle F1: 0.9938
- Jobtype F1: 0.9508
- Lastname F1: 0.9480
- Litecoinaddress F1: 0.9345
- Mac F1: 1.0
- Maskednumber F1: 0.8609
- Middlename F1: 0.9601
- Nearbygpscoordinate F1: 1.0
- Ordinaldirection F1: 0.9784
- Password F1: 0.9839
- Phoneimei F1: 0.9986
- Phonenumber F1: 0.9903
- Pin F1: 0.9390
- Prefix F1: 0.9441
- Secondaryaddress F1: 0.9945
- Sex F1: 0.9780
- Ssn F1: 0.9898
- State F1: 0.9805
- Street F1: 0.9693
- Time F1: 0.9843
- Url F1: 0.9984
- Useragent F1: 0.9918
- Username F1: 0.9909
- Vehiclevin F1: 0.9856
- Vehiclevrm F1: 0.9653
- Zipcode F1: 0.8990

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 2

### Training results

| Training Loss | Epoch | Step | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Accountname F1 | Accountnumber F1 | Age F1 | Amount F1 | Bic F1 | Bitcoinaddress F1 | Buildingnumber F1 | City F1 | Companyname F1 | County F1 | Creditcardcvv F1 | Creditcardissuer F1 | Creditcardnumber F1 | Currency F1 | Currencycode F1 | Currencyname F1 | Currencysymbol F1 | Date F1 | Dob F1 | Email F1 | Ethereumaddress F1 | Eyecolor F1 | Firstname F1 | Gender F1 | Height F1 | Iban F1 | Ip F1  | Ipv4 F1 | Ipv6 F1 | Jobarea F1 | Jobtitle F1 | Jobtype F1 | Lastname F1 | Litecoinaddress F1 | Mac F1 | Maskednumber F1 | Middlename F1 | Nearbygpscoordinate F1 | Ordinaldirection F1 | Password F1 | Phoneimei F1 | Phonenumber F1 | Pin F1 | Prefix F1 | Secondaryaddress F1 | Sex F1 | Ssn F1 | State F1 | Street F1 | Time F1 | Url F1 | Useragent F1 | Username F1 | Vehiclevin F1 | Vehiclevrm F1 | Zipcode F1 |
|:-------------:|:-----:|:----:|:---------------:|:-----------------:|:--------------:|:----------:|:----------------:|:--------------:|:----------------:|:------:|:---------:|:------:|:-----------------:|:-----------------:|:-------:|:--------------:|:---------:|:----------------:|:-------------------:|:-------------------:|:-----------:|:---------------:|:---------------:|:-----------------:|:-------:|:------:|:--------:|:------------------:|:-----------:|:------------:|:---------:|:---------:|:-------:|:------:|:-------:|:-------:|:----------:|:-----------:|:----------:|:-----------:|:------------------:|:------:|:---------------:|:-------------:|:----------------------:|:-------------------:|:-----------:|:------------:|:--------------:|:------:|:---------:|:-------------------:|:------:|:------:|:--------:|:---------:|:-------:|:------:|:------------:|:-----------:|:-------------:|:-------------:|:----------:|
| 0.0939        | 1.0   | 4350 | 0.0801          | 0.8951            | 0.9309         | 0.9126     | 0.9666           | 0.9840         | 0.9896           | 0.9182 | 0.8769    | 0.9127 | 0.9627            | 0.8770            | 0.9616  | 0.9847         | 0.9712    | 0.9373           | 0.9813              | 0.8406              | 0.3934      | 0.7451          | 0.1372          | 0.9266            | 0.8354  | 0.5796 | 0.9920   | 0.9877             | 0.9037      | 0.9642       | 0.9789    | 0.9906    | 0.9874  | 0.0    | 0.8416  | 0.8087  | 0.8854     | 0.9825      | 0.9426     | 0.9213      | 0.9015             | 0.9806 | 0.7978          | 0.9543        | 1.0                    | 0.9828              | 0.9689      | 0.9917       | 0.9777         | 0.8764 | 0.9340    | 0.9913              | 0.9761 | 0.9949 | 0.9553   | 0.9561    | 0.9723  | 0.9921 | 0.9906       | 0.9779      | 0.9942        | 0.9684        | 0.8522     |
| 0.0644        | 2.0   | 8700 | 0.0702          | 0.9259            | 0.9464         | 0.9360     | 0.9706           | 0.9924         | 0.9905           | 0.9339 | 0.9278    | 0.9598 | 0.9801            | 0.9091            | 0.9564  | 0.9908         | 0.9853    | 0.9639           | 0.9868              | 0.8929              | 0.7726      | 0.8608          | 0.3650          | 0.9536            | 0.8590  | 0.6490 | 0.9945   | 0.9986             | 0.9688      | 0.9790       | 0.9832    | 0.9906    | 1.0     | 0.1025 | 0.8217  | 0.7506  | 0.9306     | 0.9938      | 0.9508     | 0.9480      | 0.9345             | 1.0    | 0.8609          | 0.9601        | 1.0                    | 0.9784              | 0.9839      | 0.9986       | 0.9903         | 0.9390 | 0.9441    | 0.9945              | 0.9780 | 0.9898 | 0.9805   | 0.9693    | 0.9843  | 0.9984 | 0.9918       | 0.9909      | 0.9856        | 0.9653        | 0.8990     |


### Framework versions

- Transformers 4.26.1
- Pytorch 2.0.0.post101
- Datasets 2.10.1
- Tokenizers 0.13.3