--- license: mit base_model: microsoft/deberta-v3-base tags: - generated_from_trainer model-index: - name: deberta-v3-base_finetuned_ai4privacy_v2 results: [] --- # deberta-v3-base_finetuned_ai4privacy_v2 This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0211 - Overall Precision: 0.9722 - Overall Recall: 0.9792 - Overall F1: 0.9757 - Overall Accuracy: 0.9915 - Accountname F1: 0.9993 - Accountnumber F1: 0.9986 - Age F1: 0.9884 - Amount F1: 0.9984 - Bic F1: 0.9942 - Bitcoinaddress F1: 0.9974 - Buildingnumber F1: 0.9898 - City F1: 1.0 - Companyname F1: 1.0 - County F1: 0.9976 - Creditcardcvv F1: 0.9541 - Creditcardissuer F1: 0.9970 - Creditcardnumber F1: 0.9754 - Currency F1: 0.8966 - Currencycode F1: 0.9946 - Currencyname F1: 0.7697 - Currencysymbol F1: 0.9958 - Date F1: 0.9778 - Dob F1: 0.9546 - Email F1: 1.0 - Ethereumaddress F1: 1.0 - Eyecolor F1: 0.9925 - Firstname F1: 0.9947 - Gender F1: 1.0 - Height F1: 1.0 - Iban F1: 0.9978 - Ip F1: 0.5404 - Ipv4 F1: 0.8455 - Ipv6 F1: 0.8855 - Jobarea F1: 0.9091 - Jobtitle F1: 1.0 - Jobtype F1: 0.9672 - Lastname F1: 0.9855 - Litecoinaddress F1: 0.9949 - Mac F1: 0.9965 - Maskednumber F1: 0.9836 - Middlename F1: 0.7385 - Nearbygpscoordinate F1: 1.0 - Ordinaldirection F1: 1.0 - Password F1: 1.0 - Phoneimei F1: 0.9978 - Phonenumber F1: 0.9975 - Pin F1: 0.9820 - Prefix F1: 0.9872 - Secondaryaddress F1: 1.0 - Sex F1: 0.9916 - Ssn F1: 0.9960 - State F1: 0.9967 - Street F1: 0.9991 - Time F1: 1.0 - Url F1: 1.0 - Useragent F1: 0.9981 - Username F1: 1.0 - Vehiclevin F1: 0.9950 - Vehiclevrm F1: 0.9870 - Zipcode F1: 0.9966 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine_with_restarts - lr_scheduler_warmup_ratio: 0.2 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy | Accountname F1 | Accountnumber F1 | Age F1 | Amount F1 | Bic F1 | Bitcoinaddress F1 | Buildingnumber F1 | City F1 | Companyname F1 | County F1 | Creditcardcvv F1 | Creditcardissuer F1 | Creditcardnumber F1 | Currency F1 | Currencycode F1 | Currencyname F1 | Currencysymbol F1 | Date F1 | Dob F1 | Email F1 | Ethereumaddress F1 | Eyecolor F1 | Firstname F1 | Gender F1 | Height F1 | Iban F1 | Ip F1 | Ipv4 F1 | Ipv6 F1 | Jobarea F1 | Jobtitle F1 | Jobtype F1 | Lastname F1 | Litecoinaddress F1 | Mac F1 | Maskednumber F1 | Middlename F1 | Nearbygpscoordinate F1 | Ordinaldirection F1 | Password F1 | Phoneimei F1 | Phonenumber F1 | Pin F1 | Prefix F1 | Secondaryaddress F1 | Sex F1 | Ssn F1 | State F1 | Street F1 | Time F1 | Url F1 | Useragent F1 | Username F1 | Vehiclevin F1 | Vehiclevrm F1 | Zipcode F1 | |:-------------:|:-----:|:----:|:---------------:|:-----------------:|:--------------:|:----------:|:----------------:|:--------------:|:----------------:|:------:|:---------:|:------:|:-----------------:|:-----------------:|:-------:|:--------------:|:---------:|:----------------:|:-------------------:|:-------------------:|:-----------:|:---------------:|:---------------:|:-----------------:|:-------:|:------:|:--------:|:------------------:|:-----------:|:------------:|:---------:|:---------:|:-------:|:------:|:-------:|:-------:|:----------:|:-----------:|:----------:|:-----------:|:------------------:|:------:|:---------------:|:-------------:|:----------------------:|:-------------------:|:-----------:|:------------:|:--------------:|:------:|:---------:|:-------------------:|:------:|:------:|:--------:|:---------:|:-------:|:------:|:------------:|:-----------:|:-------------:|:-------------:|:----------:| | 1.5081 | 1.0 | 669 | 0.1618 | 0.7172 | 0.7690 | 0.7422 | 0.9464 | 0.9134 | 0.7806 | 0.7878 | 0.3447 | 0.6647 | 0.8155 | 0.7413 | 0.6679 | 0.8354 | 0.7719 | 0.0 | 0.0 | 0.3622 | 0.4985 | 0.0 | 0.0 | 0.0166 | 0.8014 | 0.3108 | 0.9665 | 0.9878 | 0.0 | 0.7814 | 0.7444 | 0.9244 | 0.9059 | 0.0 | 0.8207 | 0.7064 | 0.0 | 0.8649 | 0.0 | 0.2521 | 0.0 | 0.9755 | 0.5147 | 0.0 | 0.9918 | 0.0 | 0.9443 | 0.9839 | 0.9700 | 0.1919 | 0.6695 | 0.8664 | 0.0 | 0.8882 | 0.4056 | 0.6466 | 0.9538 | 0.9870 | 0.9519 | 0.7703 | 0.9380 | 0.7398 | 0.6931 | | 0.1884 | 2.0 | 1338 | 0.1022 | 0.8709 | 0.8877 | 0.8792 | 0.9648 | 0.9565 | 0.9263 | 0.8838 | 0.9202 | 0.8272 | 0.8935 | 0.8713 | 0.9326 | 0.9596 | 0.9403 | 0.0 | 0.9699 | 0.8038 | 0.6842 | 0.4724 | 0.3972 | 0.7884 | 0.8431 | 0.5134 | 0.9913 | 0.9830 | 0.8794 | 0.9270 | 0.9272 | 0.9718 | 0.9341 | 0.0 | 0.8180 | 0.8136 | 0.0 | 0.9206 | 0.0882 | 0.5360 | 0.7641 | 0.9789 | 0.6638 | 0.0 | 0.9930 | 0.0 | 0.9487 | 0.9646 | 0.9624 | 0.8082 | 0.9381 | 0.9679 | 0.9333 | 0.9392 | 0.8885 | 0.9371 | 0.9681 | 0.9883 | 0.9481 | 0.9683 | 0.7937 | 0.7858 | 0.8649 | | 0.0915 | 3.0 | 2007 | 0.0748 | 0.8934 | 0.9163 | 0.9047 | 0.9702 | 0.9783 | 0.9562 | 0.8920 | 0.9582 | 0.9474 | 0.9315 | 0.9056 | 0.9587 | 0.9785 | 0.9757 | 0.6190 | 0.9699 | 0.8209 | 0.7763 | 0.8342 | 0.0359 | 0.9223 | 0.7257 | 0.6549 | 0.9954 | 0.9818 | 0.8056 | 0.9518 | 0.9519 | 0.9887 | 0.9806 | 0.0039 | 0.8328 | 0.7923 | 0.0 | 0.9657 | 0.7634 | 0.6688 | 0.8525 | 0.9790 | 0.7631 | 0.0 | 0.9883 | 0.0 | 0.9646 | 0.9957 | 0.9829 | 0.9054 | 0.9413 | 0.9776 | 0.9421 | 0.9663 | 0.9659 | 0.9498 | 0.9682 | 0.9925 | 0.9944 | 0.9913 | 0.9497 | 0.9762 | 0.9148 | | 0.0782 | 4.0 | 2676 | 0.0632 | 0.9004 | 0.9286 | 0.9143 | 0.9622 | 0.9958 | 0.9832 | 0.9433 | 0.9646 | 0.9855 | 0.9655 | 0.9527 | 0.9808 | 0.9904 | 0.9707 | 0.5823 | 0.9670 | 0.8964 | 0.7717 | 0.8830 | 0.3387 | 0.9780 | 0.8780 | 0.6913 | 0.9776 | 0.9951 | 0.8857 | 0.9675 | 0.9865 | 0.9859 | 0.9742 | 0.2961 | 0.8395 | 0.2833 | 0.6471 | 0.9952 | 0.9355 | 0.8277 | 0.8808 | 0.9756 | 0.8859 | 0.0 | 0.9883 | 0.0 | 0.9838 | 0.9946 | 0.9853 | 0.9066 | 0.9351 | 0.9942 | 0.9134 | 0.9911 | 0.9493 | 0.9869 | 0.9768 | 0.9979 | 0.9944 | 0.9953 | 0.9572 | 0.9848 | 0.9489 | | 0.067 | 5.0 | 3345 | 0.0517 | 0.9389 | 0.9496 | 0.9442 | 0.9767 | 0.9936 | 0.9867 | 0.9579 | 0.9584 | 0.9884 | 0.9746 | 0.9542 | 0.9958 | 0.9887 | 0.9903 | 0.8952 | 0.9970 | 0.9397 | 0.8107 | 0.9140 | 0.6029 | 0.9848 | 0.9160 | 0.8132 | 0.9969 | 0.9951 | 0.9412 | 0.9731 | 0.9942 | 0.9915 | 0.9870 | 0.2084 | 0.7981 | 0.8064 | 0.7568 | 0.9827 | 0.9134 | 0.8692 | 0.8289 | 0.9877 | 0.9392 | 0.0513 | 0.9883 | 0.6667 | 0.9722 | 0.9904 | 0.9943 | 0.9591 | 0.9676 | 0.9903 | 0.9748 | 0.9921 | 0.9790 | 0.9904 | 0.9787 | 0.9952 | 0.9926 | 0.9893 | 0.9749 | 0.9891 | 0.9653 | | 0.0568 | 6.0 | 4014 | 0.0472 | 0.9539 | 0.9594 | 0.9567 | 0.9797 | 0.9986 | 0.9986 | 0.9508 | 0.9703 | 0.9884 | 0.9722 | 0.9755 | 0.9975 | 0.9965 | 0.9904 | 0.84 | 0.9670 | 0.95 | 0.8498 | 0.9565 | 0.6278 | 0.9944 | 0.9165 | 0.8625 | 0.9985 | 0.9927 | 0.9630 | 0.9894 | 0.9962 | 0.9944 | 0.9913 | 0.1921 | 0.8345 | 0.8222 | 0.8293 | 0.9979 | 0.9516 | 0.9465 | 0.9340 | 0.9772 | 0.9563 | 0.6667 | 0.9977 | 0.6667 | 0.9985 | 0.9978 | 0.9975 | 0.9704 | 0.9686 | 0.9971 | 0.9748 | 0.9931 | 0.9886 | 0.9948 | 0.9892 | 1.0 | 0.9963 | 0.9980 | 0.9749 | 0.9935 | 0.9830 | | 0.0482 | 7.0 | 4683 | 0.0426 | 0.9632 | 0.9678 | 0.9655 | 0.9826 | 0.9993 | 0.9972 | 0.9707 | 0.9926 | 0.9942 | 0.9602 | 0.9847 | 0.9992 | 0.9983 | 0.9952 | 0.9444 | 0.9970 | 0.9627 | 0.8634 | 0.9730 | 0.72 | 0.9958 | 0.9586 | 0.9102 | 1.0 | 0.9951 | 0.9925 | 0.9917 | 0.9943 | 1.0 | 0.9935 | 0.2256 | 0.8430 | 0.8488 | 0.8571 | 1.0 | 0.9516 | 0.9499 | 0.9198 | 0.9930 | 0.9704 | 0.6866 | 0.9977 | 0.9091 | 1.0 | 0.9978 | 0.9975 | 0.9880 | 0.9797 | 0.9971 | 0.9915 | 0.9940 | 0.9951 | 1.0 | 0.9873 | 1.0 | 0.9963 | 0.9993 | 0.9950 | 0.9870 | 0.9863 | | 0.042 | 8.0 | 5352 | 0.0288 | 0.9697 | 0.9749 | 0.9723 | 0.9874 | 0.9993 | 0.9986 | 0.9808 | 0.9959 | 0.9942 | 0.9932 | 0.9911 | 1.0 | 1.0 | 0.9976 | 0.9444 | 0.9970 | 0.9716 | 0.8907 | 0.9730 | 0.7561 | 0.9944 | 0.9725 | 0.9429 | 1.0 | 0.9976 | 0.9925 | 0.9927 | 0.9971 | 1.0 | 0.9957 | 0.4056 | 0.8498 | 0.8565 | 0.9091 | 1.0 | 0.9672 | 0.9757 | 0.9848 | 0.9965 | 0.9808 | 0.7164 | 1.0 | 1.0 | 1.0 | 0.9978 | 0.9975 | 0.9880 | 0.9872 | 0.9971 | 0.9831 | 0.9960 | 0.9967 | 0.9991 | 0.9941 | 1.0 | 0.9981 | 1.0 | 0.9950 | 0.9892 | 0.9966 | | 0.0327 | 9.0 | 6021 | 0.0237 | 0.9688 | 0.9778 | 0.9733 | 0.9899 | 0.9993 | 0.9986 | 0.9872 | 0.9984 | 0.9942 | 0.9957 | 0.9898 | 1.0 | 1.0 | 1.0 | 0.9444 | 0.9970 | 0.9746 | 0.8958 | 0.9946 | 0.7761 | 0.9958 | 0.9787 | 0.9557 | 1.0 | 1.0 | 0.9925 | 0.9947 | 1.0 | 1.0 | 0.9957 | 0.5173 | 0.8455 | 0.8317 | 0.9091 | 1.0 | 0.9672 | 0.9871 | 0.9949 | 0.9965 | 0.9818 | 0.7576 | 1.0 | 1.0 | 1.0 | 0.9978 | 0.9975 | 0.9820 | 0.9879 | 0.9990 | 0.9916 | 0.9960 | 0.9967 | 0.9991 | 0.9971 | 1.0 | 0.9981 | 1.0 | 0.9950 | 0.9913 | 0.9966 | | 0.0267 | 10.0 | 6690 | 0.0211 | 0.9722 | 0.9792 | 0.9757 | 0.9915 | 0.9993 | 0.9986 | 0.9884 | 0.9984 | 0.9942 | 0.9974 | 0.9898 | 1.0 | 1.0 | 0.9976 | 0.9541 | 0.9970 | 0.9754 | 0.8966 | 0.9946 | 0.7697 | 0.9958 | 0.9778 | 0.9546 | 1.0 | 1.0 | 0.9925 | 0.9947 | 1.0 | 1.0 | 0.9978 | 0.5404 | 0.8455 | 0.8855 | 0.9091 | 1.0 | 0.9672 | 0.9855 | 0.9949 | 0.9965 | 0.9836 | 0.7385 | 1.0 | 1.0 | 1.0 | 0.9978 | 0.9975 | 0.9820 | 0.9872 | 1.0 | 0.9916 | 0.9960 | 0.9967 | 0.9991 | 1.0 | 1.0 | 0.9981 | 1.0 | 0.9950 | 0.9870 | 0.9966 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.15.0