training
This model is a fine-tuned version of microsoft/deberta-base on the cynthiachan/FeedRef_10pct dataset. It achieves the following results on the evaluation set:
- Loss: 0.0810
- Attackid Precision: 1.0
- Attackid Recall: 1.0
- Attackid F1: 1.0
- Attackid Number: 6
- Cve Precision: 1.0
- Cve Recall: 1.0
- Cve F1: 1.0
- Cve Number: 11
- Defenderthreat Precision: 0.0
- Defenderthreat Recall: 0.0
- Defenderthreat F1: 0.0
- Defenderthreat Number: 2
- Domain Precision: 1.0
- Domain Recall: 0.9565
- Domain F1: 0.9778
- Domain Number: 23
- Email Precision: 1.0
- Email Recall: 1.0
- Email F1: 1.0
- Email Number: 3
- Filepath Precision: 0.8841
- Filepath Recall: 0.8788
- Filepath F1: 0.8815
- Filepath Number: 165
- Hostname Precision: 1.0
- Hostname Recall: 1.0
- Hostname F1: 1.0
- Hostname Number: 12
- Ipv4 Precision: 1.0
- Ipv4 Recall: 1.0
- Ipv4 F1: 1.0
- Ipv4 Number: 12
- Md5 Precision: 0.8333
- Md5 Recall: 0.9615
- Md5 F1: 0.8929
- Md5 Number: 52
- Sha1 Precision: 0.6667
- Sha1 Recall: 0.8571
- Sha1 F1: 0.75
- Sha1 Number: 7
- Sha256 Precision: 0.9565
- Sha256 Recall: 1.0
- Sha256 F1: 0.9778
- Sha256 Number: 44
- Uri Precision: 0.0
- Uri Recall: 0.0
- Uri F1: 0.0
- Uri Number: 1
- Overall Precision: 0.9014
- Overall Recall: 0.9201
- Overall F1: 0.9107
- Overall Accuracy: 0.9851
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
Training results
Training Loss | Epoch | Step | Validation Loss | Attackid Precision | Attackid Recall | Attackid F1 | Attackid Number | Cve Precision | Cve Recall | Cve F1 | Cve Number | Defenderthreat Precision | Defenderthreat Recall | Defenderthreat F1 | Defenderthreat Number | Domain Precision | Domain Recall | Domain F1 | Domain Number | Email Precision | Email Recall | Email F1 | Email Number | Filepath Precision | Filepath Recall | Filepath F1 | Filepath Number | Hostname Precision | Hostname Recall | Hostname F1 | Hostname Number | Ipv4 Precision | Ipv4 Recall | Ipv4 F1 | Ipv4 Number | Md5 Precision | Md5 Recall | Md5 F1 | Md5 Number | Sha1 Precision | Sha1 Recall | Sha1 F1 | Sha1 Number | Sha256 Precision | Sha256 Recall | Sha256 F1 | Sha256 Number | Uri Precision | Uri Recall | Uri F1 | Uri Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.3797 | 0.37 | 500 | 0.1998 | 0.0 | 0.0 | 0.0 | 6 | 0.0 | 0.0 | 0.0 | 11 | 0.0 | 0.0 | 0.0 | 2 | 0.0286 | 0.0435 | 0.0345 | 23 | 0.0 | 0.0 | 0.0 | 3 | 0.5108 | 0.7152 | 0.5960 | 165 | 0.1774 | 0.9167 | 0.2973 | 12 | 0.4 | 0.5 | 0.4444 | 12 | 0.3194 | 0.4423 | 0.3710 | 52 | 0.0 | 0.0 | 0.0 | 7 | 0.4588 | 0.8864 | 0.6047 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.3875 | 0.5858 | 0.4664 | 0.9593 |
0.1713 | 0.75 | 1000 | 0.1619 | 0.6 | 0.5 | 0.5455 | 6 | 0.5 | 0.6364 | 0.56 | 11 | 0.0 | 0.0 | 0.0 | 2 | 0.6957 | 0.6957 | 0.6957 | 23 | 0.0 | 0.0 | 0.0 | 3 | 0.6879 | 0.6545 | 0.6708 | 165 | 0.5217 | 1.0 | 0.6857 | 12 | 0.5714 | 1.0 | 0.7273 | 12 | 0.6667 | 0.8846 | 0.7603 | 52 | 0.0 | 0.0 | 0.0 | 7 | 0.7692 | 0.9091 | 0.8333 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.6685 | 0.7219 | 0.6942 | 0.9664 |
0.1152 | 1.12 | 1500 | 0.1096 | 0.8333 | 0.8333 | 0.8333 | 6 | 1.0 | 1.0 | 1.0 | 11 | 0.0 | 0.0 | 0.0 | 2 | 0.7826 | 0.7826 | 0.7826 | 23 | 1.0 | 1.0 | 1.0 | 3 | 0.7202 | 0.8424 | 0.7765 | 165 | 1.0 | 1.0 | 1.0 | 12 | 0.4444 | 1.0 | 0.6154 | 12 | 0.6944 | 0.9615 | 0.8065 | 52 | 0.0 | 0.0 | 0.0 | 7 | 0.8723 | 0.9318 | 0.9011 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.7312 | 0.8609 | 0.7908 | 0.9751 |
0.1089 | 1.5 | 2000 | 0.1243 | 1.0 | 1.0 | 1.0 | 6 | 0.9167 | 1.0 | 0.9565 | 11 | 0.0 | 0.0 | 0.0 | 2 | 0.9048 | 0.8261 | 0.8636 | 23 | 1.0 | 1.0 | 1.0 | 3 | 0.8011 | 0.8788 | 0.8382 | 165 | 0.6667 | 1.0 | 0.8 | 12 | 0.9091 | 0.8333 | 0.8696 | 12 | 0.7812 | 0.9615 | 0.8621 | 52 | 0.0 | 0.0 | 0.0 | 7 | 0.7857 | 1.0 | 0.88 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.8065 | 0.8876 | 0.8451 | 0.9750 |
0.0947 | 1.87 | 2500 | 0.0913 | 0.75 | 1.0 | 0.8571 | 6 | 1.0 | 1.0 | 1.0 | 11 | 0.0 | 0.0 | 0.0 | 2 | 0.8462 | 0.9565 | 0.8980 | 23 | 0.3333 | 0.6667 | 0.4444 | 3 | 0.8035 | 0.8424 | 0.8225 | 165 | 0.6 | 1.0 | 0.7500 | 12 | 1.0 | 1.0 | 1.0 | 12 | 0.7969 | 0.9808 | 0.8793 | 52 | 0.0 | 0.0 | 0.0 | 7 | 0.8302 | 1.0 | 0.9072 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.7952 | 0.8846 | 0.8375 | 0.9792 |
0.0629 | 2.25 | 3000 | 0.0940 | 1.0 | 0.8333 | 0.9091 | 6 | 1.0 | 1.0 | 1.0 | 11 | 0.0 | 0.0 | 0.0 | 2 | 0.9565 | 0.9565 | 0.9565 | 23 | 1.0 | 1.0 | 1.0 | 3 | 0.8671 | 0.8303 | 0.8483 | 165 | 1.0 | 1.0 | 1.0 | 12 | 1.0 | 1.0 | 1.0 | 12 | 0.9273 | 0.9808 | 0.9533 | 52 | 0.25 | 0.1429 | 0.1818 | 7 | 0.8776 | 0.9773 | 0.9247 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.8946 | 0.8787 | 0.8866 | 0.9825 |
0.0442 | 2.62 | 3500 | 0.1012 | 1.0 | 1.0 | 1.0 | 6 | 0.9167 | 1.0 | 0.9565 | 11 | 0.0 | 0.0 | 0.0 | 2 | 0.9091 | 0.8696 | 0.8889 | 23 | 0.75 | 1.0 | 0.8571 | 3 | 0.8182 | 0.8727 | 0.8446 | 165 | 1.0 | 1.0 | 1.0 | 12 | 1.0 | 1.0 | 1.0 | 12 | 0.92 | 0.8846 | 0.9020 | 52 | 0.5 | 1.0 | 0.6667 | 7 | 0.9565 | 1.0 | 0.9778 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.8616 | 0.9024 | 0.8815 | 0.9818 |
0.0401 | 3.0 | 4000 | 0.0810 | 1.0 | 1.0 | 1.0 | 6 | 1.0 | 1.0 | 1.0 | 11 | 0.0 | 0.0 | 0.0 | 2 | 1.0 | 0.9565 | 0.9778 | 23 | 1.0 | 1.0 | 1.0 | 3 | 0.8841 | 0.8788 | 0.8815 | 165 | 1.0 | 1.0 | 1.0 | 12 | 1.0 | 1.0 | 1.0 | 12 | 0.8333 | 0.9615 | 0.8929 | 52 | 0.6667 | 0.8571 | 0.75 | 7 | 0.9565 | 1.0 | 0.9778 | 44 | 0.0 | 0.0 | 0.0 | 1 | 0.9014 | 0.9201 | 0.9107 | 0.9851 |
Framework versions
- Transformers 4.21.2
- Pytorch 1.12.1+cu102
- Datasets 2.4.0
- Tokenizers 0.12.1
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.