Lachin commited on
Commit
50fe31b
1 Parent(s): f2adeac

Lachin/skill_classification

Browse files
README.md CHANGED
@@ -18,10 +18,10 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 1.5200
22
- - F1: 0.4973
23
- - Roc Auc: 0.6802
24
- - Accuracy: 0.3210
25
 
26
  ## Model description
27
 
@@ -40,62 +40,78 @@ More information needed
40
  ### Training hyperparameters
41
 
42
  The following hyperparameters were used during training:
43
- - learning_rate: 5e-05
44
  - train_batch_size: 8
45
  - eval_batch_size: 8
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
- - num_epochs: 30
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
54
  |:-------------:|:-----:|:-----:|:---------------:|:------:|:-------:|:--------:|
55
- | 0.4307 | 0.68 | 500 | 0.4058 | 0.5119 | 0.6801 | 0.3991 |
56
- | 0.3923 | 1.35 | 1000 | 0.4189 | 0.5473 | 0.7017 | 0.4203 |
57
- | 0.3699 | 2.03 | 1500 | 0.4344 | 0.4529 | 0.6505 | 0.3375 |
58
- | 0.2839 | 2.7 | 2000 | 0.4574 | 0.4345 | 0.6402 | 0.3415 |
59
- | 0.2339 | 3.38 | 2500 | 0.5325 | 0.5061 | 0.6804 | 0.3604 |
60
- | 0.1835 | 4.05 | 3000 | 0.5906 | 0.5145 | 0.6889 | 0.3368 |
61
- | 0.121 | 4.73 | 3500 | 0.6763 | 0.4772 | 0.6658 | 0.3178 |
62
- | 0.0916 | 5.41 | 4000 | 0.7520 | 0.5172 | 0.6919 | 0.3438 |
63
- | 0.0718 | 6.08 | 4500 | 0.8309 | 0.5115 | 0.6903 | 0.3170 |
64
- | 0.0519 | 6.76 | 5000 | 0.8800 | 0.4762 | 0.6652 | 0.3107 |
65
- | 0.0396 | 7.43 | 5500 | 0.9691 | 0.4753 | 0.6683 | 0.2555 |
66
- | 0.0319 | 8.11 | 6000 | 1.0172 | 0.4945 | 0.6787 | 0.3147 |
67
- | 0.0289 | 8.78 | 6500 | 1.0847 | 0.4983 | 0.6815 | 0.3186 |
68
- | 0.026 | 9.46 | 7000 | 1.1104 | 0.4889 | 0.6733 | 0.3352 |
69
- | 0.0186 | 10.14 | 7500 | 1.1639 | 0.4787 | 0.6682 | 0.3123 |
70
- | 0.0207 | 10.81 | 8000 | 1.1984 | 0.4715 | 0.6645 | 0.2981 |
71
- | 0.0151 | 11.49 | 8500 | 1.1868 | 0.4792 | 0.6681 | 0.3068 |
72
- | 0.0136 | 12.16 | 9000 | 1.2526 | 0.5127 | 0.6906 | 0.3178 |
73
- | 0.0131 | 12.84 | 9500 | 1.2530 | 0.4893 | 0.6736 | 0.3162 |
74
- | 0.0124 | 13.51 | 10000 | 1.2486 | 0.4978 | 0.6796 | 0.3139 |
75
- | 0.0106 | 14.19 | 10500 | 1.3166 | 0.5045 | 0.6831 | 0.3352 |
76
- | 0.01 | 14.86 | 11000 | 1.3071 | 0.4876 | 0.6734 | 0.3155 |
77
- | 0.0091 | 15.54 | 11500 | 1.3456 | 0.4716 | 0.6637 | 0.3060 |
78
- | 0.0075 | 16.22 | 12000 | 1.3318 | 0.4940 | 0.6762 | 0.3297 |
79
- | 0.0085 | 16.89 | 12500 | 1.4009 | 0.4912 | 0.6770 | 0.3052 |
80
- | 0.0079 | 17.57 | 13000 | 1.3524 | 0.4906 | 0.6740 | 0.3328 |
81
- | 0.0055 | 18.24 | 13500 | 1.4259 | 0.4761 | 0.6668 | 0.3044 |
82
- | 0.0057 | 18.92 | 14000 | 1.4156 | 0.4832 | 0.6724 | 0.2997 |
83
- | 0.0034 | 19.59 | 14500 | 1.4554 | 0.4779 | 0.6674 | 0.3139 |
84
- | 0.0048 | 20.27 | 15000 | 1.4369 | 0.4729 | 0.6641 | 0.3044 |
85
- | 0.0039 | 20.95 | 15500 | 1.4665 | 0.4914 | 0.6775 | 0.3021 |
86
- | 0.0033 | 21.62 | 16000 | 1.4586 | 0.4875 | 0.6734 | 0.3131 |
87
- | 0.0024 | 22.3 | 16500 | 1.4830 | 0.4884 | 0.6742 | 0.3155 |
88
- | 0.0034 | 22.97 | 17000 | 1.4911 | 0.4932 | 0.6771 | 0.3210 |
89
- | 0.003 | 23.65 | 17500 | 1.4832 | 0.5045 | 0.6848 | 0.3226 |
90
- | 0.002 | 24.32 | 18000 | 1.4772 | 0.4679 | 0.6613 | 0.3028 |
91
- | 0.0028 | 25.0 | 18500 | 1.4742 | 0.5071 | 0.6858 | 0.3368 |
92
- | 0.0018 | 25.68 | 19000 | 1.4784 | 0.4972 | 0.6789 | 0.3328 |
93
- | 0.0022 | 26.35 | 19500 | 1.4965 | 0.4997 | 0.6808 | 0.3328 |
94
- | 0.0021 | 27.03 | 20000 | 1.5143 | 0.4858 | 0.6723 | 0.3194 |
95
- | 0.0016 | 27.7 | 20500 | 1.5262 | 0.4868 | 0.6731 | 0.3186 |
96
- | 0.0021 | 28.38 | 21000 | 1.5200 | 0.4973 | 0.6802 | 0.3210 |
97
- | 0.0015 | 29.05 | 21500 | 1.5159 | 0.4862 | 0.6722 | 0.3186 |
98
- | 0.0017 | 29.73 | 22000 | 1.5102 | 0.4876 | 0.6727 | 0.3249 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
99
 
100
 
101
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.4789
22
+ - F1: 0.5701
23
+ - Roc Auc: 0.7465
24
+ - Accuracy: 0.2801
25
 
26
  ## Model description
27
 
 
40
  ### Training hyperparameters
41
 
42
  The following hyperparameters were used during training:
43
+ - learning_rate: 0.0001
44
  - train_batch_size: 8
45
  - eval_batch_size: 8
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
+ - num_epochs: 50
50
 
51
  ### Training results
52
 
53
  | Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
54
  |:-------------:|:-----:|:-----:|:---------------:|:------:|:-------:|:--------:|
55
+ | 0.2983 | 0.82 | 500 | 0.2704 | 0.2291 | 0.5655 | 0.0767 |
56
+ | 0.2502 | 1.65 | 1000 | 0.2516 | 0.3179 | 0.5999 | 0.1206 |
57
+ | 0.2354 | 2.47 | 1500 | 0.2390 | 0.3442 | 0.6093 | 0.1651 |
58
+ | 0.2169 | 3.29 | 2000 | 0.2327 | 0.4040 | 0.6366 | 0.2022 |
59
+ | 0.1988 | 4.12 | 2500 | 0.2310 | 0.4561 | 0.6669 | 0.2127 |
60
+ | 0.1809 | 4.94 | 3000 | 0.2332 | 0.4599 | 0.6655 | 0.2226 |
61
+ | 0.1637 | 5.77 | 3500 | 0.2331 | 0.5096 | 0.7112 | 0.2226 |
62
+ | 0.1499 | 6.59 | 4000 | 0.2331 | 0.5159 | 0.7101 | 0.2239 |
63
+ | 0.1384 | 7.41 | 4500 | 0.2404 | 0.5121 | 0.6987 | 0.2319 |
64
+ | 0.1253 | 8.24 | 5000 | 0.2443 | 0.5177 | 0.7048 | 0.2288 |
65
+ | 0.1108 | 9.06 | 5500 | 0.2509 | 0.5352 | 0.7272 | 0.2319 |
66
+ | 0.0974 | 9.88 | 6000 | 0.2669 | 0.5309 | 0.7214 | 0.2375 |
67
+ | 0.0844 | 10.71 | 6500 | 0.2650 | 0.5420 | 0.7334 | 0.2393 |
68
+ | 0.076 | 11.53 | 7000 | 0.2793 | 0.5263 | 0.7158 | 0.2344 |
69
+ | 0.0672 | 12.36 | 7500 | 0.2904 | 0.5453 | 0.7340 | 0.2369 |
70
+ | 0.0607 | 13.18 | 8000 | 0.3024 | 0.5424 | 0.7270 | 0.2529 |
71
+ | 0.0549 | 14.0 | 8500 | 0.3026 | 0.5524 | 0.7311 | 0.2684 |
72
+ | 0.0464 | 14.83 | 9000 | 0.3211 | 0.5538 | 0.7386 | 0.2505 |
73
+ | 0.0411 | 15.65 | 9500 | 0.3292 | 0.5591 | 0.7408 | 0.2672 |
74
+ | 0.0356 | 16.47 | 10000 | 0.3417 | 0.5633 | 0.7537 | 0.2492 |
75
+ | 0.0335 | 17.3 | 10500 | 0.3447 | 0.5601 | 0.7463 | 0.2536 |
76
+ | 0.0295 | 18.12 | 11000 | 0.3447 | 0.5678 | 0.7465 | 0.2715 |
77
+ | 0.0262 | 18.95 | 11500 | 0.3539 | 0.5642 | 0.7437 | 0.2653 |
78
+ | 0.0237 | 19.77 | 12000 | 0.3709 | 0.5631 | 0.7393 | 0.2801 |
79
+ | 0.0206 | 20.59 | 12500 | 0.3715 | 0.5617 | 0.7443 | 0.2783 |
80
+ | 0.0181 | 21.42 | 13000 | 0.3783 | 0.5672 | 0.7513 | 0.2641 |
81
+ | 0.0192 | 22.24 | 13500 | 0.3931 | 0.5622 | 0.7402 | 0.2672 |
82
+ | 0.0173 | 23.06 | 14000 | 0.3902 | 0.5665 | 0.7471 | 0.2709 |
83
+ | 0.0166 | 23.89 | 14500 | 0.4031 | 0.5649 | 0.7452 | 0.2740 |
84
+ | 0.0141 | 24.71 | 15000 | 0.4120 | 0.5632 | 0.7421 | 0.2764 |
85
+ | 0.0131 | 25.54 | 15500 | 0.4071 | 0.5644 | 0.7428 | 0.2845 |
86
+ | 0.013 | 26.36 | 16000 | 0.4122 | 0.5668 | 0.7412 | 0.2857 |
87
+ | 0.0121 | 27.18 | 16500 | 0.4253 | 0.5714 | 0.7505 | 0.2771 |
88
+ | 0.0109 | 28.01 | 17000 | 0.4323 | 0.5687 | 0.7462 | 0.2764 |
89
+ | 0.0112 | 28.83 | 17500 | 0.4433 | 0.5600 | 0.7401 | 0.2839 |
90
+ | 0.0099 | 29.65 | 18000 | 0.4374 | 0.5670 | 0.7446 | 0.2814 |
91
+ | 0.0106 | 30.48 | 18500 | 0.4395 | 0.5644 | 0.7488 | 0.2690 |
92
+ | 0.0104 | 31.3 | 19000 | 0.4369 | 0.5724 | 0.7498 | 0.2752 |
93
+ | 0.0085 | 32.13 | 19500 | 0.4469 | 0.5660 | 0.7430 | 0.2777 |
94
+ | 0.0093 | 32.95 | 20000 | 0.4483 | 0.5698 | 0.7463 | 0.2808 |
95
+ | 0.0085 | 33.77 | 20500 | 0.4549 | 0.5704 | 0.7580 | 0.2653 |
96
+ | 0.0093 | 34.6 | 21000 | 0.4579 | 0.5664 | 0.7420 | 0.2863 |
97
+ | 0.009 | 35.42 | 21500 | 0.4560 | 0.5726 | 0.7486 | 0.2808 |
98
+ | 0.0075 | 36.24 | 22000 | 0.4650 | 0.5635 | 0.7502 | 0.2715 |
99
+ | 0.0081 | 37.07 | 22500 | 0.4647 | 0.5659 | 0.7502 | 0.2715 |
100
+ | 0.0074 | 37.89 | 23000 | 0.4662 | 0.5674 | 0.7503 | 0.2758 |
101
+ | 0.0077 | 38.71 | 23500 | 0.4710 | 0.5676 | 0.7460 | 0.2771 |
102
+ | 0.0065 | 39.54 | 24000 | 0.4701 | 0.5659 | 0.7461 | 0.2801 |
103
+ | 0.0076 | 40.36 | 24500 | 0.4673 | 0.5687 | 0.7452 | 0.2777 |
104
+ | 0.0075 | 41.19 | 25000 | 0.4692 | 0.5643 | 0.7430 | 0.2715 |
105
+ | 0.0071 | 42.01 | 25500 | 0.4743 | 0.5697 | 0.7490 | 0.2771 |
106
+ | 0.0071 | 42.83 | 26000 | 0.4705 | 0.5678 | 0.7459 | 0.2703 |
107
+ | 0.0063 | 43.66 | 26500 | 0.4711 | 0.5682 | 0.7448 | 0.2777 |
108
+ | 0.0071 | 44.48 | 27000 | 0.4722 | 0.5671 | 0.7442 | 0.2715 |
109
+ | 0.0061 | 45.3 | 27500 | 0.4714 | 0.5680 | 0.7441 | 0.2789 |
110
+ | 0.0065 | 46.13 | 28000 | 0.4781 | 0.5712 | 0.7487 | 0.2764 |
111
+ | 0.0067 | 46.95 | 28500 | 0.4770 | 0.5699 | 0.7439 | 0.2764 |
112
+ | 0.0065 | 47.78 | 29000 | 0.4790 | 0.5697 | 0.7463 | 0.2789 |
113
+ | 0.006 | 48.6 | 29500 | 0.4782 | 0.5698 | 0.7463 | 0.2801 |
114
+ | 0.0058 | 49.42 | 30000 | 0.4789 | 0.5701 | 0.7465 | 0.2801 |
115
 
116
 
117
  ### Framework versions
config.json CHANGED
@@ -10,22 +10,58 @@
10
  "hidden_dropout_prob": 0.1,
11
  "hidden_size": 768,
12
  "id2label": {
13
- "0": "happy",
14
- "1": "sad",
15
- "2": "anger",
16
- "3": "surprise",
17
- "4": "disgust",
18
- "5": "fear"
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19
  },
20
  "initializer_range": 0.02,
21
  "intermediate_size": 3072,
22
  "label2id": {
23
- "anger": 2,
24
- "disgust": 4,
25
- "fear": 5,
26
- "happy": 0,
27
- "sad": 1,
28
- "surprise": 3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
29
  },
30
  "layer_norm_eps": 1e-12,
31
  "max_position_embeddings": 512,
 
10
  "hidden_dropout_prob": 0.1,
11
  "hidden_size": 768,
12
  "id2label": {
13
+ "0": "agile",
14
+ "1": "analysis",
15
+ "2": "architecture",
16
+ "3": "css",
17
+ "4": "database",
18
+ "5": "developer",
19
+ "6": "development",
20
+ "7": "html",
21
+ "8": "java",
22
+ "9": "javascript",
23
+ "10": "jquery",
24
+ "11": "linux",
25
+ "12": "manager",
26
+ "13": "networking",
27
+ "14": "oracle",
28
+ "15": "programming",
29
+ "16": "project management",
30
+ "17": "python",
31
+ "18": "security",
32
+ "19": "sql",
33
+ "20": "testing",
34
+ "21": "unix",
35
+ "22": "windows",
36
+ "23": "xml"
37
  },
38
  "initializer_range": 0.02,
39
  "intermediate_size": 3072,
40
  "label2id": {
41
+ "agile": 0,
42
+ "analysis": 1,
43
+ "architecture": 2,
44
+ "css": 3,
45
+ "database": 4,
46
+ "developer": 5,
47
+ "development": 6,
48
+ "html": 7,
49
+ "java": 8,
50
+ "javascript": 9,
51
+ "jquery": 10,
52
+ "linux": 11,
53
+ "manager": 12,
54
+ "networking": 13,
55
+ "oracle": 14,
56
+ "programming": 15,
57
+ "project management": 16,
58
+ "python": 17,
59
+ "security": 18,
60
+ "sql": 19,
61
+ "testing": 20,
62
+ "unix": 21,
63
+ "windows": 22,
64
+ "xml": 23
65
  },
66
  "layer_norm_eps": 1e-12,
67
  "max_position_embeddings": 512,
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:19b68ec206040d3e607b19373fffe9cae690f496f760bfee49c3cf09989d9951
3
- size 437970952
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9511cb3dfc8cf57136757f38111ea391bcea7e1254041fb65ad7e0c49c7a7ff1
3
+ size 438026320
runs/Jan01_05-27-32_99ea7efe281d/events.out.tfevents.1704087008.99ea7efe281d.395.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dabc4f25664592ab0c62d4cadb328214e24c79f7c01cd6d5a043bf4810d94f41
3
+ size 36346
runs/Jan01_07-38-06_99ea7efe281d/events.out.tfevents.1704094698.99ea7efe281d.395.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:804e2bd47a51e32e422fbb477ea2761a69a5a72171c3968ba1c65befea84adda
3
+ size 40579
runs/Jan01_07-38-06_99ea7efe281d/events.out.tfevents.1704101796.99ea7efe281d.395.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0d6ca0a9f6f1c900c627bb55497d4704d0a7cbb9879808aab6f7cfd83656c0b3
3
+ size 516
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8e08720faad6ba3e84539950f8029b94aff1a5828eadcc49f3c44cae89760249
3
  size 4600
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7675e2932b4c967008c586a38936d3c87221d567b32d1cff808038411e856e5a
3
  size 4600