satyanshu404
commited on
Commit
•
076397c
1
Parent(s):
6ad89ac
End of training
Browse files
README.md
ADDED
@@ -0,0 +1,160 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
base_model: facebook/bart-large-cnn
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
model-index:
|
7 |
+
- name: bart-large-cnn-prompt_generation
|
8 |
+
results: []
|
9 |
+
---
|
10 |
+
|
11 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
+
should probably proofread and complete it, then remove this comment. -->
|
13 |
+
|
14 |
+
# bart-large-cnn-prompt_generation
|
15 |
+
|
16 |
+
This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
|
17 |
+
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 2.5934
|
19 |
+
- Actual score: 0.8766
|
20 |
+
- Predction score: 1.3535
|
21 |
+
- Score difference: -0.4769
|
22 |
+
|
23 |
+
## Model description
|
24 |
+
|
25 |
+
More information needed
|
26 |
+
|
27 |
+
## Intended uses & limitations
|
28 |
+
|
29 |
+
More information needed
|
30 |
+
|
31 |
+
## Training and evaluation data
|
32 |
+
|
33 |
+
More information needed
|
34 |
+
|
35 |
+
## Training procedure
|
36 |
+
|
37 |
+
### Training hyperparameters
|
38 |
+
|
39 |
+
The following hyperparameters were used during training:
|
40 |
+
- learning_rate: 3e-07
|
41 |
+
- train_batch_size: 2
|
42 |
+
- eval_batch_size: 2
|
43 |
+
- seed: 42
|
44 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
45 |
+
- lr_scheduler_type: linear
|
46 |
+
- num_epochs: 100
|
47 |
+
- mixed_precision_training: Native AMP
|
48 |
+
|
49 |
+
### Training results
|
50 |
+
|
51 |
+
| Training Loss | Epoch | Step | Validation Loss | Actual score | Predction score | Score difference |
|
52 |
+
|:-------------:|:-----:|:----:|:---------------:|:------------:|:---------------:|:----------------:|
|
53 |
+
| No log | 1.0 | 15 | 3.6224 | 0.8766 | -0.4105 | 1.2871 |
|
54 |
+
| No log | 2.0 | 30 | 3.5086 | 0.8766 | -0.2477 | 1.1243 |
|
55 |
+
| No log | 3.0 | 45 | 3.3524 | 0.8766 | -0.3119 | 1.1886 |
|
56 |
+
| No log | 4.0 | 60 | 3.2496 | 0.8766 | -0.1139 | 0.9905 |
|
57 |
+
| No log | 5.0 | 75 | 3.1300 | 0.8766 | -0.3163 | 1.1929 |
|
58 |
+
| No log | 6.0 | 90 | 3.0445 | 0.8766 | -0.4738 | 1.3504 |
|
59 |
+
| No log | 7.0 | 105 | 2.9855 | 0.8766 | -0.5561 | 1.4327 |
|
60 |
+
| No log | 8.0 | 120 | 2.9429 | 0.8766 | -0.6262 | 1.5028 |
|
61 |
+
| No log | 9.0 | 135 | 2.9103 | 0.8766 | -0.4633 | 1.3399 |
|
62 |
+
| No log | 10.0 | 150 | 2.8818 | 0.8766 | -0.5404 | 1.4170 |
|
63 |
+
| No log | 11.0 | 165 | 2.8567 | 0.8766 | -0.7534 | 1.6300 |
|
64 |
+
| No log | 12.0 | 180 | 2.8327 | 0.8766 | -0.7283 | 1.6049 |
|
65 |
+
| No log | 13.0 | 195 | 2.8114 | 0.8766 | -0.5976 | 1.4742 |
|
66 |
+
| No log | 14.0 | 210 | 2.7917 | 0.8766 | -0.7693 | 1.6460 |
|
67 |
+
| No log | 15.0 | 225 | 2.7749 | 0.8766 | -0.5831 | 1.4597 |
|
68 |
+
| No log | 16.0 | 240 | 2.7596 | 0.8766 | -0.5963 | 1.4729 |
|
69 |
+
| No log | 17.0 | 255 | 2.7458 | 0.8766 | -0.5232 | 1.3998 |
|
70 |
+
| No log | 18.0 | 270 | 2.7329 | 0.8766 | -0.1795 | 1.0562 |
|
71 |
+
| No log | 19.0 | 285 | 2.7211 | 0.8766 | -0.2189 | 1.0955 |
|
72 |
+
| No log | 20.0 | 300 | 2.7111 | 0.8766 | -0.3411 | 1.2177 |
|
73 |
+
| No log | 21.0 | 315 | 2.7022 | 0.8766 | -0.3058 | 1.1824 |
|
74 |
+
| No log | 22.0 | 330 | 2.6936 | 0.8766 | -0.3270 | 1.2036 |
|
75 |
+
| No log | 23.0 | 345 | 2.6853 | 0.8766 | -0.1728 | 1.0494 |
|
76 |
+
| No log | 24.0 | 360 | 2.6771 | 0.8766 | -0.2413 | 1.1179 |
|
77 |
+
| No log | 25.0 | 375 | 2.6700 | 0.8766 | 0.0077 | 0.8689 |
|
78 |
+
| No log | 26.0 | 390 | 2.6641 | 0.8766 | -0.0744 | 0.9510 |
|
79 |
+
| No log | 27.0 | 405 | 2.6589 | 0.8766 | 0.0078 | 0.8689 |
|
80 |
+
| No log | 28.0 | 420 | 2.6540 | 0.8766 | 0.0711 | 0.8055 |
|
81 |
+
| No log | 29.0 | 435 | 2.6493 | 0.8766 | 0.2289 | 0.6477 |
|
82 |
+
| No log | 30.0 | 450 | 2.6443 | 0.8766 | 0.1096 | 0.7670 |
|
83 |
+
| No log | 31.0 | 465 | 2.6393 | 0.8766 | 0.1335 | 0.7431 |
|
84 |
+
| No log | 32.0 | 480 | 2.6355 | 0.8766 | 0.3491 | 0.5275 |
|
85 |
+
| No log | 33.0 | 495 | 2.6321 | 0.8766 | 0.4268 | 0.4498 |
|
86 |
+
| 2.6272 | 34.0 | 510 | 2.6288 | 0.8766 | 0.3806 | 0.4960 |
|
87 |
+
| 2.6272 | 35.0 | 525 | 2.6258 | 0.8766 | 0.8496 | 0.0271 |
|
88 |
+
| 2.6272 | 36.0 | 540 | 2.6231 | 0.8766 | 0.6446 | 0.2321 |
|
89 |
+
| 2.6272 | 37.0 | 555 | 2.6204 | 0.8766 | 0.6268 | 0.2498 |
|
90 |
+
| 2.6272 | 38.0 | 570 | 2.6176 | 0.8766 | 0.8588 | 0.0178 |
|
91 |
+
| 2.6272 | 39.0 | 585 | 2.6159 | 0.8766 | 0.9990 | -0.1224 |
|
92 |
+
| 2.6272 | 40.0 | 600 | 2.6132 | 0.8766 | 1.0628 | -0.1862 |
|
93 |
+
| 2.6272 | 41.0 | 615 | 2.6111 | 0.8766 | 0.9146 | -0.0380 |
|
94 |
+
| 2.6272 | 42.0 | 630 | 2.6092 | 0.8766 | 1.0457 | -0.1691 |
|
95 |
+
| 2.6272 | 43.0 | 645 | 2.6078 | 0.8766 | 0.9640 | -0.0874 |
|
96 |
+
| 2.6272 | 44.0 | 660 | 2.6059 | 0.8766 | 1.0378 | -0.1612 |
|
97 |
+
| 2.6272 | 45.0 | 675 | 2.6047 | 0.8766 | 1.0599 | -0.1833 |
|
98 |
+
| 2.6272 | 46.0 | 690 | 2.6034 | 0.8766 | 1.1746 | -0.2980 |
|
99 |
+
| 2.6272 | 47.0 | 705 | 2.6019 | 0.8766 | 1.1497 | -0.2730 |
|
100 |
+
| 2.6272 | 48.0 | 720 | 2.6002 | 0.8766 | 1.2987 | -0.4221 |
|
101 |
+
| 2.6272 | 49.0 | 735 | 2.5988 | 0.8766 | 1.2149 | -0.3383 |
|
102 |
+
| 2.6272 | 50.0 | 750 | 2.5982 | 0.8766 | 1.2456 | -0.3690 |
|
103 |
+
| 2.6272 | 51.0 | 765 | 2.5973 | 0.8766 | 1.2476 | -0.3709 |
|
104 |
+
| 2.6272 | 52.0 | 780 | 2.5958 | 0.8766 | 1.2934 | -0.4168 |
|
105 |
+
| 2.6272 | 53.0 | 795 | 2.5948 | 0.8766 | 1.2370 | -0.3604 |
|
106 |
+
| 2.6272 | 54.0 | 810 | 2.5937 | 0.8766 | 1.2163 | -0.3397 |
|
107 |
+
| 2.6272 | 55.0 | 825 | 2.5926 | 0.8766 | 1.2636 | -0.3869 |
|
108 |
+
| 2.6272 | 56.0 | 840 | 2.5923 | 0.8766 | 1.3040 | -0.4273 |
|
109 |
+
| 2.6272 | 57.0 | 855 | 2.5921 | 0.8766 | 1.3694 | -0.4928 |
|
110 |
+
| 2.6272 | 58.0 | 870 | 2.5916 | 0.8766 | 1.1951 | -0.3185 |
|
111 |
+
| 2.6272 | 59.0 | 885 | 2.5916 | 0.8766 | 1.3291 | -0.4525 |
|
112 |
+
| 2.6272 | 60.0 | 900 | 2.5914 | 0.8766 | 1.3288 | -0.4521 |
|
113 |
+
| 2.6272 | 61.0 | 915 | 2.5914 | 0.8766 | 1.3867 | -0.5101 |
|
114 |
+
| 2.6272 | 62.0 | 930 | 2.5916 | 0.8766 | 1.4165 | -0.5399 |
|
115 |
+
| 2.6272 | 63.0 | 945 | 2.5915 | 0.8766 | 1.4103 | -0.5337 |
|
116 |
+
| 2.6272 | 64.0 | 960 | 2.5910 | 0.8766 | 1.3960 | -0.5194 |
|
117 |
+
| 2.6272 | 65.0 | 975 | 2.5908 | 0.8766 | 1.3134 | -0.4368 |
|
118 |
+
| 2.6272 | 66.0 | 990 | 2.5903 | 0.8766 | 1.3638 | -0.4872 |
|
119 |
+
| 1.9897 | 67.0 | 1005 | 2.5900 | 0.8766 | 1.3875 | -0.5109 |
|
120 |
+
| 1.9897 | 68.0 | 1020 | 2.5901 | 0.8766 | 1.2404 | -0.3637 |
|
121 |
+
| 1.9897 | 69.0 | 1035 | 2.5900 | 0.8766 | 1.4162 | -0.5396 |
|
122 |
+
| 1.9897 | 70.0 | 1050 | 2.5899 | 0.8766 | 1.4048 | -0.5281 |
|
123 |
+
| 1.9897 | 71.0 | 1065 | 2.5900 | 0.8766 | 1.3967 | -0.5201 |
|
124 |
+
| 1.9897 | 72.0 | 1080 | 2.5900 | 0.8766 | 1.4208 | -0.5442 |
|
125 |
+
| 1.9897 | 73.0 | 1095 | 2.5903 | 0.8766 | 1.4418 | -0.5651 |
|
126 |
+
| 1.9897 | 74.0 | 1110 | 2.5903 | 0.8766 | 1.4656 | -0.5890 |
|
127 |
+
| 1.9897 | 75.0 | 1125 | 2.5905 | 0.8766 | 1.4504 | -0.5738 |
|
128 |
+
| 1.9897 | 76.0 | 1140 | 2.5910 | 0.8766 | 1.3669 | -0.4903 |
|
129 |
+
| 1.9897 | 77.0 | 1155 | 2.5912 | 0.8766 | 1.3362 | -0.4595 |
|
130 |
+
| 1.9897 | 78.0 | 1170 | 2.5917 | 0.8766 | 1.3196 | -0.4430 |
|
131 |
+
| 1.9897 | 79.0 | 1185 | 2.5918 | 0.8766 | 1.3537 | -0.4770 |
|
132 |
+
| 1.9897 | 80.0 | 1200 | 2.5921 | 0.8766 | 1.3136 | -0.4370 |
|
133 |
+
| 1.9897 | 81.0 | 1215 | 2.5923 | 0.8766 | 1.3806 | -0.5039 |
|
134 |
+
| 1.9897 | 82.0 | 1230 | 2.5926 | 0.8766 | 1.3900 | -0.5134 |
|
135 |
+
| 1.9897 | 83.0 | 1245 | 2.5924 | 0.8766 | 1.3907 | -0.5141 |
|
136 |
+
| 1.9897 | 84.0 | 1260 | 2.5924 | 0.8766 | 1.3785 | -0.5019 |
|
137 |
+
| 1.9897 | 85.0 | 1275 | 2.5926 | 0.8766 | 1.4009 | -0.5243 |
|
138 |
+
| 1.9897 | 86.0 | 1290 | 2.5928 | 0.8766 | 1.4108 | -0.5342 |
|
139 |
+
| 1.9897 | 87.0 | 1305 | 2.5929 | 0.8766 | 1.3947 | -0.5180 |
|
140 |
+
| 1.9897 | 88.0 | 1320 | 2.5929 | 0.8766 | 1.3845 | -0.5078 |
|
141 |
+
| 1.9897 | 89.0 | 1335 | 2.5928 | 0.8766 | 1.4045 | -0.5279 |
|
142 |
+
| 1.9897 | 90.0 | 1350 | 2.5929 | 0.8766 | 1.3804 | -0.5038 |
|
143 |
+
| 1.9897 | 91.0 | 1365 | 2.5931 | 0.8766 | 1.3962 | -0.5195 |
|
144 |
+
| 1.9897 | 92.0 | 1380 | 2.5931 | 0.8766 | 1.3801 | -0.5034 |
|
145 |
+
| 1.9897 | 93.0 | 1395 | 2.5932 | 0.8766 | 1.3664 | -0.4897 |
|
146 |
+
| 1.9897 | 94.0 | 1410 | 2.5933 | 0.8766 | 1.3716 | -0.4950 |
|
147 |
+
| 1.9897 | 95.0 | 1425 | 2.5933 | 0.8766 | 1.3935 | -0.5169 |
|
148 |
+
| 1.9897 | 96.0 | 1440 | 2.5933 | 0.8766 | 1.3676 | -0.4910 |
|
149 |
+
| 1.9897 | 97.0 | 1455 | 2.5934 | 0.8766 | 1.3914 | -0.5148 |
|
150 |
+
| 1.9897 | 98.0 | 1470 | 2.5933 | 0.8766 | 1.3912 | -0.5146 |
|
151 |
+
| 1.9897 | 99.0 | 1485 | 2.5934 | 0.8766 | 1.3930 | -0.5164 |
|
152 |
+
| 1.7966 | 100.0 | 1500 | 2.5934 | 0.8766 | 1.3535 | -0.4769 |
|
153 |
+
|
154 |
+
|
155 |
+
### Framework versions
|
156 |
+
|
157 |
+
- Transformers 4.35.0
|
158 |
+
- Pytorch 2.1.0+cu118
|
159 |
+
- Datasets 2.14.6
|
160 |
+
- Tokenizers 0.14.1
|
generation_config.json
ADDED
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"bos_token_id": 0,
|
3 |
+
"decoder_start_token_id": 2,
|
4 |
+
"early_stopping": true,
|
5 |
+
"eos_token_id": 2,
|
6 |
+
"forced_bos_token_id": 0,
|
7 |
+
"forced_eos_token_id": 2,
|
8 |
+
"length_penalty": 2.0,
|
9 |
+
"max_length": 142,
|
10 |
+
"min_length": 56,
|
11 |
+
"no_repeat_ngram_size": 3,
|
12 |
+
"num_beams": 4,
|
13 |
+
"pad_token_id": 1,
|
14 |
+
"transformers_version": "4.35.0"
|
15 |
+
}
|
runs/Nov05_12-12-16_34d2c58fe16d/events.out.tfevents.1699186345.34d2c58fe16d.725.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:96f69b76c93c15a703ad9a0647351d2a53d07fc2eec1df9117fbc287b012f5ab
|
3 |
+
size 50754
|
runs/Nov05_12-12-16_34d2c58fe16d/events.out.tfevents.1699193765.34d2c58fe16d.725.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:58ed37aacd31f0fd73ae6590c3db148f2073a7c4c8b182e7f665f47ec3005a5a
|
3 |
+
size 534
|