fine-tuned-BioBART-50-epochs-1024-input-160-output
This model is a fine-tuned version of GanjinZero/biobart-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.6492
- Rouge1: 0.173
- Rouge2: 0.0346
- Rougel: 0.1373
- Rougelsum: 0.1364
- Gen Len: 40.05
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 151 | 8.7694 | 0.0 | 0.0 | 0.0 | 0.0 | 14.0 |
No log | 2.0 | 302 | 4.3319 | 0.0047 | 0.0003 | 0.0047 | 0.0047 | 4.35 |
No log | 3.0 | 453 | 1.6898 | 0.1088 | 0.0334 | 0.099 | 0.0996 | 15.96 |
6.0134 | 4.0 | 604 | 1.4547 | 0.1004 | 0.0207 | 0.0776 | 0.0769 | 24.61 |
6.0134 | 5.0 | 755 | 1.3712 | 0.1532 | 0.0285 | 0.1197 | 0.1194 | 38.39 |
6.0134 | 6.0 | 906 | 1.3235 | 0.1144 | 0.0282 | 0.0901 | 0.0913 | 23.41 |
1.2432 | 7.0 | 1057 | 1.2835 | 0.1011 | 0.025 | 0.0784 | 0.0798 | 25.49 |
1.2432 | 8.0 | 1208 | 1.2733 | 0.1536 | 0.0387 | 0.1169 | 0.1183 | 38.0 |
1.2432 | 9.0 | 1359 | 1.2842 | 0.1386 | 0.0244 | 0.1162 | 0.1162 | 20.83 |
0.7926 | 10.0 | 1510 | 1.2752 | 0.1812 | 0.0353 | 0.1352 | 0.1363 | 45.95 |
0.7926 | 11.0 | 1661 | 1.2846 | 0.1804 | 0.0378 | 0.1452 | 0.1464 | 31.63 |
0.7926 | 12.0 | 1812 | 1.2998 | 0.1899 | 0.0432 | 0.1346 | 0.1348 | 48.98 |
0.7926 | 13.0 | 1963 | 1.3226 | 0.1809 | 0.0474 | 0.143 | 0.1438 | 33.78 |
0.4817 | 14.0 | 2114 | 1.3471 | 0.1425 | 0.0341 | 0.1024 | 0.1028 | 37.38 |
0.4817 | 15.0 | 2265 | 1.3651 | 0.1805 | 0.0315 | 0.1402 | 0.1412 | 33.77 |
0.4817 | 16.0 | 2416 | 1.3818 | 0.1469 | 0.0333 | 0.1188 | 0.1191 | 30.55 |
0.2578 | 17.0 | 2567 | 1.3936 | 0.1734 | 0.0353 | 0.1339 | 0.133 | 36.63 |
0.2578 | 18.0 | 2718 | 1.4192 | 0.1988 | 0.0471 | 0.1576 | 0.1587 | 40.01 |
0.2578 | 19.0 | 2869 | 1.4183 | 0.1852 | 0.0378 | 0.1449 | 0.1444 | 39.72 |
0.1232 | 20.0 | 3020 | 1.4483 | 0.1625 | 0.0442 | 0.1285 | 0.1296 | 36.7 |
0.1232 | 21.0 | 3171 | 1.4582 | 0.1771 | 0.0408 | 0.1321 | 0.1329 | 41.33 |
0.1232 | 22.0 | 3322 | 1.4860 | 0.1813 | 0.0429 | 0.1458 | 0.1458 | 40.09 |
0.1232 | 23.0 | 3473 | 1.5091 | 0.1616 | 0.0373 | 0.1273 | 0.1269 | 37.73 |
0.0543 | 24.0 | 3624 | 1.4922 | 0.1914 | 0.0371 | 0.1429 | 0.143 | 45.71 |
0.0543 | 25.0 | 3775 | 1.5290 | 0.1642 | 0.0388 | 0.1307 | 0.1315 | 36.5 |
0.0543 | 26.0 | 3926 | 1.5310 | 0.1929 | 0.0428 | 0.1524 | 0.1521 | 40.69 |
0.0278 | 27.0 | 4077 | 1.5282 | 0.1691 | 0.0414 | 0.1355 | 0.1362 | 39.25 |
0.0278 | 28.0 | 4228 | 1.5424 | 0.1749 | 0.0424 | 0.1404 | 0.1408 | 44.13 |
0.0278 | 29.0 | 4379 | 1.5573 | 0.1922 | 0.0364 | 0.1549 | 0.1548 | 41.01 |
0.0174 | 30.0 | 4530 | 1.5614 | 0.1635 | 0.0358 | 0.1313 | 0.1318 | 39.58 |
0.0174 | 31.0 | 4681 | 1.5683 | 0.187 | 0.0427 | 0.1508 | 0.1508 | 39.52 |
0.0174 | 32.0 | 4832 | 1.5910 | 0.172 | 0.0262 | 0.1312 | 0.13 | 39.9 |
0.0174 | 33.0 | 4983 | 1.5748 | 0.1828 | 0.0429 | 0.1471 | 0.1483 | 38.88 |
0.0118 | 34.0 | 5134 | 1.5834 | 0.1702 | 0.034 | 0.1327 | 0.1321 | 38.71 |
0.0118 | 35.0 | 5285 | 1.5935 | 0.1987 | 0.0451 | 0.1576 | 0.1577 | 40.79 |
0.0118 | 36.0 | 5436 | 1.5993 | 0.193 | 0.0407 | 0.156 | 0.1555 | 41.14 |
0.009 | 37.0 | 5587 | 1.6120 | 0.1818 | 0.0393 | 0.1406 | 0.1408 | 40.82 |
0.009 | 38.0 | 5738 | 1.6203 | 0.1699 | 0.034 | 0.1344 | 0.1353 | 40.08 |
0.009 | 39.0 | 5889 | 1.6201 | 0.1866 | 0.0419 | 0.1446 | 0.1443 | 40.17 |
0.0068 | 40.0 | 6040 | 1.6161 | 0.1708 | 0.0279 | 0.136 | 0.1365 | 42.42 |
0.0068 | 41.0 | 6191 | 1.6334 | 0.1753 | 0.0396 | 0.14 | 0.14 | 38.92 |
0.0068 | 42.0 | 6342 | 1.6321 | 0.1806 | 0.0397 | 0.1448 | 0.1449 | 37.77 |
0.0068 | 43.0 | 6493 | 1.6399 | 0.1881 | 0.0373 | 0.1508 | 0.1499 | 40.48 |
0.0055 | 44.0 | 6644 | 1.6371 | 0.1847 | 0.0364 | 0.1486 | 0.1479 | 39.22 |
0.0055 | 45.0 | 6795 | 1.6421 | 0.1879 | 0.0368 | 0.1499 | 0.1491 | 40.72 |
0.0055 | 46.0 | 6946 | 1.6471 | 0.1862 | 0.0381 | 0.1484 | 0.1483 | 40.26 |
0.0044 | 47.0 | 7097 | 1.6503 | 0.1719 | 0.036 | 0.1362 | 0.1351 | 40.92 |
0.0044 | 48.0 | 7248 | 1.6493 | 0.1711 | 0.036 | 0.1375 | 0.1377 | 39.36 |
0.0044 | 49.0 | 7399 | 1.6492 | 0.1738 | 0.0353 | 0.1375 | 0.1365 | 40.71 |
0.004 | 50.0 | 7550 | 1.6492 | 0.173 | 0.0346 | 0.1373 | 0.1364 | 40.05 |
Framework versions
- Transformers 4.36.2
- Pytorch 1.12.1+cu113
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for tanatapanun/fine-tuned-BioBART-50-epochs-1024-input-160-output
Base model
GanjinZero/biobart-base