flan-log-sage / README.md
IrwinD's picture
Update README.md
cbd3919 verified
---
license: apache-2.0
base_model: google/flan-t5-base
tags:
- generated_from_trainer
datasets:
- hdfs_log_summary_dataset
metrics:
- rouge
model-index:
- name: flan-log-sage
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: hdfs_log_summary_dataset
type: hdfs_log_summary_dataset
config: default
split: train
args: default
metrics:
- name: Rouge1
type: rouge
value: 0.4709
pipeline_tag: summarization
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-log-sage
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the hdfs_log_summary_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5181
- Rouge1: 0.4709
- Rouge2: 0.1615
- Rougel: 0.3748
- Rougelsum: 0.3905
- Gen Len: 19.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| No log | 1.0 | 12 | 2.9597 | 0.1985 | 0.0098 | 0.1629 | 0.1658 | 18.8 |
| No log | 2.0 | 24 | 2.5389 | 0.3028 | 0.0271 | 0.2401 | 0.2492 | 17.8 |
| No log | 3.0 | 36 | 2.2506 | 0.3349 | 0.0688 | 0.2549 | 0.2789 | 19.0 |
| No log | 4.0 | 48 | 2.0524 | 0.4046 | 0.0982 | 0.3249 | 0.3409 | 19.0 |
| No log | 5.0 | 60 | 1.9082 | 0.4479 | 0.1438 | 0.3449 | 0.3617 | 19.0 |
| No log | 6.0 | 72 | 1.8325 | 0.4564 | 0.1577 | 0.3402 | 0.3562 | 18.8 |
| No log | 7.0 | 84 | 1.7565 | 0.4441 | 0.1456 | 0.3335 | 0.351 | 19.0 |
| No log | 8.0 | 96 | 1.7091 | 0.4691 | 0.1732 | 0.3486 | 0.3667 | 19.0 |
| No log | 9.0 | 108 | 1.6683 | 0.4847 | 0.1645 | 0.3589 | 0.3667 | 19.0 |
| No log | 10.0 | 120 | 1.5987 | 0.4847 | 0.1727 | 0.3667 | 0.3667 | 19.0 |
| No log | 11.0 | 132 | 1.5606 | 0.4684 | 0.1935 | 0.3746 | 0.3751 | 19.0 |
| No log | 12.0 | 144 | 1.5245 | 0.4749 | 0.193 | 0.3817 | 0.3894 | 19.0 |
| No log | 13.0 | 156 | 1.4859 | 0.5163 | 0.2289 | 0.3802 | 0.3879 | 19.0 |
| No log | 14.0 | 168 | 1.4950 | 0.4404 | 0.1522 | 0.3474 | 0.3474 | 19.0 |
| No log | 15.0 | 180 | 1.4552 | 0.4609 | 0.1865 | 0.3573 | 0.362 | 19.0 |
| No log | 16.0 | 192 | 1.4501 | 0.4521 | 0.1685 | 0.342 | 0.3423 | 19.0 |
| No log | 17.0 | 204 | 1.3955 | 0.4763 | 0.1769 | 0.3788 | 0.379 | 19.0 |
| No log | 18.0 | 216 | 1.4192 | 0.4602 | 0.199 | 0.3168 | 0.3178 | 19.0 |
| No log | 19.0 | 228 | 1.3750 | 0.411 | 0.1258 | 0.3168 | 0.3269 | 19.0 |
| No log | 20.0 | 240 | 1.3660 | 0.5038 | 0.2293 | 0.3638 | 0.3649 | 19.0 |
| No log | 21.0 | 252 | 1.3610 | 0.4508 | 0.1364 | 0.3319 | 0.3397 | 19.0 |
| No log | 22.0 | 264 | 1.3437 | 0.4495 | 0.1225 | 0.3217 | 0.3239 | 19.0 |
| No log | 23.0 | 276 | 1.3394 | 0.4495 | 0.1225 | 0.3217 | 0.3239 | 19.0 |
| No log | 24.0 | 288 | 1.3716 | 0.4499 | 0.1459 | 0.3562 | 0.3727 | 19.0 |
| No log | 25.0 | 300 | 1.3673 | 0.4427 | 0.1585 | 0.3704 | 0.3784 | 19.0 |
| No log | 26.0 | 312 | 1.3225 | 0.4427 | 0.1585 | 0.3704 | 0.3784 | 19.0 |
| No log | 27.0 | 324 | 1.3041 | 0.4308 | 0.1457 | 0.3426 | 0.352 | 19.0 |
| No log | 28.0 | 336 | 1.3350 | 0.4508 | 0.1459 | 0.3562 | 0.3647 | 19.0 |
| No log | 29.0 | 348 | 1.3438 | 0.4243 | 0.1256 | 0.3364 | 0.3439 | 19.0 |
| No log | 30.0 | 360 | 1.3332 | 0.4302 | 0.1262 | 0.3394 | 0.3474 | 19.0 |
| No log | 31.0 | 372 | 1.3551 | 0.4647 | 0.1385 | 0.3595 | 0.3595 | 19.0 |
| No log | 32.0 | 384 | 1.3822 | 0.4647 | 0.1385 | 0.3595 | 0.3595 | 19.0 |
| No log | 33.0 | 396 | 1.3978 | 0.4647 | 0.1385 | 0.3595 | 0.3595 | 19.0 |
| No log | 34.0 | 408 | 1.4044 | 0.4469 | 0.1331 | 0.3518 | 0.3518 | 19.0 |
| No log | 35.0 | 420 | 1.3828 | 0.4614 | 0.1369 | 0.357 | 0.3727 | 19.0 |
| No log | 36.0 | 432 | 1.3797 | 0.4551 | 0.1369 | 0.357 | 0.3727 | 19.0 |
| No log | 37.0 | 444 | 1.3528 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| No log | 38.0 | 456 | 1.3716 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| No log | 39.0 | 468 | 1.4217 | 0.4429 | 0.124 | 0.3449 | 0.3606 | 19.0 |
| No log | 40.0 | 480 | 1.4128 | 0.4429 | 0.124 | 0.3449 | 0.3606 | 19.0 |
| No log | 41.0 | 492 | 1.3495 | 0.4429 | 0.124 | 0.3449 | 0.3606 | 19.0 |
| 1.33 | 42.0 | 504 | 1.3608 | 0.4397 | 0.1117 | 0.348 | 0.3636 | 19.0 |
| 1.33 | 43.0 | 516 | 1.4052 | 0.4605 | 0.1246 | 0.3688 | 0.3845 | 19.0 |
| 1.33 | 44.0 | 528 | 1.3969 | 0.4605 | 0.1435 | 0.3688 | 0.3845 | 19.0 |
| 1.33 | 45.0 | 540 | 1.3768 | 0.4551 | 0.1369 | 0.357 | 0.3727 | 19.0 |
| 1.33 | 46.0 | 552 | 1.3903 | 0.4429 | 0.124 | 0.3449 | 0.3606 | 19.0 |
| 1.33 | 47.0 | 564 | 1.3829 | 0.4458 | 0.1395 | 0.3547 | 0.3628 | 19.0 |
| 1.33 | 48.0 | 576 | 1.3972 | 0.4551 | 0.1369 | 0.357 | 0.3727 | 19.0 |
| 1.33 | 49.0 | 588 | 1.4015 | 0.4429 | 0.124 | 0.3449 | 0.3606 | 19.0 |
| 1.33 | 50.0 | 600 | 1.3791 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 51.0 | 612 | 1.4205 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 52.0 | 624 | 1.4269 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 53.0 | 636 | 1.3988 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 54.0 | 648 | 1.4126 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 55.0 | 660 | 1.4178 | 0.4429 | 0.124 | 0.3449 | 0.3606 | 19.0 |
| 1.33 | 56.0 | 672 | 1.4674 | 0.4332 | 0.1189 | 0.3408 | 0.3565 | 19.0 |
| 1.33 | 57.0 | 684 | 1.4871 | 0.4543 | 0.1403 | 0.3546 | 0.3703 | 19.0 |
| 1.33 | 58.0 | 696 | 1.4709 | 0.4547 | 0.1365 | 0.3567 | 0.3723 | 19.0 |
| 1.33 | 59.0 | 708 | 1.4891 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 60.0 | 720 | 1.5033 | 0.4398 | 0.1109 | 0.3289 | 0.3446 | 19.0 |
| 1.33 | 61.0 | 732 | 1.4830 | 0.4398 | 0.1109 | 0.3289 | 0.3446 | 19.0 |
| 1.33 | 62.0 | 744 | 1.4642 | 0.4246 | 0.1042 | 0.335 | 0.3507 | 19.0 |
| 1.33 | 63.0 | 756 | 1.4480 | 0.4246 | 0.1042 | 0.335 | 0.3507 | 19.0 |
| 1.33 | 64.0 | 768 | 1.4312 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 65.0 | 780 | 1.4761 | 0.4378 | 0.1247 | 0.3458 | 0.3615 | 19.0 |
| 1.33 | 66.0 | 792 | 1.4705 | 0.4378 | 0.1247 | 0.3458 | 0.3615 | 19.0 |
| 1.33 | 67.0 | 804 | 1.4665 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 68.0 | 816 | 1.4700 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 69.0 | 828 | 1.4753 | 0.4493 | 0.124 | 0.3515 | 0.3669 | 19.0 |
| 1.33 | 70.0 | 840 | 1.4910 | 0.4351 | 0.113 | 0.3354 | 0.351 | 19.0 |
| 1.33 | 71.0 | 852 | 1.4857 | 0.4586 | 0.1505 | 0.3589 | 0.3746 | 19.0 |
| 1.33 | 72.0 | 864 | 1.4965 | 0.4481 | 0.1399 | 0.3585 | 0.3727 | 19.0 |
| 1.33 | 73.0 | 876 | 1.5141 | 0.4481 | 0.1399 | 0.3585 | 0.3727 | 19.0 |
| 1.33 | 74.0 | 888 | 1.5162 | 0.4407 | 0.1358 | 0.3534 | 0.3687 | 19.0 |
| 1.33 | 75.0 | 900 | 1.5005 | 0.4523 | 0.1439 | 0.3525 | 0.3682 | 19.0 |
| 1.33 | 76.0 | 912 | 1.4910 | 0.417 | 0.1126 | 0.3258 | 0.3396 | 19.0 |
| 1.33 | 77.0 | 924 | 1.4811 | 0.4174 | 0.1143 | 0.3375 | 0.3513 | 19.0 |
| 1.33 | 78.0 | 936 | 1.4698 | 0.4312 | 0.1281 | 0.3534 | 0.3687 | 19.0 |
| 1.33 | 79.0 | 948 | 1.4688 | 0.4298 | 0.1281 | 0.3522 | 0.3666 | 19.0 |
| 1.33 | 80.0 | 960 | 1.4665 | 0.4312 | 0.1281 | 0.3534 | 0.3687 | 19.0 |
| 1.33 | 81.0 | 972 | 1.4879 | 0.4601 | 0.1469 | 0.3684 | 0.3838 | 19.0 |
| 1.33 | 82.0 | 984 | 1.4899 | 0.4601 | 0.1469 | 0.3684 | 0.3838 | 19.0 |
| 1.33 | 83.0 | 996 | 1.4859 | 0.4601 | 0.1469 | 0.3684 | 0.3838 | 19.0 |
| 0.5425 | 84.0 | 1008 | 1.4906 | 0.4645 | 0.1549 | 0.3684 | 0.3838 | 19.0 |
| 0.5425 | 85.0 | 1020 | 1.4987 | 0.4547 | 0.1424 | 0.3567 | 0.3723 | 19.0 |
| 0.5425 | 86.0 | 1032 | 1.4982 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 87.0 | 1044 | 1.4928 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 88.0 | 1056 | 1.4995 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 89.0 | 1068 | 1.4994 | 0.4547 | 0.1424 | 0.3567 | 0.3723 | 19.0 |
| 0.5425 | 90.0 | 1080 | 1.5050 | 0.4547 | 0.1424 | 0.3567 | 0.3723 | 19.0 |
| 0.5425 | 91.0 | 1092 | 1.5118 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 92.0 | 1104 | 1.5085 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 93.0 | 1116 | 1.5093 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 94.0 | 1128 | 1.5149 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 95.0 | 1140 | 1.5164 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 96.0 | 1152 | 1.5165 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 97.0 | 1164 | 1.5167 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 98.0 | 1176 | 1.5171 | 0.4611 | 0.149 | 0.363 | 0.3787 | 19.0 |
| 0.5425 | 99.0 | 1188 | 1.5180 | 0.4709 | 0.1615 | 0.3748 | 0.3905 | 19.0 |
| 0.5425 | 100.0 | 1200 | 1.5181 | 0.4709 | 0.1615 | 0.3748 | 0.3905 | 19.0 |
### Framework versions
- Transformers 4.39.0
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2