--- tags: - generated_from_trainer metrics: - rouge model-index: - name: meeting-sensai-2 results: [] --- # meeting-sensai-2 This model is a fine-tuned version of [raquelclemente/meeting-sensai](https://huggingface.co/raquelclemente/meeting-sensai) on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.7555 - Rouge1: 0.3039 - Rouge2: 0.1453 - Rougel: 0.2534 - Rougelsum: 0.2529 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 5 - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:| | No log | 0.09 | 30 | 3.1865 | 0.1936 | 0.0509 | 0.1606 | 0.1611 | | No log | 0.18 | 60 | 2.9928 | 0.2636 | 0.0961 | 0.1951 | 0.1969 | | No log | 0.28 | 90 | 2.9472 | 0.2667 | 0.0921 | 0.2097 | 0.2115 | | No log | 0.37 | 120 | 2.8506 | 0.2755 | 0.1151 | 0.2243 | 0.2243 | | No log | 0.46 | 150 | 2.8883 | 0.2505 | 0.1134 | 0.2057 | 0.2056 | | No log | 0.55 | 180 | 2.8464 | 0.2946 | 0.1229 | 0.2449 | 0.2446 | | No log | 0.65 | 210 | 2.8274 | 0.2668 | 0.1005 | 0.2112 | 0.2120 | | No log | 0.74 | 240 | 2.8153 | 0.2960 | 0.1268 | 0.2392 | 0.2384 | | No log | 0.83 | 270 | 2.7803 | 0.2940 | 0.1203 | 0.2323 | 0.2312 | | No log | 0.92 | 300 | 2.8129 | 0.2966 | 0.1083 | 0.2447 | 0.2444 | | No log | 1.02 | 330 | 2.7478 | 0.2977 | 0.1152 | 0.2388 | 0.2382 | | No log | 1.11 | 360 | 2.7482 | 0.2905 | 0.1135 | 0.2379 | 0.2369 | | No log | 1.2 | 390 | 2.7646 | 0.3215 | 0.1260 | 0.2590 | 0.2594 | | No log | 1.29 | 420 | 2.7763 | 0.3164 | 0.1273 | 0.2536 | 0.2524 | | No log | 1.38 | 450 | 2.8300 | 0.2867 | 0.1198 | 0.2182 | 0.2186 | | No log | 1.48 | 480 | 2.7683 | 0.3313 | 0.1567 | 0.2647 | 0.2643 | | 2.7437 | 1.57 | 510 | 2.7669 | 0.3004 | 0.1313 | 0.2545 | 0.2541 | | 2.7437 | 1.66 | 540 | 2.7242 | 0.2960 | 0.1394 | 0.2404 | 0.2409 | | 2.7437 | 1.75 | 570 | 2.7565 | 0.3042 | 0.1172 | 0.2475 | 0.2463 | | 2.7437 | 1.85 | 600 | 2.7866 | 0.2994 | 0.1300 | 0.2375 | 0.2373 | | 2.7437 | 1.94 | 630 | 2.7293 | 0.3122 | 0.1306 | 0.2611 | 0.2581 | | 2.7437 | 2.03 | 660 | 2.7398 | 0.3194 | 0.1314 | 0.2504 | 0.2506 | | 2.7437 | 2.12 | 690 | 2.7183 | 0.3109 | 0.1374 | 0.2591 | 0.2588 | | 2.7437 | 2.22 | 720 | 2.7929 | 0.3184 | 0.1454 | 0.2562 | 0.2562 | | 2.7437 | 2.31 | 750 | 2.8156 | 0.3360 | 0.1566 | 0.2613 | 0.2611 | | 2.7437 | 2.4 | 780 | 2.7750 | 0.3125 | 0.1364 | 0.2496 | 0.2494 | | 2.7437 | 2.49 | 810 | 2.8071 | 0.2928 | 0.1434 | 0.2501 | 0.2491 | | 2.7437 | 2.58 | 840 | 2.7322 | 0.3043 | 0.1403 | 0.2488 | 0.2484 | | 2.7437 | 2.68 | 870 | 2.7449 | 0.3006 | 0.1437 | 0.2521 | 0.2516 | | 2.7437 | 2.77 | 900 | 2.7425 | 0.3029 | 0.1479 | 0.2545 | 0.2543 | | 2.7437 | 2.86 | 930 | 2.7242 | 0.3028 | 0.1355 | 0.2318 | 0.2307 | | 2.7437 | 2.95 | 960 | 2.7232 | 0.3100 | 0.1474 | 0.2449 | 0.2443 | | 2.7437 | 3.05 | 990 | 2.7787 | 0.3036 | 0.1457 | 0.2465 | 0.2467 | | 2.1872 | 3.14 | 1020 | 2.7759 | 0.2957 | 0.1371 | 0.2394 | 0.2375 | | 2.1872 | 3.23 | 1050 | 2.7896 | 0.3105 | 0.1391 | 0.2403 | 0.2375 | | 2.1872 | 3.32 | 1080 | 2.7724 | 0.3121 | 0.1498 | 0.2452 | 0.2453 | | 2.1872 | 3.42 | 1110 | 2.7639 | 0.3204 | 0.1534 | 0.2556 | 0.2563 | | 2.1872 | 3.51 | 1140 | 2.7673 | 0.3103 | 0.1541 | 0.2529 | 0.2521 | | 2.1872 | 3.6 | 1170 | 2.7644 | 0.3059 | 0.1399 | 0.2450 | 0.2448 | | 2.1872 | 3.69 | 1200 | 2.7443 | 0.3186 | 0.1484 | 0.2554 | 0.2562 | | 2.1872 | 3.78 | 1230 | 2.7517 | 0.3045 | 0.1444 | 0.2522 | 0.2519 | | 2.1872 | 3.88 | 1260 | 2.7501 | 0.3039 | 0.1453 | 0.2534 | 0.2529 | | 2.1872 | 3.97 | 1290 | 2.7555 | 0.3039 | 0.1453 | 0.2534 | 0.2529 | ### Framework versions - Transformers 4.26.1 - Pytorch 1.13.0 - Datasets 2.1.0 - Tokenizers 0.13.2