File size: 3,689 Bytes
6e86826
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
---
license: mit
base_model: facebook/bart-large-cnn
tags:
- generated_from_trainer
metrics:
- rouge
- bleu
model-index:
- name: HealthScienceBARTMainSections
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# HealthScienceBARTMainSections

This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 3.8753
- Rouge1: 57.7808
- Rouge2: 23.8942
- Rougel: 42.2385
- Rougelsum: 54.2817
- Bertscore Precision: 83.4992
- Bertscore Recall: 85.0665
- Bertscore F1: 84.2729
- Bleu: 0.1888
- Gen Len: 234.2871

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Rouge1  | Rouge2  | Rougel  | Rougelsum | Bertscore Precision | Bertscore Recall | Bertscore F1 | Bleu   | Gen Len  |
|:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------------------:|:----------------:|:------------:|:------:|:--------:|
| 5.7855        | 0.0835 | 100  | 5.6539          | 48.5675 | 17.0972 | 33.6039 | 45.6212   | 80.1089             | 82.0896          | 81.0838      | 0.1318 | 234.2871 |
| 5.2884        | 0.1671 | 200  | 5.1333          | 51.1112 | 18.5297 | 35.438  | 47.6657   | 80.5123             | 82.7897          | 81.6311      | 0.1454 | 234.2871 |
| 4.9931        | 0.2506 | 300  | 4.8074          | 52.1485 | 19.5231 | 36.7572 | 48.8489   | 81.0844             | 83.2739          | 82.1612      | 0.1534 | 234.2871 |
| 4.6538        | 0.3342 | 400  | 4.5657          | 52.6901 | 20.3073 | 37.8168 | 49.4253   | 81.3851             | 83.5259          | 82.438       | 0.1580 | 234.2871 |
| 4.435         | 0.4177 | 500  | 4.3963          | 54.6727 | 20.9559 | 39.1501 | 51.7226   | 82.5534             | 83.8521          | 83.1954      | 0.1618 | 234.2871 |
| 4.4327        | 0.5013 | 600  | 4.2367          | 55.3497 | 21.849  | 39.8267 | 51.9765   | 82.6498             | 84.2237          | 83.4265      | 0.1698 | 234.2871 |
| 4.2704        | 0.5848 | 700  | 4.1312          | 56.2031 | 22.5317 | 40.6962 | 52.8736   | 82.966              | 84.4887          | 83.7178      | 0.1762 | 234.2871 |
| 4.2211        | 0.6684 | 800  | 4.0373          | 56.1405 | 22.9558 | 41.2482 | 52.772    | 82.9397             | 84.6224          | 83.7695      | 0.1800 | 234.2871 |
| 4.0727        | 0.7519 | 900  | 3.9672          | 57.5881 | 23.6311 | 41.712  | 53.9676   | 83.2595             | 84.8583          | 84.0486      | 0.1850 | 234.2871 |
| 4.0741        | 0.8355 | 1000 | 3.9182          | 57.2156 | 23.6916 | 42.074  | 53.7327   | 83.3537             | 84.9605          | 84.1466      | 0.1868 | 234.2871 |
| 3.8563        | 0.9190 | 1100 | 3.8753          | 57.7808 | 23.8942 | 42.2385 | 54.2817   | 83.4992             | 85.0665          | 84.2729      | 0.1888 | 234.2871 |


### Framework versions

- Transformers 4.41.2
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1