debbiesoon commited on
Commit
837eccb
1 Parent(s): f556069

Update README.md

Browse files

![SGH logo.png](https://s3.amazonaws.com/moonup/production/uploads/1667139981930-631feef1124782a19eff4243.png)

Files changed (1) hide show
  1. README.md +2 -16
README.md CHANGED
@@ -24,9 +24,6 @@ model-index:
24
  value: 0.3914
25
  ---
26
 
27
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
- should probably proofread and complete it, then remove this comment. -->
29
-
30
  # bart_large_summarise_v3
31
 
32
  This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on the multi_news dataset.
@@ -40,18 +37,11 @@ It achieves the following results on the evaluation set:
40
 
41
  ## Model description
42
 
43
- More information needed
44
 
45
  ## Intended uses & limitations
46
 
47
- encoder_max_length = 1024
48
- decoder_max_length = 512
49
-
50
- ## Training and evaluation data
51
-
52
- More information needed
53
-
54
- ## Training procedure
55
 
56
  ### Training hyperparameters
57
 
@@ -66,10 +56,6 @@ The following hyperparameters were used during training:
66
  - num_epochs: 10
67
  - label_smoothing_factor: 0.1
68
 
69
- ### Training results
70
-
71
-
72
-
73
  ### Framework versions
74
 
75
  - Transformers 4.23.1
 
24
  value: 0.3914
25
  ---
26
 
 
 
 
27
  # bart_large_summarise_v3
28
 
29
  This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on the multi_news dataset.
 
37
 
38
  ## Model description
39
 
40
+ This model was created to generate summaries of news articles.
41
 
42
  ## Intended uses & limitations
43
 
44
+ The model takes up to maximum article length of 1024 characters and generates a summary of maximum length of 512 characters.
 
 
 
 
 
 
 
45
 
46
  ### Training hyperparameters
47
 
 
56
  - num_epochs: 10
57
  - label_smoothing_factor: 0.1
58
 
 
 
 
 
59
  ### Framework versions
60
 
61
  - Transformers 4.23.1