Files changed (1) hide show
  1. README.md +53 -1
README.md CHANGED
@@ -8,11 +8,63 @@ datasets:
8
  - xsum
9
  thumbnail: https://huggingface.co/front/thumbnails/distilbart_medium.png
10
  ---
 
11
 
12
- ### Usage
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
 
14
  This checkpoint should be loaded into `BartForConditionalGeneration.from_pretrained`. See the [BART docs](https://huggingface.co/transformers/model_doc/bart.html?#transformers.BartForConditionalGeneration) for more information.
15
 
 
 
16
  ### Metrics for DistilBART models
17
 
18
  | Model Name | MM Params | Inference Time (MS) | Speedup | Rouge 2 | Rouge-L |
 
8
  - xsum
9
  thumbnail: https://huggingface.co/front/thumbnails/distilbart_medium.png
10
  ---
11
+ # Distilbart-cnn-12-6
12
 
13
+ ## Table of Contents
14
+ - [Model Details](#model-details)
15
+ - [How to Get Started With the Model](#how-to-get-started-with-the-model)
16
+ - [Uses](#uses)
17
+ - [Risks, Limitations and Biases](#risks-limitations-and-biases)
18
+ - [Training](#training)
19
+ - [Evaluation](#evaluation)
20
+
21
+ ## Model Details
22
+ - **Model Description:**
23
+ - **Developed by:** Sam Shleifer
24
+ - **Model Type:** Summarization
25
+ - **Language(s):** English
26
+ - **License:** Apache-2.0
27
+ - **Parent Model:** See the [BART large CNN model](https://huggingface.co/facebook/bart-large-cnn) for more information about the BART large-sized model which is similarly trained on [CNN Dailymail](https://huggingface.co/datasets/cnn_dailymail) dataset.
28
+ - **Resources for more information:**
29
+ - [Bart Document](https://huggingface.co/docs/transformers/model_doc/bart#transformers.BartForConditionalGeneration)
30
+ - [BART large CNN model paper](https://arxiv.org/abs/1910.13461)
31
+
32
+
33
+ ## How to Get Started With the Model
34
+ ```python
35
+ ​​from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
36
+
37
+ tokenizer = AutoTokenizer.from_pretrained("sshleifer/distilbart-cnn-12-6")
38
+
39
+ model = AutoModelForSeq2SeqLM.from_pretrained("sshleifer/distilbart-cnn-12-6")
40
+
41
+ ```
42
+
43
+
44
+ ## Uses
45
+
46
+
47
+ #### Direct Use
48
+ This model can be used for text summerzation.
49
+
50
+
51
+ ## Risks, Limitations and Biases
52
+ ### Limitations
53
+ This model makes use of the [CNN Dailymail](https://huggingface.co/datasets/cnn_dailymail) dataset, which is an English-language dataset containing just over 300k unique news articles as written by journalists at CNN and the Daily Mail.
54
+ The BCP-47 code for English as generally spoken in the United States is en-US and the BCP-47 code for English as generally spoken in the United Kingdom is en-GB. It is unknown if other varieties of English are represented in the data.
55
+
56
+ ### Biases
57
+ [Bordia and Bowman (2019)](https://www.aclweb.org/anthology/N19-3002.pdf) explore measuring gender bias and debiasing techniques in the CNN / Dailymail dataset, the Penn Treebank, and WikiText-2. They find the CNN / Dailymail dataset to have a slightly lower gender bias based on their metric compared to the other datasets, but still show evidence of gender bias when looking at words such as 'fragile'.
58
+
59
+ Further information e.g in regards to uses, out-of-scope uses, training procedure for the CNN Dailymail dataset are available within its [dataset card](https://huggingface.co/datasets/cnn_dailymail).
60
+
61
+
62
+ ## Training
63
 
64
  This checkpoint should be loaded into `BartForConditionalGeneration.from_pretrained`. See the [BART docs](https://huggingface.co/transformers/model_doc/bart.html?#transformers.BartForConditionalGeneration) for more information.
65
 
66
+ ## Evaluation
67
+
68
  ### Metrics for DistilBART models
69
 
70
  | Model Name | MM Params | Inference Time (MS) | Speedup | Rouge 2 | Rouge-L |