lidiya commited on
Commit
78ff981
•
1 Parent(s): 16fad28

Add README

Browse files
Files changed (1) hide show
  1. README.md +91 -0
README.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - bart
5
+ - seq2seq
6
+ - summarization
7
+ license: apache-2.0
8
+ datasets:
9
+ - samsum
10
+ widget:
11
+ - text: |
12
+ Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
13
+ Philipp: Sure you can use the new Hugging Face Deep Learning Container.
14
+ Jeff: ok.
15
+ Jeff: and how can I get started?
16
+ Jeff: where can I find documentation?
17
+ Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
18
+ model-index:
19
+ - name: bart-base-samsum
20
+ results:
21
+ - task:
22
+ name: Abstractive Text Summarization
23
+ type: abstractive-text-summarization
24
+ dataset:
25
+ name: "SAMSum Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization"
26
+ type: samsum
27
+ metrics:
28
+ - name: Validation ROGUE-1
29
+ type: rogue-1
30
+ value: 46.6619
31
+ - name: Validation ROGUE-2
32
+ type: rogue-2
33
+ value: 23.3285
34
+ - name: Validation ROGUE-L
35
+ type: rogue-l
36
+ value: 39.4811
37
+ - name: Test ROGUE-1
38
+ type: rogue-1
39
+ value: 44.9932
40
+ - name: Test ROGUE-2
41
+ type: rogue-2
42
+ value: 21.7286
43
+ - name: Test ROGUE-L
44
+ type: rogue-l
45
+ value: 38.1921
46
+ ---
47
+ ## `bart-base-samsum`
48
+ This model was obtained by fine-tuning `facebook/bart-base` on Samsum dataset.
49
+ ## Hyperparameters
50
+ ```json
51
+ {
52
+ "dataset_name": "samsum",
53
+ "do_eval": true,
54
+ "do_predict": true,
55
+ "do_train": true,
56
+ "fp16": true,
57
+ "learning_rate": 2e-05,
58
+ "model_name_or_path": "facebook/bart-base",
59
+ "num_train_epochs": 1,
60
+ "output_dir": "/opt/ml/model",
61
+ "per_device_eval_batch_size": 4,
62
+ "per_device_train_batch_size": 4,
63
+ "predict_with_generate": true,
64
+ "gradient_accumulation_steps": 2,
65
+ "weight_decay": 0.01,
66
+ }
67
+ ```
68
+ ## Usage
69
+ ```python
70
+ from transformers import pipeline
71
+ summarizer = pipeline("summarization", model="lidiya/bart-base-samsum")
72
+ conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
73
+ Philipp: Sure you can use the new Hugging Face Deep Learning Container.
74
+ Jeff: ok.
75
+ Jeff: and how can I get started?
76
+ Jeff: where can I find documentation?
77
+ Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
78
+ '''
79
+ nlp(conversation)
80
+ ```
81
+ ## Results
82
+ | key | value |
83
+ | --- | ----- |
84
+ | eval_rouge1 | 46.6619 |
85
+ | eval_rouge2 | 23.3285 |
86
+ | eval_rougeL | 39.4811 |
87
+ | eval_rougeLsum | 43.0482 |
88
+ | test_rouge1 | 44.9932 |
89
+ | test_rouge2 | 21.7286 |
90
+ | test_rougeL | 38.1921 |
91
+ | test_rougeLsum | 41.2672 |