lidiya commited on
Commit
64dffd6
•
1 Parent(s): 78ff981

Update README

Browse files
Files changed (1) hide show
  1. README.md +20 -14
README.md CHANGED
@@ -46,6 +46,25 @@ model-index:
46
  ---
47
  ## `bart-base-samsum`
48
  This model was obtained by fine-tuning `facebook/bart-base` on Samsum dataset.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49
  ## Hyperparameters
50
  ```json
51
  {
@@ -57,7 +76,6 @@ This model was obtained by fine-tuning `facebook/bart-base` on Samsum dataset.
57
  "learning_rate": 2e-05,
58
  "model_name_or_path": "facebook/bart-base",
59
  "num_train_epochs": 1,
60
- "output_dir": "/opt/ml/model",
61
  "per_device_eval_batch_size": 4,
62
  "per_device_train_batch_size": 4,
63
  "predict_with_generate": true,
@@ -65,19 +83,7 @@ This model was obtained by fine-tuning `facebook/bart-base` on Samsum dataset.
65
  "weight_decay": 0.01,
66
  }
67
  ```
68
- ## Usage
69
- ```python
70
- from transformers import pipeline
71
- summarizer = pipeline("summarization", model="lidiya/bart-base-samsum")
72
- conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
73
- Philipp: Sure you can use the new Hugging Face Deep Learning Container.
74
- Jeff: ok.
75
- Jeff: and how can I get started?
76
- Jeff: where can I find documentation?
77
- Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
78
- '''
79
- nlp(conversation)
80
- ```
81
  ## Results
82
  | key | value |
83
  | --- | ----- |
46
  ---
47
  ## `bart-base-samsum`
48
  This model was obtained by fine-tuning `facebook/bart-base` on Samsum dataset.
49
+
50
+ ## Usage
51
+ ```python
52
+ from transformers import pipeline
53
+
54
+ summarizer = pipeline("summarization", model="lidiya/bart-base-samsum")
55
+ conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
56
+ Philipp: Sure you can use the new Hugging Face Deep Learning Container.
57
+ Jeff: ok.
58
+ Jeff: and how can I get started?
59
+ Jeff: where can I find documentation?
60
+ Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
61
+ '''
62
+ nlp(conversation)
63
+ ```
64
+
65
+ ## Training procedure
66
+ - Colab notebook: https://colab.research.google.com/drive/1RInRjLLso9E2HG_xjA6j8JO3zXzSCBRF?usp=sharing
67
+
68
  ## Hyperparameters
69
  ```json
70
  {
76
  "learning_rate": 2e-05,
77
  "model_name_or_path": "facebook/bart-base",
78
  "num_train_epochs": 1,
 
79
  "per_device_eval_batch_size": 4,
80
  "per_device_train_batch_size": 4,
81
  "predict_with_generate": true,
83
  "weight_decay": 0.01,
84
  }
85
  ```
86
+
 
 
 
 
 
 
 
 
 
 
 
 
87
  ## Results
88
  | key | value |
89
  | --- | ----- |