Update README.md
Browse files
README.md
CHANGED
@@ -20,6 +20,9 @@ State-of-the-art lightweights pretrained Transformer-based encoder-decoder model
|
|
20 |
|
21 |
Model trained on dataset CNN-DailyMail News with input length = 512, output length = 150
|
22 |
## How to use
|
|
|
|
|
|
|
23 |
|
24 |
```python
|
25 |
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
|
|
20 |
|
21 |
Model trained on dataset CNN-DailyMail News with input length = 512, output length = 150
|
22 |
## How to use
|
23 |
+
Input for model: prefix + input text
|
24 |
+
Example: 'summarize: '+ 'Ever noticed how plane seats.....'
|
25 |
+
|
26 |
|
27 |
```python
|
28 |
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|