File size: 797 Bytes
9ca6bb6
 
8783306
 
cd38593
8783306
 
 
baa557e
 
9ca6bb6
baa557e
 
 
3ca9ebe
baa557e
 
 
3ca9ebe
baa557e
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
---
license: mit
widget:
- text: 'writeWiki: Jupiter'
- text: 'writeWiki: Sri Lanka'
- text: 'writeWiki: Language Model'
language:
- en
datasets:
- wikipedia
---

### Fine tuned T5 base model with Simple English Wikipedia Dataset

This model is fine tuned with articles from Simple English Wikipedia for article generation. Used around 25,000 articles for training.

### How to use

We have to use **"writeWiki: "** part at the begining of each prompt.

You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:

```py
>>> from transformers import pipeline
>>> generator = pipeline('text2text-generation', model='Suchinthana/T5-Base-Wikigen')
>>> generator("writeWiki: Microcontroller", do_sample=True, max_length=250)
```