T5-Base-Wikigen / README.md
Suchinthana's picture
Update README.md
3ca9ebe
|
raw
history blame
797 Bytes
metadata
license: mit
widget:
  - text: 'writeWiki: Jupiter'
  - text: 'writeWiki: Sri Lanka'
  - text: 'writeWiki: Language Model'
language:
  - en
datasets:
  - wikipedia

Fine tuned T5 base model with Simple English Wikipedia Dataset

This model is fine tuned with articles from Simple English Wikipedia for article generation. Used around 25,000 articles for training.

How to use

We have to use "writeWiki: " part at the begining of each prompt.

You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:

>>> from transformers import pipeline
>>> generator = pipeline('text2text-generation', model='Suchinthana/T5-Base-Wikigen')
>>> generator("writeWiki: Microcontroller", do_sample=True, max_length=250)