t5-small-qg-hl / README.md
julien-c's picture
julien-c HF staff
Migrate model card from transformers-repo
0cc3bde
metadata
datasets:
  - squad
tags:
  - question-generation
widget:
  - text: <hl> 42 <hl> is the answer to life, the universe and everything. </s>
  - text: >-
      Python is a programming language. It is developed by <hl> Guido Van Rossum
      <hl>. </s>
  - text: Simple is better than <hl> complex <hl>. </s>
license: mit

T5 for question-generation

This is t5-small model trained for answer aware question generation task. The answer spans are highlighted within the text with special highlight tokens.

You can play with the model using the inference API, just highlight the answer spans with <hl> tokens and end the text with </s>. For example

<hl> 42 <hl> is the answer to life, the universe and everything. </s>

For more deatils see this repo.

Model in action πŸš€

You'll need to clone the repo.

Open In Colab

from pipelines import pipeline
nlp = pipeline("question-generation")
nlp("42 is the answer to life, universe and everything.")
=> [{'answer': '42', 'question': 'What is the answer to life, universe and everything?'}]