Edit model card

T5 for question-generation

This is t5-base model trained for answer aware question generation task. The answer spans are highlighted within the text with special highlight tokens.

You can play with the model using the inference API, just highlight the answer spans with <hl> tokens and end the text with </s>. For example

<hl> 42 <hl> is the answer to life, the universe and everything. </s>

For more deatils see this repo.

Model in action πŸš€

You'll need to clone the repo.

Open In Colab

from pipelines import pipeline
nlp = pipeline("question-generation", model="valhalla/t5-base-qg-hl")
nlp("42 is the answer to life, universe and everything.")
=> [{'answer': '42', 'question': 'What is the answer to life, universe and everything?'}]
Downloads last month
7,280
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for valhalla/t5-base-qg-hl

Finetunes
1 model

Dataset used to train valhalla/t5-base-qg-hl

Spaces using valhalla/t5-base-qg-hl 2