Back to all models
translation mask_token:
Query this model
πŸ”₯ This model is currently loaded and running on the Inference API. ⚠️ This model could not be loaded by the inference API. ⚠️ This model can be loaded on the Inference API on-demand.
JSON Output
API endpoint  

⚑️ Upgrade your account to access the Inference API

Share Copied link to clipboard

Monthly model downloads

valhalla/t5-base-e2e-qg valhalla/t5-base-e2e-qg
2,269 downloads
last 30 days

pytorch

tf

Contributed by

valhalla Suraj Patil
15 models

How to use this model directly from the πŸ€—/transformers library:

			
Copy to clipboard
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("valhalla/t5-base-e2e-qg") model = AutoModelForSeq2SeqLM.from_pretrained("valhalla/t5-base-e2e-qg")

T5 for question-generation

This is t5-base model trained for end-to-end question generation task. Simply input the text and the model will generate multile questions.

You can play with the model using the inference API, just put the text and see the results!

For more deatils see this repo.

Model in action πŸš€

You'll need to clone the repo.

Open In Colab

from pipelines import pipeline

text = "Python is an interpreted, high-level, general-purpose programming language. Created by Guido van Rossum \
and first released in 1991, Python's design philosophy emphasizes code \
readability with its notable use of significant whitespace."

nlp = pipeline("e2e-qg", model="valhalla/t5-base-e2e-qg")
nlp(text)
=> [
 'Who created Python?',
 'When was Python first released?',
 "What is Python's design philosophy?"
]