from transformers import MT5ForConditionalGeneration, AutoTokenizer model = MT5ForConditionalGeneration.from_pretrained("Parth/mT5-question-generator") tokenizer = AutoTokenizer.from_pretrained("google/mt5-base")

Downloads last month
150
Hosted inference API
Text2Text Generation
This model can be loaded on the Inference API on-demand.