MTL-data-to-text

The MTL-data-to-text model was proposed in MVP: Multi-task Supervised Pre-training for Natural Language Generation by Tianyi Tang, Junyi Li, Wayne Xin Zhao and Ji-Rong Wen.

The detailed information and instructions can be found https://github.com/RUCAIBox/MVP.

Model Description

MTL-data-to-text is supervised pre-trained using a mixture of labeled data-to-text datasets. It is a variant (Single) of our main MVP model. It follows a standard Transformer encoder-decoder architecture.

MTL-data-to-text is specially designed for data-to-text generation tasks, such as KG-to-text generation (WebNLG, DART), table-to-text generation (WikiBio, ToTTo) and MR-to-text generation (E2E).

Example

>>> from transformers import MvpTokenizer, MvpForConditionalGeneration

>>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
>>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mtl-data-to-text")

>>> inputs = tokenizer(
...     "Describe the following data: Iron Man | instance of | Superhero [SEP] Stan Lee | creator | Iron Man",
...     return_tensors="pt",
... )
>>> generated_ids = model.generate(**inputs)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
['Iron Man is a fictional superhero appearing in American comic books published by Marvel Comics.']

Related Models

MVP: https://huggingface.co/RUCAIBox/mvp.

Prompt-based models:

Multi-task models:

Citation

@article{tang2022mvp,
  title={MVP: Multi-task Supervised Pre-training for Natural Language Generation},
  author={Tang, Tianyi and Li, Junyi and Zhao, Wayne Xin and Wen, Ji-Rong},
  journal={arXiv preprint arXiv:2206.12131},
  year={2022},
  url={https://arxiv.org/abs/2206.12131},
}
Downloads last month
180
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.