StevenTang commited on
Commit
ab533b6
1 Parent(s): 4d8e5ab

Update README

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ tags:
6
+ - text-generation
7
+ - text2text-generation
8
+ pipeline_tag: text2text-generation
9
+ widget:
10
+ - text: "Generate the question based on the answer: boxing [X_SEP] A bolo punch is a punch used in martial arts . A hook is a punch in boxing ."
11
+ example_title: "Example1"
12
+ - text: "Generate the question based on the answer: Arthur 's Magazine [X_SEP] Arthur 's Magazine ( 1844–1846 ) was an American literary periodical published in Philadelphia in the 19th century . First for Women is a woman 's magazine published by Bauer Media Group in the USA ."
13
+ example_title: "Example2"
14
+ ---
15
+
16
+ # MTL-question-generation
17
+ The MTL-question-generation model was proposed in [**MVP: Multi-task Supervised Pre-training for Natural Language Generation**](https://github.com/RUCAIBox/MVP/blob/main/paper.pdf) by Tianyi Tang, Junyi Li, Wayne Xin Zhao and Ji-Rong Wen.
18
+
19
+ The detailed information and instructions can be found [https://github.com/RUCAIBox/MVP](https://github.com/RUCAIBox/MVP).
20
+
21
+ ## Model Description
22
+ MTL-question-generation is supervised pre-trained using a mixture of labeled question generation datasets. It is a variant (Single) of our main [MVP](https://huggingface.co/RUCAIBox/mvp) model. It follows a standard Transformer encoder-decoder architecture.
23
+
24
+ MTL-question-generation is specially designed for question generation tasks, such as SQuAD and CoQA.
25
+
26
+ ## Example
27
+ ```python
28
+ >>> from transformers import MvpTokenizer, MvpForConditionalGeneration
29
+
30
+ >>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
31
+ >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mtl-question-generation")
32
+
33
+ >>> inputs = tokenizer(
34
+ ... "Generate the question based on the answer: boxing [X_SEP] A bolo punch is a punch used in martial arts . A hook is a punch in boxing .",
35
+ ... return_tensors="pt",
36
+ ... )
37
+ >>> generated_ids = model.generate(**inputs)
38
+ >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
39
+ ['A bolo punch and a hook are both punches used in what sport?]
40
+ ```
41
+
42
+ ## Citation