StevenTang commited on
Commit
2ab7fd1
1 Parent(s): 5321815

Update README

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ tags:
6
+ - text-generation
7
+ - text2text-generation
8
+ pipeline_tag: text2text-generation
9
+ widget:
10
+ - text: "Answer the following question: From which country did Angola achieve independence in 1975?"
11
+ example_title: "Example1"
12
+ - text: "Answer the following question: what is ce certified [X_SEP] The CE marking is the manufacturer's declaration that the product meets the requirements of the applicable EC directives. Officially, CE is an abbreviation of Conformite Conformité, europeenne Européenne Meaning. european conformity"
13
+ example_title: "Example2"
14
+ ---
15
+
16
+ # MVP-question-answering
17
+ The MVP-question-answering model was proposed in [**MVP: Multi-task Supervised Pre-training for Natural Language Generation**](https://github.com/RUCAIBox/MVP/blob/main/paper.pdf) by Tianyi Tang, Junyi Li, Wayne Xin Zhao and Ji-Rong Wen.
18
+
19
+ The detailed information and instructions can be found [https://github.com/RUCAIBox/MVP](https://github.com/RUCAIBox/MVP).
20
+
21
+ ## Model Description
22
+ MVP-question-answering is a prompt-based model that MVP is further equipped with prompts pre-trained using labeled question answering datasets. It is a variant (MVP+S) of our main MVP model. It follows a Transformer encoder-decoder architecture with layer-wise prompts.
23
+
24
+ MVP-question-answering is specially designed for question answering tasks, such as reading comprehension (SQuAD), conversational question answering (CoQA) and closed-book question-answering (Natural Questions).
25
+
26
+ ## Example
27
+ ```python
28
+ >>> from transformers import MvpTokenizer, MvpForConditionalGeneration
29
+
30
+ >>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
31
+ >>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp-question-answering")
32
+
33
+ >>> inputs = tokenizer(
34
+ ... "Answer the following question: From which country did Angola achieve independence in 1975?",
35
+ ... return_tensors="pt",
36
+ ... )
37
+ >>> generated_ids = model.generate(**inputs)
38
+ >>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
39
+ ['Portugal']
40
+ ```
41
+
42
+ ## Citation