haritzpuerto commited on
Commit
aae1bd0
1 Parent(s): 5e44794

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -0
README.md CHANGED
@@ -1,3 +1,40 @@
1
  ---
2
  license: mit
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ datasets:
4
+ - mrqa
5
+ language:
6
+ - en
7
+ metrics:
8
+ - squad
9
+ library_name: adapter-transformers
10
+ pipeline_tag: question-answering
11
  ---
12
+
13
+ This is the MADE Adapter for the HotpotQA partition of the MRQA 2019 Shared Task dataset. The adapter was created by Friedman et al. (2021) and should be used with this encoder: https://huggingface.co/UKP-SQuARE/MADE_Encoder
14
+
15
+
16
+
17
+ The UKP-SQuARE team created this model repository to simplify the deployment of this model on the UKP-SQuARE platform. The GitHub repository of the original authors is https://github.com/princeton-nlp/MADE
18
+
19
+ This model contains the same weights as https://huggingface.co/princeton-nlp/MADE/resolve/main/made_tuned_adapters/HotpotQA/model.pt. The only difference is that our repository follows the standard format of AdapterHub. Therefore, you could load this model as follows:
20
+
21
+ ```
22
+ from transformers import RobertaForQuestionAnswering, RobertaTokenizerFast
23
+
24
+ model = RobertaForQuestionAnswering.from_pretrained("UKP-SQuARE/MADE_Encoder")
25
+ model.load_adapter("UKP-SQuARE/MADE_HotpotQA_Adapter", source="hf")
26
+ model.set_active_adapters("HotpotQA")
27
+
28
+ tokenizer = RobertaTokenizerFast.from_pretrained('UKP-SQuARE/MADE_Encoder')
29
+
30
+ pipe = pipeline("question-answering", model=model, tokenizer=tokenizer)
31
+ pipe({"question": "What is the capital of Germany?", "context": "The capital of Germany is Berlin."})
32
+ ```
33
+
34
+ Note you need the adapter-transformers library https://adapterhub.ml
35
+
36
+
37
+ Please refer to the original publication for more information.
38
+
39
+ Citation:
40
+ Single-dataset Experts for Multi-dataset Question Answering (Friedman et al., EMNLP 2021)