system HF staff commited on
Commit
fcacb26
1 Parent(s): 987a9b4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +73 -0
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - pytorch
5
+ - question-answering
6
+ datasets:
7
+ - squad2
8
+ metrics:
9
+ - exact
10
+ - f1
11
+ widget:
12
+ - text: "What discipline did Winkelmann create?"
13
+ context: "Johann Joachim Winckelmann was a German art historian and archaeologist. He was a pioneering Hellenist who first articulated the difference between Greek, Greco-Roman and Roman art. The prophet and founding hero of modern archaeology, Winckelmann was one of the founders of scientific archaeology and first applied the categories of style on a large, systematic basis to the history of art."
14
+ ---
15
+
16
+ # roberta-large-finetuned-squad2
17
+
18
+ ## Model description
19
+
20
+ This model is based on [facebook/bart-large](https://huggingface.co/facebook/bart-large) and was finetuned on [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/). The corresponding papers you can found [here (model)](https://arxiv.org/pdf/1910.13461.pdf) and [here (data)](https://arxiv.org/abs/1806.03822).
21
+
22
+
23
+ ## How to use
24
+
25
+ ```python
26
+ from transformers.pipelines import pipeline
27
+
28
+ model_name = "phiyodr/bart-large-finetuned-squad2"
29
+ nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
30
+ inputs = {
31
+ 'question': 'What discipline did Winkelmann create?',
32
+ 'context': 'Johann Joachim Winckelmann was a German art historian and archaeologist. He was a pioneering Hellenist who first articulated the difference between Greek, Greco-Roman and Roman art. "The prophet and founding hero of modern archaeology", Winckelmann was one of the founders of scientific archaeology and first applied the categories of style on a large, systematic basis to the history of art. '
33
+ }
34
+ nlp(inputs)
35
+ ```
36
+
37
+
38
+
39
+ ## Training procedure
40
+
41
+ ```
42
+ {
43
+ "base_model": "facebook/bart-large",
44
+ "do_lower_case": True,
45
+ "learning_rate": 3e-5,
46
+ "num_train_epochs": 4,
47
+ "max_seq_length": 384,
48
+ "doc_stride": 128,
49
+ "max_query_length": 64,
50
+ "batch_size": 96
51
+ }
52
+ ```
53
+
54
+ ## Eval results
55
+
56
+ - Data: [dev-v2.0.json](https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v2.0.json)
57
+ - Script: [evaluate-v2.0.py](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/) (original script from [here](https://github.com/huggingface/transformers/blob/master/examples/question-answering/README.md))
58
+
59
+ ```
60
+ {
61
+ "exact": 81.96748926134929,
62
+ "f1": 85.93825235371045,
63
+ "total": 11873,
64
+ "HasAns_exact": 78.71120107962213,
65
+ "HasAns_f1": 86.6641144054667,
66
+ "HasAns_total": 5928,
67
+ "NoAns_exact": 85.21446593776282,
68
+ "NoAns_f1": 85.21446593776282,
69
+ "NoAns_total": 5945
70
+ }
71
+
72
+ ```
73
+