system HF staff commited on
Commit
83dfa86
1 Parent(s): dfd7034

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +73 -0
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: "en"
3
+ tags:
4
+ - pytorch
5
+ - question-answering
6
+ datasets:
7
+ - squad2
8
+ metrics:
9
+ - exact
10
+ - f1
11
+ widget:
12
+ - text: "What discipline did Winkelmann create?"
13
+ - context: "Johann Joachim Winckelmann was a German art historian and archaeologist. He was a pioneering Hellenist who first articulated the difference between Greek, Greco-Roman and Roman art. "The prophet and founding hero of modern archaeology", Winckelmann was one of the founders of scientific archaeology and first applied the categories of style on a large, systematic basis to the history of art.
14
+ "
15
+ ---
16
+
17
+ # roberta-large-finetuned-squad2
18
+
19
+ ## Model description
20
+
21
+ This model is based on [roberta-large](https://huggingface.co/roberta-large) and was finetuned on [SQuAD2.0](https://rajpurkar.github.io/SQuAD-explorer/). The corresponding papers you can found [here (model)](https://arxiv.org/abs/1907.11692) and [here (data)](https://arxiv.org/abs/1806.03822).
22
+
23
+
24
+ ## How to use
25
+
26
+ ```python
27
+ from transformers.pipelines import pipeline
28
+
29
+ model_name = "phiyodr/roberta-large-finetuned-squad2"
30
+ nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
31
+ inputs = {
32
+ 'question': 'What discipline did Winkelmann create?',
33
+ 'context': 'Johann Joachim Winckelmann was a German art historian and archaeologist. He was a pioneering Hellenist who first articulated the difference between Greek, Greco-Roman and Roman art. "The prophet and founding hero of modern archaeology", Winckelmann was one of the founders of scientific archaeology and first applied the categories of style on a large, systematic basis to the history of art. '
34
+ }
35
+ nlp(inputs)
36
+ ```
37
+
38
+
39
+
40
+ ## Training procedure
41
+
42
+ ```
43
+ {
44
+ "base_model": "roberta-large",
45
+ "do_lower_case": True,
46
+ "learning_rate": 3e-5,
47
+ "num_train_epochs": 4,
48
+ "max_seq_length": 384,
49
+ "doc_stride": 128,
50
+ "max_query_length": 64,
51
+ "batch_size": 96
52
+ }
53
+ ```
54
+
55
+ ## Eval results
56
+
57
+ - Data: [dev-v2.0.json](https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v2.0.json)
58
+ - Script: [evaluate-v2.0.py](https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/) (original script from [here](https://github.com/huggingface/transformers/blob/master/examples/question-answering/README.md))
59
+
60
+ ```
61
+ {
62
+ "exact": 84.38473848227069,
63
+ "f1": 87.89711571225455,
64
+ "total": 11873,
65
+ "HasAns_exact": 80.9885290148448,
66
+ "HasAns_f1": 88.02335608157898,
67
+ "HasAns_total": 5928,
68
+ "NoAns_exact": 87.77123633305298,
69
+ "NoAns_f1": 87.77123633305298,
70
+ "NoAns_total": 5945
71
+ }
72
+ ```
73
+