Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
Used [run.sh](https://huggingface.co/madlag/bert-large-uncased-whole-word-masking-finetuned-squadv2/blob/main/run.sh) used to train using transformers/example/question_answering code.
|
2 |
+
|
3 |
+
Evaluation results : F1= 85.85 , a much better result than with the original 81.9 from the BERT paper, due to the use of the "whole-word-masking" variation.
|
4 |
+
```
|
5 |
+
{
|
6 |
+
"HasAns_exact": 80.58367071524967,
|
7 |
+
"HasAns_f1": 86.64594807945029,
|
8 |
+
"HasAns_total": 5928,
|
9 |
+
"NoAns_exact": 85.06307821698907,
|
10 |
+
"NoAns_f1": 85.06307821698907,
|
11 |
+
"NoAns_total": 5945,
|
12 |
+
"best_exact": 82.82658131895899,
|
13 |
+
"best_exact_thresh": 0.0,
|
14 |
+
"best_f1": 85.85337995578023,
|
15 |
+
"best_f1_thresh": 0.0,
|
16 |
+
"epoch": 2.0,
|
17 |
+
"eval_samples": 12134,
|
18 |
+
"exact": 82.82658131895899,
|
19 |
+
"f1": 85.85337995578037,
|
20 |
+
"total": 11873
|
21 |
+
}
|
22 |
+
```
|