batterydata
commited on
Commit
•
e3d5280
1
Parent(s):
f20d59f
Update README.md
Browse files
README.md
CHANGED
@@ -1,11 +1,59 @@
|
|
1 |
---
|
2 |
-
language:
|
3 |
-
|
4 |
-
tags:
|
5 |
-
- question answering
|
6 |
license: apache-2.0
|
7 |
-
datasets:
|
8 |
-
|
9 |
-
|
10 |
-
|
11 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
language: en
|
3 |
+
tags: question answering
|
|
|
|
|
4 |
license: apache-2.0
|
5 |
+
datasets: squad
|
6 |
+
metrics: squad
|
7 |
+
---
|
8 |
+
|
9 |
+
# BERT-base-cased for QA
|
10 |
+
**Language model:** bert-base-cased
|
11 |
+
**Language:** English
|
12 |
+
**Downstream-task:** Extractive QA
|
13 |
+
**Training data:** SQuAD v1
|
14 |
+
**Eval data:** SQuAD v1
|
15 |
+
**Code:** See [example](https://github.com/ShuHuang/batterybert)
|
16 |
+
**Infrastructure**: 8x DGX A100
|
17 |
+
## Hyperparameters
|
18 |
+
```
|
19 |
+
batch_size = 32
|
20 |
+
n_epochs = 2
|
21 |
+
base_LM_model = "bert-base-cased"
|
22 |
+
max_seq_len = 386
|
23 |
+
learning_rate = 5e-5
|
24 |
+
doc_stride=128
|
25 |
+
max_query_length=64
|
26 |
+
```
|
27 |
+
## Performance
|
28 |
+
Evaluated on the SQuAD v1.0 dev set.
|
29 |
+
```
|
30 |
+
"exact": 81.30,
|
31 |
+
"f1": 88.58,
|
32 |
+
```
|
33 |
+
Evaluated on the battery device dataset.
|
34 |
+
```
|
35 |
+
"precision": 67.02,
|
36 |
+
"recall": 80.15,
|
37 |
+
```
|
38 |
+
## Usage
|
39 |
+
### In Transformers
|
40 |
+
```python
|
41 |
+
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
|
42 |
+
|
43 |
+
model_name = "batterydata/bert-base-cased-squad-v1"
|
44 |
+
# a) Get predictions
|
45 |
+
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
|
46 |
+
QA_input = {
|
47 |
+
'question': 'Why is model conversion important?',
|
48 |
+
'context': 'The option to convert models between FARM and transformers gives freedom to the user and let people easily switch between frameworks.'
|
49 |
+
}
|
50 |
+
res = nlp(QA_input)
|
51 |
+
# b) Load model & tokenizer
|
52 |
+
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
|
53 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
54 |
+
```
|
55 |
+
## Authors
|
56 |
+
Shu Huang: `sh2009 [at] cam.ac.uk`
|
57 |
+
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
|
58 |
+
## Citation
|
59 |
+
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|