invokerliang
commited on
Commit
·
deb80e9
1
Parent(s):
427fde7
Update README.md
Browse files
README.md
CHANGED
@@ -1,4 +1,25 @@
|
|
1 |
---
|
2 |
license: afl-3.0
|
3 |
---
|
4 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: afl-3.0
|
3 |
---
|
4 |
+
# MWP-BERT
|
5 |
+
NAACL 2022 Findings Paper: MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving
|
6 |
+
|
7 |
+
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/mwp-bert-a-strong-baseline-for-math-word/math-word-problem-solving-on-mathqa)](https://paperswithcode.com/sota/math-word-problem-solving-on-mathqa?p=mwp-bert-a-strong-baseline-for-math-word)
|
8 |
+
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/mwp-bert-a-strong-baseline-for-math-word/math-word-problem-solving-on-math23k)](https://paperswithcode.com/sota/math-word-problem-solving-on-math23k?p=mwp-bert-a-strong-baseline-for-math-word)
|
9 |
+
|
10 |
+
Github link: https://github.com/LZhenwen/MWP-BERT/
|
11 |
+
|
12 |
+
|
13 |
+
Please use the tokenizer of "hfl/chinese-bert-wwm-ext" for this model.
|
14 |
+
|
15 |
+
## Citation
|
16 |
+
|
17 |
+
```
|
18 |
+
@inproceedings{liang2022mwp,
|
19 |
+
title={MWP-BERT: Numeracy-Augmented Pre-training for Math Word Problem Solving},
|
20 |
+
author={Liang, Zhenwen and Zhang, Jipeng and Wang, Lei and Qin, Wei and Lan, Yunshi and Shao, Jie and Zhang, Xiangliang},
|
21 |
+
booktitle={Findings of NAACL 2022},
|
22 |
+
pages={997--1009},
|
23 |
+
year={2022}
|
24 |
+
}
|
25 |
+
```
|