nielsr HF staff commited on
Commit
958399a
1 Parent(s): d585440

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +66 -65
README.md CHANGED
@@ -1,66 +1,67 @@
1
- ---
2
- language: en
3
- tags:
4
- - tapex
5
- license: mit
6
- ---
7
-
8
- # TAPEX (large-sized model)
9
-
10
- TAPEX was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Executor](https://arxiv.org/abs/2107.07653) by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. The original repo can be found [here](https://github.com/microsoft/Table-Pretraining).
11
-
12
- ## Model description
13
-
14
- TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
15
-
16
- TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
17
-
18
- This model is the `tapex-base` model fine-tuned on the [Tabfact](https://huggingface.co/datasets/tab_fact) dataset.
19
-
20
- ## Intended Uses
21
-
22
- You can use the model for table fact verficiation.
23
-
24
- ### How to Use
25
-
26
- Here is how to use this model in transformers:
27
-
28
- ```python
29
- from transformers import TapexTokenizer, BartForSequenceClassification
30
- import pandas as pd
31
-
32
- tokenizer = TapexTokenizer.from_pretrained("microsoft/tapex-large-finetuned-tabfact")
33
- model = BartForSequenceClassification.from_pretrained("microsoft/tapex-large-finetuned-tabfact")
34
-
35
- data = {
36
- "year": [1896, 1900, 1904, 2004, 2008, 2012],
37
- "city": ["athens", "paris", "st. louis", "athens", "beijing", "london"]
38
- }
39
- table = pd.DataFrame.from_dict(data)
40
-
41
- # tapex accepts uncased input since it is pre-trained on the uncased corpus
42
- query = "beijing hosts the olympic games in 2012"
43
- encoding = tokenizer(table=table, query=query, return_tensors="pt")
44
-
45
- outputs = model(**encoding)
46
- output_id = int(outputs.logits[0].argmax(dim=0))
47
- print(model.config.id2label[output_id])
48
- # Refused
49
- ```
50
-
51
- ### How to Eval
52
-
53
- Please find the eval script [here](https://github.com/SivilTaram/transformers/tree/add_tapex_bis/examples/research_projects/tapex).
54
-
55
- ### BibTeX entry and citation info
56
-
57
- ```bibtex
58
- @inproceedings{
59
- liu2022tapex,
60
- title={{TAPEX}: Table Pre-training via Learning a Neural {SQL} Executor},
61
- author={Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou},
62
- booktitle={International Conference on Learning Representations},
63
- year={2022},
64
- url={https://openreview.net/forum?id=O50443AsCP}
65
- }
 
66
  ```
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - tapex
5
+ - table-question-answering
6
+ license: mit
7
+ ---
8
+
9
+ # TAPEX (large-sized model)
10
+
11
+ TAPEX was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Executor](https://arxiv.org/abs/2107.07653) by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. The original repo can be found [here](https://github.com/microsoft/Table-Pretraining).
12
+
13
+ ## Model description
14
+
15
+ TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
16
+
17
+ TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
18
+
19
+ This model is the `tapex-base` model fine-tuned on the [Tabfact](https://huggingface.co/datasets/tab_fact) dataset.
20
+
21
+ ## Intended Uses
22
+
23
+ You can use the model for table fact verficiation.
24
+
25
+ ### How to Use
26
+
27
+ Here is how to use this model in transformers:
28
+
29
+ ```python
30
+ from transformers import TapexTokenizer, BartForSequenceClassification
31
+ import pandas as pd
32
+
33
+ tokenizer = TapexTokenizer.from_pretrained("microsoft/tapex-large-finetuned-tabfact")
34
+ model = BartForSequenceClassification.from_pretrained("microsoft/tapex-large-finetuned-tabfact")
35
+
36
+ data = {
37
+ "year": [1896, 1900, 1904, 2004, 2008, 2012],
38
+ "city": ["athens", "paris", "st. louis", "athens", "beijing", "london"]
39
+ }
40
+ table = pd.DataFrame.from_dict(data)
41
+
42
+ # tapex accepts uncased input since it is pre-trained on the uncased corpus
43
+ query = "beijing hosts the olympic games in 2012"
44
+ encoding = tokenizer(table=table, query=query, return_tensors="pt")
45
+
46
+ outputs = model(**encoding)
47
+ output_id = int(outputs.logits[0].argmax(dim=0))
48
+ print(model.config.id2label[output_id])
49
+ # Refused
50
+ ```
51
+
52
+ ### How to Eval
53
+
54
+ Please find the eval script [here](https://github.com/SivilTaram/transformers/tree/add_tapex_bis/examples/research_projects/tapex).
55
+
56
+ ### BibTeX entry and citation info
57
+
58
+ ```bibtex
59
+ @inproceedings{
60
+ liu2022tapex,
61
+ title={{TAPEX}: Table Pre-training via Learning a Neural {SQL} Executor},
62
+ author={Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou},
63
+ booktitle={International Conference on Learning Representations},
64
+ year={2022},
65
+ url={https://openreview.net/forum?id=O50443AsCP}
66
+ }
67
  ```