Qian Liu commited on
Commit
e81963a
1 Parent(s): 9a7c476

Update Tapex README.md

Browse files
Files changed (1) hide show
  1. README.md +57 -1
README.md CHANGED
@@ -1,5 +1,61 @@
1
  ---
2
  license: mit
3
  ---
4
- TAPEX-base model pre-trained-only model. This model was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Executor](https://arxiv.org/abs/2107.07653) by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. Original repo can be found [here](https://github.com/microsoft/Table-Pretraining).
5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
  ---
 
4
 
5
+ # TAPEX (base-sized model)
6
+
7
+ TAPEX was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Executor](https://openreview.net/forum?id=O50443AsCP) by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. The original repo can be found [here](https://github.com/microsoft/Table-Pretraining).
8
+
9
+ ## Model description
10
+
11
+ TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
12
+
13
+ TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
14
+
15
+ ## Intended Uses
16
+
17
+ You can use the raw model for simulating neural SQL execution, i.e., employ TAPEX to execute a SQL query on a given table. However, the model is mostly meant to be fine-tuned on a supervised dataset. Currently TAPEX can be fine-tuned to tackle table question answering tasks and table fact verification tasks. See the [model hub](https://huggingface.co/models?search=tapex) to look for fine-tuned versions on a task that interests you.
18
+
19
+ ### How to Use
20
+
21
+ Here is how to use this model in transformers:
22
+
23
+ ```python
24
+ from transformers import TapexTokenizer, BartForConditionalGeneration
25
+ import pandas as pd
26
+
27
+ tokenizer = TapexTokenizer.from_pretrained("microsoft/tapex-base")
28
+ model = BartForConditionalGeneration.from_pretrained("microsoft/tapex-base")
29
+
30
+ data = {
31
+ "year": [1896, 1900, 1904, 2004, 2008, 2012],
32
+ "city": ["athens", "paris", "st. louis", "athens", "beijing", "london"]
33
+ }
34
+ table = pd.DataFrame.from_dict(data)
35
+
36
+ # tapex accepts uncased input since it is pre-trained on the uncased corpus
37
+ query = "select year where city = beijing"
38
+ encoding = tokenizer(table=table, query=query, return_tensors="pt")
39
+
40
+ outputs = model.generate(**encoding)
41
+
42
+ print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
43
+ # ['2008']
44
+ ```
45
+
46
+ ### How to Fine-tuning
47
+
48
+ Please find the fine-tuning script [here](https://github.com/SivilTaram/transformers/tree/add_tapex_bis/examples/research_projects/tapex).
49
+
50
+ ### BibTeX entry and citation info
51
+
52
+ ```bibtex
53
+ @inproceedings{
54
+ liu2022tapex,
55
+ title={{TAPEX}: Table Pre-training via Learning a Neural {SQL} Executor},
56
+ author={Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou},
57
+ booktitle={International Conference on Learning Representations},
58
+ year={2022},
59
+ url={https://openreview.net/forum?id=O50443AsCP}
60
+ }
61
+ ```