scottschmidt commited on
Commit
efb1f2a
1 Parent(s): 85ba1ba

encoder-encoder -> encoder-decoder

Browse files

In the [paper](https://arxiv.org/abs/2107.07653):
`TA PE X is conceptually simple and easy to implement. In this paper, we regard the pre-training as a sequence generation task and employ an encoder-decoder model`

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -14,7 +14,7 @@ TAPEX was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Execu
14
 
15
  TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
16
 
17
- TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
18
 
19
  ## Intended Uses
20
 
14
 
15
  TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
16
 
17
+ TAPEX is based on the BART architecture, the transformer encoder-decoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
18
 
19
  ## Intended Uses
20