Qian Liu
commited on
Commit
•
f1b17e5
1
Parent(s):
08b6fcc
Update README.md
Browse files
README.md
CHANGED
@@ -17,34 +17,8 @@ TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2se
|
|
17 |
|
18 |
## Intended Uses
|
19 |
|
20 |
-
|
21 |
-
|
22 |
-
### How to Use
|
23 |
-
|
24 |
-
Here is how to use this model in transformers:
|
25 |
-
|
26 |
-
```python
|
27 |
-
from transformers import TapexTokenizer, BartForConditionalGeneration
|
28 |
-
import pandas as pd
|
29 |
-
|
30 |
-
tokenizer = TapexTokenizer.from_pretrained("microsoft/tapex-large")
|
31 |
-
model = BartForConditionalGeneration.from_pretrained("microsoft/tapex-large")
|
32 |
-
|
33 |
-
data = {
|
34 |
-
"year": [1896, 1900, 1904, 2004, 2008, 2012],
|
35 |
-
"city": ["athens", "paris", "st. louis", "athens", "beijing", "london"]
|
36 |
-
}
|
37 |
-
table = pd.DataFrame.from_dict(data)
|
38 |
-
|
39 |
-
# tapex accepts uncased input since it is pre-trained on the uncased corpus
|
40 |
-
query = "select year where city = beijing"
|
41 |
-
encoding = tokenizer(table=table, query=query, return_tensors="pt")
|
42 |
-
|
43 |
-
outputs = model.generate(**encoding)
|
44 |
-
|
45 |
-
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
|
46 |
-
# ['2008']
|
47 |
-
```
|
48 |
|
49 |
### How to Fine-tuning
|
50 |
|
|
|
17 |
|
18 |
## Intended Uses
|
19 |
|
20 |
+
⚠️ This model checkpoint is **ONLY** used for fine-tuining on downstream tasks, and you **CANNOT** use this model for simulating neural SQL execution, i.e., employ TAPEX to execute a SQL query on a given table. The one that can neurally execute SQL queries is at [here](https://huggingface.co/microsoft/tapex-large-sql-execution).
|
21 |
+
> This separation of two models for two kinds of intention is because of a known issue in BART large, and we recommend readers to see [this comment](https://github.com/huggingface/transformers/issues/15559#issuecomment-1062880564) for more details.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
22 |
|
23 |
### How to Fine-tuning
|
24 |
|