Edit model card

tscholak/1zha5ono

Fine-tuned weights for PICARD - Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models based on t5.1.1.lm100k.base.

Training Data

The model has been fine-tuned on the 7000 training examples in the Spider text-to-SQL dataset. The model solves Spider's zero-shot text-to-SQL translation task, and that means that it can generalize to unseen SQL databases.

Training Objective

This model was initialized with t5.1.1.lm100k.base and fine-tuned with the text-to-text generation objective.

Questions are always grounded in a database schema, and the model is trained to predict the SQL query that would be used to answer the question. The input to the model is composed of the user's natural language question, the database identifier, and a list of tables and their columns:

[question] | [db_id] | [table] : [column] ( [content] , [content] ) , [column] ( ... ) , [...] | [table] : ... | ...

The model outputs the database identifier and the SQL query that will be executed on the database to answer the user's question:

[db_id] | [sql]

Performance

Out of the box, this model achieves 59.4 % exact-set match accuracy and 60.0 % execution accuracy on the Spider development set.

Using the PICARD constrained decoding method (see the official PICARD implementation), the model's performance can be improved to 66.6 % exact-set match accuracy and 68.4 % execution accuracy on the Spider development set.

Usage

Please see the official repository for scripts and docker images that support evaluation and serving of this model.

References

  1. PICARD - Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models

  2. Official PICARD code

Citation

@inproceedings{Scholak2021:PICARD,
  author = {Torsten Scholak and Nathan Schucher and Dzmitry Bahdanau},
  title = "{PICARD}: Parsing Incrementally for Constrained Auto-Regressive Decoding from Language Models",
  booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
  month = nov,
  year = "2021",
  publisher = "Association for Computational Linguistics",
  url = "https://aclanthology.org/2021.emnlp-main.779",
  pages = "9895--9901",
}
Downloads last month
460
Hosted inference API
Text2Text Generation
Examples
Examples
This model can be loaded on the Inference API on-demand.

Dataset used to train tscholak/1zha5ono