Edit model card

This is a t5-small fine-tuned version on the wikisql dataset for English to SQL translation text2text mission.

To load the model: (necessary packages: !pip install transformers sentencepiece)

from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("dbernsohn/t5_wikisql_en2SQL")
model = AutoModelWithLMHead.from_pretrained("dbernsohn/t5_wikisql_en2SQL")

You can then use this model to translate SQL queries into plain english.

query = "what are the names of all the people in the USA?"
input_text = f"translate English to Sql: {query} </s>"
features = tokenizer([input_text], return_tensors='pt')

output = model.generate(input_ids=features['input_ids'].cuda(), 
                        attention_mask=features['attention_mask'].cuda())

tokenizer.decode(output[0])
# Output: "SELECT Name FROM table WHERE Country = USA"

The whole training process and hyperparameters are in my GitHub repo

Created by Dor Bernsohn

Downloads last month
469
Hosted inference API
This model can be loaded on the Inference API on-demand.

Dataset used to train dbernsohn/t5_wikisql_en2SQL