SQL Query Generation Model

This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.3 specialized for SQL query generation.

Model Details

  • Base Model: mistralai/Mistral-7B-Instruct-v0.3
  • Training Method: LoRA (Rank=16, Alpha=32)
  • Task: SQL query generation from natural language instructions
  • Training Framework: Unsloth

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

# Load the model
tokenizer = AutoTokenizer.from_pretrained("exaler/aaa-2-sql-2")
model = AutoModelForCausalLM.from_pretrained("exaler/aaa-2-sql-2")

# Format your prompt
instruction = """You are an expert SQL query generator. Database schema:
Table: [dbo].[Users]
Columns: [ID], [Name], [Email], [CreatedDate]
Table: [dbo].[Orders]
Columns: [OrderID], [UserID], [Amount], [Status], [OrderDate]
"""

input_text = "Find all users who placed orders with amount greater than 1000"

prompt = f"<s>[INST] {instruction}\n\n{input_text} [/INST]"

# Generate SQL
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(inputs=inputs.input_ids, max_new_tokens=512, temperature=0.0)
response = tokenizer.decode(outputs[0][inputs.input_ids.shape[1]:], skip_special_tokens=True)
print(response)

Training Dataset

The model was trained on a custom dataset of SQL queries with their corresponding natural language descriptions.

Limitations

  • The model is optimized for the specific SQL database schema it was trained on
  • Performance may vary for database schemas significantly different from the training data
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support