Impossible to prompt

#38
by HuggySSO - opened

I am sorry, but for me it is impossible to prompt.
I am using the model provide through ollama https://ollama.com/library/sqlcoder
Promptformat in ollamas modelfile is:

TEMPLATE """{{ .Prompt }}"""
PARAMETER stop "<|endoftext|>"

My test prompt looks like this:

if prompt := st.chat_input("Enter prompt here.."):

db_schema = f"""
CREATE TABLE stadium (
stadium_id number, --unique id for each stadium
location text, --city and state
name text, --name of the stadium
capacity number, --maximum number of people that can be seated
highest number, --highest number of people that have attended a concert
lowest number, --lowest number of people that have attended a concert
average number --average number of people that have attended a concert
)

CREATE TABLE singer (
singer_id number, --unique id for each singer
name text, --name of the singer
country text, --country of origin of the singer
song_name text, --name of the song
song_release_year text, --year the song was released
age number, --age of the singer
is_male bool --whether or not the singer is male
)

CREATE TABLE concert (
concert_id number, --unique id for each concert
concert_name text, --name of the concert
theme text, --theme of the concert
stadium_id text, --id of the stadium where the concert takes place
year text --year the concert takes place
)

CREATE TABLE singer_in_concert (
concert_id number, --id of the concert
singer_id text --id of the singer
)

-- stadium.stadium_id can be joined with concert.stadium_id
-- singer.singer_id can be joined with singer_in_concert.singer_id
-- concert.concert_id can be joined with singer_in_concert.concert_id
"""

prompt: "Among the artists having concerts in year 2020, which artist has a song whose title equals the name of the stadium on which the concert takes place?"

prompt_format_sqlcoder = f"""
### Task
Generate a single SQL query without any additional Information to answer [QUESTION]{prompt}[/QUESTION]
    
### Database Schema
The query will run on a database with the following schema:
{db_schema}

### Answer
Given the database schema, here is the SQL query that [QUESTION]{prompt}[/QUESTION]
[SQL]

"""

Then i call ollama api with the data:
data = {
"model": model,
"prompt": prompt_format_sqlcoder,
"stream": stream
}
It is not following instructions, hallucinates tables and allways generates completly different querys, even with the exact same prompt.

I would apreciate some help with the prompt format for the use with ollama

Defog.ai org

Hi @HuggySSO , the link you shared seemed to be the 4-bit quantized version of our model, just to confirm were you testing on the half-precision or 4-bit quantized version?

I've not used ollama before, but while testing on our model in half-precision on vllm, we got the following results with the following prompt:

prompt = "### Task\nGenerate a SQL query to answer the following question:\n`Among the artists having concerts in year 2020, which artist has a song whose title equals the name of the stadium on which the concert takes place?`\n\n### Schema\nCREATE TABLE stadium (\nstadium_id number, --unique id for each stadium\nlocation text, --city and state\nname text, --name of the stadium\ncapacity number, --maximum number of people that can be seated\nhighest number, --highest number of people that have attended a concert\nlowest number, --lowest number of people that have attended a concert\naverage number --average number of people that have attended a concert\n)\n\nCREATE TABLE singer (\nsinger_id number, --unique id for each singer\nname text, --name of the singer\ncountry text, --country of origin of the singer\nsong_name text, --name of the song\nsong_release_year text, --year the song was released\nage number, --age of the singer\nis_male bool --whether or not the singer is male\n)\n\nCREATE TABLE concert (\nconcert_id number, --unique id for each concert\nconcert_name text, --name of the concert\ntheme text, --theme of the concert\nstadium_id text, --id of the stadium where the concert takes place\nyear text --year the concert takes place\n)\n\nCREATE TABLE singer_in_concert (\nconcert_id number, --id of the concert\nsinger_id text --id of the singer\n)\n\n-- stadium.stadium_id can be joined with concert.stadium_id\n-- singer.singer_id can be joined with singer_in_concert.singer_id\n-- concert.concert_id can be joined with singer_in_concert.concert_id\n\n### Answer\nGiven the database schema above, here is the SQL query that answers the question:\n```sql\n"

result

 SELECT s.name FROM singer s JOIN singer_in_concert sic ON s.singer_id = sic.singer_id JOIN concert c ON sic.concert_id = c.concert_id JOIN stadium st ON c.stadium_id = st.stadium_id WHERE s.song_name = st.name AND c.year = '2020'

Does this result match what you expected?

I was testing on different models, the one from the ollama link (15b), and the huggingface version sqlcoder-7b-q5_k_m.gguf
Now, with this prompt format the answers are much more coherent. Thanks a lot!
Unfortunatly there are no indentation in the answer, I don't know if that is part of the prompt

Defog.ai org

Glad to hear! Yes we intentionally avoid indentation in the answer (during finetuning), as that will increase the number of tokens to be generated and increase the latency and compute demands. We often use the sql directly with a SQL client (eg psycopg2 in python), and hence don't require it to be formatted. That said, if you would like to format the results, you can do so using
sqlparse.format(my_sql_string, reindent=True)
Feel free to check out their docs for more formatting options:
https://sqlparse.readthedocs.io/en/stable/

wongjingping changed discussion status to closed

Sign up or log in to comment