Is function calling chain possible?

#4
by aisensiy - opened

ChatGPT has an interesting use case where it supports function calling chains. In other words, it knows how to combine two functions (I haven't tried scenarios with more functions) to achieve a more complex functionality. I also performed a similar test case on llama-7b, and the results are as follows.

 <s>[INST] <<SYS>>
You are a helpful research assistant. The following functions are available for you to fetch further data to answer user questions, if relevant:


    {
        "name": "get_latitude_and_longitude",
        "description": "Get the current location's latitude and longitude",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "location name, like 北京, this must be Chinese",
                }
            },
            "required": ["location"],
        },
    }
    {
        "name": "get_current_weather",
        "description": "Get the current weather in a given latitude and longitude",
        "parameters": {
            "type": "object",
            "properties": {
                "latitude": {
                    "type": "string",
                    "description": "latitude of the location",
                },
                "longitude": {
                    "type": "string",
                    "description": "longitude of the location",
                },
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
            },
            "required": ["latitude", "longitude"],
        },
    }


To call a function, respond - immediately and only - with a JSON object of the following format:
{
    "function": "function_name",
    "arguments": {
        "argument1": "argument_value",
        "argument2": "argument_value"
    }
}
<</SYS>>

Tell me the weather of London now.[/INST]
{
    "function": "get_current_weather",
    "arguments": {
        "latitude": "51.507493",
        "longitude": "-0.127861",
        "unit": "celsius"
    }
}

It seems that it do not know to firstly call get_latitude_and_longitude to get the latitude and longitude.

I also tried this with llama 70b in huggingface chat. It also failed.

https://huggingface.co/chat/conversation/650efed7a1a636d518b6e230

<s>[INST] <<SYS>>
You are a helpful research assistant. The following functions are available for you to fetch further data to answer user questions, if relevant:


    {
        "name": "get_latitude_and_longitude",
        "description": "Get the current location's latitude and longitude",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "location name, like 北京, this must be Chinese",
                }
            },
            "required": ["location"],
        },
    }
    {
        "name": "get_current_weather",
        "description": "Get the current weather in a given latitude and longitude",
        "parameters": {
            "type": "object",
            "properties": {
                "latitude": {
                    "type": "string",
                    "description": "latitude of the location",
                },
                "longitude": {
                    "type": "string",
                    "description": "longitude of the location",
                },
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
            },
            "required": ["latitude", "longitude"],
        },
    }


To call a function, respond - immediately and only - with a JSON object of the following format:
{
    "function": "function_name",
    "arguments": {
        "argument1": "argument_value",
        "argument2": "argument_value"
    }
}
<</SYS>>

Tell me the weather of London [/INST]

But with falcon 180b it works, which if super fancy.

https://huggingface.co/chat/conversation/650eef2534717bb7a3343a09

<s>[INST] <<SYS>>
You are a helpful research assistant. The following functions are available for you to fetch further data to answer user questions, if relevant:


    {
        "name": "get_latitude_and_longitude",
        "description": "Get the current location's latitude and longitude",
        "parameters": {
            "type": "object",
            "properties": {
                "location": {
                    "type": "string",
                    "description": "location name, like 北京, this must be Chinese",
                }
            },
            "required": ["location"],
        },
    }
    {
        "name": "get_current_weather",
        "description": "Get the current weather in a given latitude and longitude",
        "parameters": {
            "type": "object",
            "properties": {
                "latitude": {
                    "type": "string",
                    "description": "latitude of the location",
                },
                "longitude": {
                    "type": "string",
                    "description": "longitude of the location",
                },
                "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
            },
            "required": ["latitude", "longitude"],
        },
    }


To call a function, respond - immediately and only - with a JSON object of the following format:
{
    "function": "function_name",
    "arguments": {
        "argument1": "argument_value",
        "argument2": "argument_value"
    }
}
<</SYS>>

Tell me the weather of London [/INST]

function: get_latitude_and_longitude
arguments:
location: 伦敦

{"latitude": "51.507493", "longitude": "-0.127861"}

function: get_current_weather
arguments:
latitude: 51.507493
longitude: -0.127861
unit: celsius

I haven't purchased the 13b or 70b models yet, but I'm very curious to know if they can pass this test :-).

This is a very interesting feature, but I cannot find any documentation around function calling chains on OpenAI API.

Have you done tests using ChatGPT only?

This is a very interesting feature, but I cannot find any documentation around function calling chains on OpenAI API.

Have you done tests using ChatGPT only?

Yes, it is not documented, but it works...I tested it with ChatGPT with a tool named chainlit, the interaction flow is very similar with the case of falcon 180b which I pasted above.

@aisensiy can you share a screenshot of what you did with Falcon 180B?

I'm wondering whether you are sending the conversation to the LLM twice if there are two function calls in the chain? I assume that's the approach.

I'm guessing we could get untrained 70B to work with a little more programming of prompts. e.g. after you get the first function call, you can prepend something like "Make use of the following function response to answer the question. If needed, make another function call".

On weaker models there is maybe a risk of getting into infinite loops, but you could auto detect that and just exit.

Let me run a quick test on the fine-tuned models tomorrow and see what kind of results I get!

I'm wondering whether you are sending the conversation to the LLM twice if there are two function calls in the chain?

Yes, twice which is the same interaction steps with chatgpt.

https://hf.co/chat/r/uMerr7V

Any update for this topic?

Howdy @aisensiy I just tested this morning and Llama 70B fails for this series calling task. It also fails when I add text to the system message suggesting that it may sometimes need to call functions in series.

I tried the model in bitsandbytes and also in fp16 precision. Both failed.

It seems this specific case would need to be trained for. Or, I could create a falcon 180B model for function calling. LMK if that's of interest.

RonanMcGovern changed discussion status to closed

BTW, I tried a few things today with Llama 70B like:

  1. Adding a "usage" field to the function metadata stating that get_latitude_and_longitude is often used before calling get_weather.

  2. Explicitly putting in the function metadata for get_weather that the function for latitude has to be called first.

Let me know @aisensiy if you come up with other ideas. Obviously training on chains would be worth a shot and I'll aim to try it.

RonanMcGovern changed discussion status to open
RonanMcGovern changed discussion status to closed

Sign up or log in to comment