How to make the model decide when to not use tools/call functions and provide normal chat response?
#1
by
jeril
- opened
I deployed your model using TGI (Huggingface). The model is able to provide responses when the question is related to tool calling. Following was the code used:
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8080/v1", api_key="functionary")
response = client.chat.completions.create(
model="meetkai/functionary-small-v2.5",
messages=[{"role": "user",
"content": "What is the weather for Istanbul?"}
],
tools=[{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
}
},
"required": ["location"]
}
}
}],
tool_choice="auto"
)
print(response)
I am also getting the following output:
ChatCompletion(id='', choices=[Choice(finish_reason='eos_token', index=0, logprobs=None, message=ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='0', function=Function(arguments={'location': 'Istanbul'}, name='openweathermap', description=None), type='function')]))], created=1721757239, model='meetkai/functionary-small-v2.5', object='chat.completion', service_tier=None, system_fingerprint='2.1.2-dev0-sha-9935720', usage=CompletionUsage(completion_tokens=23, prompt_tokens=19, total_tokens=42))
But when I changed the question to:
What is 4 + 4?
I was expecting the answer 8
, but I got the following error:
Traceback (most recent call last):
File "/eph/nvme0/azureml/cr/j/5fab296a307947099e421be9e45eb265/exe/wd/logs/test2.py", line 5, in <module>
response = client.chat.completions.create(
File "/opt/conda/lib/python3.10/site-packages/openai/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
File "/opt/conda/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 646, in create
return self._post(
File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1266, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 942, in request
return self._request(
File "/opt/conda/lib/python3.10/site-packages/openai/_base_client.py", line 1046, in _request
raise self._make_status_error_from_response(err.response) from None
openai.UnprocessableEntityError: Error code: 422 - {'error': 'Tool error: No function found in generated text', 'error_type': 'tool_error'}
Could you please help me understand how to make the model decide when to not use tools/call functions and provide normal chat response?
Hi, can you provide more details of your dependencies so I can reproduce this error? I tried What is 4 + 4?
and it worked totally fine
ChatCompletion(
id='cmpl-fbcff334c54f406983371d64ab9bbe8f',
choices=[
Choice(
finish_reason='stop',
index=0,
logprobs=None,
message=ChatCompletionMessage(
content='4 + 4 = 8',
role='assistant',
function_call=None,
tool_calls=None,
tool_call_id=None,
name=None
)
)
],
created=1721962674,
model='meetkai/functionary-small-v2.5',
object='chat.completion',
service_tier=None,
system_fingerprint=None,
usage=CompletionUsage(completion_tokens=8, prompt_tokens=118, total_tokens=126)
)