Ollama?
Sorry if this is a dumb question - but can this be used with/in Ollama?
If so - how do I get it in?
Thanks for showing interest! - Here you go: https://ollama.com/ajindal/llama3.1-storm
Thanks - I'll download it now and give it a spin!
Downloaded "PULL" OK
adam@xenomorph:~$ ollama pull ajindal/llama3.1-storm:8b
pulling manifest
pulling 85c52e1d24f8... 100% ββββββββββββββββββββββββββββββββββββββββββββββββββββββ 4.7 GB
pulling 75357d685f23... 100% ββββββββββββββββββββββββββββββββββββββββββββββββββββββ 28 B
pulling 1970553b62f4... 100% ββββββββββββββββββββββββββββββββββββββββββββββββββββββ 1.7 KB
pulling 418d4136a93a... 100% ββββββββββββββββββββββββββββββββββββββββββββββββββββββ 12 KB
pulling 56bb8bd477a5... 100% ββββββββββββββββββββββββββββββββββββββββββββββββββββββ 96 B
pulling 04d5e68bf551... 100% ββββββββββββββββββββββββββββββββββββββββββββββββββββββ 559 B
verifying sha256 digest
writing manifest
removing any unused layers
success
- but seems to have an error:
ollama run ajindal/llama3.1-storm:8b
Error: template: :12: function "json" not defined
Not sure what to do next.
Are you on latest Ollama? I am able to run it.
Can you paste your complete command/input so I can reproduce it?
I am able to run this example:
import ollama
tools = [{
'type': 'function',
'function': {
'name': 'get_current_weather',
'description': 'Get the current weather for a city',
'parameters': {
'type': 'object',
'properties': {
'city': {
'type': 'string',
'description': 'The name of the city',
},
},
'required': ['city'],
},
},
},
{
'type': 'function',
'function': {
'name': 'get_places_to_vist',
'description': 'Get places to visit in a city',
'parameters': {
'type': 'object',
'properties': {
'city': {
'type': 'string',
'description': 'The name of the city',
},
},
'required': ['city'],
},
},
},
]
response = ollama.chat(
model='ajindal/llama3.1-storm:8b',
messages=[
{'role': 'system', 'content': 'You are a helpful assistant.'},
{'role': 'user', 'content': 'What is the weather in Toronto and San Francisco?'}
],
tools=tools
)
print(response['message'])
Expected output:
{'role': 'assistant', 'content': "<tool_call>{'tool_name': 'get_current_weather', 'tool_arguments': {'city': 'Toronto'}}</tool_call>\n<tool_call>{'tool_name': 'get_current_weather', 'tool_arguments': {'city': 'San Francisco'}}</tool_call>"}
The version was the issue. I have multiple machines running ollama - and it looks like I've not updated one of the Debian Linux ones in a while.
I've updated and all looks good. Thanks again for your help!