url
stringlengths
51
54
repository_url
stringclasses
1 value
labels_url
stringlengths
65
68
comments_url
stringlengths
60
63
events_url
stringlengths
58
61
html_url
stringlengths
39
44
id
int64
1.78B
2.82B
node_id
stringlengths
18
19
number
int64
1
8.69k
title
stringlengths
1
382
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
1 class
assignee
dict
assignees
listlengths
0
2
milestone
null
comments
int64
0
323
created_at
timestamp[s]
updated_at
timestamp[s]
closed_at
timestamp[s]
author_association
stringclasses
4 values
sub_issues_summary
dict
active_lock_reason
null
draft
bool
2 classes
pull_request
dict
body
stringlengths
2
118k
closed_by
dict
reactions
dict
timeline_url
stringlengths
60
63
performed_via_github_app
null
state_reason
stringclasses
4 values
is_pull_request
bool
2 classes
https://api.github.com/repos/ollama/ollama/issues/6768
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6768/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6768/comments
https://api.github.com/repos/ollama/ollama/issues/6768/events
https://github.com/ollama/ollama/issues/6768
2,521,074,994
I_kwDOJ0Z1Ps6WRI0y
6,768
Model update history on ollama.com
{ "login": "vYLQs6", "id": 143073604, "node_id": "U_kgDOCIchRA", "avatar_url": "https://avatars.githubusercontent.com/u/143073604?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vYLQs6", "html_url": "https://github.com/vYLQs6", "followers_url": "https://api.github.com/users/vYLQs6/follower...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
4
2024-09-12T00:44:15
2024-10-08T02:56:06
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### Would be nice if we can see what has been updated for a model on the ollama.com ![](https://github.com/user-attachments/assets/f6c08a45-e58e-443b-b4e0-2e763239aa2a)
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6768/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6768/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/5549
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5549/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5549/comments
https://api.github.com/repos/ollama/ollama/issues/5549/events
https://github.com/ollama/ollama/issues/5549
2,396,442,078
I_kwDOJ0Z1Ps6O1s3e
5,549
Account removal on ollama.com
{ "login": "mak448a", "id": 94062293, "node_id": "U_kgDOBZtG1Q", "avatar_url": "https://avatars.githubusercontent.com/u/94062293?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mak448a", "html_url": "https://github.com/mak448a", "followers_url": "https://api.github.com/users/mak448a/follow...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-07-08T19:54:19
2024-07-09T04:36:27
2024-07-09T04:36:27
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Please add a "remove account" button on the website. Thank you!
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5549/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5549/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6114
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6114/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6114/comments
https://api.github.com/repos/ollama/ollama/issues/6114/events
https://github.com/ollama/ollama/issues/6114
2,441,618,319
I_kwDOJ0Z1Ps6RiCOP
6,114
llama3-groq-tool-use can't request 2 tools at once but llama3.1 could do it
{ "login": "Hor1zonZzz", "id": 105845016, "node_id": "U_kgDOBk8RGA", "avatar_url": "https://avatars.githubusercontent.com/u/105845016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Hor1zonZzz", "html_url": "https://github.com/Hor1zonZzz", "followers_url": "https://api.github.com/users/Hor...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
0
2024-08-01T06:41:33
2024-08-01T06:41:33
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? **My code is following** model = ChatOllama(model="llama3.1") from langchain_core.pydantic_v1 import BaseModel, Field def add(a: int, b: int) -> int: """Add two integers. Args: a: First integer b: Second integer """ return a + b def multiply(a: int, b:...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6114/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6114/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1684
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1684/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1684/comments
https://api.github.com/repos/ollama/ollama/issues/1684/events
https://github.com/ollama/ollama/pull/1684
2,054,596,822
PR_kwDOJ0Z1Ps5isCaw
1,684
Guard integration tests with a tag
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
0
2023-12-23T00:33:48
2023-12-23T00:43:44
2023-12-23T00:43:41
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1684", "html_url": "https://github.com/ollama/ollama/pull/1684", "diff_url": "https://github.com/ollama/ollama/pull/1684.diff", "patch_url": "https://github.com/ollama/ollama/pull/1684.patch", "merged_at": "2023-12-23T00:43:41" }
This should help CI avoid running the integration test logic in a container where it's not currently possible.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1684/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1684/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4619
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4619/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4619/comments
https://api.github.com/repos/ollama/ollama/issues/4619/events
https://github.com/ollama/ollama/pull/4619
2,316,015,016
PR_kwDOJ0Z1Ps5wf4cx
4,619
Fix download retry issue
{ "login": "noxer", "id": 566185, "node_id": "MDQ6VXNlcjU2NjE4NQ==", "avatar_url": "https://avatars.githubusercontent.com/u/566185?v=4", "gravatar_id": "", "url": "https://api.github.com/users/noxer", "html_url": "https://github.com/noxer", "followers_url": "https://api.github.com/users/noxer/followers"...
[]
closed
false
null
[]
null
7
2024-05-24T18:31:46
2024-08-02T13:17:50
2024-05-25T00:21:57
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4619", "html_url": "https://github.com/ollama/ollama/pull/4619", "diff_url": "https://github.com/ollama/ollama/pull/4619.diff", "patch_url": "https://github.com/ollama/ollama/pull/4619.patch", "merged_at": "2024-05-25T00:21:57" }
Partial downloaded chunks currently resume incorrectly as the code tries to always download the full size of the chunk rather than the remaining size. Fixes #4520
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4619/reactions", "total_count": 3, "+1": 1, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4619/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3979
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3979/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3979/comments
https://api.github.com/repos/ollama/ollama/issues/3979/events
https://github.com/ollama/ollama/issues/3979
2,267,100,130
I_kwDOJ0Z1Ps6HITPi
3,979
Debian RISCV Build Failed
{ "login": "HougeLangley", "id": 1161594, "node_id": "MDQ6VXNlcjExNjE1OTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HougeLangley", "html_url": "https://github.com/HougeLangley", "followers_url": "https://api.github.com...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
4
2024-04-27T17:03:28
2024-04-30T03:02:37
2024-04-30T03:02:37
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ``` # sipeed @ lpi4a in ~/ollama on git:main o [0:57:45] $ go build . # github.com/chewxy/math32 ../go/pkg/mod/github.com/chewxy/math32@v1.0.8/exp.go:3:6: missing function body ../go/pkg/mod/github.com/chewxy/math32@v1.0.8/exp.go:57:6: missing function body ../go/pkg/mod/github.com/chewxy...
{ "login": "HougeLangley", "id": 1161594, "node_id": "MDQ6VXNlcjExNjE1OTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HougeLangley", "html_url": "https://github.com/HougeLangley", "followers_url": "https://api.github.com...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3979/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3979/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/1449
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1449/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1449/comments
https://api.github.com/repos/ollama/ollama/issues/1449/events
https://github.com/ollama/ollama/pull/1449
2,034,150,596
PR_kwDOJ0Z1Ps5hmjOe
1,449
Get interviewed/interrogated on nearly any subject
{ "login": "stephenwithav", "id": 54563, "node_id": "MDQ6VXNlcjU0NTYz", "avatar_url": "https://avatars.githubusercontent.com/u/54563?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stephenwithav", "html_url": "https://github.com/stephenwithav", "followers_url": "https://api.github.com/user...
[]
closed
false
null
[]
null
2
2023-12-10T00:34:32
2023-12-11T17:37:43
2023-12-11T17:37:42
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1449", "html_url": "https://github.com/ollama/ollama/pull/1449", "diff_url": "https://github.com/ollama/ollama/pull/1449.diff", "patch_url": "https://github.com/ollama/ollama/pull/1449.patch", "merged_at": null }
A useful model to test your understanding of a subject. Good to prepare for job interviews.
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1449/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1449/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/55
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/55/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/55/comments
https://api.github.com/repos/ollama/ollama/issues/55/events
https://github.com/ollama/ollama/pull/55
1,794,020,638
PR_kwDOJ0Z1Ps5U8V1c
55
fix run generate
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
1
2023-07-07T18:27:02
2023-07-07T18:38:00
2023-07-07T18:37:56
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/55", "html_url": "https://github.com/ollama/ollama/pull/55", "diff_url": "https://github.com/ollama/ollama/pull/55.diff", "patch_url": "https://github.com/ollama/ollama/pull/55.patch", "merged_at": "2023-07-07T18:37:56" }
This fixes the run request where struct defaults are used instead of real defaults. This also removes the existence check for pulled images which @BruceMacD will address server side
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/55/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/55/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7886
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7886/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7886/comments
https://api.github.com/repos/ollama/ollama/issues/7886/events
https://github.com/ollama/ollama/issues/7886
2,706,480,373
I_kwDOJ0Z1Ps6hUZz1
7,886
Classify tool call vs. content earlier and stream to user
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
[ { "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "htt...
null
4
2024-11-30T01:41:42
2024-12-14T16:47:05
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://github.com/ollama/ollama/issues/5796#issuecomment-2508374342
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7886/reactions", "total_count": 7, "+1": 6, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/7886/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/826
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/826/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/826/comments
https://api.github.com/repos/ollama/ollama/issues/826/events
https://github.com/ollama/ollama/pull/826
1,948,391,834
PR_kwDOJ0Z1Ps5dEbmA
826
show: no template system if empty
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2023-10-17T22:26:55
2023-10-18T20:11:11
2023-10-18T20:11:10
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/826", "html_url": "https://github.com/ollama/ollama/pull/826", "diff_url": "https://github.com/ollama/ollama/pull/826.diff", "patch_url": "https://github.com/ollama/ollama/pull/826.patch", "merged_at": "2023-10-18T20:11:10" }
This prevents show outputs like this: ``` ollama run mistral >>> /show modelfile # Modelfile generated by "ollama show" # To build a new Modelfile based on this one, replace the FROM line with: # FROM mistral:latest FROM registry.ollama.ai/library/mistral:latest TEMPLATE """[INST] {{ .Prompt }} [/INST] """...
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/826/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/826/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6575
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6575/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6575/comments
https://api.github.com/repos/ollama/ollama/issues/6575/events
https://github.com/ollama/ollama/issues/6575
2,498,629,790
I_kwDOJ0Z1Ps6U7hCe
6,575
no way
{ "login": "Klgor1803", "id": 89669610, "node_id": "MDQ6VXNlcjg5NjY5NjEw", "avatar_url": "https://avatars.githubusercontent.com/u/89669610?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Klgor1803", "html_url": "https://github.com/Klgor1803", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-08-31T04:57:43
2024-08-31T05:01:30
2024-08-31T05:01:30
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Moondream latest >>> help, im dying 1. Helping you to understand the concept of a hash table and its implementation in Python. ### OS Linux ### GPU Other ### CPU Other ### Ollama version 0.3.6
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6575/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6575/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6515
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6515/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6515/comments
https://api.github.com/repos/ollama/ollama/issues/6515/events
https://github.com/ollama/ollama/issues/6515
2,486,802,246
I_kwDOJ0Z1Ps6UOZdG
6,515
"ollama run qwen2" return "the resource allocation failed"
{ "login": "fenggaobj", "id": 13727907, "node_id": "MDQ6VXNlcjEzNzI3OTA3", "avatar_url": "https://avatars.githubusercontent.com/u/13727907?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fenggaobj", "html_url": "https://github.com/fenggaobj", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6430601766, "node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg...
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
6
2024-08-26T12:55:23
2024-08-27T21:04:50
2024-08-27T21:04:29
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When executing "ollama run qwen2" in the Nvidia Jetson AGX Orin environment, a "terminated" error was returned. Could you please help me identify the cause? ``` (ollama) nvidia@ubuntu:~$ ollama run qwen2 Error: llama runner process has terminated: CUDA error: the resource allocation failed...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6515/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6515/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6961
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6961/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6961/comments
https://api.github.com/repos/ollama/ollama/issues/6961/events
https://github.com/ollama/ollama/issues/6961
2,548,690,210
I_kwDOJ0Z1Ps6X6e0i
6,961
UNABLE TO USE GPU FOR OLLAMA MODELS
{ "login": "Paramjethwa", "id": 142441855, "node_id": "U_kgDOCH19fw", "avatar_url": "https://avatars.githubusercontent.com/u/142441855?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Paramjethwa", "html_url": "https://github.com/Paramjethwa", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" }, { "id": 6430601766, "node_id": "LA_kwDOJ0Z1Ps8AAAABf0...
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
1
2024-09-25T18:19:53
2024-10-23T00:12:03
2024-10-23T00:12:03
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ollama is not utilizing GPU this is what i get in Ubuntu terminal ``` [+] Running 2/0 ✔ Container local_multimodal_ai-ollama-1 Created 0.0s ✔ Container local_multimodal_ai-app-1 Created ...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6961/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6961/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2215
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2215/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2215/comments
https://api.github.com/repos/ollama/ollama/issues/2215/events
https://github.com/ollama/ollama/issues/2215
2,102,800,559
I_kwDOJ0Z1Ps59VjCv
2,215
Batching
{ "login": "varunshenoy", "id": 10859091, "node_id": "MDQ6VXNlcjEwODU5MDkx", "avatar_url": "https://avatars.githubusercontent.com/u/10859091?v=4", "gravatar_id": "", "url": "https://api.github.com/users/varunshenoy", "html_url": "https://github.com/varunshenoy", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
2
2024-01-26T19:47:21
2024-01-27T07:44:35
2024-01-26T23:38:34
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Is there any plan to support batching prompts in Ollama? Thank you! Would love to use this to automate some local workflows with higher throughput.
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2215/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2215/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6185
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6185/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6185/comments
https://api.github.com/repos/ollama/ollama/issues/6185/events
https://github.com/ollama/ollama/pull/6185
2,449,209,513
PR_kwDOJ0Z1Ps53e34z
6,185
Add systemd socket
{ "login": "Nicholas42", "id": 16197255, "node_id": "MDQ6VXNlcjE2MTk3MjU1", "avatar_url": "https://avatars.githubusercontent.com/u/16197255?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Nicholas42", "html_url": "https://github.com/Nicholas42", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
1
2024-08-05T18:26:33
2024-11-23T20:57:50
2024-11-23T20:57:50
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6185", "html_url": "https://github.com/ollama/ollama/pull/6185", "diff_url": "https://github.com/ollama/ollama/pull/6185.diff", "patch_url": "https://github.com/ollama/ollama/pull/6185.patch", "merged_at": null }
This enables the use of systemd sockets with this project. A systemd socket will bind to the specified port and start the service when needed (i.e. when a request comes in). Hence, you can have the the service whenever you need it, but don't need to run it all the time.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6185/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6185/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/289
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/289/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/289/comments
https://api.github.com/repos/ollama/ollama/issues/289/events
https://github.com/ollama/ollama/pull/289
1,837,431,370
PR_kwDOJ0Z1Ps5XO-RO
289
First draft of API Docs
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.git...
[]
closed
false
null
[]
null
1
2023-08-04T23:10:03
2023-08-07T20:46:23
2023-08-07T20:46:22
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/289", "html_url": "https://github.com/ollama/ollama/pull/289", "diff_url": "https://github.com/ollama/ollama/pull/289.diff", "patch_url": "https://github.com/ollama/ollama/pull/289.patch", "merged_at": "2023-08-07T20:46:22" }
null
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.git...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/289/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/289/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2272
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2272/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2272/comments
https://api.github.com/repos/ollama/ollama/issues/2272/events
https://github.com/ollama/ollama/pull/2272
2,107,362,807
PR_kwDOJ0Z1Ps5lbNwe
2,272
Default threads enviornment variable override
{ "login": "lainedfles", "id": 126992880, "node_id": "U_kgDOB5HB8A", "avatar_url": "https://avatars.githubusercontent.com/u/126992880?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lainedfles", "html_url": "https://github.com/lainedfles", "followers_url": "https://api.github.com/users/lai...
[]
closed
false
null
[]
null
2
2024-01-30T09:36:44
2024-03-21T20:47:37
2024-03-21T20:47:12
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2272", "html_url": "https://github.com/ollama/ollama/pull/2272", "diff_url": "https://github.com/ollama/ollama/pull/2272.diff", "patch_url": "https://github.com/ollama/ollama/pull/2272.patch", "merged_at": null }
Expose opts.NumThread as env variable OLLAMA_THREADS for override.
{ "login": "lainedfles", "id": 126992880, "node_id": "U_kgDOB5HB8A", "avatar_url": "https://avatars.githubusercontent.com/u/126992880?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lainedfles", "html_url": "https://github.com/lainedfles", "followers_url": "https://api.github.com/users/lai...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2272/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2272/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1817
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1817/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1817/comments
https://api.github.com/repos/ollama/ollama/issues/1817/events
https://github.com/ollama/ollama/pull/1817
2,068,171,270
PR_kwDOJ0Z1Ps5jW1JK
1,817
only pull gguf model if already exists
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[]
closed
false
null
[]
null
0
2024-01-05T23:17:13
2024-01-05T23:50:01
2024-01-05T23:50:00
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1817", "html_url": "https://github.com/ollama/ollama/pull/1817", "diff_url": "https://github.com/ollama/ollama/pull/1817.diff", "patch_url": "https://github.com/ollama/ollama/pull/1817.patch", "merged_at": "2024-01-05T23:50:00" }
null
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1817/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1817/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4405
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4405/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4405/comments
https://api.github.com/repos/ollama/ollama/issues/4405/events
https://github.com/ollama/ollama/issues/4405
2,293,281,464
I_kwDOJ0Z1Ps6IsLK4
4,405
Add model GEITje Ultra / Dutch models
{ "login": "thisisawesome1994", "id": 58063460, "node_id": "MDQ6VXNlcjU4MDYzNDYw", "avatar_url": "https://avatars.githubusercontent.com/u/58063460?v=4", "gravatar_id": "", "url": "https://api.github.com/users/thisisawesome1994", "html_url": "https://github.com/thisisawesome1994", "followers_url": "https...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
2
2024-05-13T16:25:37
2024-05-13T21:52:10
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, Would anyone be able to add GEITje Ultra to ollama? It is a Dutch oriented model based on mistral/mixtral or llama3. The llama3 is called llama3 Dutch. I dont know yet where I would be able to find it, but I suppose you can as it has been opensourced according to this Dutch Tech website; https://tweakers.net/revie...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4405/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4405/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/389
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/389/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/389/comments
https://api.github.com/repos/ollama/ollama/issues/389/events
https://github.com/ollama/ollama/issues/389
1,858,183,618
I_kwDOJ0Z1Ps5uwaHC
389
Microsoft/guidance-ai integration with Ollama
{ "login": "JanMP", "id": 13262398, "node_id": "MDQ6VXNlcjEzMjYyMzk4", "avatar_url": "https://avatars.githubusercontent.com/u/13262398?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JanMP", "html_url": "https://github.com/JanMP", "followers_url": "https://api.github.com/users/JanMP/follow...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5667396205, "node_id": ...
closed
false
null
[]
null
3
2023-08-20T16:39:57
2023-12-04T19:19:02
2023-12-04T19:19:02
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://github.com/guidance-ai/guidance seems to be a easy and efficient way to generate tightly controlled output (like e.g json). Is there a way to use it with models provided by ollama?
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.git...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/389/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/389/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7825
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7825/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7825/comments
https://api.github.com/repos/ollama/ollama/issues/7825/events
https://github.com/ollama/ollama/issues/7825
2,689,120,303
I_kwDOJ0Z1Ps6gSLgv
7,825
Tool behavior in stream mode
{ "login": "jwnder", "id": 24688121, "node_id": "MDQ6VXNlcjI0Njg4MTIx", "avatar_url": "https://avatars.githubusercontent.com/u/24688121?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jwnder", "html_url": "https://github.com/jwnder", "followers_url": "https://api.github.com/users/jwnder/fo...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
2
2024-11-25T04:45:51
2024-11-27T17:38:51
2024-11-27T17:38:50
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
When the api is called as in the example: https://github.com/ollama/ollama/blob/main/docs/api.md#chat-request-with-tools But the stream is enabled (stream: true) The response doesn't contain tool_calls , only content are present. Please add tool_calls in the reply to differentiate between content / tool_calls
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7825/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7825/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2876
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2876/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2876/comments
https://api.github.com/repos/ollama/ollama/issues/2876/events
https://github.com/ollama/ollama/issues/2876
2,164,721,493
I_kwDOJ0Z1Ps6BBwdV
2,876
REST APIs Request Cancellation
{ "login": "mAlaliSy", "id": 14933812, "node_id": "MDQ6VXNlcjE0OTMzODEy", "avatar_url": "https://avatars.githubusercontent.com/u/14933812?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mAlaliSy", "html_url": "https://github.com/mAlaliSy", "followers_url": "https://api.github.com/users/mAl...
[]
closed
false
null
[]
null
3
2024-03-02T09:12:34
2024-03-25T01:18:11
2024-03-12T01:52:14
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi there, I am using Ollama and I found it awesome. One question, when calling Ollama using REST APIs (i.e. generate API), if the client cancels the HTTP request, will Ollama stop processing the request? I found this issue here for JS client library https://github.com/ollama/ollama-js/issues/39 but it doesn't me...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2876/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2876/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6460
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6460/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6460/comments
https://api.github.com/repos/ollama/ollama/issues/6460/events
https://github.com/ollama/ollama/issues/6460
2,480,214,243
I_kwDOJ0Z1Ps6T1RDj
6,460
glm-4v-9b
{ "login": "sdcb", "id": 1317141, "node_id": "MDQ6VXNlcjEzMTcxNDE=", "avatar_url": "https://avatars.githubusercontent.com/u/1317141?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sdcb", "html_url": "https://github.com/sdcb", "followers_url": "https://api.github.com/users/sdcb/followers", ...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
4
2024-08-22T08:55:21
2024-09-26T10:13:27
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
GLM4-9b models has added, however GLM-4v-9b is still missing, please also add: https://huggingface.co/THUDM/glm-4v-9b
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6460/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6460/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2454
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2454/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2454/comments
https://api.github.com/repos/ollama/ollama/issues/2454/events
https://github.com/ollama/ollama/pull/2454
2,129,198,385
PR_kwDOJ0Z1Ps5mlhCL
2,454
Update rocm versions
{ "login": "mkesper", "id": 3063558, "node_id": "MDQ6VXNlcjMwNjM1NTg=", "avatar_url": "https://avatars.githubusercontent.com/u/3063558?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mkesper", "html_url": "https://github.com/mkesper", "followers_url": "https://api.github.com/users/mkesper/...
[]
closed
false
null
[]
null
2
2024-02-11T22:23:26
2024-03-27T21:35:36
2024-03-27T21:35:36
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2454", "html_url": "https://github.com/ollama/ollama/pull/2454", "diff_url": "https://github.com/ollama/ollama/pull/2454.diff", "patch_url": "https://github.com/ollama/ollama/pull/2454.patch", "merged_at": null }
Update rocm build to use version 6 and bump version 6 to 6.0.2.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2454/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2454/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/26
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/26/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/26/comments
https://api.github.com/repos/ollama/ollama/issues/26/events
https://github.com/ollama/ollama/pull/26
1,781,634,020
PR_kwDOJ0Z1Ps5USN17
26
fix run arg parser
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2023-06-29T23:32:45
2023-06-29T23:33:58
2023-06-29T23:33:54
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/26", "html_url": "https://github.com/ollama/ollama/pull/26", "diff_url": "https://github.com/ollama/ollama/pull/26.diff", "patch_url": "https://github.com/ollama/ollama/pull/26.patch", "merged_at": "2023-06-29T23:33:54" }
the bug has no adverse effects other than to the reader. for clarity, rename it to run_parser
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/26/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/26/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8549
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8549/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8549/comments
https://api.github.com/repos/ollama/ollama/issues/8549/events
https://github.com/ollama/ollama/issues/8549
2,807,138,095
I_kwDOJ0Z1Ps6nUYcv
8,549
error: unsupported op 'CPY'
{ "login": "devlux76", "id": 86517969, "node_id": "MDQ6VXNlcjg2NTE3OTY5", "avatar_url": "https://avatars.githubusercontent.com/u/86517969?v=4", "gravatar_id": "", "url": "https://api.github.com/users/devlux76", "html_url": "https://github.com/devlux76", "followers_url": "https://api.github.com/users/dev...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
1
2025-01-23T15:00:30
2025-01-24T01:58:36
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I keep getting crashes on various vision models. This one is from the MiniCPM-V 2.6 straight off ollama ``` time=2025-01-23T07:51:20.484-07:00 level=WARN source=runner.go:129 msg="truncating input prompt" limit=2048 prompt=3390 keep=4 new=2048 time=2025-01-23T07:51:20.484-07:00 level=DEBUG sour...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8549/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8549/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6892
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6892/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6892/comments
https://api.github.com/repos/ollama/ollama/issues/6892/events
https://github.com/ollama/ollama/issues/6892
2,538,933,143
I_kwDOJ0Z1Ps6XVQuX
6,892
Build CPU only image
{ "login": "mgiessing", "id": 40735330, "node_id": "MDQ6VXNlcjQwNzM1MzMw", "avatar_url": "https://avatars.githubusercontent.com/u/40735330?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mgiessing", "html_url": "https://github.com/mgiessing", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5755339642, "node_id": ...
open
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
1
2024-09-20T14:10:33
2024-09-20T23:23:12
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I'd be interested in only building a CPU docker image for ppc arch. I tried to do that for arm64 but it doesn't work perfectly either so I'm wondering if that is possible at all with only one big Dockerfile as it is now? Ideally I'd like to have something like `PLATFORM=linux/arm64 TARGETARCH...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6892/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6892/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3204
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3204/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3204/comments
https://api.github.com/repos/ollama/ollama/issues/3204/events
https://github.com/ollama/ollama/issues/3204
2,191,020,872
I_kwDOJ0Z1Ps6CmFNI
3,204
Ollama cannot connect after Lobechat updating to 0.137.0 and later versions
{ "login": "cheungpatrick", "id": 37861978, "node_id": "MDQ6VXNlcjM3ODYxOTc4", "avatar_url": "https://avatars.githubusercontent.com/u/37861978?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cheungpatrick", "html_url": "https://github.com/cheungpatrick", "followers_url": "https://api.githu...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
0
2024-03-18T01:03:34
2024-03-18T01:09:46
2024-03-18T01:04:17
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "cheungpatrick", "id": 37861978, "node_id": "MDQ6VXNlcjM3ODYxOTc4", "avatar_url": "https://avatars.githubusercontent.com/u/37861978?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cheungpatrick", "html_url": "https://github.com/cheungpatrick", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3204/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3204/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7794
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7794/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7794/comments
https://api.github.com/repos/ollama/ollama/issues/7794/events
https://github.com/ollama/ollama/issues/7794
2,682,630,474
I_kwDOJ0Z1Ps6f5bFK
7,794
llama3.2-vision:90b unquantized?
{ "login": "eggsbenedicto", "id": 189337649, "node_id": "U_kgDOC0kQMQ", "avatar_url": "https://avatars.githubusercontent.com/u/189337649?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eggsbenedicto", "html_url": "https://github.com/eggsbenedicto", "followers_url": "https://api.github.com/...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
2
2024-11-22T09:39:47
2024-11-23T10:47:37
2024-11-23T10:47:37
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, First of all, thanks for your work. I tried out the llama3.2-vision:90b model on ollama and it seems to underperform the version available on the build.nvidia.com API, with the same prompt and settings. Is this because it is a quantized model? The ollama documentation on the official website says it has "Qua...
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7794/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7794/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4350
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4350/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4350/comments
https://api.github.com/repos/ollama/ollama/issues/4350/events
https://github.com/ollama/ollama/issues/4350
2,290,775,096
I_kwDOJ0Z1Ps6IinQ4
4,350
Configurable model loading timeout
{ "login": "ProjectMoon", "id": 183856, "node_id": "MDQ6VXNlcjE4Mzg1Ng==", "avatar_url": "https://avatars.githubusercontent.com/u/183856?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ProjectMoon", "html_url": "https://github.com/ProjectMoon", "followers_url": "https://api.github.com/user...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
10
2024-05-11T08:14:04
2024-06-14T20:43:41
2024-05-23T21:06:03
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
The model loading timeout, the time to wait for the llama runner, is hard coded. It would be nice to be able to configure this to increase or decrease it (for me, mostly increase). This would allow experimenting with big models that take forever to load, but might run fine once loaded.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4350/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4350/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4246
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4246/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4246/comments
https://api.github.com/repos/ollama/ollama/issues/4246/events
https://github.com/ollama/ollama/issues/4246
2,284,486,889
I_kwDOJ0Z1Ps6IKoDp
4,246
llama3-chinese
{ "login": "enryteam", "id": 20081090, "node_id": "MDQ6VXNlcjIwMDgxMDkw", "avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4", "gravatar_id": "", "url": "https://api.github.com/users/enryteam", "html_url": "https://github.com/enryteam", "followers_url": "https://api.github.com/users/enr...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
2
2024-05-08T01:23:33
2024-06-10T03:57:31
2024-06-10T03:57:09
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://github.com/UnicomAI/Unichat-llama3-Chinese
{ "login": "enryteam", "id": 20081090, "node_id": "MDQ6VXNlcjIwMDgxMDkw", "avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4", "gravatar_id": "", "url": "https://api.github.com/users/enryteam", "html_url": "https://github.com/enryteam", "followers_url": "https://api.github.com/users/enr...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4246/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4246/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7730
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7730/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7730/comments
https://api.github.com/repos/ollama/ollama/issues/7730/events
https://github.com/ollama/ollama/issues/7730
2,670,491,801
I_kwDOJ0Z1Ps6fLHiZ
7,730
Whether the model can be started by using its Id?
{ "login": "qq1273834091", "id": 87972019, "node_id": "MDQ6VXNlcjg3OTcyMDE5", "avatar_url": "https://avatars.githubusercontent.com/u/87972019?v=4", "gravatar_id": "", "url": "https://api.github.com/users/qq1273834091", "html_url": "https://github.com/qq1273834091", "followers_url": "https://api.github.c...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-11-19T02:05:51
2024-12-23T07:56:05
2024-12-23T07:56:05
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
<img width="765" alt="image" src="https://github.com/user-attachments/assets/f16a1038-2847-419f-b7c0-1700e0c9f58f">
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7730/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7730/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/838
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/838/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/838/comments
https://api.github.com/repos/ollama/ollama/issues/838/events
https://github.com/ollama/ollama/issues/838
1,949,564,678
I_kwDOJ0Z1Ps50M_8G
838
how to use ollama with open-interpreter?
{ "login": "wuyongyi", "id": 23444520, "node_id": "MDQ6VXNlcjIzNDQ0NTIw", "avatar_url": "https://avatars.githubusercontent.com/u/23444520?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wuyongyi", "html_url": "https://github.com/wuyongyi", "followers_url": "https://api.github.com/users/wuy...
[]
closed
false
null
[]
null
12
2023-10-18T11:55:02
2024-01-01T15:42:57
2023-10-19T23:21:18
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I noticed that open-interpreter utilizes litellm to communicate with llms. While litellm can utilise ollama as a backend to respond to prompts, I have been unable to find a way to utilise ollama within open-interpreter. Does anyone have any experience or knowledge regarding this?
{ "login": "wuyongyi", "id": 23444520, "node_id": "MDQ6VXNlcjIzNDQ0NTIw", "avatar_url": "https://avatars.githubusercontent.com/u/23444520?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wuyongyi", "html_url": "https://github.com/wuyongyi", "followers_url": "https://api.github.com/users/wuy...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/838/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/838/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1942
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1942/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1942/comments
https://api.github.com/repos/ollama/ollama/issues/1942/events
https://github.com/ollama/ollama/issues/1942
2,077,976,485
I_kwDOJ0Z1Ps5722el
1,942
There seems to be no way to query the ollama API with an already defined modelfile
{ "login": "Leopere", "id": 1068374, "node_id": "MDQ6VXNlcjEwNjgzNzQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1068374?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Leopere", "html_url": "https://github.com/Leopere", "followers_url": "https://api.github.com/users/Leopere/...
[]
closed
false
null
[]
null
5
2024-01-12T02:57:25
2024-01-15T05:35:22
2024-01-13T01:16:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
There seems to be no way to query the ollama API with an already defined modelfile
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1942/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1942/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/5699
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5699/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5699/comments
https://api.github.com/repos/ollama/ollama/issues/5699/events
https://github.com/ollama/ollama/issues/5699
2,408,265,012
I_kwDOJ0Z1Ps6PizU0
5,699
Qwen/Qwen2-7B-Instruct
{ "login": "zh19990906", "id": 59323683, "node_id": "MDQ6VXNlcjU5MzIzNjgz", "avatar_url": "https://avatars.githubusercontent.com/u/59323683?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zh19990906", "html_url": "https://github.com/zh19990906", "followers_url": "https://api.github.com/use...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
1
2024-07-15T09:22:51
2024-07-15T10:01:30
2024-07-15T10:01:30
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Qwen/Qwen2-7B-Instruct
{ "login": "zh19990906", "id": 59323683, "node_id": "MDQ6VXNlcjU5MzIzNjgz", "avatar_url": "https://avatars.githubusercontent.com/u/59323683?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zh19990906", "html_url": "https://github.com/zh19990906", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5699/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5699/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8079
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8079/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8079/comments
https://api.github.com/repos/ollama/ollama/issues/8079/events
https://github.com/ollama/ollama/pull/8079
2,737,090,415
PR_kwDOJ0Z1Ps6FFPul
8,079
cmd: enable use of structured outputs
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
[]
open
false
null
[]
null
4
2024-12-12T23:52:51
2024-12-23T15:55:42
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/8079", "html_url": "https://github.com/ollama/ollama/pull/8079", "diff_url": "https://github.com/ollama/ollama/pull/8079.diff", "patch_url": "https://github.com/ollama/ollama/pull/8079.patch", "merged_at": null }
Some cleanup to enable structured outputs in command line. Closes: https://github.com/ollama/ollama/pull/7973
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8079/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8079/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1596
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1596/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1596/comments
https://api.github.com/repos/ollama/ollama/issues/1596/events
https://github.com/ollama/ollama/pull/1596
2,047,913,575
PR_kwDOJ0Z1Ps5iVIx9
1,596
First take at a community resources page of blogs, tutorials, videos
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.git...
[]
closed
false
null
[]
null
1
2023-12-19T04:46:48
2024-02-20T01:55:35
2024-02-20T01:55:35
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1596", "html_url": "https://github.com/ollama/ollama/pull/1596", "diff_url": "https://github.com/ollama/ollama/pull/1596.diff", "patch_url": "https://github.com/ollama/ollama/pull/1596.patch", "merged_at": null }
we need a community page in the docs for blogs, videos, and tutorials. Tools that use Ollama will still go on the front readme.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1596/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1596/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1441
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1441/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1441/comments
https://api.github.com/repos/ollama/ollama/issues/1441/events
https://github.com/ollama/ollama/issues/1441
2,033,356,163
I_kwDOJ0Z1Ps55Mo2D
1,441
Custom model asks itself questions and responds.
{ "login": "stephenwithav", "id": 54563, "node_id": "MDQ6VXNlcjU0NTYz", "avatar_url": "https://avatars.githubusercontent.com/u/54563?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stephenwithav", "html_url": "https://github.com/stephenwithav", "followers_url": "https://api.github.com/user...
[]
closed
false
null
[]
null
4
2023-12-08T21:53:50
2023-12-10T00:11:20
2023-12-10T00:05:54
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
The Modelfile: ``` FROM llama2:13b # set the temperature to 1 [higher is more creative, lower is more coherent] PARAMETER temperature 1 PARAMETER stop "### System" TEMPLATE """ {{- if .First }} ### System: You are an expert at {{ .Prompt }}. You will ask me question, wait for my response, and then evalu...
{ "login": "stephenwithav", "id": 54563, "node_id": "MDQ6VXNlcjU0NTYz", "avatar_url": "https://avatars.githubusercontent.com/u/54563?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stephenwithav", "html_url": "https://github.com/stephenwithav", "followers_url": "https://api.github.com/user...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1441/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1441/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1896
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1896/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1896/comments
https://api.github.com/repos/ollama/ollama/issues/1896/events
https://github.com/ollama/ollama/pull/1896
2,074,430,143
PR_kwDOJ0Z1Ps5jsBmn
1,896
Increase minimum CUDA memory allocation overhead and fix minimum overhead for multi-gpu
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[]
closed
false
null
[]
null
1
2024-01-10T13:55:02
2024-01-11T00:08:52
2024-01-11T00:08:51
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1896", "html_url": "https://github.com/ollama/ollama/pull/1896", "diff_url": "https://github.com/ollama/ollama/pull/1896.diff", "patch_url": "https://github.com/ollama/ollama/pull/1896.patch", "merged_at": "2024-01-11T00:08:51" }
Fixes https://github.com/jmorganca/ollama/issues/1887
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1896/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1896/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8599
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8599/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8599/comments
https://api.github.com/repos/ollama/ollama/issues/8599/events
https://github.com/ollama/ollama/issues/8599
2,811,958,419
I_kwDOJ0Z1Ps6nmxST
8,599
Error: an error was encountered while running the model: unexpected EOF (8x H100, deepseek-r1:671b)
{ "login": "jwatte", "id": 481909, "node_id": "MDQ6VXNlcjQ4MTkwOQ==", "avatar_url": "https://avatars.githubusercontent.com/u/481909?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jwatte", "html_url": "https://github.com/jwatte", "followers_url": "https://api.github.com/users/jwatte/follow...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
5
2025-01-27T02:25:04
2025-01-29T18:11:15
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I'm using a server with 8xH100 GPUs, trying to run the deepseek-r1:671b model. This works for a fair bit, say about 1000-2000 generated tokens, and then it ends with: `Error: an error was encountered while running the model: unexpected EOF` I don't quite know how to debug this -- is there a way...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8599/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8599/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/710
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/710/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/710/comments
https://api.github.com/repos/ollama/ollama/issues/710/events
https://github.com/ollama/ollama/pull/710
1,928,659,054
PR_kwDOJ0Z1Ps5cB7au
710
Update llama.cpp gguf to latest
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[]
closed
false
null
[]
null
2
2023-10-05T16:22:46
2023-10-17T20:55:18
2023-10-17T20:55:17
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/710", "html_url": "https://github.com/ollama/ollama/pull/710", "diff_url": "https://github.com/ollama/ollama/pull/710.diff", "patch_url": "https://github.com/ollama/ollama/pull/710.patch", "merged_at": "2023-10-17T20:55:17" }
- Update 0001-remove-warm-up-logging.patch There have been some bug fixes and improvements, updating the llama.cpp gguf runner to latest to get these in our next release.
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/710/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/710/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/552
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/552/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/552/comments
https://api.github.com/repos/ollama/ollama/issues/552/events
https://github.com/ollama/ollama/pull/552
1,901,787,330
PR_kwDOJ0Z1Ps5anWjj
552
Docker Cuda File update & Documentation Addition.
{ "login": "thekevshow", "id": 1961133, "node_id": "MDQ6VXNlcjE5NjExMzM=", "avatar_url": "https://avatars.githubusercontent.com/u/1961133?v=4", "gravatar_id": "", "url": "https://api.github.com/users/thekevshow", "html_url": "https://github.com/thekevshow", "followers_url": "https://api.github.com/users...
[]
closed
false
null
[]
null
2
2023-09-18T21:36:52
2023-10-24T23:13:28
2023-10-24T23:13:27
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/552", "html_url": "https://github.com/ollama/ollama/pull/552", "diff_url": "https://github.com/ollama/ollama/pull/552.diff", "patch_url": "https://github.com/ollama/ollama/pull/552.patch", "merged_at": null }
Adding ability have cuda work on docker with the ubuntu image provided, along with a docker.md for commands that can be added documenting around docker usage
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/552/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/552/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7257
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7257/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7257/comments
https://api.github.com/repos/ollama/ollama/issues/7257/events
https://github.com/ollama/ollama/issues/7257
2,598,045,173
I_kwDOJ0Z1Ps6a2wX1
7,257
Return Triggered Stop Sequence
{ "login": "someone13574", "id": 81528246, "node_id": "MDQ6VXNlcjgxNTI4MjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/81528246?v=4", "gravatar_id": "", "url": "https://api.github.com/users/someone13574", "html_url": "https://github.com/someone13574", "followers_url": "https://api.github.c...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 7706482389, "node_id": ...
open
false
null
[]
null
1
2024-10-18T17:31:38
2024-12-18T08:01:22
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
It would be extremely useful if the API response contained *which* stop sequence was triggered if there are multiple listed. For example, if you have the model's default stop sequence and a custom one which you want to trigger an action, you currently need to carefully choose the text leading up to it so you can determ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7257/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7257/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2361
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2361/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2361/comments
https://api.github.com/repos/ollama/ollama/issues/2361/events
https://github.com/ollama/ollama/issues/2361
2,118,149,065
I_kwDOJ0Z1Ps5-QGPJ
2,361
Linux installer default path
{ "login": "arabek", "id": 2504890, "node_id": "MDQ6VXNlcjI1MDQ4OTA=", "avatar_url": "https://avatars.githubusercontent.com/u/2504890?v=4", "gravatar_id": "", "url": "https://api.github.com/users/arabek", "html_url": "https://github.com/arabek", "followers_url": "https://api.github.com/users/arabek/foll...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5755339642, "node_id": ...
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
0
2024-02-05T10:11:29
2024-08-19T18:14:25
2024-08-19T18:14:25
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
For some reason, the author decided that it'd be good to install to `/usr/local/bin`, `/usr/bin`, or even `/bin` - systemwide. ``` for BINDIR in /usr/local/bin /usr/bin /bin; do echo $PATH | grep -q $BINDIR && break || continue done status "Installing ollama to $BINDIR..." ``` And then there's the syst...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2361/reactions", "total_count": 7, "+1": 7, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2361/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/765
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/765/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/765/comments
https://api.github.com/repos/ollama/ollama/issues/765/events
https://github.com/ollama/ollama/issues/765
1,939,755,292
I_kwDOJ0Z1Ps5znlEc
765
How to run custom fine-tuned llama2 model into ollama?
{ "login": "aswinjose89", "id": 6614386, "node_id": "MDQ6VXNlcjY2MTQzODY=", "avatar_url": "https://avatars.githubusercontent.com/u/6614386?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aswinjose89", "html_url": "https://github.com/aswinjose89", "followers_url": "https://api.github.com/us...
[]
closed
false
null
[]
null
4
2023-10-12T11:09:10
2023-12-04T20:11:58
2023-12-04T20:11:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.git...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/765/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/765/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1602
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1602/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1602/comments
https://api.github.com/repos/ollama/ollama/issues/1602/events
https://github.com/ollama/ollama/issues/1602
2,048,322,335
I_kwDOJ0Z1Ps56Fusf
1,602
ML research
{ "login": "lihourchhin", "id": 22294314, "node_id": "MDQ6VXNlcjIyMjk0MzE0", "avatar_url": "https://avatars.githubusercontent.com/u/22294314?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lihourchhin", "html_url": "https://github.com/lihourchhin", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
1
2023-12-19T10:04:09
2023-12-23T01:46:38
2023-12-23T01:46:38
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "lihourchhin", "id": 22294314, "node_id": "MDQ6VXNlcjIyMjk0MzE0", "avatar_url": "https://avatars.githubusercontent.com/u/22294314?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lihourchhin", "html_url": "https://github.com/lihourchhin", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1602/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1602/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8062
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8062/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8062/comments
https://api.github.com/repos/ollama/ollama/issues/8062/events
https://github.com/ollama/ollama/issues/8062
2,734,792,165
I_kwDOJ0Z1Ps6jAZ3l
8,062
llama3.1 tool calling issue with role 'system'
{ "login": "Miaozxje", "id": 72405743, "node_id": "MDQ6VXNlcjcyNDA1NzQz", "avatar_url": "https://avatars.githubusercontent.com/u/72405743?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Miaozxje", "html_url": "https://github.com/Miaozxje", "followers_url": "https://api.github.com/users/Mia...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
[ { "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "htt...
null
5
2024-12-12T04:47:00
2025-01-15T18:51:30
2025-01-15T18:51:30
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? The 'system' role not giving the right behaviour based on what its had been defined with Its automatically generate hallucinated params that invoke function although the instruction said not to. eg: `curl --location 'http://localhost:11434/api/chat' \ --header 'Content-Type: application/...
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8062/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8062/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3824
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3824/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3824/comments
https://api.github.com/repos/ollama/ollama/issues/3824/events
https://github.com/ollama/ollama/issues/3824
2,256,695,329
I_kwDOJ0Z1Ps6GgnAh
3,824
Server error when submitting a request through OpenAI client
{ "login": "mishushakov", "id": 10400064, "node_id": "MDQ6VXNlcjEwNDAwMDY0", "avatar_url": "https://avatars.githubusercontent.com/u/10400064?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mishushakov", "html_url": "https://github.com/mishushakov", "followers_url": "https://api.github.com/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
7
2024-04-22T14:30:26
2024-05-06T09:43:28
2024-04-22T21:58:11
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Using Vercel AI SDK and llama2 Request: ```ts const content = [ { type: 'text', text: prompt }, { type: 'text', text: page.content }, ] const result = await experimental_generateObject({ model, schema, messages: [{ role: 'user', content }], temperature, ...
{ "login": "mishushakov", "id": 10400064, "node_id": "MDQ6VXNlcjEwNDAwMDY0", "avatar_url": "https://avatars.githubusercontent.com/u/10400064?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mishushakov", "html_url": "https://github.com/mishushakov", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3824/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3824/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7063
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7063/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7063/comments
https://api.github.com/repos/ollama/ollama/issues/7063/events
https://github.com/ollama/ollama/issues/7063
2,559,384,211
I_kwDOJ0Z1Ps6YjRqT
7,063
Support setting `num_ctx` in openai api via extra query parameter
{ "login": "fzyzcjy", "id": 5236035, "node_id": "MDQ6VXNlcjUyMzYwMzU=", "avatar_url": "https://avatars.githubusercontent.com/u/5236035?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fzyzcjy", "html_url": "https://github.com/fzyzcjy", "followers_url": "https://api.github.com/users/fzyzcjy/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 7706482389, "node_id": ...
open
false
null
[]
null
2
2024-10-01T14:15:06
2024-11-21T04:04:53
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi thanks for the package! It would be great if num_ctx can be set in openai api. It seems openai api allows extra query parameter, and other packages like vllm can make use of it to support custom args.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7063/reactions", "total_count": 7, "+1": 7, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7063/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/4364
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4364/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4364/comments
https://api.github.com/repos/ollama/ollama/issues/4364/events
https://github.com/ollama/ollama/issues/4364
2,290,951,173
I_kwDOJ0Z1Ps6IjSQF
4,364
support for deepseek v2
{ "login": "olumolu", "id": 162728301, "node_id": "U_kgDOCbMJbQ", "avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4", "gravatar_id": "", "url": "https://api.github.com/users/olumolu", "html_url": "https://github.com/olumolu", "followers_url": "https://api.github.com/users/olumolu/foll...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
2
2024-05-11T15:05:08
2024-06-11T22:12:35
2024-06-11T22:12:35
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://huggingface.co/deepseek-ai/DeepSeek-V2 Support foe deepseek
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4364/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4364/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8521
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8521/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8521/comments
https://api.github.com/repos/ollama/ollama/issues/8521/events
https://github.com/ollama/ollama/pull/8521
2,802,865,685
PR_kwDOJ0Z1Ps6IjElV
8,521
Blackwell codegen Support
{ "login": "johnnynunez", "id": 22727137, "node_id": "MDQ6VXNlcjIyNzI3MTM3", "avatar_url": "https://avatars.githubusercontent.com/u/22727137?v=4", "gravatar_id": "", "url": "https://api.github.com/users/johnnynunez", "html_url": "https://github.com/johnnynunez", "followers_url": "https://api.github.com/...
[]
open
false
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.g...
null
4
2025-01-21T21:38:54
2025-01-29T13:36:26
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/8521", "html_url": "https://github.com/ollama/ollama/pull/8521", "diff_url": "https://github.com/ollama/ollama/pull/8521.diff", "patch_url": "https://github.com/ollama/ollama/pull/8521.patch", "merged_at": null }
10.0 blackwell b100/b200 12.0 blackwell rtx50
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8521/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8521/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4208
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4208/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4208/comments
https://api.github.com/repos/ollama/ollama/issues/4208/events
https://github.com/ollama/ollama/pull/4208
2,281,790,854
PR_kwDOJ0Z1Ps5ur3lA
4,208
Fix stale test logic
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
0
2024-05-06T21:16:57
2024-05-06T21:23:15
2024-05-06T21:23:12
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4208", "html_url": "https://github.com/ollama/ollama/pull/4208", "diff_url": "https://github.com/ollama/ollama/pull/4208.diff", "patch_url": "https://github.com/ollama/ollama/pull/4208.patch", "merged_at": "2024-05-06T21:23:12" }
The model processing was recently changed to be deferred but this test scenario hadn't been adjusted for that change in behavior.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4208/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4208/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2654
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2654/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2654/comments
https://api.github.com/repos/ollama/ollama/issues/2654/events
https://github.com/ollama/ollama/pull/2654
2,147,753,449
PR_kwDOJ0Z1Ps5nk8hW
2,654
reset with `init_vars` ahead of each cpu build in `gen_windows.ps1`
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[]
closed
false
null
[]
null
0
2024-02-21T21:34:47
2024-02-21T21:35:35
2024-02-21T21:35:34
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2654", "html_url": "https://github.com/ollama/ollama/pull/2654", "diff_url": "https://github.com/ollama/ollama/pull/2654.diff", "patch_url": "https://github.com/ollama/ollama/pull/2654.patch", "merged_at": "2024-02-21T21:35:34" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2654/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2654/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3302
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3302/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3302/comments
https://api.github.com/repos/ollama/ollama/issues/3302/events
https://github.com/ollama/ollama/pull/3302
2,203,359,647
PR_kwDOJ0Z1Ps5qiOu2
3,302
Fix Execution Error when /tmp is mounted with noexec flag for Issue #2436
{ "login": "jshbmllr", "id": 27757825, "node_id": "MDQ6VXNlcjI3NzU3ODI1", "avatar_url": "https://avatars.githubusercontent.com/u/27757825?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jshbmllr", "html_url": "https://github.com/jshbmllr", "followers_url": "https://api.github.com/users/jsh...
[]
closed
false
null
[]
null
2
2024-03-22T21:31:01
2024-11-21T16:23:42
2024-11-21T16:23:42
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3302", "html_url": "https://github.com/ollama/ollama/pull/3302", "diff_url": "https://github.com/ollama/ollama/pull/3302.diff", "patch_url": "https://github.com/ollama/ollama/pull/3302.patch", "merged_at": null }
In relation to https://github.com/ollama/ollama/issues/2436, which remains unresolved, this pull request introduces a fix similar to the one in https://github.com/ollama/ollama/pull/2403. The issue arises on Linux systems where the /tmp directory is mounted with the noexec flag, preventing the execution of libraries an...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3302/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3302/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4389
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4389/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4389/comments
https://api.github.com/repos/ollama/ollama/issues/4389/events
https://github.com/ollama/ollama/issues/4389
2,291,856,245
I_kwDOJ0Z1Ps6ImvN1
4,389
Can we add whisper to ollama?
{ "login": "JenuelDev", "id": 31676163, "node_id": "MDQ6VXNlcjMxNjc2MTYz", "avatar_url": "https://avatars.githubusercontent.com/u/31676163?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JenuelDev", "html_url": "https://github.com/JenuelDev", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-05-13T05:28:23
2024-05-13T06:20:35
2024-05-13T06:20:35
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
is it possible to add whisper to ollama?
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4389/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4389/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8081
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8081/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8081/comments
https://api.github.com/repos/ollama/ollama/issues/8081/events
https://github.com/ollama/ollama/issues/8081
2,737,459,420
I_kwDOJ0Z1Ps6jKlDc
8,081
android new go toolchain not avail
{ "login": "fxmbsw7", "id": 39368685, "node_id": "MDQ6VXNlcjM5MzY4Njg1", "avatar_url": "https://avatars.githubusercontent.com/u/39368685?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fxmbsw7", "html_url": "https://github.com/fxmbsw7", "followers_url": "https://api.github.com/users/fxmbsw...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
10
2024-12-13T05:44:24
2025-01-17T21:25:02
2024-12-20T22:07:44
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? maybe new version comes soon go: downloading go1.23.4 (android/arm64) go: download go1.23.4 for android/arm64: toolchain not available ### OS Linux ### GPU Other ### CPU Other ### Ollama version 0 git today
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8081/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8081/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4359
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4359/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4359/comments
https://api.github.com/repos/ollama/ollama/issues/4359/events
https://github.com/ollama/ollama/issues/4359
2,290,877,903
I_kwDOJ0Z1Ps6IjAXP
4,359
Mistral is not using GPU, but LLama3 is utilizing GPU properly
{ "login": "itinance", "id": 1758597, "node_id": "MDQ6VXNlcjE3NTg1OTc=", "avatar_url": "https://avatars.githubusercontent.com/u/1758597?v=4", "gravatar_id": "", "url": "https://api.github.com/users/itinance", "html_url": "https://github.com/itinance", "followers_url": "https://api.github.com/users/itina...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6430601766, "node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg...
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
3
2024-05-11T12:34:53
2024-08-22T09:18:01
2024-06-02T00:19:27
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When running `mistral:latest` or `stablelm2:latest`, ollama is not utlizing the GPU on Ubuntu with NVIDIA graphiucs card. Running Ollama:70b is using GPU very well. Command **nvidia-smi** on `ollama run mistral:latest`: ``` +--------------------------------------------------------------...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4359/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4359/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/362
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/362/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/362/comments
https://api.github.com/repos/ollama/ollama/issues/362/events
https://github.com/ollama/ollama/issues/362
1,853,901,253
I_kwDOJ0Z1Ps5ugEnF
362
How to get (log) conditional probability of next word given a context in Ollama?
{ "login": "HeningWang", "id": 62840739, "node_id": "MDQ6VXNlcjYyODQwNzM5", "avatar_url": "https://avatars.githubusercontent.com/u/62840739?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HeningWang", "html_url": "https://github.com/HeningWang", "followers_url": "https://api.github.com/use...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
3
2023-08-16T21:01:57
2024-02-20T00:51:59
2024-02-20T00:51:59
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, I'm new to Ollama. I'd like to get (log) conditional probability of next word given a context like with other LLMs. I cannot find theis usage in the turorial or API. I'm thankful if anybody can help me with that. Sorry, if this question is too basic or not appropriate for an issue. Best
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/362/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/362/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7365
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7365/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7365/comments
https://api.github.com/repos/ollama/ollama/issues/7365/events
https://github.com/ollama/ollama/issues/7365
2,614,984,824
I_kwDOJ0Z1Ps6b3YB4
7,365
Unable to pull IQ4_NL quants from HF
{ "login": "Mushoz", "id": 18422243, "node_id": "MDQ6VXNlcjE4NDIyMjQz", "avatar_url": "https://avatars.githubusercontent.com/u/18422243?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Mushoz", "html_url": "https://github.com/Mushoz", "followers_url": "https://api.github.com/users/Mushoz/fo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
4
2024-10-25T19:51:52
2024-10-26T12:38:20
2024-10-25T21:02:44
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Pulling existing IQ4_NL quants from HF seem to fail: ollama run hf.co/bartowski/Replete-LLM-V2.5-Qwen-32b-GGUF:IQ4_NL pulling manifest Error: pull model manifest: 400: The specified tag is not a valid quantization scheme. Please use another tag or "latest" Any other quant does work fine...
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7365/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7365/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/8365
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8365/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8365/comments
https://api.github.com/repos/ollama/ollama/issues/8365/events
https://github.com/ollama/ollama/issues/8365
2,778,091,842
I_kwDOJ0Z1Ps6lllFC
8,365
When I use multiple GPUs, the utilization is very low.How can I configure it to maximize GPU utilization and reduce the reasoning time?
{ "login": "RoRui", "id": 95675024, "node_id": "U_kgDOBbPikA", "avatar_url": "https://avatars.githubusercontent.com/u/95675024?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RoRui", "html_url": "https://github.com/RoRui", "followers_url": "https://api.github.com/users/RoRui/followers", ...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
4
2025-01-09T15:10:55
2025-01-24T09:42:35
2025-01-24T09:42:35
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
The graphics card I am using is a Tesla M60 16G and the model I am using is qwen2.5:14b. When I use only one GPU core, GPU utilization can be up to 100%. Trying to write an 800 word article takes about 50 seconds. Then, I configured the environment variable OLLAMA_SCHED_SPREAD=1 and used 2 GPU cores, and the util...
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8365/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8365/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5114
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5114/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5114/comments
https://api.github.com/repos/ollama/ollama/issues/5114/events
https://github.com/ollama/ollama/issues/5114
2,359,727,198
I_kwDOJ0Z1Ps6MppRe
5,114
Ollama not loading in gpu with docker on latest version but works on 0.1.31 which doesn't have multi-user concurrency
{ "login": "bluenevus", "id": 3675043, "node_id": "MDQ6VXNlcjM2NzUwNDM=", "avatar_url": "https://avatars.githubusercontent.com/u/3675043?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bluenevus", "html_url": "https://github.com/bluenevus", "followers_url": "https://api.github.com/users/bl...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
4
2024-06-18T12:18:59
2024-06-19T01:28:11
2024-06-19T01:28:11
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Absolutely excited to see you have multi-user concurrency. I setup ollama on docker with 8 gpus. I could get 2 models to run in gpu each with their own container, Llava and Llamaguard2. No other models would load into gpu even if there are no other gpus using it. I tried --gpus= 2, I tried...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5114/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5114/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2486
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2486/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2486/comments
https://api.github.com/repos/ollama/ollama/issues/2486/events
https://github.com/ollama/ollama/pull/2486
2,133,449,776
PR_kwDOJ0Z1Ps5mz87-
2,486
self extend support
{ "login": "cognitivetech", "id": 55156785, "node_id": "MDQ6VXNlcjU1MTU2Nzg1", "avatar_url": "https://avatars.githubusercontent.com/u/55156785?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cognitivetech", "html_url": "https://github.com/cognitivetech", "followers_url": "https://api.githu...
[]
closed
false
null
[]
null
5
2024-02-14T02:43:21
2024-08-11T01:35:06
2024-08-11T01:35:05
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2486", "html_url": "https://github.com/ollama/ollama/pull/2486", "diff_url": "https://github.com/ollama/ollama/pull/2486.diff", "patch_url": "https://github.com/ollama/ollama/pull/2486.patch", "merged_at": null }
trying to add support for self-extend as discussed here: https://github.com/ollama/ollama/issues/1964 I was hoping it would be as simple as adding these parameters as I've seen done in a previous commit, but I was copying moves from an older configuration of the source. obviously I'm missing something.. probably ...
{ "login": "cognitivetech", "id": 55156785, "node_id": "MDQ6VXNlcjU1MTU2Nzg1", "avatar_url": "https://avatars.githubusercontent.com/u/55156785?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cognitivetech", "html_url": "https://github.com/cognitivetech", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2486/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2486/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6705
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6705/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6705/comments
https://api.github.com/repos/ollama/ollama/issues/6705/events
https://github.com/ollama/ollama/issues/6705
2,512,765,259
I_kwDOJ0Z1Ps6VxcFL
6,705
what the heck is this??
{ "login": "isriam", "id": 9697950, "node_id": "MDQ6VXNlcjk2OTc5NTA=", "avatar_url": "https://avatars.githubusercontent.com/u/9697950?v=4", "gravatar_id": "", "url": "https://api.github.com/users/isriam", "html_url": "https://github.com/isriam", "followers_url": "https://api.github.com/users/isriam/foll...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-09-09T02:34:07
2024-09-09T02:46:07
2024-09-09T02:43:12
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I just downloaded and ran llama3.1:8b-text-q6_k and got the following response from a hello. llama3.1:8b-text-q6_K , my name is kylie and i am 12 years old. I have been through a lot of hard times in the past few months. I was bullied for being who i am, i lost my best friend, because she ...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6705/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6705/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7913
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7913/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7913/comments
https://api.github.com/repos/ollama/ollama/issues/7913/events
https://github.com/ollama/ollama/pull/7913
2,713,652,061
PR_kwDOJ0Z1Ps6D0dID
7,913
wip: next ollama runner
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
open
false
null
[]
null
1
2024-12-03T00:04:27
2025-01-29T23:08:44
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
true
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7913", "html_url": "https://github.com/ollama/ollama/pull/7913", "diff_url": "https://github.com/ollama/ollama/pull/7913.diff", "patch_url": "https://github.com/ollama/ollama/pull/7913.patch", "merged_at": null }
implement llama and mllama model architectures in go using ggml (through cgo) ```console $ go run model/cmd/main.go [-cache] [-n 2048] path/to/model <path/to/prompt ```
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7913/reactions", "total_count": 9, "+1": 1, "-1": 0, "laugh": 1, "hooray": 2, "confused": 0, "heart": 2, "rocket": 2, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/7913/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7375
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7375/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7375/comments
https://api.github.com/repos/ollama/ollama/issues/7375/events
https://github.com/ollama/ollama/issues/7375
2,616,101,638
I_kwDOJ0Z1Ps6b7osG
7,375
use arm64 extensions ?
{ "login": "fxmbsw7", "id": 39368685, "node_id": "MDQ6VXNlcjM5MzY4Njg1", "avatar_url": "https://avatars.githubusercontent.com/u/39368685?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fxmbsw7", "html_url": "https://github.com/fxmbsw7", "followers_url": "https://api.github.com/users/fxmbsw...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
9
2024-10-26T19:34:44
2024-11-05T22:31:15
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
like neon64 and way others .. ? greets ..
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7375/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7375/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6867
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6867/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6867/comments
https://api.github.com/repos/ollama/ollama/issues/6867/events
https://github.com/ollama/ollama/issues/6867
2,535,154,576
I_kwDOJ0Z1Ps6XG2OQ
6,867
ollama import does not work
{ "login": "CPLACKY", "id": 174754354, "node_id": "U_kgDOCmqKMg", "avatar_url": "https://avatars.githubusercontent.com/u/174754354?v=4", "gravatar_id": "", "url": "https://api.github.com/users/CPLACKY", "html_url": "https://github.com/CPLACKY", "followers_url": "https://api.github.com/users/CPLACKY/foll...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-09-19T02:46:39
2024-12-02T22:58:20
2024-12-02T22:58:20
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? <img width="806" alt="1" src="https://github.com/user-attachments/assets/646b7aca-dbcd-4227-bfef-4a1d54c7a912"> <img width="861" alt="ollama" src="https://github.com/user-attachments/assets/9be387e9-2d7e-4ec0-8118-3d174b998a7d"> dosent work T_T how can i fix this? ### OS _No response_ ##...
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6867/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6867/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4920
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4920/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4920/comments
https://api.github.com/repos/ollama/ollama/issues/4920/events
https://github.com/ollama/ollama/issues/4920
2,341,278,402
I_kwDOJ0Z1Ps6LjRLC
4,920
Update docs/tutorials/windows.md for Windows Uninstall
{ "login": "Suvoo", "id": 52796258, "node_id": "MDQ6VXNlcjUyNzk2MjU4", "avatar_url": "https://avatars.githubusercontent.com/u/52796258?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Suvoo", "html_url": "https://github.com/Suvoo", "followers_url": "https://api.github.com/users/Suvoo/follow...
[ { "id": 5667396191, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw", "url": "https://api.github.com/repos/ollama/ollama/labels/documentation", "name": "documentation", "color": "0075ca", "default": true, "description": "Improvements or additions to documentation" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
0
2024-06-07T23:17:39
2024-09-05T22:57:39
2024-09-05T22:57:39
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
The [Linux Tutorial](https://github.com/ollama/ollama/blob/main/docs/linux.md) has instructions for uninstalling Ollama from the system. request to add similar instructions for [Windows Tutorial](https://github.com/ollama/ollama/blob/main/docs/windows.md). ## Uninstall remove ollama : open up powershell as adm...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4920/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4920/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5279
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5279/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5279/comments
https://api.github.com/repos/ollama/ollama/issues/5279/events
https://github.com/ollama/ollama/pull/5279
2,373,361,557
PR_kwDOJ0Z1Ps5ziKdE
5,279
use timestamp from challenge, fallback to local time
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2024-06-25T18:25:23
2025-01-29T19:20:12
2025-01-29T19:20:12
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5279", "html_url": "https://github.com/ollama/ollama/pull/5279", "diff_url": "https://github.com/ollama/ollama/pull/5279.diff", "patch_url": "https://github.com/ollama/ollama/pull/5279.patch", "merged_at": null }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5279/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5279/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/679
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/679/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/679/comments
https://api.github.com/repos/ollama/ollama/issues/679/events
https://github.com/ollama/ollama/pull/679
1,922,697,753
PR_kwDOJ0Z1Ps5btt7P
679
`Modelfile` syntax highlighting
{ "login": "jamesbraza", "id": 8990777, "node_id": "MDQ6VXNlcjg5OTA3Nzc=", "avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jamesbraza", "html_url": "https://github.com/jamesbraza", "followers_url": "https://api.github.com/users...
[]
closed
false
null
[]
null
1
2023-10-02T20:56:26
2023-10-06T20:00:17
2023-10-06T19:59:45
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/679", "html_url": "https://github.com/ollama/ollama/pull/679", "diff_url": "https://github.com/ollama/ollama/pull/679.diff", "patch_url": "https://github.com/ollama/ollama/pull/679.patch", "merged_at": "2023-10-06T19:59:45" }
Pertains to https://github.com/jmorganca/ollama/issues/649: - Highlighted `Modelfile` in `modelfile.md` - Made it clear the name can be lowercase
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/679/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/679/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6960
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6960/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6960/comments
https://api.github.com/repos/ollama/ollama/issues/6960/events
https://github.com/ollama/ollama/issues/6960
2,548,676,876
I_kwDOJ0Z1Ps6X6bkM
6,960
Please add support for Molmo-7B, new SOTA multimodal model from Allen AI
{ "login": "robert-mcdermott", "id": 7399563, "node_id": "MDQ6VXNlcjczOTk1NjM=", "avatar_url": "https://avatars.githubusercontent.com/u/7399563?v=4", "gravatar_id": "", "url": "https://api.github.com/users/robert-mcdermott", "html_url": "https://github.com/robert-mcdermott", "followers_url": "https://ap...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
6
2024-09-25T18:12:05
2024-11-13T09:17:20
2024-11-13T09:17:19
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://huggingface.co/allenai/Molmo-7B-O-0924
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6960/reactions", "total_count": 17, "+1": 17, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6960/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8626
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8626/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8626/comments
https://api.github.com/repos/ollama/ollama/issues/8626/events
https://github.com/ollama/ollama/issues/8626
2,814,742,635
I_kwDOJ0Z1Ps6nxZBr
8,626
Tool Support for vision models
{ "login": "abdarwish23", "id": 135508500, "node_id": "U_kgDOCBOyFA", "avatar_url": "https://avatars.githubusercontent.com/u/135508500?v=4", "gravatar_id": "", "url": "https://api.github.com/users/abdarwish23", "html_url": "https://github.com/abdarwish23", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
2
2025-01-28T06:22:42
2025-01-28T07:05:05
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Ollama vision models are not supporting tool calling this is very important specially for browser use or computer use Use-cases can we have tool support for vision models ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8626/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8626/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/4602
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4602/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4602/comments
https://api.github.com/repos/ollama/ollama/issues/4602/events
https://github.com/ollama/ollama/issues/4602
2,314,224,724
I_kwDOJ0Z1Ps6J8ERU
4,602
Ollama pull module failed.
{ "login": "HougeLangley", "id": 1161594, "node_id": "MDQ6VXNlcjExNjE1OTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HougeLangley", "html_url": "https://github.com/HougeLangley", "followers_url": "https://api.github.com...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-05-24T02:53:25
2024-05-24T03:54:15
2024-05-24T03:54:15
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I have tried different network environments, including using and not using proxies. The same problem still exists in all these attempts. ``` ╭─hougelangley at Arch-Legion in ~ 24-05-24 - 10:47:25 ╰─○ ollama pull aya:8b-23-q6_K pulling manifest pulling a3c15cdb0e0f... 98% ▕██████████████...
{ "login": "HougeLangley", "id": 1161594, "node_id": "MDQ6VXNlcjExNjE1OTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HougeLangley", "html_url": "https://github.com/HougeLangley", "followers_url": "https://api.github.com...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4602/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4602/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4815
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4815/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4815/comments
https://api.github.com/repos/ollama/ollama/issues/4815/events
https://github.com/ollama/ollama/issues/4815
2,333,601,183
I_kwDOJ0Z1Ps6LF-2f
4,815
Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connectex: No connections could be made because the target machine actively refused them.
{ "login": "aledepaulaaa", "id": 88629170, "node_id": "MDQ6VXNlcjg4NjI5MTcw", "avatar_url": "https://avatars.githubusercontent.com/u/88629170?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aledepaulaaa", "html_url": "https://github.com/aledepaulaaa", "followers_url": "https://api.github.c...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg...
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
1
2024-06-04T13:50:53
2024-09-26T02:39:02
2024-09-26T02:39:02
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Good morning, does anyone know what this problem is? I have already installed and downloaded the model correctly. I'm using phi3, but when running I get this error. I tried running other models and the error is the same. Did the service go down by any chance? ![erro_ollama](https://github.com/...
{ "login": "aledepaulaaa", "id": 88629170, "node_id": "MDQ6VXNlcjg4NjI5MTcw", "avatar_url": "https://avatars.githubusercontent.com/u/88629170?v=4", "gravatar_id": "", "url": "https://api.github.com/users/aledepaulaaa", "html_url": "https://github.com/aledepaulaaa", "followers_url": "https://api.github.c...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4815/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4815/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2026
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2026/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2026/comments
https://api.github.com/repos/ollama/ollama/issues/2026/events
https://github.com/ollama/ollama/pull/2026
2,085,197,602
PR_kwDOJ0Z1Ps5kQs1u
2,026
fix: normalize name path before splitting
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2024-01-17T00:49:42
2024-01-17T00:58:43
2024-01-17T00:58:42
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2026", "html_url": "https://github.com/ollama/ollama/pull/2026", "diff_url": "https://github.com/ollama/ollama/pull/2026.diff", "patch_url": "https://github.com/ollama/ollama/pull/2026.patch", "merged_at": "2024-01-17T00:58:42" }
During pruning input to ParseModelPath is a file path which on Windows will cause the split to not work as expected. It's still necessary to split on `/` because the most common case is a URL path which is platform agnostic
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2026/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2026/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4061
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4061/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4061/comments
https://api.github.com/repos/ollama/ollama/issues/4061/events
https://github.com/ollama/ollama/issues/4061
2,272,453,743
I_kwDOJ0Z1Ps6HcuRv
4,061
[FEATURE] Add llamascript to community projects
{ "login": "zanderlewis", "id": 158775116, "node_id": "U_kgDOCXa3TA", "avatar_url": "https://avatars.githubusercontent.com/u/158775116?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zanderlewis", "html_url": "https://github.com/zanderlewis", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-04-30T21:09:23
2024-05-09T21:03:01
2024-05-09T21:03:01
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
# LlamaScript [pypi](https://pypi.org/project/llamascript/) [repo](https://github.com/WolfTheDeveloper/llamascript) ## Why? llamascript allows the creation of no-code AI chatbots using Ollama. ## Example A basic chatbot using llamascript ```llamascript USE llama3 PROMPT Why is the sky blue? CHAT ...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4061/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4061/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6750
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6750/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6750/comments
https://api.github.com/repos/ollama/ollama/issues/6750/events
https://github.com/ollama/ollama/issues/6750
2,519,228,722
I_kwDOJ0Z1Ps6WKGEy
6,750
mattw/loganalyzer 无法ollama run
{ "login": "syuan-Boom", "id": 178369897, "node_id": "U_kgDOCqG1aQ", "avatar_url": "https://avatars.githubusercontent.com/u/178369897?v=4", "gravatar_id": "", "url": "https://api.github.com/users/syuan-Boom", "html_url": "https://github.com/syuan-Boom", "followers_url": "https://api.github.com/users/syu...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
8
2024-09-11T09:51:06
2024-09-19T12:17:17
2024-09-11T18:32:50
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
i want to use mattw/loganalyzer 。the tutorial is in example/python_loganalyzer,but it doesnot work。why?please answer
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6750/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6750/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6678
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6678/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6678/comments
https://api.github.com/repos/ollama/ollama/issues/6678/events
https://github.com/ollama/ollama/issues/6678
2,510,497,119
I_kwDOJ0Z1Ps6VoyVf
6,678
OLLAMA_LOAD_TIMEOUT env variable not being applied
{ "login": "YetheSamartaka", "id": 55753928, "node_id": "MDQ6VXNlcjU1NzUzOTI4", "avatar_url": "https://avatars.githubusercontent.com/u/55753928?v=4", "gravatar_id": "", "url": "https://api.github.com/users/YetheSamartaka", "html_url": "https://github.com/YetheSamartaka", "followers_url": "https://api.gi...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
7
2024-09-06T13:53:39
2024-09-13T20:00:14
2024-09-06T15:34:32
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? OLLAMA_LOAD_TIMEOUT env variable is not applied at all. When I specify it using docker -e OLLAMA_LOAD_TIMEOUT=60 and then inspect logs, this variable is missing there completely. Other variables might be missing there as well. Here is the text from logs: `routes.go:1125: INFO server config e...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6678/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6678/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1721
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1721/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1721/comments
https://api.github.com/repos/ollama/ollama/issues/1721/events
https://github.com/ollama/ollama/issues/1721
2,056,599,222
I_kwDOJ0Z1Ps56lTa2
1,721
How to enable Ollama read contents of directory?
{ "login": "oliverbob", "id": 23272429, "node_id": "MDQ6VXNlcjIzMjcyNDI5", "avatar_url": "https://avatars.githubusercontent.com/u/23272429?v=4", "gravatar_id": "", "url": "https://api.github.com/users/oliverbob", "html_url": "https://github.com/oliverbob", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
3
2023-12-26T17:56:19
2024-05-10T00:21:19
2024-05-10T00:21:19
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Since it can read '/home/user/whateverfile.it.is', would it be possible for ollama to be able to read an entire directory or 'repo' for that matter so we can talk to it? If it is not yet a feature, maybe its neat to add this for developers to help us quickly solve our coding challenges. Thanks.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1721/reactions", "total_count": 10, "+1": 6, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 4 }
https://api.github.com/repos/ollama/ollama/issues/1721/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5972
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5972/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5972/comments
https://api.github.com/repos/ollama/ollama/issues/5972/events
https://github.com/ollama/ollama/issues/5972
2,431,386,784
I_kwDOJ0Z1Ps6Q7ASg
5,972
Intergration With Gnome (add on idea)
{ "login": "TheoThePerson", "id": 139904624, "node_id": "U_kgDOCFbGcA", "avatar_url": "https://avatars.githubusercontent.com/u/139904624?v=4", "gravatar_id": "", "url": "https://api.github.com/users/TheoThePerson", "html_url": "https://github.com/TheoThePerson", "followers_url": "https://api.github.com/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
4
2024-07-26T04:18:10
2024-11-11T18:48:26
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I started working on a GNOME extension to connect Ollama to GNOME. This is my first gnome extension and with my limited knowledge of java script I’ve run into some issues and cannot make any progress (chatgpt and C# knowledge could only get me so far). I’d love to collaborate if anybody is interested. Here is the git h...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5972/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5972/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/123
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/123/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/123/comments
https://api.github.com/repos/ollama/ollama/issues/123/events
https://github.com/ollama/ollama/issues/123
1,811,551,065
I_kwDOJ0Z1Ps5r-hNZ
123
Support for Intel Macs
{ "login": "VicariousVision", "id": 36631296, "node_id": "MDQ6VXNlcjM2NjMxMjk2", "avatar_url": "https://avatars.githubusercontent.com/u/36631296?v=4", "gravatar_id": "", "url": "https://api.github.com/users/VicariousVision", "html_url": "https://github.com/VicariousVision", "followers_url": "https://api...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
3
2023-07-19T09:26:17
2023-08-23T17:43:45
2023-08-23T17:43:45
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi When can we expect to hear feedback regadring the future of this? Maybe I could help out?
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/123/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/123/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1789
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1789/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1789/comments
https://api.github.com/repos/ollama/ollama/issues/1789/events
https://github.com/ollama/ollama/issues/1789
2,066,371,488
I_kwDOJ0Z1Ps57KlOg
1,789
Azure Container build failed
{ "login": "questsin", "id": 1900759, "node_id": "MDQ6VXNlcjE5MDA3NTk=", "avatar_url": "https://avatars.githubusercontent.com/u/1900759?v=4", "gravatar_id": "", "url": "https://api.github.com/users/questsin", "html_url": "https://github.com/questsin", "followers_url": "https://api.github.com/users/quest...
[]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
5
2024-01-04T21:38:29
2024-03-12T18:07:10
2024-03-12T18:06:47
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
failed to build on Azure Containers 2024-01-04 16:33:33.786 [info] Step 6/21 : ADD https://dl.google.com/go/go1.21.3.linux-$TARGETARCH.tar.gz /tmp/go1.21.3.tar.gz 2024-01-04 16:33:33.786 [info] ADD failed: failed to GET https://dl.google.com/go/go1.21.3.linux-.tar.gz with status 404 Not Found: <!DOCTYPE html> 2024...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1789/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1789/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8056
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8056/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8056/comments
https://api.github.com/repos/ollama/ollama/issues/8056/events
https://github.com/ollama/ollama/issues/8056
2,734,312,547
I_kwDOJ0Z1Ps6i-kxj
8,056
Create Endpoint Appears Broken
{ "login": "mcmah309", "id": 56412856, "node_id": "MDQ6VXNlcjU2NDEyODU2", "avatar_url": "https://avatars.githubusercontent.com/u/56412856?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mcmah309", "html_url": "https://github.com/mcmah309", "followers_url": "https://api.github.com/users/mcm...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-12-12T00:08:48
2024-12-14T16:37:03
2024-12-14T16:37:03
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? The create endpoint seems broken. The cli seems to work fine though. ```console henry@nixos:~/work/my_packages/rust/olinker (master)$ curl http://localhost:11434/api/create -d '{ "model": "mario", "path": "/tmp/Modelfile" }' {"error":"error reading modelfile: open /tmp/Modelfile: no su...
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8056/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8056/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6680
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6680/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6680/comments
https://api.github.com/repos/ollama/ollama/issues/6680/events
https://github.com/ollama/ollama/pull/6680
2,511,042,569
PR_kwDOJ0Z1Ps56swBp
6,680
adding Archyve to community integrations list
{ "login": "nickthecook", "id": 5835543, "node_id": "MDQ6VXNlcjU4MzU1NDM=", "avatar_url": "https://avatars.githubusercontent.com/u/5835543?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nickthecook", "html_url": "https://github.com/nickthecook", "followers_url": "https://api.github.com/us...
[]
closed
false
null
[]
null
0
2024-09-06T19:05:57
2024-09-06T21:06:02
2024-09-06T21:06:02
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6680", "html_url": "https://github.com/ollama/ollama/pull/6680", "diff_url": "https://github.com/ollama/ollama/pull/6680.diff", "patch_url": "https://github.com/ollama/ollama/pull/6680.patch", "merged_at": "2024-09-06T21:06:02" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6680/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6680/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5808
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5808/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5808/comments
https://api.github.com/repos/ollama/ollama/issues/5808/events
https://github.com/ollama/ollama/issues/5808
2,420,691,980
I_kwDOJ0Z1Ps6QSNQM
5,808
please add `https://huggingface.co/nvidia/Nemotron-4-340B-Instruct` to `https://ollama.com/library`
{ "login": "hemangjoshi37a", "id": 12392345, "node_id": "MDQ6VXNlcjEyMzkyMzQ1", "avatar_url": "https://avatars.githubusercontent.com/u/12392345?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hemangjoshi37a", "html_url": "https://github.com/hemangjoshi37a", "followers_url": "https://api.gi...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
3
2024-07-20T07:53:46
2024-09-04T04:28:12
2024-09-04T04:28:12
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
please add [nvidia/Nemotron-4-340B-Instruct](https://huggingface.co/nvidia/Nemotron-4-340B-Instruct) to `https://ollama.com/library`
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5808/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5808/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/708
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/708/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/708/comments
https://api.github.com/repos/ollama/ollama/issues/708/events
https://github.com/ollama/ollama/pull/708
1,927,607,119
PR_kwDOJ0Z1Ps5b-Xeb
708
Use Go 1.21 in the Dockerfile
{ "login": "xyproto", "id": 52813, "node_id": "MDQ6VXNlcjUyODEz", "avatar_url": "https://avatars.githubusercontent.com/u/52813?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xyproto", "html_url": "https://github.com/xyproto", "followers_url": "https://api.github.com/users/xyproto/follower...
[]
closed
false
null
[]
null
0
2023-10-05T07:44:06
2023-10-05T07:44:42
2023-10-05T07:44:42
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/708", "html_url": "https://github.com/ollama/ollama/pull/708", "diff_url": "https://github.com/ollama/ollama/pull/708.diff", "patch_url": "https://github.com/ollama/ollama/pull/708.patch", "merged_at": null }
I have tested Ollama with Go 1.21 on macOS and Arch Linux and everything works here. This commit bumps `FROM golang:1.20` to `FROM golang:1.21`.
{ "login": "xyproto", "id": 52813, "node_id": "MDQ6VXNlcjUyODEz", "avatar_url": "https://avatars.githubusercontent.com/u/52813?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xyproto", "html_url": "https://github.com/xyproto", "followers_url": "https://api.github.com/users/xyproto/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/708/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/708/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5840
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5840/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5840/comments
https://api.github.com/repos/ollama/ollama/issues/5840/events
https://github.com/ollama/ollama/issues/5840
2,421,830,011
I_kwDOJ0Z1Ps6QWjF7
5,840
Crash on startup when trying to clean up unused files
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.g...
null
1
2024-07-22T03:25:37
2024-08-07T18:29:33
2024-08-07T18:29:32
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ``` % ollama serve 2024/07/21 20:24:39 routes.go:1096: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRA...
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5840/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5840/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2470
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2470/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2470/comments
https://api.github.com/repos/ollama/ollama/issues/2470/events
https://github.com/ollama/ollama/issues/2470
2,131,237,147
I_kwDOJ0Z1Ps5_CBkb
2,470
system message isn't being overridden when using the chat-completion API
{ "login": "jukofyork", "id": 69222624, "node_id": "MDQ6VXNlcjY5MjIyNjI0", "avatar_url": "https://avatars.githubusercontent.com/u/69222624?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jukofyork", "html_url": "https://github.com/jukofyork", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api...
null
9
2024-02-13T00:36:49
2024-02-16T16:45:14
2024-02-16T14:14:25
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Sorry if this has been mentioned already (searching the Issues for "system" brings up 100s of pages): ``` { "model": "mixtral:32k-test", "messages": [ { "role": "system", "content": "You are an AI assistant for the Eclipse IDE. Your objective is to assist users in writing and analyzing sour...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2470/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2470/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6123
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6123/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6123/comments
https://api.github.com/repos/ollama/ollama/issues/6123/events
https://github.com/ollama/ollama/pull/6123
2,442,833,625
PR_kwDOJ0Z1Ps53JCmA
6,123
llama: Runtime selection of new or old runners
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
0
2024-08-01T16:03:33
2024-08-01T22:51:54
2024-08-01T22:51:51
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6123", "html_url": "https://github.com/ollama/ollama/pull/6123", "diff_url": "https://github.com/ollama/ollama/pull/6123.diff", "patch_url": "https://github.com/ollama/ollama/pull/6123.patch", "merged_at": "2024-08-01T22:51:51" }
This change pulls out the ~minimal set of changes from #5287 to be able to build locally and run either the C++ or Go runner Carries #6122 This wont be ready to merge to main until other build rigging changes from the other PR are factored in.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6123/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6123/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1388
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1388/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1388/comments
https://api.github.com/repos/ollama/ollama/issues/1388/events
https://github.com/ollama/ollama/issues/1388
2,025,488,080
I_kwDOJ0Z1Ps54un7Q
1,388
Description of models in the ollama page
{ "login": "lfoppiano", "id": 15426, "node_id": "MDQ6VXNlcjE1NDI2", "avatar_url": "https://avatars.githubusercontent.com/u/15426?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lfoppiano", "html_url": "https://github.com/lfoppiano", "followers_url": "https://api.github.com/users/lfoppiano/...
[]
closed
false
null
[]
null
6
2023-12-05T07:44:19
2025-01-27T23:09:38
2023-12-06T01:33:21
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I cannot find the meaning part of the naming convention of the models. For example: https://ollama.ai/library/starling-lm:7b-alpha-q4_K_M It's clear that q4 indicate the bit of quantization, but what do K and M mean? Thanks
{ "login": "lfoppiano", "id": 15426, "node_id": "MDQ6VXNlcjE1NDI2", "avatar_url": "https://avatars.githubusercontent.com/u/15426?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lfoppiano", "html_url": "https://github.com/lfoppiano", "followers_url": "https://api.github.com/users/lfoppiano/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1388/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1388/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7017
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7017/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7017/comments
https://api.github.com/repos/ollama/ollama/issues/7017/events
https://github.com/ollama/ollama/issues/7017
2,554,045,486
I_kwDOJ0Z1Ps6YO6Qu
7,017
amd-llama-135M
{ "login": "olumolu", "id": 162728301, "node_id": "U_kgDOCbMJbQ", "avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4", "gravatar_id": "", "url": "https://api.github.com/users/olumolu", "html_url": "https://github.com/olumolu", "followers_url": "https://api.github.com/users/olumolu/foll...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
1
2024-09-28T06:41:35
2024-09-28T22:01:58
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://huggingface.co/amd/AMD-Llama-135m Fully open source with opensource licence and open source data set
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7017/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7017/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3686
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3686/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3686/comments
https://api.github.com/repos/ollama/ollama/issues/3686/events
https://github.com/ollama/ollama/issues/3686
2,247,104,339
I_kwDOJ0Z1Ps6F8BdT
3,686
ollama run dbrx:132b-instruct-q2_K Error: exception error loading model architecture: unknown model architecture: 'dbrx'
{ "login": "wengbenjue", "id": 5533297, "node_id": "MDQ6VXNlcjU1MzMyOTc=", "avatar_url": "https://avatars.githubusercontent.com/u/5533297?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wengbenjue", "html_url": "https://github.com/wengbenjue", "followers_url": "https://api.github.com/users...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-04-17T00:52:25
2024-04-17T01:05:35
2024-04-17T01:05:35
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ollama run dbrx:132b-instruct-q2_K Error: exception error loading model architecture: unknown model architecture: 'dbrx' ### What did you expect to see? ollama run dbrx:132b-instruct-q2_K ### Steps to reproduce _No response_ ### Are there any recent changes that introduced the issue? _No ...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3686/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3686/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5544
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5544/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5544/comments
https://api.github.com/repos/ollama/ollama/issues/5544/events
https://github.com/ollama/ollama/issues/5544
2,395,781,480
I_kwDOJ0Z1Ps6OzLlo
5,544
OpenAI v1/completion inserts prompt template
{ "login": "chigkim", "id": 22120994, "node_id": "MDQ6VXNlcjIyMTIwOTk0", "avatar_url": "https://avatars.githubusercontent.com/u/22120994?v=4", "gravatar_id": "", "url": "https://api.github.com/users/chigkim", "html_url": "https://github.com/chigkim", "followers_url": "https://api.github.com/users/chigki...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
5
2024-07-08T14:19:35
2024-09-25T11:57:27
2024-07-08T22:05:47
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? If you use the new completion feature (not chat.completion), it should be completely free form including prompt format. Ollama inserts prompt template even if I use OpenAI completion (client.completions.create). For example, if I use phi3, debug output shows: time=2024-07-06T12:23:40.094-04...
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5544/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5544/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/256
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/256/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/256/comments
https://api.github.com/repos/ollama/ollama/issues/256/events
https://github.com/ollama/ollama/issues/256
1,833,276,864
I_kwDOJ0Z1Ps5tRZXA
256
running `/show` in the CLI doesn't show parameters inherited from a parent modelfile
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2023-08-02T14:14:53
2023-08-02T15:46:39
2023-08-02T15:46:39
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
``` oolama run my-custom-model ...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/256/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/256/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3268
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3268/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3268/comments
https://api.github.com/repos/ollama/ollama/issues/3268/events
https://github.com/ollama/ollama/issues/3268
2,197,202,536
I_kwDOJ0Z1Ps6C9qZo
3,268
When can the grok model be integrated?
{ "login": "honestAnt", "id": 15356873, "node_id": "MDQ6VXNlcjE1MzU2ODcz", "avatar_url": "https://avatars.githubusercontent.com/u/15356873?v=4", "gravatar_id": "", "url": "https://api.github.com/users/honestAnt", "html_url": "https://github.com/honestAnt", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
1
2024-03-20T10:38:21
2024-03-21T08:59:23
2024-03-21T08:59:22
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What model would you like? https://github.com/xai-org/grok-1
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3268/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3268/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2630
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2630/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2630/comments
https://api.github.com/repos/ollama/ollama/issues/2630/events
https://github.com/ollama/ollama/issues/2630
2,146,355,038
I_kwDOJ0Z1Ps5_7sde
2,630
OpenAI API adds both system prompts from model card and from request
{ "login": "hoblin", "id": 28090, "node_id": "MDQ6VXNlcjI4MDkw", "avatar_url": "https://avatars.githubusercontent.com/u/28090?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hoblin", "html_url": "https://github.com/hoblin", "followers_url": "https://api.github.com/users/hoblin/followers", ...
[]
closed
false
null
[]
null
3
2024-02-21T10:16:25
2024-02-23T13:34:31
2024-02-23T13:34:31
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hey there. Is there any way to override the model's default system prompt when I use the OpenAI API endpoint? The request had a system prompt `CUSTOM_SYSTEM_PROMPT` and a user message `Hello.` That's the resulting prompt from the server.log file: ```server.log time=2024-02-21T12:09:22.158+02:00 level=DEBUG source...
{ "login": "hoblin", "id": 28090, "node_id": "MDQ6VXNlcjI4MDkw", "avatar_url": "https://avatars.githubusercontent.com/u/28090?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hoblin", "html_url": "https://github.com/hoblin", "followers_url": "https://api.github.com/users/hoblin/followers", ...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2630/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2630/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5315
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5315/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5315/comments
https://api.github.com/repos/ollama/ollama/issues/5315/events
https://github.com/ollama/ollama/issues/5315
2,376,554,162
I_kwDOJ0Z1Ps6Np1ay
5,315
Support for Ascend NPU hardware
{ "login": "JingWoo", "id": 21989093, "node_id": "MDQ6VXNlcjIxOTg5MDkz", "avatar_url": "https://avatars.githubusercontent.com/u/21989093?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JingWoo", "html_url": "https://github.com/JingWoo", "followers_url": "https://api.github.com/users/JingWo...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
10
2024-06-27T02:02:26
2024-11-22T09:35:42
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Huawei Ascend AI processor is an AI chip based on Huawei-developed Da Vinci architecture. It performs well in processing large-scale data and complex computing tasks. Currently, the llama.cpp project is adapting to the Ascend series AI processors. I'm also adapting Ollama to support the Ascend series of AI processors t...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5315/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5315/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/187
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/187/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/187/comments
https://api.github.com/repos/ollama/ollama/issues/187/events
https://github.com/ollama/ollama/issues/187
1,817,272,676
I_kwDOJ0Z1Ps5sUWFk
187
Error: stream: registry responded with code 416:
{ "login": "codazoda", "id": 527246, "node_id": "MDQ6VXNlcjUyNzI0Ng==", "avatar_url": "https://avatars.githubusercontent.com/u/527246?v=4", "gravatar_id": "", "url": "https://api.github.com/users/codazoda", "html_url": "https://github.com/codazoda", "followers_url": "https://api.github.com/users/codazod...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api...
null
3
2023-07-23T19:36:58
2023-08-30T16:44:58
2023-08-30T16:44:57
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I'm getting the following error when I try to run llama2 with ollama. I'm on an M1 Max with 64G of RAM running Ventura 13.4.1. ``` $ollama run llama2 pulling manifest Error: stream: registry responded with code 416: ``` I have a feeling something happened with the internet connection when I originally tried t...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/187/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/187/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1371
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1371/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1371/comments
https://api.github.com/repos/ollama/ollama/issues/1371/events
https://github.com/ollama/ollama/issues/1371
2,023,271,104
I_kwDOJ0Z1Ps54mKrA
1,371
Unable to Pull on IPv6 system
{ "login": "jabhishek87", "id": 2536001, "node_id": "MDQ6VXNlcjI1MzYwMDE=", "avatar_url": "https://avatars.githubusercontent.com/u/2536001?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jabhishek87", "html_url": "https://github.com/jabhishek87", "followers_url": "https://api.github.com/us...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2023-12-04T08:13:14
2024-07-24T21:19:59
2024-07-24T21:19:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
user@host:~$ ollama run mistral --verbose --insecure pulling manifest Error: Head "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/6a/6ae28029995007a3ee8d0b8556d50f3b59b831074cf19c84de87acf51fb54054/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c7...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1371/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1371/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/232
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/232/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/232/comments
https://api.github.com/repos/ollama/ollama/issues/232/events
https://github.com/ollama/ollama/pull/232
1,825,239,309
PR_kwDOJ0Z1Ps5Wl58t
232
Allow specifying stop conditions in Modelfile
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[]
closed
false
null
[]
null
0
2023-07-27T21:13:37
2023-07-28T16:31:09
2023-07-28T16:31:08
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/232", "html_url": "https://github.com/ollama/ollama/pull/232", "diff_url": "https://github.com/ollama/ollama/pull/232.diff", "patch_url": "https://github.com/ollama/ollama/pull/232.patch", "merged_at": "2023-07-28T16:31:08" }
This is useful for Modelfiles which define a format. Multi-value paramters are set by listing them in quotes. Example Modelfile: ``` FROM llama2 PARAMETER temperature 1 PARAMETER stop "AI Cat:" "Dog:" TEMPLATE """ {{- if .First }} <<SYS>> {{ .System }} <</SYS>> Dog: woof woof woof AI Cat: meow meow ...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/232/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/232/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/827
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/827/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/827/comments
https://api.github.com/repos/ollama/ollama/issues/827/events
https://github.com/ollama/ollama/pull/827
1,948,394,268
PR_kwDOJ0Z1Ps5dEcIF
827
model: native gotemplate adapter template
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2023-10-17T22:29:11
2023-10-18T20:11:26
2023-10-18T20:11:25
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/827", "html_url": "https://github.com/ollama/ollama/pull/827", "diff_url": "https://github.com/ollama/ollama/pull/827.diff", "patch_url": "https://github.com/ollama/ollama/pull/827.patch", "merged_at": "2023-10-18T20:11:25" }
Use gotemplate range instead of string concatenation
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/827/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/827/timeline
null
null
true