url
stringlengths
51
54
repository_url
stringclasses
1 value
labels_url
stringlengths
65
68
comments_url
stringlengths
60
63
events_url
stringlengths
58
61
html_url
stringlengths
39
44
id
int64
1.78B
2.82B
node_id
stringlengths
18
19
number
int64
1
8.69k
title
stringlengths
1
382
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
1 class
assignee
dict
assignees
listlengths
0
2
milestone
null
comments
int64
0
323
created_at
timestamp[s]
updated_at
timestamp[s]
closed_at
timestamp[s]
author_association
stringclasses
4 values
sub_issues_summary
dict
active_lock_reason
null
draft
bool
2 classes
pull_request
dict
body
stringlengths
2
118k
closed_by
dict
reactions
dict
timeline_url
stringlengths
60
63
performed_via_github_app
null
state_reason
stringclasses
4 values
is_pull_request
bool
2 classes
https://api.github.com/repos/ollama/ollama/issues/1205
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1205/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1205/comments
https://api.github.com/repos/ollama/ollama/issues/1205/events
https://github.com/ollama/ollama/issues/1205
2,002,257,814
I_kwDOJ0Z1Ps53WAeW
1,205
/admin Page Auth Key not working
{ "login": "Asher9971", "id": 11883647, "node_id": "MDQ6VXNlcjExODgzNjQ3", "avatar_url": "https://avatars.githubusercontent.com/u/11883647?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Asher9971", "html_url": "https://github.com/Asher9971", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
3
2023-11-20T13:52:10
2023-12-08T23:30:07
2023-11-20T16:23:56
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
After a fresh install i need to input an Auth Key on the admin page. ![image](https://github.com/jmorganca/ollama/assets/11883647/a36825fe-27e7-4153-b8d3-be886fc7bcb9) I generated one and pasted it in the compose-file. `cheshire-cat-core: image: ghcr.io/cheshire-cat-ai/core:latest container_name: chesh...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1205/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1205/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2323
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2323/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2323/comments
https://api.github.com/repos/ollama/ollama/issues/2323/events
https://github.com/ollama/ollama/issues/2323
2,114,580,332
I_kwDOJ0Z1Ps5-Ce9s
2,323
RFE: provide checksum for artefacts released
{ "login": "truatpasteurdotfr", "id": 8300215, "node_id": "MDQ6VXNlcjgzMDAyMTU=", "avatar_url": "https://avatars.githubusercontent.com/u/8300215?v=4", "gravatar_id": "", "url": "https://api.github.com/users/truatpasteurdotfr", "html_url": "https://github.com/truatpasteurdotfr", "followers_url": "https:/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers"...
[ { "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/...
null
1
2024-02-02T10:19:29
2024-03-14T19:53:26
2024-03-14T19:53:26
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, would it be possible to provide sha256 checksums/signature to verify the downloaded artefacts when a new version is released? Just to be sure that the executables have not modified. :D Thanks Tru
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2323/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2323/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5106
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5106/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5106/comments
https://api.github.com/repos/ollama/ollama/issues/5106/events
https://github.com/ollama/ollama/pull/5106
2,358,702,658
PR_kwDOJ0Z1Ps5yxFAP
5,106
Tighten up memory prediction logging
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
0
2024-06-18T02:11:21
2024-06-18T16:24:50
2024-06-18T16:24:38
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5106", "html_url": "https://github.com/ollama/ollama/pull/5106", "diff_url": "https://github.com/ollama/ollama/pull/5106.diff", "patch_url": "https://github.com/ollama/ollama/pull/5106.patch", "merged_at": "2024-06-18T16:24:38" }
Prior to this change, we logged the memory prediction multiple times as the scheduler iterates to find a suitable configuration, which can be confusing since only the last log before the server starts is actually valid. This now logs once just before starting the server on the final configuration. It also reports what ...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5106/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5106/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6034
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6034/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6034/comments
https://api.github.com/repos/ollama/ollama/issues/6034/events
https://github.com/ollama/ollama/issues/6034
2,434,236,468
I_kwDOJ0Z1Ps6RF4A0
6,034
can't import DarkIdol-Llama-3.1-Instruct-1.2-Uncensored:8b_Q8_0
{ "login": "taozhiyuai", "id": 146583103, "node_id": "U_kgDOCLyuPw", "avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4", "gravatar_id": "", "url": "https://api.github.com/users/taozhiyuai", "html_url": "https://github.com/taozhiyuai", "followers_url": "https://api.github.com/users/tao...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-07-29T01:10:30
2024-08-02T07:34:08
2024-08-02T07:34:08
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? 1. modelfile FROM ./DarkIdol-Llama-3.1-8B-Instruct-1.2-Uncensored.Q8_0.gguf TEMPLATE "{{- if .Messages }} {{- if .System }}<|start_header_id|>system<|end_header_id|> {{ .System }}<|eot_id|> {{- end }} {{- range .Messages }}<|start_header_id|>{{ .Role }}<|end_header_id|> {{ .Content ...
{ "login": "taozhiyuai", "id": 146583103, "node_id": "U_kgDOCLyuPw", "avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4", "gravatar_id": "", "url": "https://api.github.com/users/taozhiyuai", "html_url": "https://github.com/taozhiyuai", "followers_url": "https://api.github.com/users/tao...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6034/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6034/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/162
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/162/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/162/comments
https://api.github.com/repos/ollama/ollama/issues/162/events
https://github.com/ollama/ollama/issues/162
1,815,859,719
I_kwDOJ0Z1Ps5sO9IH
162
Don't automatically start on startup / have an option to disable this
{ "login": "gregsadetsky", "id": 1017304, "node_id": "MDQ6VXNlcjEwMTczMDQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1017304?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gregsadetsky", "html_url": "https://github.com/gregsadetsky", "followers_url": "https://api.github.com...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
26
2023-07-21T13:57:28
2025-01-23T17:15:04
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
EDIT: [there's a PR for it now](https://github.com/ollama/ollama/pull/7097)! --- Thanks for making this! I noticed that on macOS (I suppose it's the same on Windows), the app sets itself to open at login. This is done here: https://github.com/jmorganca/ollama/blob/91cd54016c47b71223e8263c44250766874e05cf/app/sr...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/162/reactions", "total_count": 54, "+1": 52, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 }
https://api.github.com/repos/ollama/ollama/issues/162/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3315
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3315/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3315/comments
https://api.github.com/repos/ollama/ollama/issues/3315/events
https://github.com/ollama/ollama/pull/3315
2,204,023,930
PR_kwDOJ0Z1Ps5qkYjJ
3,315
Added [N,y] prompt to confirm the deletion of a model
{ "login": "Icelain", "id": 50962640, "node_id": "MDQ6VXNlcjUwOTYyNjQw", "avatar_url": "https://avatars.githubusercontent.com/u/50962640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Icelain", "html_url": "https://github.com/Icelain", "followers_url": "https://api.github.com/users/Icelai...
[]
closed
false
null
[]
null
3
2024-03-23T19:50:37
2024-03-31T20:40:16
2024-03-31T17:13:37
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3315", "html_url": "https://github.com/ollama/ollama/pull/3315", "diff_url": "https://github.com/ollama/ollama/pull/3315.diff", "patch_url": "https://github.com/ollama/ollama/pull/3315.patch", "merged_at": null }
Fixes #3108
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3315/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3315/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8520
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8520/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8520/comments
https://api.github.com/repos/ollama/ollama/issues/8520/events
https://github.com/ollama/ollama/issues/8520
2,802,264,444
I_kwDOJ0Z1Ps6nByl8
8,520
$OLLAMA_MODELS no longer respected?
{ "login": "yuimbo", "id": 83395410, "node_id": "MDQ6VXNlcjgzMzk1NDEw", "avatar_url": "https://avatars.githubusercontent.com/u/83395410?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuimbo", "html_url": "https://github.com/yuimbo", "followers_url": "https://api.github.com/users/yuimbo/fo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
4
2025-01-21T16:12:04
2025-01-22T09:34:06
2025-01-22T09:33:53
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I've been using the OLLAMA_MODELS variable to store my models on an external drive. I can see that this is set: ``` $ echo $OLLAMA_MODELS /Volumes/bigdrive/ollama/models ``` I can see that my models are stored there: ``` $ tree $OLLAMA_MODELS /Volumes/bigdrive/ollama/models ├── blobs │   ├─...
{ "login": "yuimbo", "id": 83395410, "node_id": "MDQ6VXNlcjgzMzk1NDEw", "avatar_url": "https://avatars.githubusercontent.com/u/83395410?v=4", "gravatar_id": "", "url": "https://api.github.com/users/yuimbo", "html_url": "https://github.com/yuimbo", "followers_url": "https://api.github.com/users/yuimbo/fo...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8520/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8520/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7402
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7402/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7402/comments
https://api.github.com/repos/ollama/ollama/issues/7402/events
https://github.com/ollama/ollama/issues/7402
2,619,012,896
I_kwDOJ0Z1Ps6cGvcg
7,402
ollama run aya-expanse:32b gives nonsensical output
{ "login": "lefromage", "id": 757997, "node_id": "MDQ6VXNlcjc1Nzk5Nw==", "avatar_url": "https://avatars.githubusercontent.com/u/757997?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lefromage", "html_url": "https://github.com/lefromage", "followers_url": "https://api.github.com/users/lefr...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
3
2024-10-28T17:08:55
2024-10-28T21:17:57
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? aya-expanse 8b runs fine, but 32b produces nonsensical output as shown below ollama run aya-expanse:8b >>> hi Hello! How can I help you today? ollama run aya-expanse:32b >>> hi L<PAD>KJ<PAD>OLE6IEGU;F<B9DN:FM4VNOUSV7I=5<UNHBGUQTUR=GOG;<PAD>LRN<CLE<;7BV@>T:8ND5>>;<34<PAD>LR;C;D7...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7402/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7402/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/3189
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3189/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3189/comments
https://api.github.com/repos/ollama/ollama/issues/3189/events
https://github.com/ollama/ollama/issues/3189
2,190,445,483
I_kwDOJ0Z1Ps6Cj4ur
3,189
Add support for amd Radeon 780M gfx1103 - override works
{ "login": "thbley", "id": 941223, "node_id": "MDQ6VXNlcjk0MTIyMw==", "avatar_url": "https://avatars.githubusercontent.com/u/941223?v=4", "gravatar_id": "", "url": "https://api.github.com/users/thbley", "html_url": "https://github.com/thbley", "followers_url": "https://api.github.com/users/thbley/follow...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5755339642, "node_id": ...
open
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
14
2024-03-17T02:29:13
2025-01-29T22:49:46
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What are you trying to do? Please support GPU acceleration using "AMD Ryzen 7 PRO 7840U w/ Radeon 780M Graphics" on Linux (Ubuntu 22.04). Newer notebooks are shipped with AMD 7840U and support setting VRAM from 1GB to 8GB in the bios. With GPU acceleration only 1 vCPU is used and user experience with 7B models ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3189/reactions", "total_count": 24, "+1": 24, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3189/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/4717
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4717/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4717/comments
https://api.github.com/repos/ollama/ollama/issues/4717/events
https://github.com/ollama/ollama/issues/4717
2,325,127,482
I_kwDOJ0Z1Ps6KlqE6
4,717
phi3:medium-128k doesn't use the full context window by default
{ "login": "derluke", "id": 6739699, "node_id": "MDQ6VXNlcjY3Mzk2OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6739699?v=4", "gravatar_id": "", "url": "https://api.github.com/users/derluke", "html_url": "https://github.com/derluke", "followers_url": "https://api.github.com/users/derluke/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-05-30T08:57:43
2024-05-30T16:21:44
2024-05-30T16:21:43
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I was playing with the new phi3:medium-128k model and was surprised to see it struggled to keep track of my earlier questions, or handle long documents. But on the bright side it was surprisingly fast. After a little digging I found out how to specify the context size using a new model file. I decided to give it a g...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4717/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4717/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6177
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6177/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6177/comments
https://api.github.com/repos/ollama/ollama/issues/6177/events
https://github.com/ollama/ollama/issues/6177
2,448,758,778
I_kwDOJ0Z1Ps6R9Rf6
6,177
run OI with OLLAMA SERVER IN NETWORK
{ "login": "RM-S2", "id": 174100356, "node_id": "U_kgDOCmCPhA", "avatar_url": "https://avatars.githubusercontent.com/u/174100356?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RM-S2", "html_url": "https://github.com/RM-S2", "followers_url": "https://api.github.com/users/RM-S2/followers", ...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-08-05T14:52:49
2024-08-08T15:48:38
2024-08-08T15:48:38
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? i try to run OI with ollama server in another server computer in my network.and the command start OI is: interpreter --model ollama/llama3.1 --api_base "http://192.168.3.13:11434" --api_key "fake_key" but i get error say cant fine ollama in my computer ,what i thought is wrong because OI sho...
{ "login": "RM-S2", "id": 174100356, "node_id": "U_kgDOCmCPhA", "avatar_url": "https://avatars.githubusercontent.com/u/174100356?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RM-S2", "html_url": "https://github.com/RM-S2", "followers_url": "https://api.github.com/users/RM-S2/followers", ...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6177/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6177/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/156
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/156/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/156/comments
https://api.github.com/repos/ollama/ollama/issues/156/events
https://github.com/ollama/ollama/issues/156
1,815,137,426
I_kwDOJ0Z1Ps5sMMyS
156
Fine-tuning support
{ "login": "shrikrishnaholla", "id": 1164410, "node_id": "MDQ6VXNlcjExNjQ0MTA=", "avatar_url": "https://avatars.githubusercontent.com/u/1164410?v=4", "gravatar_id": "", "url": "https://api.github.com/users/shrikrishnaholla", "html_url": "https://github.com/shrikrishnaholla", "followers_url": "https://ap...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
[ { "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/us...
null
20
2023-07-21T04:33:31
2024-10-16T18:36:41
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
First of all, thanks for building this tool and releasing it as open source. I like that the interfaces seem similar to `docker`. I also like the idea of Modelfile. Maybe it could also be used to define a finetuning process. That would also allow making the build process be part of a CI/CD routine and would allow bu...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/156/reactions", "total_count": 32, "+1": 27, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 3, "eyes": 2 }
https://api.github.com/repos/ollama/ollama/issues/156/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/4433
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4433/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4433/comments
https://api.github.com/repos/ollama/ollama/issues/4433/events
https://github.com/ollama/ollama/issues/4433
2,296,066,016
I_kwDOJ0Z1Ps6I2y_g
4,433
GPU layer control / prioritisation
{ "login": "AncientMystic", "id": 62780271, "node_id": "MDQ6VXNlcjYyNzgwMjcx", "avatar_url": "https://avatars.githubusercontent.com/u/62780271?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AncientMystic", "html_url": "https://github.com/AncientMystic", "followers_url": "https://api.githu...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
0
2024-05-14T18:05:59
2024-05-14T18:05:59
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Would it be possible to add into the configuration of ollama something similar to LM studio to control the gpu utilisation? Also would it be possible to fine tune ollama to somehow only load certain layers to the gpu similar to unsloth? Possibly a way to load accessed and adjacent layers maybe with configuration on...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4433/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4433/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1529
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1529/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1529/comments
https://api.github.com/repos/ollama/ollama/issues/1529/events
https://github.com/ollama/ollama/pull/1529
2,042,606,963
PR_kwDOJ0Z1Ps5iDRk6
1,529
README with Enchanted iOS App
{ "login": "gluonfield", "id": 5672094, "node_id": "MDQ6VXNlcjU2NzIwOTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/5672094?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gluonfield", "html_url": "https://github.com/gluonfield", "followers_url": "https://api.github.com/users...
[]
closed
false
null
[]
null
0
2023-12-14T22:35:25
2023-12-15T19:37:29
2023-12-15T19:37:29
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1529", "html_url": "https://github.com/ollama/ollama/pull/1529", "diff_url": "https://github.com/ollama/ollama/pull/1529.diff", "patch_url": "https://github.com/ollama/ollama/pull/1529.patch", "merged_at": "2023-12-15T19:37:29" }
I have just released iOS mobile App for Ollama and wanted to share with the community. A lot of improvements are coming up soon.
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1529/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1529/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4309
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4309/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4309/comments
https://api.github.com/repos/ollama/ollama/issues/4309/events
https://github.com/ollama/ollama/issues/4309
2,288,939,443
I_kwDOJ0Z1Ps6IbnGz
4,309
I have uploaded this model, but it is not shown on my page.
{ "login": "taozhiyuai", "id": 146583103, "node_id": "U_kgDOCLyuPw", "avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4", "gravatar_id": "", "url": "https://api.github.com/users/taozhiyuai", "html_url": "https://github.com/taozhiyuai", "followers_url": "https://api.github.com/users/tao...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6573197867, "node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw...
closed
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api...
null
1
2024-05-10T05:06:12
2024-05-10T08:52:15
2024-05-10T08:52:15
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? <img width="1091" alt="截屏2024-05-10 13 00 11" src="https://github.com/ollama/ollama/assets/146583103/f809d253-4deb-4224-99f8-3a20501ad869"> I have uploaded this model, but it is not shown on my page. ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 1.34
{ "login": "taozhiyuai", "id": 146583103, "node_id": "U_kgDOCLyuPw", "avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4", "gravatar_id": "", "url": "https://api.github.com/users/taozhiyuai", "html_url": "https://github.com/taozhiyuai", "followers_url": "https://api.github.com/users/tao...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4309/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4309/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/705
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/705/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/705/comments
https://api.github.com/repos/ollama/ollama/issues/705/events
https://github.com/ollama/ollama/pull/705
1,927,045,435
PR_kwDOJ0Z1Ps5b8evF
705
Fix go test./... issue: fmt.Println arg list ends with redundant newline
{ "login": "xyproto", "id": 52813, "node_id": "MDQ6VXNlcjUyODEz", "avatar_url": "https://avatars.githubusercontent.com/u/52813?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xyproto", "html_url": "https://github.com/xyproto", "followers_url": "https://api.github.com/users/xyproto/follower...
[]
closed
false
null
[]
null
0
2023-10-04T22:02:22
2023-10-05T20:09:41
2023-10-05T15:11:05
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/705", "html_url": "https://github.com/ollama/ollama/pull/705", "diff_url": "https://github.com/ollama/ollama/pull/705.diff", "patch_url": "https://github.com/ollama/ollama/pull/705.patch", "merged_at": "2023-10-05T15:11:05" }
`go test ./...` currently fails with: ``` # github.com/jmorganca/ollama/cmd cmd/cmd.go:690:7: fmt.Println arg list ends with redundant newline cmd/cmd.go:698:7: fmt.Println arg list ends with redundant newline cmd/cmd.go:704:7: fmt.Println arg list ends with redundant newline cmd/cmd.go:710:7: fmt.Println arg l...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/705/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/705/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5675
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5675/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5675/comments
https://api.github.com/repos/ollama/ollama/issues/5675/events
https://github.com/ollama/ollama/pull/5675
2,407,007,425
PR_kwDOJ0Z1Ps51TA0O
5,675
Add Kerlig AI, an app for macOS
{ "login": "Jaarson", "id": 16690523, "node_id": "MDQ6VXNlcjE2NjkwNTIz", "avatar_url": "https://avatars.githubusercontent.com/u/16690523?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Jaarson", "html_url": "https://github.com/Jaarson", "followers_url": "https://api.github.com/users/Jaarso...
[]
closed
false
null
[]
null
0
2024-07-13T15:16:17
2024-07-13T15:33:47
2024-07-13T15:33:47
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5675", "html_url": "https://github.com/ollama/ollama/pull/5675", "diff_url": "https://github.com/ollama/ollama/pull/5675.diff", "patch_url": "https://github.com/ollama/ollama/pull/5675.patch", "merged_at": "2024-07-13T15:33:46" }
null
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5675/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5675/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3600
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3600/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3600/comments
https://api.github.com/repos/ollama/ollama/issues/3600/events
https://github.com/ollama/ollama/pull/3600
2,238,285,742
PR_kwDOJ0Z1Ps5sZATO
3,600
mixtral mem
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2024-04-11T18:10:55
2024-04-11T19:23:38
2024-04-11T19:23:37
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3600", "html_url": "https://github.com/ollama/ollama/pull/3600", "diff_url": "https://github.com/ollama/ollama/pull/3600.diff", "patch_url": "https://github.com/ollama/ollama/pull/3600.patch", "merged_at": "2024-04-11T19:23:37" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3600/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3600/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7778
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7778/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7778/comments
https://api.github.com/repos/ollama/ollama/issues/7778/events
https://github.com/ollama/ollama/issues/7778
2,679,493,210
I_kwDOJ0Z1Ps6ftdJa
7,778
tool_choice parameter
{ "login": "nicho2", "id": 11471811, "node_id": "MDQ6VXNlcjExNDcxODEx", "avatar_url": "https://avatars.githubusercontent.com/u/11471811?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nicho2", "html_url": "https://github.com/nicho2", "followers_url": "https://api.github.com/users/nicho2/fo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
4
2024-11-21T13:31:30
2024-12-11T08:58:27
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Hello, I use the model **Mixtral 8*22B Q4_0.** I want to use **function calling** but the model don't send very good the tool to call (2 times / 10 (in tool_call tag)) so i add the parameter : "tool_choice": "required", but it's seems have no effect . Is this capacity is take into...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7778/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7778/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/4495
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4495/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4495/comments
https://api.github.com/repos/ollama/ollama/issues/4495/events
https://github.com/ollama/ollama/issues/4495
2,302,215,298
I_kwDOJ0Z1Ps6JOQSC
4,495
gemma 2.0
{ "login": "olumolu", "id": 162728301, "node_id": "U_kgDOCbMJbQ", "avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4", "gravatar_id": "", "url": "https://api.github.com/users/olumolu", "html_url": "https://github.com/olumolu", "followers_url": "https://api.github.com/users/olumolu/foll...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
1
2024-05-17T09:16:33
2024-07-10T18:03:19
2024-07-10T18:03:19
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://developers.googleblog.com/en/gemma-family-and-toolkit-expansion-io-2024/
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4495/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4495/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7400
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7400/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7400/comments
https://api.github.com/repos/ollama/ollama/issues/7400/events
https://github.com/ollama/ollama/issues/7400
2,618,727,343
I_kwDOJ0Z1Ps6cFpuv
7,400
Creating embeddings using the REST API is much slower than performing the same operation using Sentence Transformers
{ "login": "sebovzeoueb", "id": 7989595, "node_id": "MDQ6VXNlcjc5ODk1OTU=", "avatar_url": "https://avatars.githubusercontent.com/u/7989595?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sebovzeoueb", "html_url": "https://github.com/sebovzeoueb", "followers_url": "https://api.github.com/us...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5808482718, "node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng...
open
false
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users...
[ { "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://...
null
11
2024-10-28T15:13:57
2024-10-30T16:44:52
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I'm working on a RAG written in Python, and we're using ollama as the chatbot LLM provider. It's running in a Docker container and the Python app makes REST API calls to it. We have so far been using Sentence Transformers to create embeddings for documents that get ingested into the RAG and the ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7400/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7400/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1395
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1395/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1395/comments
https://api.github.com/repos/ollama/ollama/issues/1395/events
https://github.com/ollama/ollama/issues/1395
2,027,411,394
I_kwDOJ0Z1Ps5419fC
1,395
Model filenames (are incompatible with other programs)
{ "login": "marco-trovato", "id": 18162107, "node_id": "MDQ6VXNlcjE4MTYyMTA3", "avatar_url": "https://avatars.githubusercontent.com/u/18162107?v=4", "gravatar_id": "", "url": "https://api.github.com/users/marco-trovato", "html_url": "https://github.com/marco-trovato", "followers_url": "https://api.githu...
[]
closed
false
null
[]
null
1
2023-12-06T00:42:55
2023-12-06T01:16:26
2023-12-06T01:16:25
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I already have a folder with several LLM models, each one can be 20-40 GB. ollama is unable to load them, i have to pull them again one by one, and they will so they will get saved by ollama according to their HASH, i.e.: `.../ollama_models/blobs/sha256:843d506c69eed7ece9a1584965be88421d9774a82bffd59e992d5a73eac2...
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1395/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1395/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5887
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5887/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5887/comments
https://api.github.com/repos/ollama/ollama/issues/5887/events
https://github.com/ollama/ollama/pull/5887
2,426,042,598
PR_kwDOJ0Z1Ps52Qr7q
5,887
cmd/server: utilizing OS copy to transfer blobs if the server is local
{ "login": "joshyan1", "id": 76125168, "node_id": "MDQ6VXNlcjc2MTI1MTY4", "avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4", "gravatar_id": "", "url": "https://api.github.com/users/joshyan1", "html_url": "https://github.com/joshyan1", "followers_url": "https://api.github.com/users/jos...
[]
open
false
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
[ { "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/us...
null
1
2024-07-23T20:13:08
2024-11-21T18:22:06
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5887", "html_url": "https://github.com/ollama/ollama/pull/5887", "diff_url": "https://github.com/ollama/ollama/pull/5887.diff", "patch_url": "https://github.com/ollama/ollama/pull/5887.patch", "merged_at": null }
This PR looks to utilize local copies to a local server prior to posting the blob through the server
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5887/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5887/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3062
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3062/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3062/comments
https://api.github.com/repos/ollama/ollama/issues/3062/events
https://github.com/ollama/ollama/issues/3062
2,179,912,174
I_kwDOJ0Z1Ps6B7tHu
3,062
Ubuntu: Snap installation
{ "login": "MartinsRepo", "id": 10252728, "node_id": "MDQ6VXNlcjEwMjUyNzI4", "avatar_url": "https://avatars.githubusercontent.com/u/10252728?v=4", "gravatar_id": "", "url": "https://api.github.com/users/MartinsRepo", "html_url": "https://github.com/MartinsRepo", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
1
2024-03-11T18:26:44
2024-03-12T01:50:19
2024-03-12T01:50:19
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Installing Ollama with: sudo snap install ollama --beta is working correctly. Ollama list is showing it'working. Changing the default folder with: sudo snap set ollama models=/path to my new ollama model storage/ is accepted. Another ollama list gives: Error: could not connect to ollama app, is it running? After...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3062/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3062/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4586
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4586/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4586/comments
https://api.github.com/repos/ollama/ollama/issues/4586/events
https://github.com/ollama/ollama/issues/4586
2,312,025,800
I_kwDOJ0Z1Ps6JzrbI
4,586
Installation path issue
{ "login": "SnowWindDancing", "id": 60132911, "node_id": "MDQ6VXNlcjYwMTMyOTEx", "avatar_url": "https://avatars.githubusercontent.com/u/60132911?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SnowWindDancing", "html_url": "https://github.com/SnowWindDancing", "followers_url": "https://api...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5860134234, "node_id": ...
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
1
2024-05-23T06:02:13
2024-05-23T17:48:52
2024-05-23T17:48:40
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I hope to specify the installation directory for the Windows version installation
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4586/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4586/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1890
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1890/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1890/comments
https://api.github.com/repos/ollama/ollama/issues/1890/events
https://github.com/ollama/ollama/issues/1890
2,074,014,980
I_kwDOJ0Z1Ps57nvUE
1,890
A way to update all downloaded models
{ "login": "Zig1375", "id": 2699034, "node_id": "MDQ6VXNlcjI2OTkwMzQ=", "avatar_url": "https://avatars.githubusercontent.com/u/2699034?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Zig1375", "html_url": "https://github.com/Zig1375", "followers_url": "https://api.github.com/users/Zig1375/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
13
2024-01-10T10:03:21
2024-12-06T21:38:59
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I'd like to have a way to update all downloaded models. Right now I have to pull each model separately.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1890/reactions", "total_count": 13, "+1": 12, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/1890/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6675
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6675/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6675/comments
https://api.github.com/repos/ollama/ollama/issues/6675/events
https://github.com/ollama/ollama/pull/6675
2,510,033,036
PR_kwDOJ0Z1Ps56pVKS
6,675
Bugfix for #6656 (Fixed redirect check if direct URL is already Present)
{ "login": "Tobix99", "id": 22603015, "node_id": "MDQ6VXNlcjIyNjAzMDE1", "avatar_url": "https://avatars.githubusercontent.com/u/22603015?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Tobix99", "html_url": "https://github.com/Tobix99", "followers_url": "https://api.github.com/users/Tobix9...
[]
open
false
null
[]
null
4
2024-09-06T09:49:49
2024-12-29T19:40:49
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6675", "html_url": "https://github.com/ollama/ollama/pull/6675", "diff_url": "https://github.com/ollama/ollama/pull/6675.diff", "patch_url": "https://github.com/ollama/ollama/pull/6675.patch", "merged_at": null }
Sorry, this is a Bugfix for my old PR from [yesterday](https://github.com/ollama/ollama/pull/6656#issue-2507674135). I hadn't tested it thoroughly yesterday and noticed another bug in the logic. With this new logic it should return - the requestURL on Status OK - the Redirect URL on Status TemporaryRedirect - and...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6675/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6675/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2441
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2441/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2441/comments
https://api.github.com/repos/ollama/ollama/issues/2441/events
https://github.com/ollama/ollama/pull/2441
2,128,273,316
PR_kwDOJ0Z1Ps5mimt0
2,441
Allow Tauri requests by default (tauri://)
{ "login": "da-z", "id": 3681019, "node_id": "MDQ6VXNlcjM2ODEwMTk=", "avatar_url": "https://avatars.githubusercontent.com/u/3681019?v=4", "gravatar_id": "", "url": "https://api.github.com/users/da-z", "html_url": "https://github.com/da-z", "followers_url": "https://api.github.com/users/da-z/followers", ...
[]
closed
false
null
[]
null
1
2024-02-10T10:12:09
2024-11-21T05:53:16
2024-11-21T05:53:16
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2441", "html_url": "https://github.com/ollama/ollama/pull/2441", "diff_url": "https://github.com/ollama/ollama/pull/2441.diff", "patch_url": "https://github.com/ollama/ollama/pull/2441.patch", "merged_at": null }
In preparation of maybe supporting `tauri://` schema by default, I refactored a bit the CORS part of the config and added a test.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2441/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2441/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8535
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8535/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8535/comments
https://api.github.com/repos/ollama/ollama/issues/8535/events
https://github.com/ollama/ollama/issues/8535
2,804,389,409
I_kwDOJ0Z1Ps6nJ5Yh
8,535
Failsafe model download method?
{ "login": "paboum", "id": 54635274, "node_id": "MDQ6VXNlcjU0NjM1Mjc0", "avatar_url": "https://avatars.githubusercontent.com/u/54635274?v=4", "gravatar_id": "", "url": "https://api.github.com/users/paboum", "html_url": "https://github.com/paboum", "followers_url": "https://api.github.com/users/paboum/fo...
[]
closed
false
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers"...
[ { "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/...
null
10
2025-01-22T13:25:05
2025-01-30T09:12:47
2025-01-30T00:09:32
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I'm trying to pull the famous deepseek-r1 model today: ``` time=2025-01-22T14:22:30.734+01:00 level=INFO source=download.go:370 msg="4cd576d9aa16 part 23 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection." time=2025-01-22T14:22:30.734+01:00 level=INFO source=downl...
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers"...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8535/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8535/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6893
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6893/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6893/comments
https://api.github.com/repos/ollama/ollama/issues/6893/events
https://github.com/ollama/ollama/issues/6893
2,538,984,231
I_kwDOJ0Z1Ps6XVdMn
6,893
Llama3.170b through web api gives different quality then command line
{ "login": "remco-pc", "id": 8077908, "node_id": "MDQ6VXNlcjgwNzc5MDg=", "avatar_url": "https://avatars.githubusercontent.com/u/8077908?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remco-pc", "html_url": "https://github.com/remco-pc", "followers_url": "https://api.github.com/users/remco...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-09-20T14:35:37
2024-09-20T16:41:14
2024-09-20T16:41:13
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? through web i ask this: can you give an svg circle example ? Below is simple SVG (alable Graphics) example draws a red: ``` <svg width="100 height="100 ="50"="50 r="40 stroke="green stroke-width4" fillred" /> svg> ``Here's breakdown of the used in this circle example- `` and ``: Thes...
{ "login": "remco-pc", "id": 8077908, "node_id": "MDQ6VXNlcjgwNzc5MDg=", "avatar_url": "https://avatars.githubusercontent.com/u/8077908?v=4", "gravatar_id": "", "url": "https://api.github.com/users/remco-pc", "html_url": "https://github.com/remco-pc", "followers_url": "https://api.github.com/users/remco...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6893/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6893/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2184
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2184/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2184/comments
https://api.github.com/repos/ollama/ollama/issues/2184/events
https://github.com/ollama/ollama/issues/2184
2,099,916,090
I_kwDOJ0Z1Ps59Ki06
2,184
docker swarm service create doesn't use GPU
{ "login": "go-laoji", "id": 92168729, "node_id": "U_kgDOBX5iGQ", "avatar_url": "https://avatars.githubusercontent.com/u/92168729?v=4", "gravatar_id": "", "url": "https://api.github.com/users/go-laoji", "html_url": "https://github.com/go-laoji", "followers_url": "https://api.github.com/users/go-laoji/fo...
[]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
4
2024-01-25T09:12:33
2025-01-27T15:41:37
2024-03-27T20:46:39
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
``` docker service create \ --name ollama \ --mount type=bind,source=/tmp/ollama,destination=/root/.ollama \ --constraint node.role==worker \ --generic-resource "GPU=2" \ --mount type=bind,source=/dev/nvidia0,target=/dev/nvidia0 \ --mount type=bind,source=/dev/nvidiactl,target=/dev/nvidiact...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2184/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2184/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4533
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4533/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4533/comments
https://api.github.com/repos/ollama/ollama/issues/4533/events
https://github.com/ollama/ollama/pull/4533
2,305,071,842
PR_kwDOJ0Z1Ps5v6Qyq
4,533
Move the parser back + handle utf16 files
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
[]
closed
false
null
[]
null
0
2024-05-20T04:45:53
2024-05-20T18:26:46
2024-05-20T18:26:46
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4533", "html_url": "https://github.com/ollama/ollama/pull/4533", "diff_url": "https://github.com/ollama/ollama/pull/4533.diff", "patch_url": "https://github.com/ollama/ollama/pull/4533.patch", "merged_at": "2024-05-20T18:26:46" }
This moves the parser back to `parser/` and also adds support for decoding utf16le and utf16be files. Fixes #4503
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4533/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4533/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5210
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5210/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5210/comments
https://api.github.com/repos/ollama/ollama/issues/5210/events
https://github.com/ollama/ollama/pull/5210
2,367,552,559
PR_kwDOJ0Z1Ps5zPOgW
5,210
cabelo@opensuse.org - Add LTO
{ "login": "cabelo", "id": 675645, "node_id": "MDQ6VXNlcjY3NTY0NQ==", "avatar_url": "https://avatars.githubusercontent.com/u/675645?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cabelo", "html_url": "https://github.com/cabelo", "followers_url": "https://api.github.com/users/cabelo/follow...
[]
closed
false
null
[]
null
0
2024-06-22T04:40:59
2024-08-23T01:58:47
2024-08-23T01:58:47
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5210", "html_url": "https://github.com/ollama/ollama/pull/5210", "diff_url": "https://github.com/ollama/ollama/pull/5210.diff", "patch_url": "https://github.com/ollama/ollama/pull/5210.patch", "merged_at": null }
null
{ "login": "cabelo", "id": 675645, "node_id": "MDQ6VXNlcjY3NTY0NQ==", "avatar_url": "https://avatars.githubusercontent.com/u/675645?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cabelo", "html_url": "https://github.com/cabelo", "followers_url": "https://api.github.com/users/cabelo/follow...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5210/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5210/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7244
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7244/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7244/comments
https://api.github.com/repos/ollama/ollama/issues/7244/events
https://github.com/ollama/ollama/issues/7244
2,595,506,668
I_kwDOJ0Z1Ps6atEns
7,244
Pulling models from private OCI Registries
{ "login": "mitja", "id": 234870, "node_id": "MDQ6VXNlcjIzNDg3MA==", "avatar_url": "https://avatars.githubusercontent.com/u/234870?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mitja", "html_url": "https://github.com/mitja", "followers_url": "https://api.github.com/users/mitja/followers"...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
5
2024-10-17T18:57:43
2025-01-19T18:40:04
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
According to #2388 it should be possible to push and pull models to a Docker/OCI registry (without authentication). Even though it's an unsupported feature, I find it very useful and would like to contribute a short description how to do this. Potential use cases are - organisation-internal registries for org...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7244/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7244/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/4460
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4460/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4460/comments
https://api.github.com/repos/ollama/ollama/issues/4460/events
https://github.com/ollama/ollama/pull/4460
2,298,939,196
PR_kwDOJ0Z1Ps5vlpNE
4,460
fix the cpu estimatedTotal memory + get the expiry time for loading models
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
[]
closed
false
null
[]
null
0
2024-05-15T22:17:28
2024-05-15T22:29:39
2024-05-15T22:29:39
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4460", "html_url": "https://github.com/ollama/ollama/pull/4460", "diff_url": "https://github.com/ollama/ollama/pull/4460.diff", "patch_url": "https://github.com/ollama/ollama/pull/4460.patch", "merged_at": null }
null
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4460/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4460/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1643
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1643/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1643/comments
https://api.github.com/repos/ollama/ollama/issues/1643/events
https://github.com/ollama/ollama/issues/1643
2,051,245,632
I_kwDOJ0Z1Ps56Q4ZA
1,643
Example to run ollama on OpenShift
{ "login": "jeremyssc", "id": 143193860, "node_id": "U_kgDOCIj3BA", "avatar_url": "https://avatars.githubusercontent.com/u/143193860?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jeremyssc", "html_url": "https://github.com/jeremyssc", "followers_url": "https://api.github.com/users/jeremy...
[ { "id": 5667396191, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw", "url": "https://api.github.com/repos/ollama/ollama/labels/documentation", "name": "documentation", "color": "0075ca", "default": true, "description": "Improvements or additions to documentation" }, { "id": 6677677816, ...
closed
false
null
[]
null
2
2023-12-20T20:47:50
2024-05-10T00:25:13
2024-05-10T00:25:13
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hello, I ran into a permission problem when running the Kubernetes example on OpenShift since the example didn't create a persistent volume claim and a volume. You will find attached to this issue a txt file with the manifests I used to make it work if it could help you. [openshift-ollama-example.txt](https://git...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1643/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1643/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2011
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2011/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2011/comments
https://api.github.com/repos/ollama/ollama/issues/2011/events
https://github.com/ollama/ollama/issues/2011
2,083,121,620
I_kwDOJ0Z1Ps58KenU
2,011
Parameters loaded from Modelfile are cast to int in /show parameters
{ "login": "nathanpbell", "id": 3697, "node_id": "MDQ6VXNlcjM2OTc=", "avatar_url": "https://avatars.githubusercontent.com/u/3697?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nathanpbell", "html_url": "https://github.com/nathanpbell", "followers_url": "https://api.github.com/users/nathan...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-01-16T06:44:17
2024-01-16T18:35:25
2024-01-16T18:35:25
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
It appears if I set float value parameters in the Modelfile, when I run that model and run `/show parameters` those floats get cast to ints. ### Steps to reproduce Create a Modelfile: ``` FROM mistral:text PARAMETER num_ctx 32000 PARAMETER seed 42 PARAMETER num_predict 128 PARAMETER temperature 0.7 PARAMET...
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2011/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2011/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6022
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6022/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6022/comments
https://api.github.com/repos/ollama/ollama/issues/6022/events
https://github.com/ollama/ollama/issues/6022
2,433,675,409
I_kwDOJ0Z1Ps6RDvCR
6,022
ollama version is 0.0.0 (windows preview)
{ "login": "dispather", "id": 62810211, "node_id": "MDQ6VXNlcjYyODEwMjEx", "avatar_url": "https://avatars.githubusercontent.com/u/62810211?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dispather", "html_url": "https://github.com/dispather", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-07-28T00:46:25
2024-07-28T13:07:01
2024-07-28T13:07:01
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I installed windows preview ollama and I found gpu is not working when using ollama. So I check ollama version. C:\Users\mightyhun\AppData\Local\Programs\Ollama>ollama -v ollama version is 0.0.0 Warning: client version is 0.3.0 It shows mismatch of ollama version. and I checked ollam...
{ "login": "dispather", "id": 62810211, "node_id": "MDQ6VXNlcjYyODEwMjEx", "avatar_url": "https://avatars.githubusercontent.com/u/62810211?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dispather", "html_url": "https://github.com/dispather", "followers_url": "https://api.github.com/users/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6022/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6022/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4900
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4900/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4900/comments
https://api.github.com/repos/ollama/ollama/issues/4900/events
https://github.com/ollama/ollama/issues/4900
2,339,921,364
I_kwDOJ0Z1Ps6LeF3U
4,900
MiniCPM-Llama3-V-2_5
{ "login": "kotaxyz", "id": 105466290, "node_id": "U_kgDOBklJsg", "avatar_url": "https://avatars.githubusercontent.com/u/105466290?v=4", "gravatar_id": "", "url": "https://api.github.com/users/kotaxyz", "html_url": "https://github.com/kotaxyz", "followers_url": "https://api.github.com/users/kotaxyz/foll...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
19
2024-06-07T08:43:55
2024-08-13T03:22:20
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
This is the best open source vision model i have ever tried , We need support for it in ollama
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4900/reactions", "total_count": 17, "+1": 14, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 3, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4900/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1703
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1703/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1703/comments
https://api.github.com/repos/ollama/ollama/issues/1703/events
https://github.com/ollama/ollama/issues/1703
2,055,421,973
I_kwDOJ0Z1Ps56g0AV
1,703
Error: llama runner process has terminated. when running dolphin-mixtral
{ "login": "G-only1", "id": 96492140, "node_id": "U_kgDOBcBabA", "avatar_url": "https://avatars.githubusercontent.com/u/96492140?v=4", "gravatar_id": "", "url": "https://api.github.com/users/G-only1", "html_url": "https://github.com/G-only1", "followers_url": "https://api.github.com/users/G-only1/follow...
[]
closed
false
null
[]
null
9
2023-12-25T06:21:44
2024-01-08T21:42:05
2024-01-08T21:42:05
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
when i run ollama run dolphin-mixtral it gives the error Error: llama runner process has terminated.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1703/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1703/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1285
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1285/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1285/comments
https://api.github.com/repos/ollama/ollama/issues/1285/events
https://github.com/ollama/ollama/issues/1285
2,011,848,986
I_kwDOJ0Z1Ps536mEa
1,285
Support `GPT2LMHeadModel` architecture
{ "login": "jhagelback", "id": 3829669, "node_id": "MDQ6VXNlcjM4Mjk2Njk=", "avatar_url": "https://avatars.githubusercontent.com/u/3829669?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jhagelback", "html_url": "https://github.com/jhagelback", "followers_url": "https://api.github.com/users...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5789807732, "node_id": ...
closed
false
null
[]
null
1
2023-11-27T09:23:49
2024-12-23T03:19:27
2024-12-23T03:19:27
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Any plans on supporting _GPT2LMHeadModel_ architecture?
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1285/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1285/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7843
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7843/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7843/comments
https://api.github.com/repos/ollama/ollama/issues/7843/events
https://github.com/ollama/ollama/issues/7843
2,695,221,887
I_kwDOJ0Z1Ps6gpdJ_
7,843
New Tool Calling issues.
{ "login": "AssassinUKG", "id": 5285547, "node_id": "MDQ6VXNlcjUyODU1NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/5285547?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AssassinUKG", "html_url": "https://github.com/AssassinUKG", "followers_url": "https://api.github.com/us...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
0
2024-11-26T16:02:31
2024-11-26T16:08:20
2024-11-26T16:08:20
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? In my image below you can see me using the code functions provided by the ollama python examples on GitHub, specifically the chat.py example (added client for local ollama) Sometimes the values are treated as strings and other times as integer, although the function definition is an int typ...
{ "login": "AssassinUKG", "id": 5285547, "node_id": "MDQ6VXNlcjUyODU1NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/5285547?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AssassinUKG", "html_url": "https://github.com/AssassinUKG", "followers_url": "https://api.github.com/us...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7843/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7843/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6863
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6863/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6863/comments
https://api.github.com/repos/ollama/ollama/issues/6863/events
https://github.com/ollama/ollama/issues/6863
2,534,994,308
I_kwDOJ0Z1Ps6XGPGE
6,863
Qwen/Qwen2.5-Coder-7B
{ "login": "wuweinero", "id": 32291523, "node_id": "MDQ6VXNlcjMyMjkxNTIz", "avatar_url": "https://avatars.githubusercontent.com/u/32291523?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wuweinero", "html_url": "https://github.com/wuweinero", "followers_url": "https://api.github.com/users/...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
4
2024-09-19T00:24:37
2024-09-21T02:36:24
2024-09-20T17:36:15
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://huggingface.co/Qwen/Qwen2.5-Coder-7B
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6863/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6863/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2938
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2938/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2938/comments
https://api.github.com/repos/ollama/ollama/issues/2938/events
https://github.com/ollama/ollama/issues/2938
2,169,706,145
I_kwDOJ0Z1Ps6BUxah
2,938
Windows install path
{ "login": "pozzo-balbi", "id": 3755138, "node_id": "MDQ6VXNlcjM3NTUxMzg=", "avatar_url": "https://avatars.githubusercontent.com/u/3755138?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pozzo-balbi", "html_url": "https://github.com/pozzo-balbi", "followers_url": "https://api.github.com/us...
[ { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg", "url": "https://api.github.com/repos/ollama/ollama/labels/windows", "name": "windows", "color": "0052CC", "default": false, "description": "" } ]
closed
false
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers"...
[ { "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/...
null
13
2024-03-05T16:51:30
2024-11-23T19:07:45
2024-03-21T13:20:16
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, please add an option to choose an installation path, e.g. c:\program files\ollama during install. Installing under the user's home directory is security wise a bad idea. Thanks
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2938/reactions", "total_count": 25, "+1": 25, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2938/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2953
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2953/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2953/comments
https://api.github.com/repos/ollama/ollama/issues/2953/events
https://github.com/ollama/ollama/issues/2953
2,171,510,787
I_kwDOJ0Z1Ps6BbqAD
2,953
EOF of starcoder2:15b on Ollama 0.1.28
{ "login": "owenzhao", "id": 2182896, "node_id": "MDQ6VXNlcjIxODI4OTY=", "avatar_url": "https://avatars.githubusercontent.com/u/2182896?v=4", "gravatar_id": "", "url": "https://api.github.com/users/owenzhao", "html_url": "https://github.com/owenzhao", "followers_url": "https://api.github.com/users/owenz...
[]
closed
false
null
[]
null
38
2024-03-06T13:26:53
2024-03-21T20:06:50
2024-03-12T00:11:48
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Mac mini M1 16GB 512GB macOS Sonoma 14.4 (23E214) ```bash ollama run starcoder2:15b pulling manifest pulling dc5deb763c38... 100% ▕████████████████████████████████████████████████▏ 9.1 GB pulling 4ec42cd966c9... 100% ▕████████████████████████████████████████████████▏ 12 KB ...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2953/reactions", "total_count": 17, "+1": 17, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2953/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/22
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/22/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/22/comments
https://api.github.com/repos/ollama/ollama/issues/22/events
https://github.com/ollama/ollama/issues/22
1,781,579,627
I_kwDOJ0Z1Ps5qML9r
22
add a flag to override template prompts
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
2
2023-06-29T22:11:13
2023-07-24T20:50:17
2023-07-24T20:50:17
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/22/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/22/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3531
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3531/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3531/comments
https://api.github.com/repos/ollama/ollama/issues/3531/events
https://github.com/ollama/ollama/issues/3531
2,230,171,167
I_kwDOJ0Z1Ps6E7bYf
3,531
Installation failure on linux due to directory `/usr/share/ollama` not exists
{ "login": "hualet", "id": 2023967, "node_id": "MDQ6VXNlcjIwMjM5Njc=", "avatar_url": "https://avatars.githubusercontent.com/u/2023967?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hualet", "html_url": "https://github.com/hualet", "followers_url": "https://api.github.com/users/hualet/foll...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
2
2024-04-08T03:20:25
2024-05-05T00:35:28
2024-05-05T00:34:54
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Installation failure on linux due to directory `/usr/share/ollama` not exists ➜ ~ curl -fsSL https://ollama.com/install.sh | sh >>> Downloading ollama... ######################################################################## 100.0%##O#- # ...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3531/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3531/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1396
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1396/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1396/comments
https://api.github.com/repos/ollama/ollama/issues/1396/events
https://github.com/ollama/ollama/issues/1396
2,027,758,522
I_kwDOJ0Z1Ps543SO6
1,396
Continuous batching support
{ "login": "Huvinesh-Rajendran-12", "id": 81321926, "node_id": "MDQ6VXNlcjgxMzIxOTI2", "avatar_url": "https://avatars.githubusercontent.com/u/81321926?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Huvinesh-Rajendran-12", "html_url": "https://github.com/Huvinesh-Rajendran-12", "followers_...
[]
closed
false
null
[]
null
14
2023-12-06T06:28:28
2024-09-04T03:35:49
2024-09-04T03:35:49
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Does Ollama support continuous batching for concurrent requests? I couldn't find anything in the documentation.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1396/reactions", "total_count": 16, "+1": 11, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 5 }
https://api.github.com/repos/ollama/ollama/issues/1396/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/605
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/605/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/605/comments
https://api.github.com/repos/ollama/ollama/issues/605/events
https://github.com/ollama/ollama/pull/605
1,913,902,538
PR_kwDOJ0Z1Ps5bQJq9
605
do not unload nouveau driver
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2023-09-26T16:37:51
2023-09-26T16:53:06
2023-09-26T16:53:05
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/605", "html_url": "https://github.com/ollama/ollama/pull/605", "diff_url": "https://github.com/ollama/ollama/pull/605.diff", "patch_url": "https://github.com/ollama/ollama/pull/605.patch", "merged_at": "2023-09-26T16:53:05" }
unloading this driver on a desktop kills the display which is not optimal. instead, inform the user they need to reboot
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/605/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/605/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4578
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4578/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4578/comments
https://api.github.com/repos/ollama/ollama/issues/4578/events
https://github.com/ollama/ollama/pull/4578
2,311,037,663
PR_kwDOJ0Z1Ps5wOyDI
4,578
add phi 3 medium & moondream 2 in readme
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
[]
closed
false
null
[]
null
0
2024-05-22T16:53:22
2024-05-22T16:53:46
2024-05-22T16:53:45
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4578", "html_url": "https://github.com/ollama/ollama/pull/4578", "diff_url": "https://github.com/ollama/ollama/pull/4578.diff", "patch_url": "https://github.com/ollama/ollama/pull/4578.patch", "merged_at": "2024-05-22T16:53:45" }
null
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4578/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4578/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1301
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1301/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1301/comments
https://api.github.com/repos/ollama/ollama/issues/1301/events
https://github.com/ollama/ollama/pull/1301
2,014,300,064
PR_kwDOJ0Z1Ps5gi_q2
1,301
Correct MacOS Host Port in FAQ
{ "login": "ToasterUwU", "id": 43654377, "node_id": "MDQ6VXNlcjQzNjU0Mzc3", "avatar_url": "https://avatars.githubusercontent.com/u/43654377?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ToasterUwU", "html_url": "https://github.com/ToasterUwU", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
0
2023-11-28T12:11:17
2023-11-29T18:26:58
2023-11-29T16:44:04
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1301", "html_url": "https://github.com/ollama/ollama/pull/1301", "diff_url": "https://github.com/ollama/ollama/pull/1301.diff", "patch_url": "https://github.com/ollama/ollama/pull/1301.patch", "merged_at": "2023-11-29T16:44:04" }
For some reason, the port for MacOS in this how-to was different then the one mentioned before and the one used after in the linux example. Skimming over this and copy pasting this as a Mac user, would result in the ollama program running on a different port and making it unreachable unless the port is changed in all o...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1301/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1301/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/746
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/746/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/746/comments
https://api.github.com/repos/ollama/ollama/issues/746/events
https://github.com/ollama/ollama/issues/746
1,934,101,601
I_kwDOJ0Z1Ps5zSAxh
746
Support multi-modal models
{ "login": "arian81", "id": 35879206, "node_id": "MDQ6VXNlcjM1ODc5MjA2", "avatar_url": "https://avatars.githubusercontent.com/u/35879206?v=4", "gravatar_id": "", "url": "https://api.github.com/users/arian81", "html_url": "https://github.com/arian81", "followers_url": "https://api.github.com/users/arian8...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
[ { "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/us...
null
21
2023-10-10T01:14:13
2024-03-06T13:12:22
2023-12-16T09:00:54
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
This is one of the best open source multi modals based on llama 7 currently. It would nice to be able to host it in ollama. https://llava-vl.github.io/
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/746/reactions", "total_count": 23, "+1": 23, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/746/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8494
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8494/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8494/comments
https://api.github.com/repos/ollama/ollama/issues/8494/events
https://github.com/ollama/ollama/issues/8494
2,798,081,134
I_kwDOJ0Z1Ps6mx1Ru
8,494
Can the /API/Chat interface support session related parameters?
{ "login": "lx687", "id": 192780267, "node_id": "U_kgDOC32X6w", "avatar_url": "https://avatars.githubusercontent.com/u/192780267?v=4", "gravatar_id": "", "url": "https://api.github.com/users/lx687", "html_url": "https://github.com/lx687", "followers_url": "https://api.github.com/users/lx687/followers", ...
[]
closed
false
null
[]
null
3
2025-01-20T03:21:54
2025-01-24T09:30:06
2025-01-24T09:30:06
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I want to control the Q&A behavior of the same user through session ID data,Does the current interface support it? ![Image](https://github.com/user-attachments/assets/407a0515-8118-4612-958f-2ca6f6f74fbc)
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8494/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8494/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/479
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/479/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/479/comments
https://api.github.com/repos/ollama/ollama/issues/479/events
https://github.com/ollama/ollama/pull/479
1,884,836,473
PR_kwDOJ0Z1Ps5ZuhiR
479
update dockerfile
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2023-09-06T22:25:53
2023-09-06T22:44:25
2023-09-06T22:44:24
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/479", "html_url": "https://github.com/ollama/ollama/pull/479", "diff_url": "https://github.com/ollama/ollama/pull/479.diff", "patch_url": "https://github.com/ollama/ollama/pull/479.patch", "merged_at": "2023-09-06T22:44:24" }
``` docker build -t ollama . docker run -d -p 11434:11434 -v $HOME/.ollama:/home/ollama/.ollama ollama ``` This container image does not build GPU. That'll come later, after #454
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/479/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/479/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2725
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2725/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2725/comments
https://api.github.com/repos/ollama/ollama/issues/2725/events
https://github.com/ollama/ollama/issues/2725
2,152,229,961
I_kwDOJ0Z1Ps6ASGxJ
2,725
Ping api endpoint for more efficient network scanning
{ "login": "danemadsen", "id": 11537699, "node_id": "MDQ6VXNlcjExNTM3Njk5", "avatar_url": "https://avatars.githubusercontent.com/u/11537699?v=4", "gravatar_id": "", "url": "https://api.github.com/users/danemadsen", "html_url": "https://github.com/danemadsen", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
1
2024-02-24T09:27:14
2024-03-01T02:13:17
2024-03-01T02:13:16
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
currently im using the /api/tags endpoint for automated scanning of the network to find ollama. This is working fine but it may be better to have a dedicated ping endpoint for this kind of operation.
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2725/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2725/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3588
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3588/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3588/comments
https://api.github.com/repos/ollama/ollama/issues/3588/events
https://github.com/ollama/ollama/pull/3588
2,237,005,372
PR_kwDOJ0Z1Ps5sUmHe
3,588
remove header while getting model list
{ "login": "deepakdeore2004", "id": 313430, "node_id": "MDQ6VXNlcjMxMzQzMA==", "avatar_url": "https://avatars.githubusercontent.com/u/313430?v=4", "gravatar_id": "", "url": "https://api.github.com/users/deepakdeore2004", "html_url": "https://github.com/deepakdeore2004", "followers_url": "https://api.git...
[]
closed
false
null
[]
null
2
2024-04-11T06:31:14
2024-06-10T03:27:11
2024-06-10T01:57:41
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3588", "html_url": "https://github.com/ollama/ollama/pull/3588", "diff_url": "https://github.com/ollama/ollama/pull/3588.diff", "patch_url": "https://github.com/ollama/ollama/pull/3588.patch", "merged_at": null }
null
{ "login": "deepakdeore2004", "id": 313430, "node_id": "MDQ6VXNlcjMxMzQzMA==", "avatar_url": "https://avatars.githubusercontent.com/u/313430?v=4", "gravatar_id": "", "url": "https://api.github.com/users/deepakdeore2004", "html_url": "https://github.com/deepakdeore2004", "followers_url": "https://api.git...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3588/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3588/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1812
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1812/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1812/comments
https://api.github.com/repos/ollama/ollama/issues/1812/events
https://github.com/ollama/ollama/issues/1812
2,067,828,105
I_kwDOJ0Z1Ps57QI2J
1,812
IMPROVEMENT: Proper calcuation of the KV cache size inside of gpu::NumGPU() instead of the 3/4 magic number...
{ "login": "jukofyork", "id": 69222624, "node_id": "MDQ6VXNlcjY5MjIyNjI0", "avatar_url": "https://avatars.githubusercontent.com/u/69222624?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jukofyork", "html_url": "https://github.com/jukofyork", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api...
null
3
2024-01-05T18:13:55
2024-01-08T21:42:01
2024-01-08T21:42:01
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
See: https://github.com/jmorganca/ollama/issues/1800#issuecomment-1878955910 Feel free to pull out the stuff from that thread - it's only in there as I did quite a lot of research on this to try to figure out the OOM errors.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1812/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1812/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8275
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8275/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8275/comments
https://api.github.com/repos/ollama/ollama/issues/8275/events
https://github.com/ollama/ollama/issues/8275
2,764,530,224
I_kwDOJ0Z1Ps6kx2Iw
8,275
Magnet download
{ "login": "Zig-VS-TypeScript-VS", "id": 192610801, "node_id": "U_kgDOC3sB8Q", "avatar_url": "https://avatars.githubusercontent.com/u/192610801?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Zig-VS-TypeScript-VS", "html_url": "https://github.com/Zig-VS-TypeScript-VS", "followers_url": "ht...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-12-31T16:38:09
2025-01-08T17:42:39
2025-01-08T17:42:39
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Support Magnet download model. ollama Magnet saves bandwidth, disk life, faster speed. ### Need to do - ollama site models Generates Magnet link. - ollama Magnet seeding.
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8275/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8275/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1727
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1727/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1727/comments
https://api.github.com/repos/ollama/ollama/issues/1727/events
https://github.com/ollama/ollama/issues/1727
2,057,088,840
I_kwDOJ0Z1Ps56nK9I
1,727
ollama doesn't use system RAM
{ "login": "DrGood01", "id": 130962326, "node_id": "U_kgDOB85Tlg", "avatar_url": "https://avatars.githubusercontent.com/u/130962326?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DrGood01", "html_url": "https://github.com/DrGood01", "followers_url": "https://api.github.com/users/DrGood01/...
[ { "id": 6430601766, "node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg", "url": "https://api.github.com/repos/ollama/ollama/labels/nvidia", "name": "nvidia", "color": "8CDB00", "default": false, "description": "Issues relating to Nvidia GPUs and CUDA" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
29
2023-12-27T08:43:44
2025-01-15T16:12:00
2024-05-16T23:11:45
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I'm running Ollama on a ubuntu 22 linux laptop with 32 G of RAM and a NVIDIA gtx 1650. Ollama loads the models exclusively in the graphic card RAM, and doesn't use any of the system RAM at all. Very frustrating, as it exists with "Error: llama runner exited, you may not have enough available memory to run this model" ...
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1727/reactions", "total_count": 15, "+1": 15, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1727/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8186
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8186/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8186/comments
https://api.github.com/repos/ollama/ollama/issues/8186/events
https://github.com/ollama/ollama/issues/8186
2,753,145,864
I_kwDOJ0Z1Ps6kGawI
8,186
mllama doesn't support parallel requests yet - llama3.2-vision:11b for Standard_NC24ads_A100_v4
{ "login": "breddy-lgamerica", "id": 90788463, "node_id": "MDQ6VXNlcjkwNzg4NDYz", "avatar_url": "https://avatars.githubusercontent.com/u/90788463?v=4", "gravatar_id": "", "url": "https://api.github.com/users/breddy-lgamerica", "html_url": "https://github.com/breddy-lgamerica", "followers_url": "https://...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
6
2024-12-20T17:18:28
2025-01-13T01:43:48
2025-01-13T01:43:48
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? We are running ollama container in a kubernetes cluster in azure using Standard_NC24ads_A100_v4 and running a model mllama llama3.2-vision:11b , we keep getting the error - mllama doesn't support parallel requests yet How do we fix this ### OS Linux, Docker ### GPU Nvidia ### CPU _No re...
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8186/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8186/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2678
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2678/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2678/comments
https://api.github.com/repos/ollama/ollama/issues/2678/events
https://github.com/ollama/ollama/issues/2678
2,149,095,107
I_kwDOJ0Z1Ps6AGJbD
2,678
Understanding Modelfile template with respect to conversational history
{ "login": "nikhil0360", "id": 43106856, "node_id": "MDQ6VXNlcjQzMTA2ODU2", "avatar_url": "https://avatars.githubusercontent.com/u/43106856?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nikhil0360", "html_url": "https://github.com/nikhil0360", "followers_url": "https://api.github.com/use...
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
closed
false
null
[]
null
2
2024-02-22T13:36:39
2024-03-17T17:57:03
2024-03-17T17:57:03
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
[proposed Label] question Hello, I want to understand how does the conversational history is feed back into the template from the model file. for example, for llama2:chat ``` TEMPLATE """[INST] <<SYS>>{{ .System }}<</SYS>> {{ .Prompt }} [/INST] """ ``` I am able to do conversational question answering ...
{ "login": "nikhil0360", "id": 43106856, "node_id": "MDQ6VXNlcjQzMTA2ODU2", "avatar_url": "https://avatars.githubusercontent.com/u/43106856?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nikhil0360", "html_url": "https://github.com/nikhil0360", "followers_url": "https://api.github.com/use...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2678/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2678/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3790
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3790/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3790/comments
https://api.github.com/repos/ollama/ollama/issues/3790/events
https://github.com/ollama/ollama/pull/3790
2,254,874,782
PR_kwDOJ0Z1Ps5tQ7bN
3,790
use vanity imports
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers"...
[]
closed
false
null
[]
null
1
2024-04-21T03:04:46
2024-04-21T03:30:17
2024-04-21T03:30:17
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3790", "html_url": "https://github.com/ollama/ollama/pull/3790", "diff_url": "https://github.com/ollama/ollama/pull/3790.diff", "patch_url": "https://github.com/ollama/ollama/pull/3790.patch", "merged_at": null }
null
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers"...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3790/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3790/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8664
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8664/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8664/comments
https://api.github.com/repos/ollama/ollama/issues/8664/events
https://github.com/ollama/ollama/issues/8664
2,818,425,629
I_kwDOJ0Z1Ps6n_cMd
8,664
Wrong GPU size calculation for the `command-r7b:7b` model
{ "login": "vvidovic", "id": 3177210, "node_id": "MDQ6VXNlcjMxNzcyMTA=", "avatar_url": "https://avatars.githubusercontent.com/u/3177210?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vvidovic", "html_url": "https://github.com/vvidovic", "followers_url": "https://api.github.com/users/vvido...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6849881759, "node_id": "LA_kwDOJ0Z1Ps8AAAABmEjmnw...
open
false
null
[]
null
4
2025-01-29T14:44:48
2025-01-30T07:47:04
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I wasn't able to run `command-r7b:7b` model while all other larger models were running successfully. After some investigation and trial and error, I realized I could fix this issue by creating a new model that would offload fewer model layers to GPU. Initial state: ``` $ nvidia-smi Wed Jan 29 ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8664/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8664/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6199
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6199/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6199/comments
https://api.github.com/repos/ollama/ollama/issues/6199/events
https://github.com/ollama/ollama/issues/6199
2,450,769,631
I_kwDOJ0Z1Ps6SE8bf
6,199
Ollama crashes with Deepseek-Coder-V2-Lite-Instruct
{ "login": "shockme", "id": 470676, "node_id": "MDQ6VXNlcjQ3MDY3Ng==", "avatar_url": "https://avatars.githubusercontent.com/u/470676?v=4", "gravatar_id": "", "url": "https://api.github.com/users/shockme", "html_url": "https://github.com/shockme", "followers_url": "https://api.github.com/users/shockme/fo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
9
2024-08-06T12:31:24
2024-10-31T18:19:26
2024-10-31T18:19:26
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? The output is cut in the middle of generation. Here's the log: ``` Aug 06 15:10:46 user-desktop systemd[4465]: Started Ollama Service. Aug 06 15:10:46 user-desktop ollama[13639]: 2024/08/06 15:10:46 routes.go:1108: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VI...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6199/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6199/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4839
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4839/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4839/comments
https://api.github.com/repos/ollama/ollama/issues/4839/events
https://github.com/ollama/ollama/issues/4839
2,336,357,770
I_kwDOJ0Z1Ps6LQf2K
4,839
/api/list shows Start of CE 'expires_at'
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
[ { "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.git...
null
0
2024-06-05T16:31:56
2024-06-05T18:19:25
2024-06-05T18:19:25
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Should not return the field ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4839/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4839/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4994
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4994/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4994/comments
https://api.github.com/repos/ollama/ollama/issues/4994/events
https://github.com/ollama/ollama/issues/4994
2,347,973,132
I_kwDOJ0Z1Ps6L8zoM
4,994
support for recurrent gemma
{ "login": "olumolu", "id": 162728301, "node_id": "U_kgDOCbMJbQ", "avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4", "gravatar_id": "", "url": "https://api.github.com/users/olumolu", "html_url": "https://github.com/olumolu", "followers_url": "https://api.github.com/users/olumolu/foll...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
3
2024-06-12T06:58:20
2024-06-27T18:26:32
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://huggingface.co/google/recurrentgemma-2b-it Support for recurrent gemma
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4994/reactions", "total_count": 4, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4994/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2404
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2404/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2404/comments
https://api.github.com/repos/ollama/ollama/issues/2404/events
https://github.com/ollama/ollama/pull/2404
2,124,244,788
PR_kwDOJ0Z1Ps5mU6oE
2,404
Add GBNF grammar support
{ "login": "jquesnelle", "id": 687076, "node_id": "MDQ6VXNlcjY4NzA3Ng==", "avatar_url": "https://avatars.githubusercontent.com/u/687076?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jquesnelle", "html_url": "https://github.com/jquesnelle", "followers_url": "https://api.github.com/users/j...
[]
closed
false
null
[]
null
10
2024-02-08T02:28:09
2024-12-05T00:42:25
2024-12-05T00:42:24
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2404", "html_url": "https://github.com/ollama/ollama/pull/2404", "diff_url": "https://github.com/ollama/ollama/pull/2404.diff", "patch_url": "https://github.com/ollama/ollama/pull/2404.patch", "merged_at": null }
This is an updated version of #1606 that accounts for changes to the code since it was originally submitted. Adds support for llama.cpp's GBNF grammars, which enable very specific steering of model outputs. This feature is already used on the backend by when the `format` option is set to `json`, but this allows any ...
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2404/reactions", "total_count": 37, "+1": 28, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 9, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2404/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8272
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8272/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8272/comments
https://api.github.com/repos/ollama/ollama/issues/8272/events
https://github.com/ollama/ollama/issues/8272
2,764,069,808
I_kwDOJ0Z1Ps6kwFuw
8,272
Ollama models give low inference with Continue extension on VS Code Community Edition.
{ "login": "ENUMERA8OR", "id": 65213780, "node_id": "MDQ6VXNlcjY1MjEzNzgw", "avatar_url": "https://avatars.githubusercontent.com/u/65213780?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ENUMERA8OR", "html_url": "https://github.com/ENUMERA8OR", "followers_url": "https://api.github.com/use...
[]
closed
false
null
[]
null
1
2024-12-31T07:45:17
2025-01-13T01:49:18
2025-01-13T01:49:18
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Please tell me how to troubleshoot shoot this issue. I want to increase the model inference on vs code. Any suggestionss would be helpful.
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8272/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8272/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6646
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6646/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6646/comments
https://api.github.com/repos/ollama/ollama/issues/6646/events
https://github.com/ollama/ollama/issues/6646
2,506,522,708
I_kwDOJ0Z1Ps6VZoBU
6,646
POST /v1/chat/completions returns 404 not 400 for model not found
{ "login": "codefromthecrypt", "id": 64215, "node_id": "MDQ6VXNlcjY0MjE1", "avatar_url": "https://avatars.githubusercontent.com/u/64215?v=4", "gravatar_id": "", "url": "https://api.github.com/users/codefromthecrypt", "html_url": "https://github.com/codefromthecrypt", "followers_url": "https://api.github...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-09-05T00:22:03
2024-09-09T22:21:47
2024-09-09T22:21:46
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? POST /v1/chat/completions returns 404 not 400 for model not found. Semantically, the better code here is 400, as it is an invalid argument on a correct route. Using 404 messages on a route that exists is confusing and had me doubting if the routes were mounted or not. This seems to be the sam...
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6646/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6646/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1026
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1026/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1026/comments
https://api.github.com/repos/ollama/ollama/issues/1026/events
https://github.com/ollama/ollama/pull/1026
1,980,680,803
PR_kwDOJ0Z1Ps5exUCq
1,026
Update client.py
{ "login": "eltociear", "id": 22633385, "node_id": "MDQ6VXNlcjIyNjMzMzg1", "avatar_url": "https://avatars.githubusercontent.com/u/22633385?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eltociear", "html_url": "https://github.com/eltociear", "followers_url": "https://api.github.com/users/...
[]
closed
false
null
[]
null
0
2023-11-07T06:59:18
2023-11-07T17:55:47
2023-11-07T17:55:47
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1026", "html_url": "https://github.com/ollama/ollama/pull/1026", "diff_url": "https://github.com/ollama/ollama/pull/1026.diff", "patch_url": "https://github.com/ollama/ollama/pull/1026.patch", "merged_at": "2023-11-07T17:55:47" }
recieve -> receive
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1026/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1026/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2204
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2204/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2204/comments
https://api.github.com/repos/ollama/ollama/issues/2204/events
https://github.com/ollama/ollama/issues/2204
2,101,882,474
I_kwDOJ0Z1Ps59SC5q
2,204
Questions about context size
{ "login": "swip3798", "id": 33018263, "node_id": "MDQ6VXNlcjMzMDE4MjYz", "avatar_url": "https://avatars.githubusercontent.com/u/33018263?v=4", "gravatar_id": "", "url": "https://api.github.com/users/swip3798", "html_url": "https://github.com/swip3798", "followers_url": "https://api.github.com/users/swi...
[]
closed
false
null
[]
null
9
2024-01-26T09:30:45
2024-12-16T06:37:58
2024-05-10T01:06:39
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Before I start, thank you for this amazing project! It's really great to run LLMs on my own hardware this easily. I am currently building a small story writing application that uses ollama to have a "cowriter" AI, that will write along with the user, similar to how AIDungeon or NovelAI work. Since the stories have n...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2204/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2204/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8594
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8594/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8594/comments
https://api.github.com/repos/ollama/ollama/issues/8594/events
https://github.com/ollama/ollama/issues/8594
2,811,633,348
I_kwDOJ0Z1Ps6nlh7E
8,594
Ollama stops accessing GPU and Reverts to CPU after runing for extended periods
{ "login": "loca5790", "id": 96643826, "node_id": "U_kgDOBcKq8g", "avatar_url": "https://avatars.githubusercontent.com/u/96643826?v=4", "gravatar_id": "", "url": "https://api.github.com/users/loca5790", "html_url": "https://github.com/loca5790", "followers_url": "https://api.github.com/users/loca5790/fo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
6
2025-01-26T15:52:00
2025-01-27T16:01:22
2025-01-27T15:55:33
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I have ollama set to be persistent in my VRAM based off my homeassistant usage. I moved to an RTX3090 and after sometimes 12 hours and other times a day plus Ollama will stop using the GPU and revert to CPU only. It then gets stuck spooling the CPU up for hours at a time without generating any...
{ "login": "loca5790", "id": 96643826, "node_id": "U_kgDOBcKq8g", "avatar_url": "https://avatars.githubusercontent.com/u/96643826?v=4", "gravatar_id": "", "url": "https://api.github.com/users/loca5790", "html_url": "https://github.com/loca5790", "followers_url": "https://api.github.com/users/loca5790/fo...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8594/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8594/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5281
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5281/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5281/comments
https://api.github.com/repos/ollama/ollama/issues/5281/events
https://github.com/ollama/ollama/issues/5281
2,373,571,955
I_kwDOJ0Z1Ps6NedVz
5,281
update /show to work like command line show
{ "login": "iplayfast", "id": 751306, "node_id": "MDQ6VXNlcjc1MTMwNg==", "avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4", "gravatar_id": "", "url": "https://api.github.com/users/iplayfast", "html_url": "https://github.com/iplayfast", "followers_url": "https://api.github.com/users/ipla...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
[ { "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.git...
null
2
2024-06-25T20:07:58
2024-06-28T20:15:53
2024-06-28T20:15:53
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I really like the new ``` ollama show <model> ``` feature. when running ollama from command line or url it would be nice to be able to get the same type of info without actually loading the model and requesting all the individual sections. Currently ``` >>> /show Available Commands: /show info Sh...
{ "login": "royjhan", "id": 65097070, "node_id": "MDQ6VXNlcjY1MDk3MDcw", "avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4", "gravatar_id": "", "url": "https://api.github.com/users/royjhan", "html_url": "https://github.com/royjhan", "followers_url": "https://api.github.com/users/royjha...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5281/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5281/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3577
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3577/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3577/comments
https://api.github.com/repos/ollama/ollama/issues/3577/events
https://github.com/ollama/ollama/issues/3577
2,235,580,063
I_kwDOJ0Z1Ps6FQD6f
3,577
error when run command r plus
{ "login": "taozhiyuai", "id": 146583103, "node_id": "U_kgDOCLyuPw", "avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4", "gravatar_id": "", "url": "https://api.github.com/users/taozhiyuai", "html_url": "https://github.com/taozhiyuai", "followers_url": "https://api.github.com/users/tao...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-04-10T13:17:09
2024-04-12T19:14:03
2024-04-12T19:14:03
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? import command r plus gguf successfully. but error when run it. <img width="837" alt="截屏2024-04-10 21 14 08" src="https://github.com/ollama/ollama/assets/146583103/37fbd1d2-7b75-432b-be86-af2c31b414d7"> ### What did you expect to see? _No response_ ### Steps to reproduce _No response_ ##...
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3577/reactions", "total_count": 3, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/3577/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1577
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1577/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1577/comments
https://api.github.com/repos/ollama/ollama/issues/1577/events
https://github.com/ollama/ollama/issues/1577
2,046,062,096
I_kwDOJ0Z1Ps559G4Q
1,577
ValueError: Error raised by inference API HTTP code: 500, {"error":"failed to generate embedding"}
{ "login": "doanaktar", "id": 66390064, "node_id": "MDQ6VXNlcjY2MzkwMDY0", "avatar_url": "https://avatars.githubusercontent.com/u/66390064?v=4", "gravatar_id": "", "url": "https://api.github.com/users/doanaktar", "html_url": "https://github.com/doanaktar", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA...
closed
false
null
[]
null
11
2023-12-18T08:47:20
2024-06-30T18:05:59
2024-05-07T00:06:43
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
When i'm trying ollama for document chat i get api error when it tries to create vectorstore. ```python from langchain.llms import Ollama from langchain.document_loaders import WebBaseLoader from langchain.embeddings import OllamaEmbeddings from langchain.vectorstores import Chroma from langchain.chains import ...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1577/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1577/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/122
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/122/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/122/comments
https://api.github.com/repos/ollama/ollama/issues/122/events
https://github.com/ollama/ollama/issues/122
1,811,445,670
I_kwDOJ0Z1Ps5r-Hem
122
show ollama version
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
6
2023-07-19T08:32:07
2023-08-22T16:51:13
2023-08-22T16:51:12
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/122/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/122/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2144
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2144/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2144/comments
https://api.github.com/repos/ollama/ollama/issues/2144/events
https://github.com/ollama/ollama/pull/2144
2,094,699,870
PR_kwDOJ0Z1Ps5kw3Ls
2,144
faq: update to use launchctl setenv
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[]
closed
false
null
[]
null
0
2024-01-22T20:31:42
2024-01-22T21:46:58
2024-01-22T21:46:57
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2144", "html_url": "https://github.com/ollama/ollama/pull/2144", "diff_url": "https://github.com/ollama/ollama/pull/2144.diff", "patch_url": "https://github.com/ollama/ollama/pull/2144.patch", "merged_at": "2024-01-22T21:46:57" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2144/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2144/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5068
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5068/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5068/comments
https://api.github.com/repos/ollama/ollama/issues/5068/events
https://github.com/ollama/ollama/issues/5068
2,355,092,563
I_kwDOJ0Z1Ps6MX9xT
5,068
please add nvidia/Nemotron-4-340B-Instruct
{ "login": "gileneusz", "id": 34601970, "node_id": "MDQ6VXNlcjM0NjAxOTcw", "avatar_url": "https://avatars.githubusercontent.com/u/34601970?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gileneusz", "html_url": "https://github.com/gileneusz", "followers_url": "https://api.github.com/users/...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
9
2024-06-15T18:10:10
2024-10-17T07:06:57
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
my GPUs are not utilized fully, I need to spin my H200s!! just kidding, need quantized version of the model ;)
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5068/reactions", "total_count": 11, "+1": 8, "-1": 0, "laugh": 3, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5068/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/8407
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8407/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8407/comments
https://api.github.com/repos/ollama/ollama/issues/8407/events
https://github.com/ollama/ollama/pull/8407
2,785,807,752
PR_kwDOJ0Z1Ps6Hogsg
8,407
convert/test: migrate conversion tests to work with refactor
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
[]
closed
false
null
[]
null
2
2025-01-14T00:29:20
2025-01-14T17:45:15
2025-01-14T17:45:15
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
true
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/8407", "html_url": "https://github.com/ollama/ollama/pull/8407", "diff_url": "https://github.com/ollama/ollama/pull/8407.diff", "patch_url": "https://github.com/ollama/ollama/pull/8407.patch", "merged_at": null }
When conversion was refactored it broke all these tests, but they were silently skipping due to the wrong file names being checked for. This change refactors the valid model conversion test to highlight the important files and check for important details. Draft to see if people are ok with this approach before addin...
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8407/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8407/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7746
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7746/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7746/comments
https://api.github.com/repos/ollama/ollama/issues/7746/events
https://github.com/ollama/ollama/pull/7746
2,673,516,676
PR_kwDOJ0Z1Ps6CcNli
7,746
Add Community Integration (Update README.md)
{ "login": "gkamer8", "id": 10733401, "node_id": "MDQ6VXNlcjEwNzMzNDAx", "avatar_url": "https://avatars.githubusercontent.com/u/10733401?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gkamer8", "html_url": "https://github.com/gkamer8", "followers_url": "https://api.github.com/users/gkamer...
[]
closed
false
null
[]
null
0
2024-11-19T20:49:37
2024-11-20T05:37:15
2024-11-20T05:37:15
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7746", "html_url": "https://github.com/ollama/ollama/pull/7746", "diff_url": "https://github.com/ollama/ollama/pull/7746.diff", "patch_url": "https://github.com/ollama/ollama/pull/7746.patch", "merged_at": "2024-11-20T05:37:15" }
Added [Abbey](https://github.com/US-Artificial-Intelligence/abbey), an open source AI interface server, into community integrations
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7746/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7746/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2150
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2150/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2150/comments
https://api.github.com/repos/ollama/ollama/issues/2150/events
https://github.com/ollama/ollama/pull/2150
2,095,058,881
PR_kwDOJ0Z1Ps5kyGUL
2,150
Set a default version using git describe
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
0
2024-01-23T01:12:49
2024-01-23T01:41:08
2024-01-23T01:38:27
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2150", "html_url": "https://github.com/ollama/ollama/pull/2150", "diff_url": "https://github.com/ollama/ollama/pull/2150.diff", "patch_url": "https://github.com/ollama/ollama/pull/2150.patch", "merged_at": "2024-01-23T01:38:27" }
If a VERSION is not specified, this will generate a version string that represents the state of the repo. For example `0.1.21-12-gffaf52e-dirty` representing 12 commits away from 0.1.21 tag, on commit gffaf52e and the tree is dirty.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2150/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2150/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/947
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/947/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/947/comments
https://api.github.com/repos/ollama/ollama/issues/947/events
https://github.com/ollama/ollama/issues/947
1,967,122,164
I_kwDOJ0Z1Ps51P-b0
947
ollama push username/UppercaseModelname fails with 401 error
{ "login": "easp", "id": 414705, "node_id": "MDQ6VXNlcjQxNDcwNQ==", "avatar_url": "https://avatars.githubusercontent.com/u/414705?v=4", "gravatar_id": "", "url": "https://api.github.com/users/easp", "html_url": "https://github.com/easp", "followers_url": "https://api.github.com/users/easp/followers", ...
[]
closed
false
null
[]
null
4
2023-10-29T19:21:02
2023-10-30T00:55:34
2023-10-29T19:35:47
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I tried pushing a model I'd created with capital letters in the name and repeatedly got a 401 error. It took me a while to figure out why. It seems like the error should be more descriptive and/or `ollama create` and `ollama cp` should enforce the lower-case only rule.
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.git...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/947/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/947/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6930
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6930/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6930/comments
https://api.github.com/repos/ollama/ollama/issues/6930/events
https://github.com/ollama/ollama/issues/6930
2,545,018,031
I_kwDOJ0Z1Ps6XseSv
6,930
Tesla p40 24G with quadro M6000 24G can not work together
{ "login": "Blake110", "id": 98226493, "node_id": "U_kgDOBdrRPQ", "avatar_url": "https://avatars.githubusercontent.com/u/98226493?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Blake110", "html_url": "https://github.com/Blake110", "followers_url": "https://api.github.com/users/Blake110/fo...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6430601766, "node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg...
open
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.gi...
null
10
2024-09-24T10:35:14
2024-09-27T10:32:47
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? P40 with M6000, just P40 works, and M6000 memory not be used by ollama. even modified ollama.service for multi GPU. I try to use P40 with 1080ti, works fine with default ollama.service. P40 with RTX 2060, works fine with default ollama.service. anyone can tell me why and is there a chance to ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6930/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6930/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6003
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6003/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6003/comments
https://api.github.com/repos/ollama/ollama/issues/6003/events
https://github.com/ollama/ollama/issues/6003
2,433,120,533
I_kwDOJ0Z1Ps6RBnkV
6,003
AMD Radeon RX 6750 XT Support
{ "login": "SmollClover", "id": 39840298, "node_id": "MDQ6VXNlcjM5ODQwMjk4", "avatar_url": "https://avatars.githubusercontent.com/u/39840298?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SmollClover", "html_url": "https://github.com/SmollClover", "followers_url": "https://api.github.com/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
6
2024-07-27T00:16:23
2024-07-28T14:59:35
2024-07-28T14:59:35
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Currently, as it seems, the Radeon RX 6750 XT isn't supported by Ollama and trying to force it to use it using `env HSA_OVERRIDE_GFX_VERSION=gfx1031 ollama serve` results in it being unable to initialize the tensile host. Edit: Without the HSA_OVERRIDE_GFX_VERSION, it just states that it wa...
{ "login": "SmollClover", "id": 39840298, "node_id": "MDQ6VXNlcjM5ODQwMjk4", "avatar_url": "https://avatars.githubusercontent.com/u/39840298?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SmollClover", "html_url": "https://github.com/SmollClover", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6003/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6003/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4210
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4210/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4210/comments
https://api.github.com/repos/ollama/ollama/issues/4210/events
https://github.com/ollama/ollama/issues/4210
2,281,890,062
I_kwDOJ0Z1Ps6IAuEO
4,210
if the template is correct?
{ "login": "taozhiyuai", "id": 146583103, "node_id": "U_kgDOCLyuPw", "avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4", "gravatar_id": "", "url": "https://api.github.com/users/taozhiyuai", "html_url": "https://github.com/taozhiyuai", "followers_url": "https://api.github.com/users/tao...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-05-06T22:24:17
2024-05-09T16:42:05
2024-05-07T16:46:09
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I try to import https://hf-mirror.com/NousResearch/Hermes-2-Pro-Llama-3-8B-GGUF the template from this HF webpage is ' <|im_start|>system You are "Hermes 2", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to ...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4210/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4210/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4071
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4071/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4071/comments
https://api.github.com/repos/ollama/ollama/issues/4071/events
https://github.com/ollama/ollama/issues/4071
2,273,059,271
I_kwDOJ0Z1Ps6HfCHH
4,071
ollama pull llama3 error
{ "login": "wisepmlin", "id": 74945717, "node_id": "MDQ6VXNlcjc0OTQ1NzE3", "avatar_url": "https://avatars.githubusercontent.com/u/74945717?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wisepmlin", "html_url": "https://github.com/wisepmlin", "followers_url": "https://api.github.com/users/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-05-01T07:17:41
2024-05-01T20:38:39
2024-05-01T20:38:38
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ollama pull llama3 Error: pull model manifest: Get "https://ollama.com/token?nonce=EBGaz66AqKbJTscDMcl-ag&scope=repository%!A(MISSING)library%!F(MISSING)llama3%!A(MISSING)pull&service=ollama.com&ts=1714547177": read tcp 192.168.188.104:49346->34.120.132.20:443: read: connection reset by peer C...
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4071/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4071/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5947
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5947/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5947/comments
https://api.github.com/repos/ollama/ollama/issues/5947/events
https://github.com/ollama/ollama/issues/5947
2,429,706,051
I_kwDOJ0Z1Ps6Q0l9D
5,947
Would be cool to find somewhere how to upgrade ollama 0.2.5 to 0.2.8 on MacOS
{ "login": "deniercounter", "id": 24805904, "node_id": "MDQ6VXNlcjI0ODA1OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/24805904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/deniercounter", "html_url": "https://github.com/deniercounter", "followers_url": "https://api.githu...
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
4
2024-07-25T11:18:09
2024-07-25T12:26:33
2024-07-25T12:26:32
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
To find the command to update ollama on a MacOS it seems one has to join a discord server. Really?
{ "login": "deniercounter", "id": 24805904, "node_id": "MDQ6VXNlcjI0ODA1OTA0", "avatar_url": "https://avatars.githubusercontent.com/u/24805904?v=4", "gravatar_id": "", "url": "https://api.github.com/users/deniercounter", "html_url": "https://github.com/deniercounter", "followers_url": "https://api.githu...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5947/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5947/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4441
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4441/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4441/comments
https://api.github.com/repos/ollama/ollama/issues/4441/events
https://github.com/ollama/ollama/pull/4441
2,296,601,945
PR_kwDOJ0Z1Ps5vdnRX
4,441
Use DRM driver for VRAM info for amd
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
1
2024-05-14T23:58:44
2024-06-06T17:57:38
2024-06-06T17:57:34
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4441", "html_url": "https://github.com/ollama/ollama/pull/4441", "diff_url": "https://github.com/ollama/ollama/pull/4441.diff", "patch_url": "https://github.com/ollama/ollama/pull/4441.patch", "merged_at": null }
The amdgpu drivers free VRAM reporting omits some other apps, so leverage the upstream DRM driver which keeps better tabs on things Marking draft until I can do more testing... Fixes #3765
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4441/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4441/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/941
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/941/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/941/comments
https://api.github.com/repos/ollama/ollama/issues/941/events
https://github.com/ollama/ollama/issues/941
1,966,683,283
I_kwDOJ0Z1Ps51OTST
941
`digest mismatch` on download
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/follower...
[ { "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/...
null
116
2023-10-28T17:47:23
2025-01-30T02:21:24
null
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
While rare, `ollama pull` will sometimes result in a digest mismatch on download ``` % ollama run wizard-vicuna-uncensored:30b-q5_K_M pulling manifest pulling b1571c5cbd28... 100% |█████████████████████████████████████████████████████████████████████████████████████████████████████████| (45/45 B, 34 B/s) ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/941/reactions", "total_count": 25, "+1": 25, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/941/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/7265
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7265/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7265/comments
https://api.github.com/repos/ollama/ollama/issues/7265/events
https://github.com/ollama/ollama/pull/7265
2,598,655,929
PR_kwDOJ0Z1Ps5_KCM1
7,265
Migrate off centos 7 for intermediate build layers in container image builds
{ "login": "cazlo", "id": 3895350, "node_id": "MDQ6VXNlcjM4OTUzNTA=", "avatar_url": "https://avatars.githubusercontent.com/u/3895350?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cazlo", "html_url": "https://github.com/cazlo", "followers_url": "https://api.github.com/users/cazlo/follower...
[]
open
false
null
[]
null
1
2024-10-19T02:03:54
2024-12-05T22:25:43
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7265", "html_url": "https://github.com/ollama/ollama/pull/7265", "diff_url": "https://github.com/ollama/ollama/pull/7265.diff", "patch_url": "https://github.com/ollama/ollama/pull/7265.patch", "merged_at": null }
# What Migrate dependencies in the container image build to later supported versions: - centos 7 -> rockylinux 8 - gcc 10.2 -> gcc 11.2 - cuda 11.3.1 -> 11.7.1 # Why Closes #7260 . Avoids a compile issue with gcc 10.3. Avoids end of life of centos 7. More info on justification is available at #7260...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7265/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7265/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4684
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4684/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4684/comments
https://api.github.com/repos/ollama/ollama/issues/4684/events
https://github.com/ollama/ollama/issues/4684
2,321,811,440
I_kwDOJ0Z1Ps6KZAfw
4,684
Model download finally fails behind company firewall
{ "login": "berndgoetz", "id": 227312, "node_id": "MDQ6VXNlcjIyNzMxMg==", "avatar_url": "https://avatars.githubusercontent.com/u/227312?v=4", "gravatar_id": "", "url": "https://api.github.com/users/berndgoetz", "html_url": "https://github.com/berndgoetz", "followers_url": "https://api.github.com/users/b...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677370291, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw...
open
false
null
[]
null
2
2024-05-28T19:54:24
2024-10-23T16:07:25
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I want to make ollama available to our developer community in our company in order to learn to use the technology. We got ollama.com/* whitelisted through our company firewall, and it actually pretty much works quite well, but at the end of the model download, the process gets stuck: pulling ...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4684/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4684/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6411
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6411/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6411/comments
https://api.github.com/repos/ollama/ollama/issues/6411/events
https://github.com/ollama/ollama/pull/6411
2,472,453,285
PR_kwDOJ0Z1Ps54r_mE
6,411
server: limit upload parts to 16
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[]
closed
false
null
[]
null
0
2024-08-19T04:52:54
2024-08-19T16:20:54
2024-08-19T16:20:52
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6411", "html_url": "https://github.com/ollama/ollama/pull/6411", "diff_url": "https://github.com/ollama/ollama/pull/6411.diff", "patch_url": "https://github.com/ollama/ollama/pull/6411.patch", "merged_at": "2024-08-19T16:20:52" }
In similar vein as https://github.com/ollama/ollama/pull/6347, limit the number of upload connections to 16.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6411/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6411/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3857
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3857/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3857/comments
https://api.github.com/repos/ollama/ollama/issues/3857/events
https://github.com/ollama/ollama/pull/3857
2,259,978,702
PR_kwDOJ0Z1Ps5tiMlC
3,857
Add back memory escape valve
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
[]
closed
false
null
[]
null
0
2024-04-24T00:09:27
2024-04-24T00:32:27
2024-04-24T00:32:24
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3857", "html_url": "https://github.com/ollama/ollama/pull/3857", "diff_url": "https://github.com/ollama/ollama/pull/3857.diff", "patch_url": "https://github.com/ollama/ollama/pull/3857.patch", "merged_at": "2024-04-24T00:32:24" }
If we get our predictions wrong, this can be used to set a lower memory limit as a workaround. Recent multi-gpu refactoring accidentally removed it, so this adds it back.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhilt...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3857/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3857/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/957
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/957/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/957/comments
https://api.github.com/repos/ollama/ollama/issues/957/events
https://github.com/ollama/ollama/issues/957
1,971,350,354
I_kwDOJ0Z1Ps51gGtS
957
How do I create a Docker image containing a model?
{ "login": "flemzord", "id": 1952914, "node_id": "MDQ6VXNlcjE5NTI5MTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1952914?v=4", "gravatar_id": "", "url": "https://api.github.com/users/flemzord", "html_url": "https://github.com/flemzord", "followers_url": "https://api.github.com/users/flemz...
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
closed
false
null
[]
null
7
2023-10-31T21:49:04
2024-10-15T13:37:41
2024-03-11T19:05:54
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hello, I use Modelfile locally. I would like to deploy this one in production on a Kubernetes cluster, but I don't know how to proceed? How can I create a Docker image containing Ollama and the Model created from the Modelfile?
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/Br...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/957/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/957/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7940
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7940/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7940/comments
https://api.github.com/repos/ollama/ollama/issues/7940/events
https://github.com/ollama/ollama/issues/7940
2,719,108,129
I_kwDOJ0Z1Ps6iEkwh
7,940
Mini-CPM-V-2.6-q8_0 produces incoherent responses after applying KV Cache q4_0 or q8_0.
{ "login": "SingularityMan", "id": 91804288, "node_id": "U_kgDOBXjSgA", "avatar_url": "https://avatars.githubusercontent.com/u/91804288?v=4", "gravatar_id": "", "url": "https://api.github.com/users/SingularityMan", "html_url": "https://github.com/SingularityMan", "followers_url": "https://api.github.com...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
open
false
null
[]
null
0
2024-12-05T01:28:31
2024-12-05T01:28:31
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? This happens when running ollama `/generate` via the python API. The output looks like the model is having a seizure. It seems to be able to see the images but its output is so random and erratic I can't make out anything from the text. I didn't change any other parameter about the model. ### O...
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7940/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7940/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2438
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2438/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2438/comments
https://api.github.com/repos/ollama/ollama/issues/2438/events
https://github.com/ollama/ollama/issues/2438
2,128,024,512
I_kwDOJ0Z1Ps5-1xPA
2,438
Issue with system messages being discarded after updating to v0.1.23
{ "login": "gaodeng", "id": 1118249, "node_id": "MDQ6VXNlcjExMTgyNDk=", "avatar_url": "https://avatars.githubusercontent.com/u/1118249?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gaodeng", "html_url": "https://github.com/gaodeng", "followers_url": "https://api.github.com/users/gaodeng/...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.g...
null
0
2024-02-10T01:04:19
2024-02-12T23:06:58
2024-02-12T23:06:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Since the update to version v0.1.23, I have noticed that multiple system messages are being discarded when I send the following messages: ``` [ { "role": "system", "content": "You are an AI assistant called ‘BotGem’ that is based on the language model llama2. You are helpful, creative, clever, friendly...
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2438/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2438/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2583
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2583/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2583/comments
https://api.github.com/repos/ollama/ollama/issues/2583/events
https://github.com/ollama/ollama/issues/2583
2,141,229,415
I_kwDOJ0Z1Ps5_oJFn
2,583
How to make a PR to fix a modelfile?
{ "login": "WolframRavenwolf", "id": 52386626, "node_id": "MDQ6VXNlcjUyMzg2NjI2", "avatar_url": "https://avatars.githubusercontent.com/u/52386626?v=4", "gravatar_id": "", "url": "https://api.github.com/users/WolframRavenwolf", "html_url": "https://github.com/WolframRavenwolf", "followers_url": "https://...
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
4
2024-02-18T23:05:32
2024-05-16T22:58:08
2024-05-16T22:56:47
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Couldn't find the modelfiles in this repo, but would like to fix and make a PR for the Mixtral modelfile. Its prompt format is wrong, fixed it locally, but how to contribute that back to the project?
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/follower...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2583/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2583/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6635
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6635/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6635/comments
https://api.github.com/repos/ollama/ollama/issues/6635/events
https://github.com/ollama/ollama/issues/6635
2,505,819,425
I_kwDOJ0Z1Ps6VW8Uh
6,635
Moondream2 needs an update
{ "login": "ddpasa", "id": 112642920, "node_id": "U_kgDOBrbLaA", "avatar_url": "https://avatars.githubusercontent.com/u/112642920?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ddpasa", "html_url": "https://github.com/ddpasa", "followers_url": "https://api.github.com/users/ddpasa/follower...
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmor...
[ { "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.g...
null
1
2024-09-04T16:26:51
2024-11-19T23:24:41
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
moondream2 is an amazing tiny little VLM. The owner (https://github.com/vikhyat) releases updates quite frequently. I'm not sure which version ollama currently has, but there was a new release last week (2024-08-26) which is not in ollama. https://huggingface.co/vikhyatk/moondream2
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6635/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6635/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/7888
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7888/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7888/comments
https://api.github.com/repos/ollama/ollama/issues/7888/events
https://github.com/ollama/ollama/pull/7888
2,706,590,591
PR_kwDOJ0Z1Ps6DnXB7
7,888
Enable index tracking for tools - openai api support
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
[]
closed
false
null
[]
null
0
2024-11-30T03:42:10
2024-11-30T04:00:11
2024-11-30T04:00:09
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7888", "html_url": "https://github.com/ollama/ollama/pull/7888", "diff_url": "https://github.com/ollama/ollama/pull/7888.diff", "patch_url": "https://github.com/ollama/ollama/pull/7888.patch", "merged_at": "2024-11-30T04:00:09" }
Closes https://github.com/ollama/ollama/issues/7881 Now able to use `client.beta.chat.completions.stream` <img width="570" alt="image" src="https://github.com/user-attachments/assets/32c48aca-b23a-40b8-b32d-2fcb667d2d81">
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7888/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7888/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2152
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2152/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2152/comments
https://api.github.com/repos/ollama/ollama/issues/2152/events
https://github.com/ollama/ollama/issues/2152
2,095,148,484
I_kwDOJ0Z1Ps584W3E
2,152
True SVG of Ollama logo?
{ "login": "sqs", "id": 1976, "node_id": "MDQ6VXNlcjE5NzY=", "avatar_url": "https://avatars.githubusercontent.com/u/1976?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sqs", "html_url": "https://github.com/sqs", "followers_url": "https://api.github.com/users/sqs/followers", "following_u...
[]
closed
false
null
[]
null
4
2024-01-23T03:01:39
2024-07-14T20:49:16
2024-01-23T04:49:21
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I see https://github.com/jmorganca/ollama/blob/a0a829bf7a29b532f4bebe00e7cb1304ff9f0190/app/src/ollama.svg, but it's an SVG that embeds PNG data. Is there a true SVG of the Ollama logo? I would like to use it in the model selection dropdown in Cody: ![image](https://github.com/jmorganca/ollama/assets/1976/8d2a173a-8...
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/us...
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2152/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2152/timeline
null
completed
false