url stringlengths 51 54 | repository_url stringclasses 1
value | labels_url stringlengths 65 68 | comments_url stringlengths 60 63 | events_url stringlengths 58 61 | html_url stringlengths 39 44 | id int64 1.78B 2.82B | node_id stringlengths 18 19 | number int64 1 8.69k | title stringlengths 1 382 | user dict | labels listlengths 0 5 | state stringclasses 2
values | locked bool 1
class | assignee dict | assignees listlengths 0 2 | milestone null | comments int64 0 323 | created_at timestamp[s] | updated_at timestamp[s] | closed_at timestamp[s] | author_association stringclasses 4
values | sub_issues_summary dict | active_lock_reason null | draft bool 2
classes | pull_request dict | body stringlengths 2 118k ⌀ | closed_by dict | reactions dict | timeline_url stringlengths 60 63 | performed_via_github_app null | state_reason stringclasses 4
values | is_pull_request bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/4508 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4508/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4508/comments | https://api.github.com/repos/ollama/ollama/issues/4508/events | https://github.com/ollama/ollama/pull/4508 | 2,303,689,972 | PR_kwDOJ0Z1Ps5v18ny | 4,508 | add OLLAMA_NOHISTORY to turn off history in interactive mode | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | [] | closed | false | null | [] | null | 4 | 2024-05-17T23:38:16 | 2025-01-18T05:03:26 | 2024-05-18T18:51:57 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/4508",
"html_url": "https://github.com/ollama/ollama/pull/4508",
"diff_url": "https://github.com/ollama/ollama/pull/4508.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4508.patch",
"merged_at": "2024-05-18T18:51:57"
} | fixes #3002 | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4508/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4508/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/5298 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5298/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5298/comments | https://api.github.com/repos/ollama/ollama/issues/5298/events | https://github.com/ollama/ollama/issues/5298 | 2,375,365,380 | I_kwDOJ0Z1Ps6NlTME | 5,298 | Internal error at url manifests/sha256: | {
"login": "alexeu1994",
"id": 20879475,
"node_id": "MDQ6VXNlcjIwODc5NDc1",
"avatar_url": "https://avatars.githubusercontent.com/u/20879475?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alexeu1994",
"html_url": "https://github.com/alexeu1994",
"followers_url": "https://api.github.com/use... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | open | false | null | [] | null | 12 | 2024-06-26T13:31:28 | 2025-01-09T17:55:25 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Sonatype nexus proxy was configured and worked, but 2 weeks ago it started giving an error when requesting a manifest.
```
2024-06-24 22:19:00,375+0300 DEBUG [nexus-httpclient-eviction-thread] *SYSTEM org.apache.http.impl.conn.CPool - Connection [id:3377][route:{s}->[https://registry.ol... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5298/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5298/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/2025 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2025/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2025/comments | https://api.github.com/repos/ollama/ollama/issues/2025/events | https://github.com/ollama/ollama/issues/2025 | 2,084,978,849 | I_kwDOJ0Z1Ps58RkCh | 2,025 | model stable-code is not stable | {
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/ipla... | [] | closed | false | null | [] | null | 4 | 2024-01-16T21:29:11 | 2024-03-11T18:36:43 | 2024-03-11T18:36:43 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | what languges do you know results in an endless display of ```
. This particular event has actually just been added to our entire project code base here above, which means that a new unique identifier for this particular event has also been generated automatically by my very special personal computer system right now ... | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2025/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2025/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8569 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8569/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8569/comments | https://api.github.com/repos/ollama/ollama/issues/8569/events | https://github.com/ollama/ollama/issues/8569 | 2,810,087,681 | I_kwDOJ0Z1Ps6nfokB | 8,569 | Linux: Compiling Ollama with AVX-512 and CUDA support | {
"login": "graynoir",
"id": 184021645,
"node_id": "U_kgDOCvfyjQ",
"avatar_url": "https://avatars.githubusercontent.com/u/184021645?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/graynoir",
"html_url": "https://github.com/graynoir",
"followers_url": "https://api.github.com/users/graynoir/... | [] | open | false | null | [] | null | 11 | 2025-01-24T18:13:32 | 2025-01-26T22:06:04 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Hi, I've been trying to compile Ollama with AVX-512 and Cuda support on Linux (Manjaro), however, despite multiple attempts and different custom cpu flags I don't seem to get it to work, and instead `ollama serve` falls back to `level=INFO source=routes.go:1267 msg="Dynamic LLM libraries" runners=[cpu]`, when the binar... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8569/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8569/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/3402 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3402/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3402/comments | https://api.github.com/repos/ollama/ollama/issues/3402/events | https://github.com/ollama/ollama/issues/3402 | 2,214,509,688 | I_kwDOJ0Z1Ps6D_rx4 | 3,402 | "ollama run llama2" Fails with Connection Error and Runtime Panic on Windows 11 | {
"login": "cvecve147",
"id": 12343899,
"node_id": "MDQ6VXNlcjEyMzQzODk5",
"avatar_url": "https://avatars.githubusercontent.com/u/12343899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cvecve147",
"html_url": "https://github.com/cvecve147",
"followers_url": "https://api.github.com/users/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 0 | 2024-03-29T02:44:57 | 2024-04-08T05:33:10 | 2024-04-08T05:33:10 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
When attempting to run the command **ollama run llama2**, I encountered a connection error followed by a runtime panic. Initially, the process attempts to pull a manifest but fails with a connection error indicating that the connection to **127.0.0.1:11434** was refused. Subsequently, inspecti... | {
"login": "cvecve147",
"id": 12343899,
"node_id": "MDQ6VXNlcjEyMzQzODk5",
"avatar_url": "https://avatars.githubusercontent.com/u/12343899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cvecve147",
"html_url": "https://github.com/cvecve147",
"followers_url": "https://api.github.com/users/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3402/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3402/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8655 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8655/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8655/comments | https://api.github.com/repos/ollama/ollama/issues/8655/events | https://github.com/ollama/ollama/issues/8655 | 2,818,034,521 | I_kwDOJ0Z1Ps6n98tZ | 8,655 | GPU process at 1-3% when running Deepseek R1 32b | {
"login": "BananasMan",
"id": 112043755,
"node_id": "U_kgDOBq2m6w",
"avatar_url": "https://avatars.githubusercontent.com/u/112043755?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BananasMan",
"html_url": "https://github.com/BananasMan",
"followers_url": "https://api.github.com/users/Ban... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | open | false | null | [] | null | 3 | 2025-01-29T12:09:24 | 2025-01-30T08:53:18 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
im trying to run deepseek r1 32b locally. It runs but the GPU barely used.
when it processing a simple task like multipying numbers, i saw in task manager that the gpu barely used at 1-3%, while the cpu at 70%.
i have to add tho that both ram and vram is still used well and both almost full.
... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8655/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8655/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/4680 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4680/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4680/comments | https://api.github.com/repos/ollama/ollama/issues/4680/events | https://github.com/ollama/ollama/issues/4680 | 2,321,201,470 | I_kwDOJ0Z1Ps6KWrk- | 4,680 | Json Mode significantly decrease GPU usage | {
"login": "LaetLanf",
"id": 131473617,
"node_id": "U_kgDOB9Yg0Q",
"avatar_url": "https://avatars.githubusercontent.com/u/131473617?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LaetLanf",
"html_url": "https://github.com/LaetLanf",
"followers_url": "https://api.github.com/users/LaetLanf/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 1 | 2024-05-28T14:17:19 | 2024-05-28T20:41:50 | 2024-05-28T20:41:50 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
I am running Ollama Llama3:70b-instruct on an Azure Linux A100 VM.
I did a test with and without json mode, with the exact same prompt and python code. The only thing I changed is format='json' in the chat call.
WITHOUT json mode, I reached:
22-25 TPS for 1 chat call
The monitoring of th... | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4680/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4680/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/3827 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3827/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3827/comments | https://api.github.com/repos/ollama/ollama/issues/3827/events | https://github.com/ollama/ollama/issues/3827 | 2,256,950,300 | I_kwDOJ0Z1Ps6GhlQc | 3,827 | Enable CORS for "app://obsidian.md" | {
"login": "pegasusthemis",
"id": 167796164,
"node_id": "U_kgDOCgBdxA",
"avatar_url": "https://avatars.githubusercontent.com/u/167796164?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pegasusthemis",
"html_url": "https://github.com/pegasusthemis",
"followers_url": "https://api.github.com/... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | closed | false | null | [] | null | 7 | 2024-04-22T16:13:48 | 2024-08-23T08:43:31 | 2024-04-23T22:56:16 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Hello,
I am a developer creating plugins for Obsidian, a popular knowledge management and note-taking software. I believe that enabling CORS for `app://obsidian.md` would significantly enhance the functionality and integration possibilities of Obsidian plugins with Ollama models.
Obsidian uses a custom protocol `ap... | {
"login": "pegasusthemis",
"id": 167796164,
"node_id": "U_kgDOCgBdxA",
"avatar_url": "https://avatars.githubusercontent.com/u/167796164?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pegasusthemis",
"html_url": "https://github.com/pegasusthemis",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3827/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3827/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2040 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2040/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2040/comments | https://api.github.com/repos/ollama/ollama/issues/2040/events | https://github.com/ollama/ollama/pull/2040 | 2,087,387,204 | PR_kwDOJ0Z1Ps5kYHGf | 2,040 | Add cuda to CI build | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [] | closed | false | null | [] | null | 2 | 2024-01-18T03:05:47 | 2024-01-27T15:14:58 | 2024-01-27T15:14:55 | COLLABORATOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | true | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/2040",
"html_url": "https://github.com/ollama/ollama/pull/2040",
"diff_url": "https://github.com/ollama/ollama/pull/2040.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2040.patch",
"merged_at": null
} | null | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2040/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2040/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/5730 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5730/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5730/comments | https://api.github.com/repos/ollama/ollama/issues/5730/events | https://github.com/ollama/ollama/pull/5730 | 2,412,022,138 | PR_kwDOJ0Z1Ps51j27m | 5,730 | remove unneeded tool calls | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | [] | closed | false | null | [] | null | 0 | 2024-07-16T20:50:20 | 2024-07-16T21:42:14 | 2024-07-16T21:42:13 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/5730",
"html_url": "https://github.com/ollama/ollama/pull/5730",
"diff_url": "https://github.com/ollama/ollama/pull/5730.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5730.patch",
"merged_at": "2024-07-16T21:42:13"
} | ID and Type are currently unused so leave them out for now | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5730/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5730/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/1035 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1035/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1035/comments | https://api.github.com/repos/ollama/ollama/issues/1035/events | https://github.com/ollama/ollama/pull/1035 | 1,982,028,245 | PR_kwDOJ0Z1Ps5e17Rq | 1,035 | add a complete /generate options example | {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/Br... | [] | closed | false | null | [] | null | 0 | 2023-11-07T19:01:44 | 2023-11-09T00:44:37 | 2023-11-09T00:44:37 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/1035",
"html_url": "https://github.com/ollama/ollama/pull/1035",
"diff_url": "https://github.com/ollama/ollama/pull/1035.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1035.patch",
"merged_at": "2023-11-09T00:44:37"
} | - Add an example to the api docs that shows how all generate runtime options can be specified
- Move the `GenerateRequest` options closed to the struct declaration so its easier for readers to find
resolves #1027 | {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/Br... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1035/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1035/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/6583 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6583/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6583/comments | https://api.github.com/repos/ollama/ollama/issues/6583/events | https://github.com/ollama/ollama/pull/6583 | 2,499,201,738 | PR_kwDOJ0Z1Ps56EiQV | 6,583 | Update README.md | {
"login": "jonathanhecl",
"id": 1691623,
"node_id": "MDQ6VXNlcjE2OTE2MjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1691623?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jonathanhecl",
"html_url": "https://github.com/jonathanhecl",
"followers_url": "https://api.github.com... | [] | closed | false | null | [] | null | 0 | 2024-09-01T03:53:54 | 2024-09-02T19:34:26 | 2024-09-02T19:34:26 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6583",
"html_url": "https://github.com/ollama/ollama/pull/6583",
"diff_url": "https://github.com/ollama/ollama/pull/6583.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6583.patch",
"merged_at": "2024-09-02T19:34:26"
} | New links:
Go-CREW and Ollamaclient for Golang | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6583/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6583/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3758 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3758/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3758/comments | https://api.github.com/repos/ollama/ollama/issues/3758/events | https://github.com/ollama/ollama/issues/3758 | 2,253,454,463 | I_kwDOJ0Z1Ps6GUPx_ | 3,758 | Ollama backend down? | {
"login": "piratos",
"id": 8265152,
"node_id": "MDQ6VXNlcjgyNjUxNTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8265152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/piratos",
"html_url": "https://github.com/piratos",
"followers_url": "https://api.github.com/users/piratos/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 6 | 2024-04-19T16:52:04 | 2024-04-19T17:00:03 | 2024-04-19T17:00:03 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
ollama pull returns
```
no healthy upstream
```
llama 3 release traffic killed your backend ? :smile:
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.31 | {
"login": "piratos",
"id": 8265152,
"node_id": "MDQ6VXNlcjgyNjUxNTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8265152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/piratos",
"html_url": "https://github.com/piratos",
"followers_url": "https://api.github.com/users/piratos/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3758/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3758/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/7332 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7332/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7332/comments | https://api.github.com/repos/ollama/ollama/issues/7332/events | https://github.com/ollama/ollama/issues/7332 | 2,608,448,576 | I_kwDOJ0Z1Ps6becRA | 7,332 | Support installations in non-systemd distros | {
"login": "Sachin-Bhat",
"id": 25080916,
"node_id": "MDQ6VXNlcjI1MDgwOTE2",
"avatar_url": "https://avatars.githubusercontent.com/u/25080916?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sachin-Bhat",
"html_url": "https://github.com/Sachin-Bhat",
"followers_url": "https://api.github.com/... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | closed | false | null | [] | null | 2 | 2024-10-23T12:35:05 | 2024-11-21T18:58:57 | 2024-11-21T18:58:56 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Hello folks,
Kudos on this project! I wanted to install this on my system which is Artix Linux running the runit init system. Was wondering if support could be added for this.
Appreciate the assistance in advance.
Cheers,
Sachin | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7332/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7332/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/6237 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6237/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6237/comments | https://api.github.com/repos/ollama/ollama/issues/6237/events | https://github.com/ollama/ollama/issues/6237 | 2,453,913,814 | I_kwDOJ0Z1Ps6SQ8DW | 6,237 | Ollama Product Stance on Grammar Feature / Outstanding PRs | {
"login": "Kinglord",
"id": 597488,
"node_id": "MDQ6VXNlcjU5NzQ4OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/597488?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kinglord",
"html_url": "https://github.com/Kinglord",
"followers_url": "https://api.github.com/users/Kinglor... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | closed | false | null | [] | null | 18 | 2024-08-07T16:47:55 | 2024-12-05T00:52:24 | 2024-12-05T00:31:49 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Hello,
This isn't a feature request, but it's the best category I could pick. This is really a question around merging PRs for exposing an existing feature to users of Ollama that are being ignored or declined without good context. I'm asking this to get more public visibility from the Ollama team on grammar feature... | {
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6237/reactions",
"total_count": 62,
"+1": 29,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 25,
"rocket": 0,
"eyes": 8
} | https://api.github.com/repos/ollama/ollama/issues/6237/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/625 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/625/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/625/comments | https://api.github.com/repos/ollama/ollama/issues/625/events | https://github.com/ollama/ollama/issues/625 | 1,916,332,688 | I_kwDOJ0Z1Ps5yOOqQ | 625 | `ollama cp` followed by `ollama push` requires re-pushing layers | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 2 | 2023-09-27T21:12:33 | 2024-01-16T22:13:55 | 2024-01-16T22:13:55 | MEMBER | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | To reproduce:
```
ollama cp llama2 <username>/llama2
ollama push <username>/llama2
``` | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/625/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/625/timeline | null | not_planned | false |
https://api.github.com/repos/ollama/ollama/issues/5075 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5075/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5075/comments | https://api.github.com/repos/ollama/ollama/issues/5075/events | https://github.com/ollama/ollama/pull/5075 | 2,355,375,310 | PR_kwDOJ0Z1Ps5ylt6D | 5,075 | docs: add missing powershell package to windows development instructions | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | [] | closed | false | null | [] | null | 0 | 2024-06-16T01:52:50 | 2024-06-16T03:08:10 | 2024-06-16T03:08:09 | MEMBER | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/5075",
"html_url": "https://github.com/ollama/ollama/pull/5075",
"diff_url": "https://github.com/ollama/ollama/pull/5075.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5075.patch",
"merged_at": "2024-06-16T03:08:09"
} | The powershell script for building Ollama on Windows requires the `ThreadJob` module. Add this to the instructions and dependency list. | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5075/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5075/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/2217 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2217/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2217/comments | https://api.github.com/repos/ollama/ollama/issues/2217/events | https://github.com/ollama/ollama/issues/2217 | 2,102,921,772 | I_kwDOJ0Z1Ps59WAos | 2,217 | Message vs Template vs System | {
"login": "giannisak",
"id": 154079765,
"node_id": "U_kgDOCS8SFQ",
"avatar_url": "https://avatars.githubusercontent.com/u/154079765?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/giannisak",
"html_url": "https://github.com/giannisak",
"followers_url": "https://api.github.com/users/gianni... | [] | closed | false | null | [] | null | 2 | 2024-01-26T21:22:30 | 2024-01-27T00:57:37 | 2024-01-27T00:57:37 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | What is the difference between message, template and system if I want to do few-shot prompting?
I mean, I could pass the example of release(v0.1.21) to a model in three different ways:
1) Few-shot using Message:
SYSTEM You are a friendly assistant that only answers with 'yes' or 'no'
MESSAGE user Is Toronto in Ca... | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2217/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2217/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2956 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2956/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2956/comments | https://api.github.com/repos/ollama/ollama/issues/2956/events | https://github.com/ollama/ollama/issues/2956 | 2,172,035,695 | I_kwDOJ0Z1Ps6BdqJv | 2,956 | feat: Add an "official" indicator in the library | {
"login": "jimscard",
"id": 26580570,
"node_id": "MDQ6VXNlcjI2NTgwNTcw",
"avatar_url": "https://avatars.githubusercontent.com/u/26580570?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jimscard",
"html_url": "https://github.com/jimscard",
"followers_url": "https://api.github.com/users/jim... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6573197867,
"node_id": ... | open | false | {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/Br... | [
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api... | null | 0 | 2024-03-06T17:12:35 | 2024-03-12T20:53:36 | null | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | In the model library on ollama.com, there is no indication whether a model is "official", e.g., provided by the ollama team, or uploaded by a user, other than the username prefix.
This can cause confusion for other users searching for a model by name.
Recommend implementing something similar to Docker Hub to indicate... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2956/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2956/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/1923 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1923/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1923/comments | https://api.github.com/repos/ollama/ollama/issues/1923/events | https://github.com/ollama/ollama/issues/1923 | 2,076,206,237 | I_kwDOJ0Z1Ps57wGSd | 1,923 | choosing the right model to interact | {
"login": "umtksa",
"id": 12473742,
"node_id": "MDQ6VXNlcjEyNDczNzQy",
"avatar_url": "https://avatars.githubusercontent.com/u/12473742?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/umtksa",
"html_url": "https://github.com/umtksa",
"followers_url": "https://api.github.com/users/umtksa/fo... | [
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] | closed | false | null | [] | null | 1 | 2024-01-11T10:04:42 | 2024-03-11T19:32:12 | 2024-03-11T19:32:12 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I can use a custom mistral modelfile to choose which model is the best choice based on the subject.
like in my model file "choose" all models have descriptions like copywriter model or weather model
based on the subject mistral can choose the best model and gives me the command to run
so I can run it through the mo... | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1923/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/ollama/ollama/issues/1923/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/4118 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4118/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4118/comments | https://api.github.com/repos/ollama/ollama/issues/4118/events | https://github.com/ollama/ollama/pull/4118 | 2,276,937,764 | PR_kwDOJ0Z1Ps5ubzx7 | 4,118 | Add ChatGPTBox and RWKV-Runner to community integrations | {
"login": "josStorer",
"id": 13366013,
"node_id": "MDQ6VXNlcjEzMzY2MDEz",
"avatar_url": "https://avatars.githubusercontent.com/u/13366013?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/josStorer",
"html_url": "https://github.com/josStorer",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | null | 3 | 2024-05-03T05:20:34 | 2024-11-23T21:31:27 | 2024-11-23T21:31:27 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/4118",
"html_url": "https://github.com/ollama/ollama/pull/4118",
"diff_url": "https://github.com/ollama/ollama/pull/4118.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4118.patch",
"merged_at": "2024-11-23T21:31:27"
} | Integrating Tutorial:
ChatGPTBox: https://github.com/josStorer/chatGPTBox/issues/616#issuecomment-1975186467
RWKV-Runner:

| {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4118/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/5877 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5877/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5877/comments | https://api.github.com/repos/ollama/ollama/issues/5877/events | https://github.com/ollama/ollama/issues/5877 | 2,425,363,032 | I_kwDOJ0Z1Ps6QkBpY | 5,877 | Ollama API not seeing messages provided in conversation_history | {
"login": "barclaybrown",
"id": 36378453,
"node_id": "MDQ6VXNlcjM2Mzc4NDUz",
"avatar_url": "https://avatars.githubusercontent.com/u/36378453?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/barclaybrown",
"html_url": "https://github.com/barclaybrown",
"followers_url": "https://api.github.c... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 4 | 2024-07-23T14:28:07 | 2024-09-12T22:02:25 | 2024-09-12T22:02:25 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
When I pass a list of dictionaries (messages) to ollama.chat it seems that the model does not see anything other than the latest message. For example, I want the model to get a bunch of text, and then answer a question about it. I send something like:
role : system content: You are a helpf... | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5877/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/936 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/936/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/936/comments | https://api.github.com/repos/ollama/ollama/issues/936/events | https://github.com/ollama/ollama/pull/936 | 1,966,094,839 | PR_kwDOJ0Z1Ps5eAOhe | 936 | I've added the sample with Gradio and the scan of a folder | {
"login": "suoko",
"id": 3659980,
"node_id": "MDQ6VXNlcjM2NTk5ODA=",
"avatar_url": "https://avatars.githubusercontent.com/u/3659980?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/suoko",
"html_url": "https://github.com/suoko",
"followers_url": "https://api.github.com/users/suoko/follower... | [] | closed | false | null | [] | null | 1 | 2023-10-27T19:43:10 | 2023-10-30T21:55:35 | 2023-10-30T21:55:34 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/936",
"html_url": "https://github.com/ollama/ollama/pull/936",
"diff_url": "https://github.com/ollama/ollama/pull/936.diff",
"patch_url": "https://github.com/ollama/ollama/pull/936.patch",
"merged_at": null
} | The only value to change is the chunk_size which varies according to documents to scan | {
"login": "suoko",
"id": 3659980,
"node_id": "MDQ6VXNlcjM2NTk5ODA=",
"avatar_url": "https://avatars.githubusercontent.com/u/3659980?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/suoko",
"html_url": "https://github.com/suoko",
"followers_url": "https://api.github.com/users/suoko/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/936/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/936/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/6524 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6524/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6524/comments | https://api.github.com/repos/ollama/ollama/issues/6524/events | https://github.com/ollama/ollama/pull/6524 | 2,488,115,928 | PR_kwDOJ0Z1Ps55gdUY | 6,524 | server: clean up route names for consistency | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | [] | closed | false | null | [] | null | 0 | 2024-08-27T02:16:33 | 2024-08-27T02:36:13 | 2024-08-27T02:36:12 | MEMBER | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6524",
"html_url": "https://github.com/ollama/ollama/pull/6524",
"diff_url": "https://github.com/ollama/ollama/pull/6524.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6524.patch",
"merged_at": "2024-08-27T02:36:12"
} | null | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6524/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6524/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/5672 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5672/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5672/comments | https://api.github.com/repos/ollama/ollama/issues/5672/events | https://github.com/ollama/ollama/issues/5672 | 2,406,879,451 | I_kwDOJ0Z1Ps6PdhDb | 5,672 | ollama._types.ResponseError | {
"login": "Lena-Van",
"id": 149133903,
"node_id": "U_kgDOCOOaTw",
"avatar_url": "https://avatars.githubusercontent.com/u/149133903?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Lena-Van",
"html_url": "https://github.com/Lena-Van",
"followers_url": "https://api.github.com/users/Lena-Van/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 2 | 2024-07-13T12:24:40 | 2024-07-14T13:27:29 | 2024-07-14T13:27:28 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
I set the llama3_70b_ollama_model_configuration = {
"config_name": "ollama_llama3_70b",
"model_type": "ollama_chat",
"model_name": "example",
"options": {
"temperature": 0.5,
"seed": 123
},
"keep_alive": "5m"
}
the "example" model was downloaded from t... | {
"login": "Lena-Van",
"id": 149133903,
"node_id": "U_kgDOCOOaTw",
"avatar_url": "https://avatars.githubusercontent.com/u/149133903?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Lena-Van",
"html_url": "https://github.com/Lena-Van",
"followers_url": "https://api.github.com/users/Lena-Van/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5672/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5672/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/7616 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7616/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7616/comments | https://api.github.com/repos/ollama/ollama/issues/7616/events | https://github.com/ollama/ollama/issues/7616 | 2,648,522,293 | I_kwDOJ0Z1Ps6d3T41 | 7,616 | Please add microsoft/OmniParser model | {
"login": "craftslab",
"id": 49358172,
"node_id": "MDQ6VXNlcjQ5MzU4MTcy",
"avatar_url": "https://avatars.githubusercontent.com/u/49358172?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/craftslab",
"html_url": "https://github.com/craftslab",
"followers_url": "https://api.github.com/users/... | [
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] | open | false | null | [] | null | 0 | 2024-11-11T08:10:34 | 2024-11-11T08:10:34 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | OmniParser is a general screen parsing tool, which interprets/converts UI screenshot to structured format, to improve existing LLM based UI agent. Training Datasets include: 1) an interactable icon detection dataset, which was curated from popular web pages and automatically annotated to highlight clickable and actiona... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7616/reactions",
"total_count": 12,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 12,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7616/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/7123 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7123/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7123/comments | https://api.github.com/repos/ollama/ollama/issues/7123/events | https://github.com/ollama/ollama/issues/7123 | 2,571,670,286 | I_kwDOJ0Z1Ps6ZSJMO | 7,123 | Long responses can corrupt the model until unloaded | {
"login": "ragibson",
"id": 14023456,
"node_id": "MDQ6VXNlcjE0MDIzNDU2",
"avatar_url": "https://avatars.githubusercontent.com/u/14023456?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ragibson",
"html_url": "https://github.com/ragibson",
"followers_url": "https://api.github.com/users/rag... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q... | open | false | null | [] | null | 5 | 2024-10-07T22:51:24 | 2024-11-06T00:20:07 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
In a relatively simple prompt, one of the Phi models went off track and ranted for several thousand words. After, all future responses produced (mostly) garbage output, even in separate API calls or interactive sessions with cleared session context. This persisted until the model was completely ... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7123/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7123/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/7838 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7838/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7838/comments | https://api.github.com/repos/ollama/ollama/issues/7838/events | https://github.com/ollama/ollama/issues/7838 | 2,693,402,758 | I_kwDOJ0Z1Ps6gihCG | 7,838 | AMD ROCm 6.2.4 Ubuntu 24.04 `ggml-cuda.cu:132: CUDA error` | {
"login": "unclemusclez",
"id": 8789242,
"node_id": "MDQ6VXNlcjg3ODkyNDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8789242?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/unclemusclez",
"html_url": "https://github.com/unclemusclez",
"followers_url": "https://api.github.com... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 1 | 2024-11-26T06:49:31 | 2024-12-03T05:05:10 | 2024-12-03T05:05:10 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Not sure why `ggml-cuda.cu` is being called with AMD:
compiled from origin main
```
ggml-cuda.cu:132: CUDA error
Could not attach to process. If your uid matches the uid of the target
process, check the setting of /proc/sys/kernel/yama/ptrace_scope, or try
again as the root user. For... | {
"login": "unclemusclez",
"id": 8789242,
"node_id": "MDQ6VXNlcjg3ODkyNDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8789242?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/unclemusclez",
"html_url": "https://github.com/unclemusclez",
"followers_url": "https://api.github.com... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7838/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7838/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8389 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8389/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8389/comments | https://api.github.com/repos/ollama/ollama/issues/8389/events | https://github.com/ollama/ollama/issues/8389 | 2,782,229,709 | I_kwDOJ0Z1Ps6l1XTN | 8,389 | Ollama install script relaces the systemd profile | {
"login": "gerroon",
"id": 8519469,
"node_id": "MDQ6VXNlcjg1MTk0Njk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8519469?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gerroon",
"html_url": "https://github.com/gerroon",
"followers_url": "https://api.github.com/users/gerroon/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 8 | 2025-01-12T01:55:55 | 2025-01-28T21:11:50 | 2025-01-28T21:11:49 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
The installation script doesn't pay attention to existing systemd profile, so every new install will replace the existing systemd script. This is not the standard behavior in Debian or other distros, the script at least should ask for permission to replace it.
This is the recommend script
... | {
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8389/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8389/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/3132 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3132/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3132/comments | https://api.github.com/repos/ollama/ollama/issues/3132/events | https://github.com/ollama/ollama/pull/3132 | 2,185,243,153 | PR_kwDOJ0Z1Ps5pko5_ | 3,132 | Fix Execution Error in /tmp with noexec for Issue #2436 | {
"login": "jshbmllr",
"id": 27757825,
"node_id": "MDQ6VXNlcjI3NzU3ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/27757825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jshbmllr",
"html_url": "https://github.com/jshbmllr",
"followers_url": "https://api.github.com/users/jsh... | [] | closed | false | null | [] | null | 0 | 2024-03-14T02:15:44 | 2024-03-14T10:48:15 | 2024-03-14T10:48:15 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/3132",
"html_url": "https://github.com/ollama/ollama/pull/3132",
"diff_url": "https://github.com/ollama/ollama/pull/3132.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3132.patch",
"merged_at": null
} | In relation to [Issue #2436](https://github.com/ollama/ollama/issues/2436), which remains unresolved, this pull request introduces a fix similar to the one in [PR #2403](https://github.com/ollama/ollama/pull/2403). The issue arises on Linux systems where the /tmp directory is mounted with the noexec option, preventing ... | {
"login": "jshbmllr",
"id": 27757825,
"node_id": "MDQ6VXNlcjI3NzU3ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/27757825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jshbmllr",
"html_url": "https://github.com/jshbmllr",
"followers_url": "https://api.github.com/users/jsh... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3132/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3132/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/4957 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4957/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4957/comments | https://api.github.com/repos/ollama/ollama/issues/4957/events | https://github.com/ollama/ollama/pull/4957 | 2,342,499,891 | PR_kwDOJ0Z1Ps5x5w7Z | 4,957 | Update README.md | {
"login": "ZeyoYT",
"id": 61089602,
"node_id": "MDQ6VXNlcjYxMDg5NjAy",
"avatar_url": "https://avatars.githubusercontent.com/u/61089602?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZeyoYT",
"html_url": "https://github.com/ZeyoYT",
"followers_url": "https://api.github.com/users/ZeyoYT/fo... | [] | closed | false | null | [] | null | 2 | 2024-06-09T21:29:54 | 2024-09-05T20:10:44 | 2024-09-05T20:10:44 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/4957",
"html_url": "https://github.com/ollama/ollama/pull/4957",
"diff_url": "https://github.com/ollama/ollama/pull/4957.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4957.patch",
"merged_at": "2024-09-05T20:10:44"
} | Add AiLama to the list of community apps in Extensions & Plugins
This is a duplicate pull request of #4481, but resolves conflicts | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4957/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4957/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/6116 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6116/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6116/comments | https://api.github.com/repos/ollama/ollama/issues/6116/events | https://github.com/ollama/ollama/issues/6116 | 2,441,916,883 | I_kwDOJ0Z1Ps6RjLHT | 6,116 | mistral nemo | {
"login": "Domi31tls",
"id": 124446863,
"node_id": "U_kgDOB2rojw",
"avatar_url": "https://avatars.githubusercontent.com/u/124446863?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Domi31tls",
"html_url": "https://github.com/Domi31tls",
"followers_url": "https://api.github.com/users/Domi31... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 2 | 2024-08-01T09:09:33 | 2024-08-01T20:32:55 | 2024-08-01T20:27:06 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
I use open-webui. When i want to use mistral nemo, I have an error 500 :

### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.37 | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6116/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6116/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/4572 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4572/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4572/comments | https://api.github.com/repos/ollama/ollama/issues/4572/events | https://github.com/ollama/ollama/issues/4572 | 2,309,825,847 | I_kwDOJ0Z1Ps6JrSU3 | 4,572 | Error: llama runner process has terminated: exit status 0xc0000409 | {
"login": "NeoFii",
"id": 155638855,
"node_id": "U_kgDOCUbcRw",
"avatar_url": "https://avatars.githubusercontent.com/u/155638855?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NeoFii",
"html_url": "https://github.com/NeoFii",
"followers_url": "https://api.github.com/users/NeoFii/follower... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg... | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 35 | 2024-05-22T07:48:41 | 2024-12-03T01:35:05 | 2024-07-23T18:10:53 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
I encountered issues while deploying my fine-tuned model using ollama.I have successfully created my own model locally.

When I used the command `ollama run legalassistant`, an error o... | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4572/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4572/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/1182 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1182/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1182/comments | https://api.github.com/repos/ollama/ollama/issues/1182/events | https://github.com/ollama/ollama/issues/1182 | 2,000,002,484 | I_kwDOJ0Z1Ps53NZ20 | 1,182 | Bug: --json mode going into a infinite loop? | {
"login": "hemanth",
"id": 18315,
"node_id": "MDQ6VXNlcjE4MzE1",
"avatar_url": "https://avatars.githubusercontent.com/u/18315?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hemanth",
"html_url": "https://github.com/hemanth",
"followers_url": "https://api.github.com/users/hemanth/follower... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/Br... | [
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api... | null | 5 | 2023-11-17T22:06:49 | 2024-03-12T18:40:35 | 2024-03-12T18:40:35 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ```sh
/tmp took 37s
❯ ollama run llama2 --format json
>>> give me 10 emojis with their meanings
{
.... # never ends for that input
```
https://github.com/jmorganca/ollama/assets/18315/6771fd1f-d0e3-4f1f-9e7b-e15bacf8acad
^ recording | {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/Br... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1182/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1182/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/5396 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5396/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5396/comments | https://api.github.com/repos/ollama/ollama/issues/5396/events | https://github.com/ollama/ollama/issues/5396 | 2,382,594,935 | I_kwDOJ0Z1Ps6OA4N3 | 5,396 | deepseek-v2:236b Startup Issues | {
"login": "SongXiaoMao",
"id": 55074934,
"node_id": "MDQ6VXNlcjU1MDc0OTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/55074934?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SongXiaoMao",
"html_url": "https://github.com/SongXiaoMao",
"followers_url": "https://api.github.com/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg... | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 2 | 2024-07-01T01:29:08 | 2024-08-01T22:36:49 | 2024-08-01T22:36:48 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?

Computer configuration: 3090*4 128g memory
There are problems starting both models
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
ollama version is 0.1.48 | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5396/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5396/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/4099 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4099/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4099/comments | https://api.github.com/repos/ollama/ollama/issues/4099/events | https://github.com/ollama/ollama/issues/4099 | 2,275,430,545 | I_kwDOJ0Z1Ps6HoFCR | 4,099 | Please support gfx1103 in rocm docker image | {
"login": "LaurentBonnaud",
"id": 2168323,
"node_id": "MDQ6VXNlcjIxNjgzMjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/2168323?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LaurentBonnaud",
"html_url": "https://github.com/LaurentBonnaud",
"followers_url": "https://api.gith... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 1 | 2024-05-02T12:34:14 | 2024-05-21T17:48:13 | 2024-05-21T17:48:13 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Hi,
my laptop has this SoC:
```
$ lscpu
[...]
Model name: AMD Ryzen 7 PRO 7840U w/ Radeon 780M Graphics
```
```
$ rocminfo
[...]
*******
Agent 2
*******
Name: gfx1103
Uuid: ... | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4099/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4099/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2166 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2166/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2166/comments | https://api.github.com/repos/ollama/ollama/issues/2166/events | https://github.com/ollama/ollama/issues/2166 | 2,097,591,985 | I_kwDOJ0Z1Ps59Brax | 2,166 | ROCm container CUDA error | {
"login": "Eelviny",
"id": 4560915,
"node_id": "MDQ6VXNlcjQ1NjA5MTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4560915?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Eelviny",
"html_url": "https://github.com/Eelviny",
"followers_url": "https://api.github.com/users/Eelviny/... | [] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 4 | 2024-01-24T07:17:20 | 2024-01-30T17:23:36 | 2024-01-25T12:51:06 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I'm attempting to use an AMD Radeon RX 7900 XT on ollama v0.1.21 in a container that I built from the Dockerfile. I use podman to build and run containers, and my OS is Bluefin (Fedora Silverblue spin). I'm unsure whether this is an issue because I'm missing something on my host OS, or an issue with the container.
H... | {
"login": "Eelviny",
"id": 4560915,
"node_id": "MDQ6VXNlcjQ1NjA5MTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4560915?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Eelviny",
"html_url": "https://github.com/Eelviny",
"followers_url": "https://api.github.com/users/Eelviny/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2166/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2166/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2923 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2923/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2923/comments | https://api.github.com/repos/ollama/ollama/issues/2923/events | https://github.com/ollama/ollama/issues/2923 | 2,167,677,281 | I_kwDOJ0Z1Ps6BNCFh | 2,923 | how does memory work in cmd `ollama run openchat`? | {
"login": "TimmekHW",
"id": 94626112,
"node_id": "U_kgDOBaPhQA",
"avatar_url": "https://avatars.githubusercontent.com/u/94626112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TimmekHW",
"html_url": "https://github.com/TimmekHW",
"followers_url": "https://api.github.com/users/TimmekHW/fo... | [] | closed | false | null | [] | null | 1 | 2024-03-04T20:22:42 | 2024-03-06T22:32:11 | 2024-03-06T22:32:10 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | how does memory work in cmd `ollama run openchat`?
could you share the code? remembering chat history and context works well there. can I please have the code? Because my implementation of history is not working correctly | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2923/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2923/timeline | null | not_planned | false |
https://api.github.com/repos/ollama/ollama/issues/6350 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6350/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6350/comments | https://api.github.com/repos/ollama/ollama/issues/6350/events | https://github.com/ollama/ollama/issues/6350 | 2,464,842,738 | I_kwDOJ0Z1Ps6S6oPy | 6,350 | Is this wrong in https://ollama.com/blog/gemma2 | {
"login": "wonpn",
"id": 14801003,
"node_id": "MDQ6VXNlcjE0ODAxMDAz",
"avatar_url": "https://avatars.githubusercontent.com/u/14801003?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wonpn",
"html_url": "https://github.com/wonpn",
"followers_url": "https://api.github.com/users/wonpn/follow... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | {
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/us... | [
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https... | null | 2 | 2024-08-14T03:54:35 | 2024-08-14T17:49:32 | 2024-08-14T17:49:32 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?

8B to 9B?
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_ | {
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/us... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6350/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6350/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2062 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2062/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2062/comments | https://api.github.com/repos/ollama/ollama/issues/2062/events | https://github.com/ollama/ollama/issues/2062 | 2,089,399,218 | I_kwDOJ0Z1Ps58ibOy | 2,062 | Add obsidian model | {
"login": "mak448a",
"id": 94062293,
"node_id": "U_kgDOBZtG1Q",
"avatar_url": "https://avatars.githubusercontent.com/u/94062293?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mak448a",
"html_url": "https://github.com/mak448a",
"followers_url": "https://api.github.com/users/mak448a/follow... | [
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] | open | false | null | [] | null | 2 | 2024-01-19T01:50:25 | 2024-03-11T17:59:40 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Could you add the obsidian model to the library? | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2062/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2062/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/6171 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6171/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6171/comments | https://api.github.com/repos/ollama/ollama/issues/6171/events | https://github.com/ollama/ollama/pull/6171 | 2,447,779,211 | PR_kwDOJ0Z1Ps53Z6Jq | 6,171 | removeall to remove non-empty temp dirs | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | [] | closed | false | null | [] | null | 1 | 2024-08-05T07:05:59 | 2024-08-09T22:47:15 | 2024-08-09T22:47:13 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6171",
"html_url": "https://github.com/ollama/ollama/pull/6171",
"diff_url": "https://github.com/ollama/ollama/pull/6171.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6171.patch",
"merged_at": "2024-08-09T22:47:13"
} | `os.Remove()` does not remove non-empty directories so it'll error with `directory not empty`. instead, remove the expected content (`ollama.pid` and `runners`) individually, then remove the parent directory. remove the content explicitly so as to not accidentally remove things ollama does not own | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6171/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6171/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/7524 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7524/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7524/comments | https://api.github.com/repos/ollama/ollama/issues/7524/events | https://github.com/ollama/ollama/issues/7524 | 2,637,347,423 | I_kwDOJ0Z1Ps6dMrpf | 7,524 | Error: could not connect to ollama app, is it running? | {
"login": "BongozGoBOOM",
"id": 116317767,
"node_id": "U_kgDOBu7eRw",
"avatar_url": "https://avatars.githubusercontent.com/u/116317767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BongozGoBOOM",
"html_url": "https://github.com/BongozGoBOOM",
"followers_url": "https://api.github.com/use... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg... | open | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 20 | 2024-11-06T08:13:04 | 2025-01-13T00:57:17 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Tried versions v0.4.0, v0.3.14, and v0.3.13, all yielded the same exact results.

Attempted to start the app through start menu, file explorer, and the ollama serve command (in separ... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7524/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7524/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/7038 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7038/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7038/comments | https://api.github.com/repos/ollama/ollama/issues/7038/events | https://github.com/ollama/ollama/issues/7038 | 2,555,361,878 | I_kwDOJ0Z1Ps6YT7pW | 7,038 | Error: llama runner process has terminated: error loading modelvocabulary: cannot find tokenizer merges in model file | {
"login": "sparklyi",
"id": 64263737,
"node_id": "MDQ6VXNlcjY0MjYzNzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/64263737?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sparklyi",
"html_url": "https://github.com/sparklyi",
"followers_url": "https://api.github.com/users/spa... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | open | false | null | [] | null | 10 | 2024-09-30T01:59:29 | 2024-10-06T14:03:20 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
time : 09/30/2024
script:
```
FROM "./model-quant.gguf"
TEMPLATE """{{- if .System }}
<|im_start|>system {{ .System }}<|im_end|>
{{- end }}
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assistant
"""
SYSTEM """"""
PARAMETER stop <|im_start|>
PARAMETER stop <|im_end|>
```... | {
"login": "sparklyi",
"id": 64263737,
"node_id": "MDQ6VXNlcjY0MjYzNzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/64263737?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sparklyi",
"html_url": "https://github.com/sparklyi",
"followers_url": "https://api.github.com/users/spa... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7038/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7038/timeline | null | reopened | false |
https://api.github.com/repos/ollama/ollama/issues/3990 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3990/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3990/comments | https://api.github.com/repos/ollama/ollama/issues/3990/events | https://github.com/ollama/ollama/issues/3990 | 2,267,321,303 | I_kwDOJ0Z1Ps6HJJPX | 3,990 | how to upgrade ollama automatically on MAC PRO? | {
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/tao... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | closed | false | null | [] | null | 0 | 2024-04-28T03:43:02 | 2024-04-29T01:31:50 | 2024-04-29T01:31:50 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | how to upgrade ollama automatically on MAC PRO?
| {
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/tao... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3990/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3990/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/6395 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6395/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6395/comments | https://api.github.com/repos/ollama/ollama/issues/6395/events | https://github.com/ollama/ollama/pull/6395 | 2,470,948,574 | PR_kwDOJ0Z1Ps54nSwu | 6,395 | Make new tokenizer logic conditional | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [] | closed | false | null | [] | null | 4 | 2024-08-16T20:30:36 | 2024-08-25T00:25:40 | 2024-08-25T00:25:38 | COLLABORATOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6395",
"html_url": "https://github.com/ollama/ollama/pull/6395",
"diff_url": "https://github.com/ollama/ollama/pull/6395.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6395.patch",
"merged_at": "2024-08-25T00:25:37"
} | Only use the new cgo tokenizer/detokenizer if we're using the new runners | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6395/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6395/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/4882 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4882/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4882/comments | https://api.github.com/repos/ollama/ollama/issues/4882/events | https://github.com/ollama/ollama/issues/4882 | 2,339,231,080 | I_kwDOJ0Z1Ps6LbdVo | 4,882 | mac app silently fails to install CLI link if /usr/local/bin/ missing | {
"login": "saimgulay",
"id": 120498676,
"node_id": "U_kgDOBy6p9A",
"avatar_url": "https://avatars.githubusercontent.com/u/120498676?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saimgulay",
"html_url": "https://github.com/saimgulay",
"followers_url": "https://api.github.com/users/saimgu... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677279472,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjf8y8A... | open | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 5 | 2024-06-06T22:12:58 | 2025-01-09T00:40:44 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Hi community,
I have a MacOS Sonoma 14.4.1 and my Ollama version is 0.1.41. I moved the app to the Applications folder then run the app, click the Next button, then click the Install button to install the command line but nothing happens. I tried running as an admin but still faced the same ... | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4882/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4882/timeline | null | reopened | false |
https://api.github.com/repos/ollama/ollama/issues/8680 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8680/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8680/comments | https://api.github.com/repos/ollama/ollama/issues/8680/events | https://github.com/ollama/ollama/issues/8680 | 2,819,658,888 | I_kwDOJ0Z1Ps6oEJSI | 8,680 | api/chat not working in parallel with docker-compose | {
"login": "acclayer7",
"id": 178514264,
"node_id": "U_kgDOCqPpWA",
"avatar_url": "https://avatars.githubusercontent.com/u/178514264?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/acclayer7",
"html_url": "https://github.com/acclayer7",
"followers_url": "https://api.github.com/users/acclay... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | open | false | null | [] | null | 1 | 2025-01-30T00:54:32 | 2025-01-30T01:05:37 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Hello, I have my ollama with enough memory (16vram), I use OLLAMA_NUM_PARALLEL=2 OLLAMA_MAX_LOADED_MODELS=2, but I don't see any memory increase.
I use docker-compose to make work, however when using the api, it does not increase the vram, it stays using the same vram and I still have 10gb vram... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8680/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8680/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/1652 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1652/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1652/comments | https://api.github.com/repos/ollama/ollama/issues/1652/events | https://github.com/ollama/ollama/issues/1652 | 2,051,887,042 | I_kwDOJ0Z1Ps56TU_C | 1,652 | In time | {
"login": "Xdcnft",
"id": 111935635,
"node_id": "U_kgDOBqwAkw",
"avatar_url": "https://avatars.githubusercontent.com/u/111935635?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Xdcnft",
"html_url": "https://github.com/Xdcnft",
"followers_url": "https://api.github.com/users/Xdcnft/follower... | [] | closed | false | null | [] | null | 0 | 2023-12-21T07:40:48 | 2023-12-21T07:40:57 | 2023-12-21T07:40:57 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | null | {
"login": "Xdcnft",
"id": 111935635,
"node_id": "U_kgDOBqwAkw",
"avatar_url": "https://avatars.githubusercontent.com/u/111935635?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Xdcnft",
"html_url": "https://github.com/Xdcnft",
"followers_url": "https://api.github.com/users/Xdcnft/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1652/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1652/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8591 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8591/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8591/comments | https://api.github.com/repos/ollama/ollama/issues/8591/events | https://github.com/ollama/ollama/issues/8591 | 2,811,472,497 | I_kwDOJ0Z1Ps6nk6px | 8,591 | High idle power consumption due to persistent CUDA initialization | {
"login": "SvenMeyer",
"id": 25609,
"node_id": "MDQ6VXNlcjI1NjA5",
"avatar_url": "https://avatars.githubusercontent.com/u/25609?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SvenMeyer",
"html_url": "https://github.com/SvenMeyer",
"followers_url": "https://api.github.com/users/SvenMeyer/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | open | false | null | [] | null | 4 | 2025-01-26T11:24:50 | 2025-01-27T12:31:37 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | # High idle power consumption due to PCIe bus unable to enter sleep state
## Issue Description
When running Ollama as a service with CUDA enabled, the system maintains unnecessarily high power consumption (~14W vs ~6W) even when idle. This is primarily caused by the PCIe bus being unable to enter sleep state (D3) due ... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8591/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8591/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/3764 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3764/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3764/comments | https://api.github.com/repos/ollama/ollama/issues/3764/events | https://github.com/ollama/ollama/issues/3764 | 2,253,875,609 | I_kwDOJ0Z1Ps6GV2mZ | 3,764 | Error: pull model manifest: 400 | {
"login": "zedmango",
"id": 33294054,
"node_id": "MDQ6VXNlcjMzMjk0MDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/33294054?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zedmango",
"html_url": "https://github.com/zedmango",
"followers_url": "https://api.github.com/users/zed... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 1 | 2024-04-19T20:06:49 | 2024-04-19T20:12:13 | 2024-04-19T20:12:12 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Got this strange error while trying to create a model.
```
$ ./createmodels.sh
transferring model data
creating model layer
creating template layer
creating parameters layer
creating config layer
using already created layer sha256:5499dfd64378623dc8ae420f29fc8e6a6e43f23198a290f5c4e96... | {
"login": "zedmango",
"id": 33294054,
"node_id": "MDQ6VXNlcjMzMjk0MDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/33294054?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zedmango",
"html_url": "https://github.com/zedmango",
"followers_url": "https://api.github.com/users/zed... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3764/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3764/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/118 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/118/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/118/comments | https://api.github.com/repos/ollama/ollama/issues/118/events | https://github.com/ollama/ollama/issues/118 | 1,811,251,138 | I_kwDOJ0Z1Ps5r9X_C | 118 | Crashed on M2 Air 8GB | {
"login": "chsasank",
"id": 9305875,
"node_id": "MDQ6VXNlcjkzMDU4NzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/9305875?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chsasank",
"html_url": "https://github.com/chsasank",
"followers_url": "https://api.github.com/users/chsas... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 6 | 2023-07-19T06:29:51 | 2023-08-23T17:41:55 | 2023-08-23T17:41:55 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ```[GIN] 2023/07/19 - 11:58:16 | 200 | 13m51s | 127.0.0.1 | POST "/api/pull"
llama.cpp: loading model from /Users/sasank/.ollama/models/blobs/sha256:8daa9615cce30c259a9555b1cc250d461d1bc69980a274b44d7eda0be78076d8
llama_model_load_internal: format = ggjt v3 (latest)
llama_model_load_internal: n_... | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/118/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2370 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2370/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2370/comments | https://api.github.com/repos/ollama/ollama/issues/2370/events | https://github.com/ollama/ollama/issues/2370 | 2,120,108,637 | I_kwDOJ0Z1Ps5-Xkpd | 2,370 | 36GB Macbook not using GPU for models that could fit | {
"login": "WinnieP",
"id": 497472,
"node_id": "MDQ6VXNlcjQ5NzQ3Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/497472?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/WinnieP",
"html_url": "https://github.com/WinnieP",
"followers_url": "https://api.github.com/users/WinnieP/fo... | [
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 6 | 2024-02-06T07:09:27 | 2024-04-09T05:04:37 | 2024-03-12T21:30:44 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ERROR: type should be string, got "https://github.com/ollama/ollama/blob/27aa2d4a194c6daeafbd00391f475628deccce72/gpu/gpu_darwin.go#L24C1-L28C3\r\n\r\nIn older versions of Ollama, certain models would run on the GPU of a 36GB M3 macbook pro (specifically q4_K_M quantization of mixtral). Now, it's running on CPU.\r\nI believe MacOS is allowing closer to ~75% of the memory to be allocated to GPU on this model, not 66%.\r\n\r\n```ggml_metal_init: recommendedMaxWorkingSetSize = 28991.03 MB```" | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2370/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2370/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8287 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8287/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8287/comments | https://api.github.com/repos/ollama/ollama/issues/8287/events | https://github.com/ollama/ollama/issues/8287 | 2,765,927,156 | I_kwDOJ0Z1Ps6k3LL0 | 8,287 | The <toolcall> in nemotron-mini. Again. | {
"login": "tripolskypetr",
"id": 19227776,
"node_id": "MDQ6VXNlcjE5MjI3Nzc2",
"avatar_url": "https://avatars.githubusercontent.com/u/19227776?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tripolskypetr",
"html_url": "https://github.com/tripolskypetr",
"followers_url": "https://api.githu... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | {
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/... | [
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "htt... | null | 27 | 2025-01-02T12:11:25 | 2025-01-29T19:14:51 | 2025-01-29T19:14:51 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
# The problem
I am trying to implement the agent swarm for ollama from scratch. I made the triage agent: the intent navigator. It should call the `navigate_to_refund_agent_tool` or `navigate_to_sales_agent_tool` depends on user's choose. Both tools got empty arguments like
```tsx
function... | {
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8287/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8287/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/4244 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4244/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4244/comments | https://api.github.com/repos/ollama/ollama/issues/4244/events | https://github.com/ollama/ollama/pull/4244 | 2,284,455,893 | PR_kwDOJ0Z1Ps5u0iHB | 4,244 | skip if same quantization | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | [] | closed | false | null | [] | null | 0 | 2024-05-08T00:44:44 | 2024-05-08T02:03:38 | 2024-05-08T02:03:38 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/4244",
"html_url": "https://github.com/ollama/ollama/pull/4244",
"diff_url": "https://github.com/ollama/ollama/pull/4244.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4244.patch",
"merged_at": "2024-05-08T02:03:38"
} | this skips quantization of the input and output are the same file types. most of the time, this means if the input and output are both f16 | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4244/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4244/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/929 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/929/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/929/comments | https://api.github.com/repos/ollama/ollama/issues/929/events | https://github.com/ollama/ollama/issues/929 | 1,964,769,465 | I_kwDOJ0Z1Ps51HAC5 | 929 | FR: Increase prompt size limit on UI | {
"login": "hemanth",
"id": 18315,
"node_id": "MDQ6VXNlcjE4MzE1",
"avatar_url": "https://avatars.githubusercontent.com/u/18315?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hemanth",
"html_url": "https://github.com/hemanth",
"followers_url": "https://api.github.com/users/hemanth/follower... | [] | closed | false | null | [] | null | 5 | 2023-10-27T04:59:26 | 2023-10-30T22:28:49 | 2023-10-30T22:28:49 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | <img width="378" alt="image" src="https://github.com/jmorganca/ollama/assets/18315/2bee7056-b101-4430-a0c1-400cff494405">
This is limited to 255 chars only. | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/929/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/929/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/6852 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6852/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6852/comments | https://api.github.com/repos/ollama/ollama/issues/6852/events | https://github.com/ollama/ollama/issues/6852 | 2,532,873,826 | I_kwDOJ0Z1Ps6W-JZi | 6,852 | Fetch Failed Error on using OLLAMA locally with nomic-embed-text and llama3.1:8b | {
"login": "saisandeepbalbari",
"id": 25894087,
"node_id": "MDQ6VXNlcjI1ODk0MDg3",
"avatar_url": "https://avatars.githubusercontent.com/u/25894087?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saisandeepbalbari",
"html_url": "https://github.com/saisandeepbalbari",
"followers_url": "https... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 12 | 2024-09-18T06:48:26 | 2024-12-02T22:56:23 | 2024-12-02T22:56:23 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
I'm using OLLAMA with Anything LLM. It's taking a lot of time to respond to the prompts
The following error I'm getting from the docker logs of anything llm
[OllamaEmbedder] Embedding 1 chunks of text with nomic-embed-text:latest.
TypeError: fetch failed
at node:internal/deps/undici/... | {
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6852/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/771 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/771/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/771/comments | https://api.github.com/repos/ollama/ollama/issues/771/events | https://github.com/ollama/ollama/issues/771 | 1,940,762,088 | I_kwDOJ0Z1Ps5zra3o | 771 | Looking up environment variables while starting the server via Electron | {
"login": "ba1uev",
"id": 7990776,
"node_id": "MDQ6VXNlcjc5OTA3NzY=",
"avatar_url": "https://avatars.githubusercontent.com/u/7990776?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ba1uev",
"html_url": "https://github.com/ba1uev",
"followers_url": "https://api.github.com/users/ba1uev/foll... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5667396210,
"node_id": ... | closed | false | null | [] | null | 5 | 2023-10-12T20:41:16 | 2025-01-28T18:32:04 | 2025-01-28T18:32:02 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Constant users of web clients may configure some of the supported variables in profile files such as `~/.bash_profile`, `~/.zshrc`, etc.
```bash
echo "export OLLAMA_ORIGINS=https://example.com OLLAMA_HOST=0.0.0.0:1337" >> ~/.zshrc
```
The server launched manually using `ollama serve` will utilize these variables. I... | {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/Br... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/771/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/771/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/4190 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4190/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4190/comments | https://api.github.com/repos/ollama/ollama/issues/4190/events | https://github.com/ollama/ollama/pull/4190 | 2,279,894,149 | PR_kwDOJ0Z1Ps5ulb_t | 4,190 | fix golangci workflow not enable gofmt and goimports | {
"login": "alwqx",
"id": 9915368,
"node_id": "MDQ6VXNlcjk5MTUzNjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/9915368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alwqx",
"html_url": "https://github.com/alwqx",
"followers_url": "https://api.github.com/users/alwqx/follower... | [] | closed | false | null | [] | null | 0 | 2024-05-06T02:18:23 | 2024-05-07T16:49:40 | 2024-05-07T16:49:40 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/4190",
"html_url": "https://github.com/ollama/ollama/pull/4190",
"diff_url": "https://github.com/ollama/ollama/pull/4190.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4190.patch",
"merged_at": "2024-05-07T16:49:40"
} | Hi, this pr tries to fix golangci not enable `gofmt` and `goimport` in github workflows.
All workflows have passed. But I notice `* text eol=lf` was removed in commit [9164b0161](https://github.com/ollama/ollama/commit/9164b0161bcb24e543cba835a8863b80af2c0c21) which v0.1.33 was released. So I still need help from m... | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4190/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4190/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3643 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3643/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3643/comments | https://api.github.com/repos/ollama/ollama/issues/3643/events | https://github.com/ollama/ollama/issues/3643 | 2,242,721,513 | I_kwDOJ0Z1Ps6FrTbp | 3,643 | how to change the max input token length when I run ‘’ollama run gemma:7b-instruct-v1.1-fp16‘’ | {
"login": "dh12306",
"id": 20471681,
"node_id": "MDQ6VXNlcjIwNDcxNjgx",
"avatar_url": "https://avatars.githubusercontent.com/u/20471681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dh12306",
"html_url": "https://github.com/dh12306",
"followers_url": "https://api.github.com/users/dh1230... | [] | closed | false | null | [] | null | 7 | 2024-04-15T05:12:38 | 2024-12-02T09:55:40 | 2024-04-17T00:46:54 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | the default input token lens is 2048 ? how can I change it because the gemma can support more input tokens
| {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3643/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3643/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/3285 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3285/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3285/comments | https://api.github.com/repos/ollama/ollama/issues/3285/events | https://github.com/ollama/ollama/issues/3285 | 2,200,109,066 | I_kwDOJ0Z1Ps6DIwAK | 3,285 | gemma accuracy down from 0.128 to 0.129 | {
"login": "RamiKassouf",
"id": 92019309,
"node_id": "U_kgDOBXwabQ",
"avatar_url": "https://avatars.githubusercontent.com/u/92019309?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RamiKassouf",
"html_url": "https://github.com/RamiKassouf",
"followers_url": "https://api.github.com/users/Ra... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA... | open | false | null | [] | null | 3 | 2024-03-21T12:48:18 | 2024-03-30T04:16:45 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Prompts that were producing the correct results are now producing different (false) outputs
### What did you expect to see?
Corretly formatted yaml with correct inputs based on custgom promt
-> Got a yaml with indentation issues and missing fields with wrong structure even though ther... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3285/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3285/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/8601 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8601/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8601/comments | https://api.github.com/repos/ollama/ollama/issues/8601/events | https://github.com/ollama/ollama/pull/8601 | 2,812,057,230 | PR_kwDOJ0Z1Ps6JCJq5 | 8,601 | README: Add handy-ollama to tutorial | {
"login": "AXYZdong",
"id": 45477220,
"node_id": "MDQ6VXNlcjQ1NDc3MjIw",
"avatar_url": "https://avatars.githubusercontent.com/u/45477220?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AXYZdong",
"html_url": "https://github.com/AXYZdong",
"followers_url": "https://api.github.com/users/AXY... | [] | open | false | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | [
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.g... | null | 0 | 2025-01-27T04:29:41 | 2025-01-27T17:08:48 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/8601",
"html_url": "https://github.com/ollama/ollama/pull/8601",
"diff_url": "https://github.com/ollama/ollama/pull/8601.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8601.patch",
"merged_at": null
} | Chinese Tutorial for Ollama by [Datawhale ](https://github.com/datawhalechina)- China's Largest Open Source AI Learning Community.
We'd like to contribute to the Ollama community by announcing the release of our open-source Chinese tutorial.
This tutorial aims to be comprehensive and easy to understand, covering:... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8601/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8601/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/6717 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6717/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6717/comments | https://api.github.com/repos/ollama/ollama/issues/6717/events | https://github.com/ollama/ollama/pull/6717 | 2,515,025,890 | PR_kwDOJ0Z1Ps566JsO | 6,717 | Improve nvidia GPU discovery error handling | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [] | open | false | null | [] | null | 0 | 2024-09-09T22:19:59 | 2024-11-27T20:55:56 | null | COLLABORATOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6717",
"html_url": "https://github.com/ollama/ollama/pull/6717",
"diff_url": "https://github.com/ollama/ollama/pull/6717.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6717.patch",
"merged_at": null
} | In some cases, the cuda library may respond with a status code indicating we should retry later.
If we get an error, use the applicable cuda library error string function to get a human readable explanation.
Improve logging during retries in the server subprocess logic as well.
Fixes #6637 | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6717/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6717/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/7804 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7804/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7804/comments | https://api.github.com/repos/ollama/ollama/issues/7804/events | https://github.com/ollama/ollama/issues/7804 | 2,684,833,085 | I_kwDOJ0Z1Ps6gB009 | 7,804 | Not reading image files with vision models | {
"login": "whatToUseThisFor",
"id": 130185104,
"node_id": "U_kgDOB8J3kA",
"avatar_url": "https://avatars.githubusercontent.com/u/130185104?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/whatToUseThisFor",
"html_url": "https://github.com/whatToUseThisFor",
"followers_url": "https://api.gi... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 6 | 2024-11-22T22:43:22 | 2024-11-23T17:51:24 | 2024-11-23T17:51:24 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?

When I try to give an image file to a model with the "vision" tag, it says that it can't access files on my computer.
I've tried with llava 7b, llava 13b, and llama3.2-vision (all times I tried were on a... | {
"login": "whatToUseThisFor",
"id": 130185104,
"node_id": "U_kgDOB8J3kA",
"avatar_url": "https://avatars.githubusercontent.com/u/130185104?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/whatToUseThisFor",
"html_url": "https://github.com/whatToUseThisFor",
"followers_url": "https://api.gi... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7804/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7804/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/6842 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6842/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6842/comments | https://api.github.com/repos/ollama/ollama/issues/6842/events | https://github.com/ollama/ollama/pull/6842 | 2,531,894,590 | PR_kwDOJ0Z1Ps57zWo0 | 6,842 | llama: Refine developer docs for Go server | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [] | closed | false | null | [] | null | 2 | 2024-09-17T19:09:06 | 2024-09-27T22:12:43 | 2024-09-27T22:12:40 | COLLABORATOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6842",
"html_url": "https://github.com/ollama/ollama/pull/6842",
"diff_url": "https://github.com/ollama/ollama/pull/6842.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6842.patch",
"merged_at": "2024-09-27T22:12:40"
} | This enhances the documentation for development focusing on a minimal single known to work set of tools. | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6842/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6842/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/1778 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1778/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1778/comments | https://api.github.com/repos/ollama/ollama/issues/1778/events | https://github.com/ollama/ollama/pull/1778 | 2,064,770,937 | PR_kwDOJ0Z1Ps5jLXHL | 1,778 | Fail fast on WSL1 while allowing on WSL2 | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [] | closed | false | null | [] | null | 0 | 2024-01-03T23:16:12 | 2024-01-04T00:18:44 | 2024-01-04T00:18:41 | COLLABORATOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/1778",
"html_url": "https://github.com/ollama/ollama/pull/1778",
"diff_url": "https://github.com/ollama/ollama/pull/1778.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1778.patch",
"merged_at": "2024-01-04T00:18:41"
} | This prevents users from accidentally installing on WSL1 with instructions guiding how to upgrade their WSL instance to version 2. Once running WSL2 if you have an NVIDIA card, you can follow their instructions to set up GPU passthrough and run models on the GPU. This is not possible on WSL1.
Example output.
... | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1778/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1778/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/8332 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8332/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8332/comments | https://api.github.com/repos/ollama/ollama/issues/8332/events | https://github.com/ollama/ollama/issues/8332 | 2,772,129,016 | I_kwDOJ0Z1Ps6lO1T4 | 8,332 | Allow set the type of K/V cache separately | {
"login": "ag2s20150909",
"id": 19373730,
"node_id": "MDQ6VXNlcjE5MzczNzMw",
"avatar_url": "https://avatars.githubusercontent.com/u/19373730?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ag2s20150909",
"html_url": "https://github.com/ag2s20150909",
"followers_url": "https://api.github.c... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | open | false | null | [] | null | 1 | 2025-01-07T07:50:27 | 2025-01-21T14:03:20 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Allow set the type of K/V cache separately
On Qwen2-7B,
when K/V cache both `q4_0` produces weird results.
when k is `q4_0` and v is `q8_0` produces weird results.
when k is `q8_0` and v is `q4_0` produces normal results. | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8332/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8332/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/2491 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2491/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2491/comments | https://api.github.com/repos/ollama/ollama/issues/2491/events | https://github.com/ollama/ollama/issues/2491 | 2,134,227,302 | I_kwDOJ0Z1Ps5_Nblm | 2,491 | How to install ollama on ubuntu with specific version | {
"login": "MugdhaHardikar-GSLab",
"id": 5062147,
"node_id": "MDQ6VXNlcjUwNjIxNDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5062147?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MugdhaHardikar-GSLab",
"html_url": "https://github.com/MugdhaHardikar-GSLab",
"followers_url":... | [] | closed | false | null | [] | null | 8 | 2024-02-14T12:16:23 | 2025-01-21T18:10:51 | 2024-02-20T03:59:33 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I want to install the ollama on my ubuntu server but every few days new version of ollama gets installed. I want to fix the version of the ollama getting installed on my machine. Current install.sh doesn't seem to have that functionality. IS there any way? | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2491/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2491/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/114 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/114/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/114/comments | https://api.github.com/repos/ollama/ollama/issues/114/events | https://github.com/ollama/ollama/issues/114 | 1,811,136,432 | I_kwDOJ0Z1Ps5r87-w | 114 | pls Wizard Uncensored | {
"login": "nathanleclaire",
"id": 1476820,
"node_id": "MDQ6VXNlcjE0NzY4MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1476820?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nathanleclaire",
"html_url": "https://github.com/nathanleclaire",
"followers_url": "https://api.gith... | [] | closed | false | null | [] | null | 3 | 2023-07-19T04:50:38 | 2023-07-19T15:22:46 | 2023-07-19T06:38:05 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | https://huggingface.co/TheBloke/WizardLM-13B-Uncensored-GGML good performance ime | {
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/us... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/114/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/114/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2411 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2411/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2411/comments | https://api.github.com/repos/ollama/ollama/issues/2411/events | https://github.com/ollama/ollama/issues/2411 | 2,125,248,757 | I_kwDOJ0Z1Ps5-rLj1 | 2,411 | Discrete AMD GPU not used, CPU used instead | {
"login": "haplo",
"id": 71658,
"node_id": "MDQ6VXNlcjcxNjU4",
"avatar_url": "https://avatars.githubusercontent.com/u/71658?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/haplo",
"html_url": "https://github.com/haplo",
"followers_url": "https://api.github.com/users/haplo/followers",
"f... | [
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 28 | 2024-02-08T13:57:34 | 2024-04-29T12:57:14 | 2024-03-12T23:30:18 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | My system has both an integrated and a dedicated GPU (an AMD Radeon 7900XTX). I see ollama ignores the integrated card, detects the 7900XTX but then it goes ahead and uses the CPU (Ryzen 7900).
I'm running ollama 0.1.23 from Arch Linux repository. This should include the fix at #2195, I see in the logs that `ROCR_VI... | {
"login": "haplo",
"id": 71658,
"node_id": "MDQ6VXNlcjcxNjU4",
"avatar_url": "https://avatars.githubusercontent.com/u/71658?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/haplo",
"html_url": "https://github.com/haplo",
"followers_url": "https://api.github.com/users/haplo/followers",
"f... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2411/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2411/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/5017 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5017/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5017/comments | https://api.github.com/repos/ollama/ollama/issues/5017/events | https://github.com/ollama/ollama/issues/5017 | 2,350,572,877 | I_kwDOJ0Z1Ps6MGuVN | 5,017 | Using Ollama in a Dockerfile | {
"login": "Deepansharora27",
"id": 43300955,
"node_id": "MDQ6VXNlcjQzMzAwOTU1",
"avatar_url": "https://avatars.githubusercontent.com/u/43300955?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Deepansharora27",
"html_url": "https://github.com/Deepansharora27",
"followers_url": "https://api... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 2 | 2024-06-13T08:51:50 | 2024-06-18T22:24:24 | 2024-06-18T22:24:08 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Hi,
I Have Been Trying to Use Ollama in My Dockerfile like this
```
FROM python:3.10 AS builder
WORKDIR /usr/src/app
ENV PATH="/venv/bin:$PATH"
RUN apt-get update && apt-get install -y git
RUN python -m venv /venv
COPY . /usr/src/app
RUN pip install --no-cache-dir -r requirements.... | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5017/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5017/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2224 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2224/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2224/comments | https://api.github.com/repos/ollama/ollama/issues/2224/events | https://github.com/ollama/ollama/pull/2224 | 2,103,258,044 | PR_kwDOJ0Z1Ps5lNr9u | 2,224 | ROCm: Correct the response string in rocm_get_version function | {
"login": "jaglinux",
"id": 1555686,
"node_id": "MDQ6VXNlcjE1NTU2ODY=",
"avatar_url": "https://avatars.githubusercontent.com/u/1555686?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jaglinux",
"html_url": "https://github.com/jaglinux",
"followers_url": "https://api.github.com/users/jagli... | [] | closed | false | null | [] | null | 1 | 2024-01-27T06:10:58 | 2024-01-27T18:42:22 | 2024-01-27T15:29:33 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/2224",
"html_url": "https://github.com/ollama/ollama/pull/2224",
"diff_url": "https://github.com/ollama/ollama/pull/2224.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2224.patch",
"merged_at": "2024-01-27T15:29:33"
} | null | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2224/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2224/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/1001 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1001/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1001/comments | https://api.github.com/repos/ollama/ollama/issues/1001/events | https://github.com/ollama/ollama/pull/1001 | 1,977,375,530 | PR_kwDOJ0Z1Ps5emRq8 | 1,001 | Add ModelFusion community integration | {
"login": "lgrammel",
"id": 205036,
"node_id": "MDQ6VXNlcjIwNTAzNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/205036?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lgrammel",
"html_url": "https://github.com/lgrammel",
"followers_url": "https://api.github.com/users/lgramme... | [] | closed | false | null | [] | null | 1 | 2023-11-04T14:43:21 | 2023-11-06T17:55:31 | 2023-11-06T17:53:00 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/1001",
"html_url": "https://github.com/ollama/ollama/pull/1001",
"diff_url": "https://github.com/ollama/ollama/pull/1001.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1001.patch",
"merged_at": null
} | null | {
"login": "lgrammel",
"id": 205036,
"node_id": "MDQ6VXNlcjIwNTAzNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/205036?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lgrammel",
"html_url": "https://github.com/lgrammel",
"followers_url": "https://api.github.com/users/lgramme... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1001/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1001/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/6290 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6290/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6290/comments | https://api.github.com/repos/ollama/ollama/issues/6290/events | https://github.com/ollama/ollama/pull/6290 | 2,458,468,816 | PR_kwDOJ0Z1Ps53-i9S | 6,290 | Harden intel boostrap for nil pointers | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [] | closed | false | null | [] | null | 0 | 2024-08-09T18:34:16 | 2024-08-09T19:14:46 | 2024-08-09T19:14:43 | COLLABORATOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6290",
"html_url": "https://github.com/ollama/ollama/pull/6290",
"diff_url": "https://github.com/ollama/ollama/pull/6290.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6290.patch",
"merged_at": "2024-08-09T19:14:43"
} | If the user enables intel GPU discovery, but the library doesn't initialize, we'd crash over a nil pointer.
Fixes #6284 | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6290/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6290/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/1861 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1861/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1861/comments | https://api.github.com/repos/ollama/ollama/issues/1861/events | https://github.com/ollama/ollama/issues/1861 | 2,071,687,990 | I_kwDOJ0Z1Ps57e3M2 | 1,861 | [Bug] Phi-2 template incorrect | {
"login": "coder543",
"id": 726063,
"node_id": "MDQ6VXNlcjcyNjA2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/726063?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coder543",
"html_url": "https://github.com/coder543",
"followers_url": "https://api.github.com/users/coder54... | [] | closed | false | null | [] | null | 1 | 2024-01-09T06:05:26 | 2024-01-10T03:43:35 | 2024-01-10T03:43:17 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I believe the template being used for Phi-2 is incorrect.
Here is an example conversation:
```
ollama run phi
>>> What is the LHC?
The Large Hadron Collider (LHC) is a circular particle
accelerator located at CERN, the European Organization for
Nuclear Research, near Geneva, Switzerland. It was construct... | {
"login": "coder543",
"id": 726063,
"node_id": "MDQ6VXNlcjcyNjA2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/726063?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coder543",
"html_url": "https://github.com/coder543",
"followers_url": "https://api.github.com/users/coder54... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1861/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1861/timeline | null | not_planned | false |
https://api.github.com/repos/ollama/ollama/issues/8388 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8388/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8388/comments | https://api.github.com/repos/ollama/ollama/issues/8388/events | https://github.com/ollama/ollama/pull/8388 | 2,782,212,202 | PR_kwDOJ0Z1Ps6HcMug | 8,388 | add new create api doc | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | [] | closed | false | null | [] | null | 0 | 2025-01-12T00:45:51 | 2025-01-14T01:30:26 | 2025-01-14T01:30:24 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/8388",
"html_url": "https://github.com/ollama/ollama/pull/8388",
"diff_url": "https://github.com/ollama/ollama/pull/8388.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8388.patch",
"merged_at": "2025-01-14T01:30:24"
} | This replaces the existing `POST /api/create` documentation in the API docs. It covers the basics of how to create from an existing model, a GGUF file, or a safetensors file.
Note that I haven't *yet* included examples for each optional parameter. | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8388/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8388/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/8336 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8336/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8336/comments | https://api.github.com/repos/ollama/ollama/issues/8336/events | https://github.com/ollama/ollama/issues/8336 | 2,773,030,163 | I_kwDOJ0Z1Ps6lSRUT | 8,336 | Rerank models.... WHERE ARE THEY??????????? | {
"login": "Crimson-Hawk-1",
"id": 8478529,
"node_id": "MDQ6VXNlcjg0Nzg1Mjk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8478529?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Crimson-Hawk-1",
"html_url": "https://github.com/Crimson-Hawk-1",
"followers_url": "https://api.gith... | [
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] | closed | false | null | [] | null | 3 | 2025-01-07T14:49:32 | 2025-01-07T21:05:22 | 2025-01-07T21:05:21 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | When will we have rerank models in Ollama? | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8336/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8099 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8099/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8099/comments | https://api.github.com/repos/ollama/ollama/issues/8099/events | https://github.com/ollama/ollama/issues/8099 | 2,740,094,415 | I_kwDOJ0Z1Ps6jUoXP | 8,099 | ollama run silently truncating prompt | {
"login": "daniel-j-h",
"id": 527241,
"node_id": "MDQ6VXNlcjUyNzI0MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/527241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/daniel-j-h",
"html_url": "https://github.com/daniel-j-h",
"followers_url": "https://api.github.com/users/d... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 3 | 2024-12-14T18:45:58 | 2024-12-17T19:35:32 | 2024-12-17T19:35:32 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
The documentation shows us how to use `ollama run` to summarize a file, see
$ ollama run llama3.2 "Summarize this file: $(cat README.md)"
https://github.com/ollama/ollama?tab=readme-ov-file#pass-the-prompt-as-an-argument
What's not obvious here is that by default the prompt (and the... | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8099/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8099/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2134 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2134/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2134/comments | https://api.github.com/repos/ollama/ollama/issues/2134/events | https://github.com/ollama/ollama/pull/2134 | 2,093,473,313 | PR_kwDOJ0Z1Ps5ksqk8 | 2,134 | readline: drop not use min function | {
"login": "mengzhuo",
"id": 885662,
"node_id": "MDQ6VXNlcjg4NTY2Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/885662?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mengzhuo",
"html_url": "https://github.com/mengzhuo",
"followers_url": "https://api.github.com/users/mengzhu... | [] | closed | false | null | [] | null | 1 | 2024-01-22T09:29:41 | 2024-01-22T16:15:08 | 2024-01-22T16:15:08 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/2134",
"html_url": "https://github.com/ollama/ollama/pull/2134",
"diff_url": "https://github.com/ollama/ollama/pull/2134.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2134.patch",
"merged_at": "2024-01-22T16:15:08"
} | Since [Go1.21 (go.mod)](https://go.dev/doc/go1.21), Go adds min builtin function. | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2134/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2134/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/1755 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1755/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1755/comments | https://api.github.com/repos/ollama/ollama/issues/1755/events | https://github.com/ollama/ollama/issues/1755 | 2,061,660,731 | I_kwDOJ0Z1Ps564nI7 | 1,755 | [enhancement] use bert.cpp for /api/embeddings | {
"login": "fakezeta",
"id": 25375389,
"node_id": "MDQ6VXNlcjI1Mzc1Mzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/25375389?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fakezeta",
"html_url": "https://github.com/fakezeta",
"followers_url": "https://api.github.com/users/fak... | [] | closed | false | null | [] | null | 2 | 2024-01-01T16:51:14 | 2024-01-02T11:29:19 | 2024-01-02T11:29:19 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Llama2 and mistral base model are quite poor in embedding compared to sentence tranformer models like bert.
Why not integrate [bert.cpp](https://github.com/skeskinen/bert.cpp) or [sentence-transformers](https://sbert.net/) for `api/embeddings` endpoint so we can have the best of both architectures?
| {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/Br... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1755/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1755/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/2937 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2937/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2937/comments | https://api.github.com/repos/ollama/ollama/issues/2937/events | https://github.com/ollama/ollama/issues/2937 | 2,169,495,179 | I_kwDOJ0Z1Ps6BT96L | 2,937 | Unable to pass embeddings to the api call | {
"login": "brobles82",
"id": 2970237,
"node_id": "MDQ6VXNlcjI5NzAyMzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/2970237?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/brobles82",
"html_url": "https://github.com/brobles82",
"followers_url": "https://api.github.com/users/br... | [] | closed | false | {
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers"... | [
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/... | null | 2 | 2024-03-05T15:15:22 | 2024-03-07T07:39:43 | 2024-03-07T07:39:43 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Im using this code for generate embeddings
```
EMBEDDINGS_RESPONSE=$(curl "http://localhost:11434/api/embeddings" -d '{
"model": "mistral",
"prompt": "Spiderman is color green"
}')
EMBEDDINGS=$(echo $EMBEDDINGS_RESPONSE | jq '.embedding')
```
And then when I try to use the embeddings as context for ne... | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2937/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2937/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8367 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8367/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8367/comments | https://api.github.com/repos/ollama/ollama/issues/8367/events | https://github.com/ollama/ollama/issues/8367 | 2,778,635,661 | I_kwDOJ0Z1Ps6lnp2N | 8,367 | Single json expected when streaming set to false | {
"login": "gklcbord",
"id": 176333143,
"node_id": "U_kgDOCoKhVw",
"avatar_url": "https://avatars.githubusercontent.com/u/176333143?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gklcbord",
"html_url": "https://github.com/gklcbord",
"followers_url": "https://api.github.com/users/gklcbord/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 2 | 2025-01-09T19:46:37 | 2025-01-10T14:50:45 | 2025-01-10T14:50:45 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Trying the API with this input json:
{
"model": "llama3.2",
"prompt": "Why is the sky blue?",
"streaming": false
}
and I am getting return similar to below:
{
"model": "llama3.2",
"created_at": "2025-01-09T19:31:13.2233009Z",
"response": "The",
"done": false... | {
"login": "gklcbord",
"id": 176333143,
"node_id": "U_kgDOCoKhVw",
"avatar_url": "https://avatars.githubusercontent.com/u/176333143?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gklcbord",
"html_url": "https://github.com/gklcbord",
"followers_url": "https://api.github.com/users/gklcbord/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8367/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8367/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/6213 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6213/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6213/comments | https://api.github.com/repos/ollama/ollama/issues/6213/events | https://github.com/ollama/ollama/issues/6213 | 2,451,901,156 | I_kwDOJ0Z1Ps6SJQrk | 6,213 | Different behavior for "tool" and "function" roles | {
"login": "matheusfvesco",
"id": 114014793,
"node_id": "U_kgDOBsu6SQ",
"avatar_url": "https://avatars.githubusercontent.com/u/114014793?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/matheusfvesco",
"html_url": "https://github.com/matheusfvesco",
"followers_url": "https://api.github.com/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 2 | 2024-08-07T00:36:58 | 2024-08-08T00:40:27 | 2024-08-07T17:17:26 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Ollama models only replies/summarizes the result of the function calls if the message is created with the role "tool". If the role "function" is used, the model simply returns an empty string or says it can't do what i asked and keeps on doing it, no matter the chat history.
Example code:
``... | {
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjha... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6213/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6213/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8480 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8480/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8480/comments | https://api.github.com/repos/ollama/ollama/issues/8480/events | https://github.com/ollama/ollama/pull/8480 | 2,796,853,718 | PR_kwDOJ0Z1Ps6IOuRZ | 8,480 | check bounds for blob parts | {
"login": "bbSnavy",
"id": 46828965,
"node_id": "MDQ6VXNlcjQ2ODI4OTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/46828965?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bbSnavy",
"html_url": "https://github.com/bbSnavy",
"followers_url": "https://api.github.com/users/bbSnav... | [] | open | false | null | [] | null | 0 | 2025-01-18T08:26:59 | 2025-01-27T19:57:02 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/8480",
"html_url": "https://github.com/ollama/ollama/pull/8480",
"diff_url": "https://github.com/ollama/ollama/pull/8480.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8480.patch",
"merged_at": null
} | Resolves #8400 | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8480/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8480/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/448 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/448/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/448/comments | https://api.github.com/repos/ollama/ollama/issues/448/events | https://github.com/ollama/ollama/pull/448 | 1,875,777,717 | PR_kwDOJ0Z1Ps5ZQHe5 | 448 | fix spelling errors in example prompts | {
"login": "callmephilip",
"id": 492025,
"node_id": "MDQ6VXNlcjQ5MjAyNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/492025?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/callmephilip",
"html_url": "https://github.com/callmephilip",
"followers_url": "https://api.github.com/u... | [] | closed | false | null | [] | null | 0 | 2023-08-31T15:35:07 | 2023-08-31T15:57:07 | 2023-08-31T15:57:07 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/448",
"html_url": "https://github.com/ollama/ollama/pull/448",
"diff_url": "https://github.com/ollama/ollama/pull/448.diff",
"patch_url": "https://github.com/ollama/ollama/pull/448.patch",
"merged_at": "2023-08-31T15:57:07"
} | null | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/448/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/448/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/4625 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4625/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4625/comments | https://api.github.com/repos/ollama/ollama/issues/4625/events | https://github.com/ollama/ollama/pull/4625 | 2,316,580,082 | PR_kwDOJ0Z1Ps5wh3KU | 4,625 | server/download.go: Fix downloading with too much EOF error | {
"login": "coolljt0725",
"id": 8232360,
"node_id": "MDQ6VXNlcjgyMzIzNjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8232360?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coolljt0725",
"html_url": "https://github.com/coolljt0725",
"followers_url": "https://api.github.com/us... | [] | open | false | null | [] | null | 6 | 2024-05-25T02:07:41 | 2024-12-14T06:30:42 | null | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/4625",
"html_url": "https://github.com/ollama/ollama/pull/4625",
"diff_url": "https://github.com/ollama/ollama/pull/4625.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4625.patch",
"merged_at": null
} | PR #4436 use `io.CopyN` instead of `io.Copy`, `CopyN` will return `io.EOF` if src stop early, please refer to:
https://cs.opensource.google/go/go/+/refs/tags/go1.22.3:src/io/io.go;l=370
```
func CopyN(dst Writer, src Reader, n int64) (written int64, err error) {
written, err = Copy(dst, LimitReader(src, n))
i... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4625/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4625/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/4624 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4624/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4624/comments | https://api.github.com/repos/ollama/ollama/issues/4624/events | https://github.com/ollama/ollama/pull/4624 | 2,316,380,882 | PR_kwDOJ0Z1Ps5whJVh | 4,624 | fix q5_0, q5_1 | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | [] | closed | false | null | [] | null | 0 | 2024-05-24T23:01:58 | 2024-05-24T23:11:23 | 2024-05-24T23:11:22 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/4624",
"html_url": "https://github.com/ollama/ollama/pull/4624",
"diff_url": "https://github.com/ollama/ollama/pull/4624.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4624.patch",
"merged_at": "2024-05-24T23:11:22"
} | null | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4624/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4624/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3184 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3184/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3184/comments | https://api.github.com/repos/ollama/ollama/issues/3184/events | https://github.com/ollama/ollama/issues/3184 | 2,190,190,897 | I_kwDOJ0Z1Ps6Ci6kx | 3,184 | Add Video-LLaVA | {
"login": "Anas20001",
"id": 64137962,
"node_id": "MDQ6VXNlcjY0MTM3OTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/64137962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Anas20001",
"html_url": "https://github.com/Anas20001",
"followers_url": "https://api.github.com/users/... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | open | false | null | [] | null | 27 | 2024-03-16T19:08:13 | 2025-01-30T02:08:14 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What model would you like?
Add Video-LLaVA to be then used easily
https://github.com/PKU-YuanGroup/Video-LLaVA/tree/main | {
"login": "Anas20001",
"id": 64137962,
"node_id": "MDQ6VXNlcjY0MTM3OTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/64137962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Anas20001",
"html_url": "https://github.com/Anas20001",
"followers_url": "https://api.github.com/users/... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3184/reactions",
"total_count": 6,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 6
} | https://api.github.com/repos/ollama/ollama/issues/3184/timeline | null | reopened | false |
https://api.github.com/repos/ollama/ollama/issues/8580 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8580/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8580/comments | https://api.github.com/repos/ollama/ollama/issues/8580/events | https://github.com/ollama/ollama/issues/8580 | 2,810,987,233 | I_kwDOJ0Z1Ps6njELh | 8,580 | FHS Violation | {
"login": "rgammans",
"id": 512223,
"node_id": "MDQ6VXNlcjUxMjIyMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/512223?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rgammans",
"html_url": "https://github.com/rgammans",
"followers_url": "https://api.github.com/users/rgamman... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | open | false | null | [] | null | 2 | 2025-01-25T13:36:15 | 2025-01-29T14:53:18 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
The Linux FHS has this to say about /usr/
```
/usr is shareable, read-only data. That means that /usr should
be shareable between various FHS-compliant hosts and must not be written to.
Any information that is host-specific or varies with time is stored elsewhere.
```
However, Lama puts the s... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8580/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8580/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/6437 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6437/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6437/comments | https://api.github.com/repos/ollama/ollama/issues/6437/events | https://github.com/ollama/ollama/issues/6437 | 2,474,886,732 | I_kwDOJ0Z1Ps6Tg8ZM | 6,437 | how to use batch when using llm | {
"login": "PassStory",
"id": 6964842,
"node_id": "MDQ6VXNlcjY5NjQ4NDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/6964842?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PassStory",
"html_url": "https://github.com/PassStory",
"followers_url": "https://api.github.com/users/Pa... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | open | false | null | [] | null | 1 | 2024-08-20T07:10:38 | 2024-08-20T17:01:26 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I noticed the api does not support processing batch prompt, the GPU utilization is low, and i want to use batch mode to improve GPU utilization and accelerate the inference process, so, how to do that | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6437/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6437/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/5426 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5426/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5426/comments | https://api.github.com/repos/ollama/ollama/issues/5426/events | https://github.com/ollama/ollama/pull/5426 | 2,385,157,180 | PR_kwDOJ0Z1Ps50JA_Q | 5,426 | Enable AMD iGPU 780M in Linux, Create amd-igpu-780m.md | {
"login": "alexhegit",
"id": 31022192,
"node_id": "MDQ6VXNlcjMxMDIyMTky",
"avatar_url": "https://avatars.githubusercontent.com/u/31022192?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alexhegit",
"html_url": "https://github.com/alexhegit",
"followers_url": "https://api.github.com/users/... | [] | open | false | null | [] | null | 17 | 2024-07-02T04:04:52 | 2025-01-26T15:31:04 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/5426",
"html_url": "https://github.com/ollama/ollama/pull/5426",
"diff_url": "https://github.com/ollama/ollama/pull/5426.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5426.patch",
"merged_at": null
} | Add tutorial to run Ollama with AMD iGPU 780M (of Ryzen 7000s/8000s CPU) in Linux. | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5426/reactions",
"total_count": 34,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 2,
"confused": 0,
"heart": 16,
"rocket": 10,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5426/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/7872 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7872/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7872/comments | https://api.github.com/repos/ollama/ollama/issues/7872/events | https://github.com/ollama/ollama/pull/7872 | 2,702,182,986 | PR_kwDOJ0Z1Ps6DeWMU | 7,872 | Brucemacd/check key register | {
"login": "Kustom665",
"id": 179161305,
"node_id": "U_kgDOCq3I2Q",
"avatar_url": "https://avatars.githubusercontent.com/u/179161305?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kustom665",
"html_url": "https://github.com/Kustom665",
"followers_url": "https://api.github.com/users/Kustom... | [] | closed | false | null | [] | null | 0 | 2024-11-28T13:33:19 | 2024-11-28T13:33:49 | 2024-11-28T13:33:35 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/7872",
"html_url": "https://github.com/ollama/ollama/pull/7872",
"diff_url": "https://github.com/ollama/ollama/pull/7872.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7872.patch",
"merged_at": null
} | null | {
"login": "Kustom665",
"id": 179161305,
"node_id": "U_kgDOCq3I2Q",
"avatar_url": "https://avatars.githubusercontent.com/u/179161305?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kustom665",
"html_url": "https://github.com/Kustom665",
"followers_url": "https://api.github.com/users/Kustom... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7872/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7872/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/2701 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2701/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2701/comments | https://api.github.com/repos/ollama/ollama/issues/2701/events | https://github.com/ollama/ollama/issues/2701 | 2,150,590,614 | I_kwDOJ0Z1Ps6AL2iW | 2,701 | ollama.service cannot create folder defined by OLLAMA_MODELS or do not run when the folder is created manually | {
"login": "Crystal4276",
"id": 27446196,
"node_id": "MDQ6VXNlcjI3NDQ2MTk2",
"avatar_url": "https://avatars.githubusercontent.com/u/27446196?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Crystal4276",
"html_url": "https://github.com/Crystal4276",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | null | 10 | 2024-02-23T08:15:01 | 2024-11-22T18:08:13 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Hello
I'm facing an issue to locate the models into my home folder since my root partition is limited in size.
I followed the FAQ and information collected here and there to setup OLLAMA_MODELS in ollama.service.
When starting the service, the journal report that the server could not create the folder in my home dir... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2701/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2701/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/3515 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3515/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3515/comments | https://api.github.com/repos/ollama/ollama/issues/3515/events | https://github.com/ollama/ollama/pull/3515 | 2,229,211,010 | PR_kwDOJ0Z1Ps5r6DXw | 3,515 | Docs: Remove wrong parameter for Chat Completion | {
"login": "ThomasVitale",
"id": 8523418,
"node_id": "MDQ6VXNlcjg1MjM0MTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8523418?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ThomasVitale",
"html_url": "https://github.com/ThomasVitale",
"followers_url": "https://api.github.com... | [] | closed | false | null | [] | null | 0 | 2024-04-06T11:56:35 | 2024-04-06T16:08:35 | 2024-04-06T16:08:35 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/3515",
"html_url": "https://github.com/ollama/ollama/pull/3515",
"diff_url": "https://github.com/ollama/ollama/pull/3515.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3515.patch",
"merged_at": "2024-04-06T16:08:35"
} | Fixes gh-3514 | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3515/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3515/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3053 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3053/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3053/comments | https://api.github.com/repos/ollama/ollama/issues/3053/events | https://github.com/ollama/ollama/issues/3053 | 2,179,264,800 | I_kwDOJ0Z1Ps6B5PEg | 3,053 | something broke /embeddings in last update ( 0.1.28 and .29) docker | {
"login": "Hansson0728",
"id": 9604420,
"node_id": "MDQ6VXNlcjk2MDQ0MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/9604420?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hansson0728",
"html_url": "https://github.com/Hansson0728",
"followers_url": "https://api.github.com/us... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 10 | 2024-03-11T14:22:37 | 2024-07-17T15:57:56 | 2024-06-04T06:46:49 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | i dont even get a response when curl /embeddings.
curl -X POST http://localhost:11434/api/embeddings -d '{"model":"nomic-embed-text", "prompt": "hello"}'
nothin in the logs no answer no 404 no nothing. iam pretty sure i worked before 0.1.28.. | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3053/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3053/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/5676 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5676/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5676/comments | https://api.github.com/repos/ollama/ollama/issues/5676/events | https://github.com/ollama/ollama/pull/5676 | 2,407,039,142 | PR_kwDOJ0Z1Ps51THkM | 5,676 | server: fix `context`, `load_duration` and `total_duration` fields | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | [] | closed | false | null | [] | null | 0 | 2024-07-13T16:14:57 | 2024-07-13T16:25:33 | 2024-07-13T16:25:31 | MEMBER | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/5676",
"html_url": "https://github.com/ollama/ollama/pull/5676",
"diff_url": "https://github.com/ollama/ollama/pull/5676.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5676.patch",
"merged_at": "2024-07-13T16:25:31"
} | Fixes https://github.com/ollama/ollama/issues/5671 | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5676/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5676/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/4385 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4385/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4385/comments | https://api.github.com/repos/ollama/ollama/issues/4385/events | https://github.com/ollama/ollama/issues/4385 | 2,291,543,742 | I_kwDOJ0Z1Ps6Ili6- | 4,385 | Unable to access ollama from obsidian plugins app://obsidian.md | {
"login": "airtonix",
"id": 61225,
"node_id": "MDQ6VXNlcjYxMjI1",
"avatar_url": "https://avatars.githubusercontent.com/u/61225?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/airtonix",
"html_url": "https://github.com/airtonix",
"followers_url": "https://api.github.com/users/airtonix/foll... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 3 | 2024-05-12T23:16:48 | 2024-05-13T12:18:16 | 2024-05-13T03:29:31 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
```
index.html:1 Access to fetch at 'http://127.0.0.1:11434/api/chat' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque re... | {
"login": "airtonix",
"id": 61225,
"node_id": "MDQ6VXNlcjYxMjI1",
"avatar_url": "https://avatars.githubusercontent.com/u/61225?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/airtonix",
"html_url": "https://github.com/airtonix",
"followers_url": "https://api.github.com/users/airtonix/foll... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4385/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4385/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/1702 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1702/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1702/comments | https://api.github.com/repos/ollama/ollama/issues/1702/events | https://github.com/ollama/ollama/pull/1702 | 2,055,344,194 | PR_kwDOJ0Z1Ps5iuWqy | 1,702 | added uninstall script | {
"login": "vtrenton",
"id": 85969349,
"node_id": "MDQ6VXNlcjg1OTY5MzQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/85969349?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vtrenton",
"html_url": "https://github.com/vtrenton",
"followers_url": "https://api.github.com/users/vtr... | [] | closed | false | null | [] | null | 4 | 2023-12-25T03:49:01 | 2024-11-21T05:52:19 | 2024-11-21T05:52:19 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/1702",
"html_url": "https://github.com/ollama/ollama/pull/1702",
"diff_url": "https://github.com/ollama/ollama/pull/1702.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1702.patch",
"merged_at": null
} | A script for uninstalling ollama on Linux.
Fixes #1701 | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmor... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1702/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1702/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3284 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3284/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3284/comments | https://api.github.com/repos/ollama/ollama/issues/3284/events | https://github.com/ollama/ollama/pull/3284 | 2,199,988,592 | PR_kwDOJ0Z1Ps5qWpnU | 3,284 | Add MarshalJSON to Duration | {
"login": "jackielii",
"id": 360983,
"node_id": "MDQ6VXNlcjM2MDk4Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/360983?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackielii",
"html_url": "https://github.com/jackielii",
"followers_url": "https://api.github.com/users/jack... | [] | closed | false | null | [] | null | 1 | 2024-03-21T11:57:51 | 2024-05-06T22:59:18 | 2024-05-06T22:59:18 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/3284",
"html_url": "https://github.com/ollama/ollama/pull/3284",
"diff_url": "https://github.com/ollama/ollama/pull/3284.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3284.patch",
"merged_at": "2024-05-06T22:59:18"
} | fix #3283 | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/follower... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3284/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3284/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/1359 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1359/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1359/comments | https://api.github.com/repos/ollama/ollama/issues/1359/events | https://github.com/ollama/ollama/issues/1359 | 2,022,344,626 | I_kwDOJ0Z1Ps54ioey | 1,359 | 4 GPUs, each with 12.2MiB. The utility loads more into rank 0, but it only gets up to about 4 plus GiB never close to 12.2GiB | {
"login": "phalexo",
"id": 4603365,
"node_id": "MDQ6VXNlcjQ2MDMzNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4603365?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phalexo",
"html_url": "https://github.com/phalexo",
"followers_url": "https://api.github.com/users/phalexo/... | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.gi... | null | 4 | 2023-12-03T04:02:00 | 2024-02-01T23:14:08 | 2024-02-01T23:14:07 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | cuBLAS error 15 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/gguf/ggml-cuda.cu:7586
current device: 0
⠸ 2023/12/02 22:53:21 llama.go:436: exit status 1
2023/12/02 22:53:21 llama.go:510: llama runner stopped successfully
[GIN] 2023/12/02 - 22:53:21 | 200 | 1.311500885s | 127.0.0.1 | POST "/api/gen... | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhilt... | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1359/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1359/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/3492 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3492/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3492/comments | https://api.github.com/repos/ollama/ollama/issues/3492/events | https://github.com/ollama/ollama/issues/3492 | 2,225,874,326 | I_kwDOJ0Z1Ps6ErCWW | 3,492 | Add enhancement to allow RAG functionnality | {
"login": "g02200jeff",
"id": 159446878,
"node_id": "U_kgDOCYD3Xg",
"avatar_url": "https://avatars.githubusercontent.com/u/159446878?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/g02200jeff",
"html_url": "https://github.com/g02200jeff",
"followers_url": "https://api.github.com/users/g02... | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | open | false | null | [] | null | 6 | 2024-04-04T15:46:02 | 2024-11-06T17:45:00 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What are you trying to do?
I want use a custom script on ollama server (windows) to execute Retrieval Augmented Generation (RAG) process. How can I do ?
(I have an example which is working with a python script, langchain and ollama but I can't do it behing the ollama server using api restful).
### How should we ... | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3492/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3492/timeline | null | null | false |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.