url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/2979 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2979/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2979/comments | https://api.github.com/repos/ollama/ollama/issues/2979/events | https://github.com/ollama/ollama/issues/2979 | 2,173,683,721 | I_kwDOJ0Z1Ps6Bj8gJ | 2,979 | Starcoder2 crashing ollama docker version 0.1.28 | {
"login": "tilllt",
"id": 1854364,
"node_id": "MDQ6VXNlcjE4NTQzNjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1854364?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tilllt",
"html_url": "https://github.com/tilllt",
"followers_url": "https://api.github.com/users/tilllt/followers",
"following_url": "https://api.github.com/users/tilllt/following{/other_user}",
"gists_url": "https://api.github.com/users/tilllt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tilllt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tilllt/subscriptions",
"organizations_url": "https://api.github.com/users/tilllt/orgs",
"repos_url": "https://api.github.com/users/tilllt/repos",
"events_url": "https://api.github.com/users/tilllt/events{/privacy}",
"received_events_url": "https://api.github.com/users/tilllt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 2 | 2024-03-07T11:51:04 | 2024-03-07T12:49:55 | 2024-03-07T12:49:54 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I noticed that the ollama version shipped as docker container has been updated to 0.1.28 and thus should run starcoder2 and gemma models - i am still not having luck running those, ollama just crashes... am i missing something?
https://pastebin.com/ALJRfZZ5 | {
"login": "tilllt",
"id": 1854364,
"node_id": "MDQ6VXNlcjE4NTQzNjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1854364?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tilllt",
"html_url": "https://github.com/tilllt",
"followers_url": "https://api.github.com/users/tilllt/followers",
"following_url": "https://api.github.com/users/tilllt/following{/other_user}",
"gists_url": "https://api.github.com/users/tilllt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tilllt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tilllt/subscriptions",
"organizations_url": "https://api.github.com/users/tilllt/orgs",
"repos_url": "https://api.github.com/users/tilllt/repos",
"events_url": "https://api.github.com/users/tilllt/events{/privacy}",
"received_events_url": "https://api.github.com/users/tilllt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2979/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2979/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8660 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8660/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8660/comments | https://api.github.com/repos/ollama/ollama/issues/8660/events | https://github.com/ollama/ollama/issues/8660 | 2,818,256,756 | I_kwDOJ0Z1Ps6n-y90 | 8,660 | GPU Memory Not Released After Exiting deepseek-r1:32b Model | {
"login": "Sebjac06",
"id": 172889704,
"node_id": "U_kgDOCk4WaA",
"avatar_url": "https://avatars.githubusercontent.com/u/172889704?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sebjac06",
"html_url": "https://github.com/Sebjac06",
"followers_url": "https://api.github.com/users/Sebjac06/followers",
"following_url": "https://api.github.com/users/Sebjac06/following{/other_user}",
"gists_url": "https://api.github.com/users/Sebjac06/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Sebjac06/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Sebjac06/subscriptions",
"organizations_url": "https://api.github.com/users/Sebjac06/orgs",
"repos_url": "https://api.github.com/users/Sebjac06/repos",
"events_url": "https://api.github.com/users/Sebjac06/events{/privacy}",
"received_events_url": "https://api.github.com/users/Sebjac06/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 2 | 2025-01-29T13:41:07 | 2025-01-29T13:51:19 | 2025-01-29T13:51:19 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
- Ollama Version: 0.5.7
- Model: deepseek-r1:32b
- GPU: NVIDIA RTX 3090 (24GB VRAM)
- OS: Windows 11 (include build version if known)
After running the `deepseek-r1:32b` model via `ollama run deepseek-r1:32b` and exiting with `/bye` in my terminal, the GPU's dedicated memory remains fully allocated at 24GB despite 0% GPU usage. This persists until I close the ollama application fully, and occurs again when using the model.
Is this a bug? It seems strange that the dedicated memory remains at the maximum even after closing the terrminal.
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.5.7 | {
"login": "Sebjac06",
"id": 172889704,
"node_id": "U_kgDOCk4WaA",
"avatar_url": "https://avatars.githubusercontent.com/u/172889704?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sebjac06",
"html_url": "https://github.com/Sebjac06",
"followers_url": "https://api.github.com/users/Sebjac06/followers",
"following_url": "https://api.github.com/users/Sebjac06/following{/other_user}",
"gists_url": "https://api.github.com/users/Sebjac06/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Sebjac06/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Sebjac06/subscriptions",
"organizations_url": "https://api.github.com/users/Sebjac06/orgs",
"repos_url": "https://api.github.com/users/Sebjac06/repos",
"events_url": "https://api.github.com/users/Sebjac06/events{/privacy}",
"received_events_url": "https://api.github.com/users/Sebjac06/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8660/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8660/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/3362 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3362/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3362/comments | https://api.github.com/repos/ollama/ollama/issues/3362/events | https://github.com/ollama/ollama/issues/3362 | 2,208,545,720 | I_kwDOJ0Z1Ps6Do7u4 | 3,362 | Report better error on windows on port conflict with winnat | {
"login": "Canman1963",
"id": 133131797,
"node_id": "U_kgDOB-9uFQ",
"avatar_url": "https://avatars.githubusercontent.com/u/133131797?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Canman1963",
"html_url": "https://github.com/Canman1963",
"followers_url": "https://api.github.com/users/Canman1963/followers",
"following_url": "https://api.github.com/users/Canman1963/following{/other_user}",
"gists_url": "https://api.github.com/users/Canman1963/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Canman1963/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Canman1963/subscriptions",
"organizations_url": "https://api.github.com/users/Canman1963/orgs",
"repos_url": "https://api.github.com/users/Canman1963/repos",
"events_url": "https://api.github.com/users/Canman1963/events{/privacy}",
"received_events_url": "https://api.github.com/users/Canman1963/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] | open | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4 | 2024-03-26T15:15:16 | 2024-04-28T19:01:09 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Hi DevOps
My Ollama was working fine for me until I tried to use it today not sure what has happened. The LOGS show this repeated Crash and attempt to reload in the app.log
Time=2024-03-25T12:09:31.329-05:00 level=INFO source=logging.go:45 msg="ollama app started"
time=2024-03-25T12:09:31.389-05:00 level=INFO source=server.go:135 msg="unable to connect to server"
time=2024-03-25T12:09:34.633-05:00 level=INFO source=server.go:91 msg="started ollama server with pid 33376"
time=2024-03-25T12:09:34.633-05:00 level=INFO source=server.go:93 msg="ollama server logs C:\\Users\\David\\AppData\\Local\\Ollama\\server.log"
time=2024-03-25T12:09:35.525-05:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 1 - respawning"
time=2024-03-25T12:09:36.037-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:09:37.038-05:00 level=WARN source=server.go:113 msg="server crash 2 - exit code 1 - respawning"
time=2024-03-25T12:09:37.550-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:09:39.557-05:00 level=WARN source=server.go:113 msg="server crash 3 - exit code 1 - respawning"
time=2024-03-25T12:09:40.071-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:09:43.072-05:00 level=WARN source=server.go:113 msg="server crash 4 - exit code 1 - respawning"
time=2024-03-25T12:09:43.572-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:09:47.587-05:00 level=WARN source=server.go:113 msg="server crash 5 - exit code 1 - respawning"
time=2024-03-25T12:09:48.087-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:09:53.087-05:00 level=WARN source=server.go:113 msg="server crash 6 - exit code 1 - respawning"
time=2024-03-25T12:09:53.599-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:09:59.613-05:00 level=WARN source=server.go:113 msg="server crash 7 - exit code 1 - respawning"
time=2024-03-25T12:10:00.116-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:10:07.123-05:00 level=WARN source=server.go:113 msg="server crash 8 - exit code 1 - respawning"
time=2024-03-25T12:10:07.636-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:10:15.651-05:00 level=WARN source=server.go:113 msg="server crash 9 - exit code 1 - respawning"
time=2024-03-25T12:10:16.166-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:10:25.171-05:00 level=WARN source=server.go:113 msg="server crash 10 - exit code 1 - respawning"
time=2024-03-25T12:10:25.683-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:10:35.698-05:00 level=WARN source=server.go:113 msg="server crash 11 - exit code 1 - respawning"
time=2024-03-25T12:10:36.207-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:10:47.222-05:00 level=WARN source=server.go:113 msg="server crash 12 - exit code 1 - respawning"
time=2024-03-25T12:10:47.734-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:10:59.745-05:00 level=WARN source=server.go:113 msg="server crash 13 - exit code 1 - respawning"
time=2024-03-25T12:11:00.252-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:11:13.267-05:00 level=WARN source=server.go:113 msg="server crash 14 - exit code 1 - respawning"
time=2024-03-25T12:11:13.782-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:11:27.784-05:00 level=WARN source=server.go:113 msg="server crash 15 - exit code 1 - respawning"
time=2024-03-25T12:11:28.293-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:11:43.295-05:00 level=WARN source=server.go:113 msg="server crash 16 - exit code 1 - respawning"
time=2024-03-25T12:11:43.810-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:11:59.814-05:00 level=WARN source=server.go:113 msg="server crash 17 - exit code 1 - respawning"
time=2024-03-25T12:12:00.328-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:12:17.333-05:00 level=WARN source=server.go:113 msg="server crash 18 - exit code 1 - respawning"
time=2024-03-25T12:12:17.845-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:12:35.856-05:00 level=WARN source=server.go:113 msg="server crash 19 - exit code 1 - respawning"
time=2024-03-25T12:12:36.370-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:12:55.374-05:00 level=WARN source=server.go:113 msg="server crash 20 - exit code 1 - respawning"
time=2024-03-25T12:12:55.888-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:13:15.889-05:00 level=WARN source=server.go:113 msg="server crash 21 - exit code 1 - respawning"
time=2024-03-25T12:13:16.403-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:13:37.410-05:00 level=WARN source=server.go:113 msg="server crash 22 - exit code 1 - respawning"
time=2024-03-25T12:13:37.924-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:13:59.930-05:00 level=WARN source=server.go:113 msg="server crash 23 - exit code 1 - respawning"
time=2024-03-25T12:14:00.436-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:14:23.441-05:00 level=WARN source=server.go:113 msg="server crash 24 - exit code 1 - respawning"
time=2024-03-25T12:14:23.954-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:14:47.968-05:00 level=WARN source=server.go:113 msg="server crash 25 - exit code 1 - respawning"
time=2024-03-25T12:14:48.482-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:15:13.490-05:00 level=WARN source=server.go:113 msg="server crash 26 - exit code 1 - respawning"
time=2024-03-25T12:15:14.003-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:15:40.015-05:00 level=WARN source=server.go:113 msg="server crash 27 - exit code 1 - respawning"
time=2024-03-25T12:15:40.529-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:16:07.542-05:00 level=WARN source=server.go:113 msg="server crash 28 - exit code 1 - respawning"
time=2024-03-25T12:16:08.052-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:16:36.055-05:00 level=WARN source=server.go:113 msg="server crash 29 - exit code 1 - respawning"
time=2024-03-25T12:16:36.569-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:17:05.578-05:00 level=WARN source=server.go:113 msg="server crash 30 - exit code 1 - respawning"
time=2024-03-25T12:17:06.089-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:17:36.095-05:00 level=WARN source=server.go:113 msg="server crash 31 - exit code 1 - respawning"
time=2024-03-25T12:17:36.595-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:18:07.599-05:00 level=WARN source=server.go:113 msg="server crash 32 - exit code 1 - respawning"
time=2024-03-25T12:18:08.108-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:18:40.115-05:00 level=WARN source=server.go:113 msg="server crash 33 - exit code 1 - respawning"
time=2024-03-25T12:18:40.628-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:19:13.638-05:00 level=WARN source=server.go:113 msg="server crash 34 - exit code 1 - respawning"
time=2024-03-25T12:19:14.152-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:19:48.153-05:00 level=WARN source=server.go:113 msg="server crash 35 - exit code 1 - respawning"
time=2024-03-25T12:19:48.664-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:20:23.680-05:00 level=WARN source=server.go:113 msg="server crash 36 - exit code 1 - respawning"
time=2024-03-25T12:20:24.181-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:21:00.187-05:00 level=WARN source=server.go:113 msg="server crash 37 - exit code 1 - respawning"
time=2024-03-25T12:21:00.696-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:21:37.709-05:00 level=WARN source=server.go:113 msg="server crash 38 - exit code 1 - respawning"
time=2024-03-25T12:21:38.210-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:22:16.225-05:00 level=WARN source=server.go:113 msg="server crash 39 - exit code 1 - respawning"
time=2024-03-25T12:22:16.728-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:22:55.736-05:00 level=WARN source=server.go:113 msg="server crash 40 - exit code 1 - respawning"
time=2024-03-25T12:22:56.251-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:23:36.259-05:00 level=WARN source=server.go:113 msg="server crash 41 - exit code 1 - respawning"
time=2024-03-25T12:23:36.772-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:24:17.786-05:00 level=WARN source=server.go:113 msg="server crash 42 - exit code 1 - respawning"
time=2024-03-25T12:24:18.297-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:25:00.300-05:00 level=WARN source=server.go:113 msg="server crash 43 - exit code 1 - respawning"
time=2024-03-25T12:25:00.801-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:25:43.807-05:00 level=WARN source=server.go:113 msg="server crash 44 - exit code 1 - respawning"
time=2024-03-25T12:25:44.319-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:26:28.319-05:00 level=WARN source=server.go:113 msg="server crash 45 - exit code 1 - respawning"
time=2024-03-25T12:26:28.834-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:27:13.836-05:00 level=WARN source=server.go:113 msg="server crash 46 - exit code 1 - respawning"
time=2024-03-25T12:27:14.349-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:28:00.358-05:00 level=WARN source=server.go:113 msg="server crash 47 - exit code 1 - respawning"
time=2024-03-25T12:28:00.858-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:28:47.859-05:00 level=WARN source=server.go:113 msg="server crash 48 - exit code 1 - respawning"
time=2024-03-25T12:28:48.370-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:29:36.379-05:00 level=WARN source=server.go:113 msg="server crash 49 - exit code 1 - respawning"
time=2024-03-25T12:29:36.890-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:30:25.903-05:00 level=WARN source=server.go:113 msg="server crash 50 - exit code 1 - respawning"
time=2024-03-25T12:30:26.414-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:31:16.423-05:00 level=WARN source=server.go:113 msg="server crash 51 - exit code 1 - respawning"
time=2024-03-25T12:31:16.928-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:32:07.931-05:00 level=WARN source=server.go:113 msg="server crash 52 - exit code 1 - respawning"
time=2024-03-25T12:32:08.440-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:33:00.453-05:00 level=WARN source=server.go:113 msg="server crash 53 - exit code 1 - respawning"
time=2024-03-25T12:33:00.967-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:33:53.982-05:00 level=WARN source=server.go:113 msg="server crash 54 - exit code 1 - respawning"
time=2024-03-25T12:33:54.492-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:34:48.500-05:00 level=WARN source=server.go:113 msg="server crash 55 - exit code 1 - respawning"
time=2024-03-25T12:34:49.001-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:35:44.010-05:00 level=WARN source=server.go:113 msg="server crash 56 - exit code 1 - respawning"
time=2024-03-25T12:35:44.519-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:36:40.524-05:00 level=WARN source=server.go:113 msg="server crash 57 - exit code 1 - respawning"
time=2024-03-25T12:36:41.036-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:37:38.048-05:00 level=WARN source=server.go:113 msg="server crash 58 - exit code 1 - respawning"
time=2024-03-25T12:37:38.556-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:38:36.560-05:00 level=WARN source=server.go:113 msg="server crash 59 - exit code 1 - respawning"
time=2024-03-25T12:38:37.076-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:39:36.085-05:00 level=WARN source=server.go:113 msg="server crash 60 - exit code 1 - respawning"
time=2024-03-25T12:39:36.597-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:40:36.611-05:00 level=WARN source=server.go:113 msg="server crash 61 - exit code 1 - respawning"
time=2024-03-25T12:40:37.120-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:41:38.134-05:00 level=WARN source=server.go:113 msg="server crash 62 - exit code 1 - respawning"
time=2024-03-25T12:41:38.642-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:42:40.653-05:00 level=WARN source=server.go:113 msg="server crash 63 - exit code 1 - respawning"
time=2024-03-25T12:42:41.167-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:43:44.169-05:00 level=WARN source=server.go:113 msg="server crash 64 - exit code 1 - respawning"
time=2024-03-25T12:43:44.681-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:44:48.683-05:00 level=WARN source=server.go:113 msg="server crash 65 - exit code 1 - respawning"
time=2024-03-25T12:44:49.190-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:45:54.201-05:00 level=WARN source=server.go:113 msg="server crash 66 - exit code 1 - respawning"
time=2024-03-25T12:45:54.714-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:47:00.715-05:00 level=WARN source=server.go:113 msg="server crash 67 - exit code 1 - respawning"
time=2024-03-25T12:47:01.226-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:48:08.233-05:00 level=WARN source=server.go:113 msg="server crash 68 - exit code 1 - respawning"
time=2024-03-25T12:48:08.747-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:49:16.758-05:00 level=WARN source=server.go:113 msg="server crash 69 - exit code 1 - respawning"
time=2024-03-25T12:49:17.269-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:50:26.284-05:00 level=WARN source=server.go:113 msg="server crash 70 - exit code 1 - respawning"
time=2024-03-25T12:50:26.796-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:51:36.803-05:00 level=WARN source=server.go:113 msg="server crash 71 - exit code 1 - respawning"
time=2024-03-25T12:51:37.313-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:52:48.320-05:00 level=WARN source=server.go:113 msg="server crash 72 - exit code 1 - respawning"
time=2024-03-25T12:52:48.830-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:54:00.841-05:00 level=WARN source=server.go:113 msg="server crash 73 - exit code 1 - respawning"
time=2024-03-25T12:54:01.353-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:55:14.366-05:00 level=WARN source=server.go:113 msg="server crash 74 - exit code 1 - respawning"
time=2024-03-25T12:55:14.880-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:56:28.893-05:00 level=WARN source=server.go:113 msg="server crash 75 - exit code 1 - respawning"
time=2024-03-25T12:56:29.394-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:57:44.407-05:00 level=WARN source=server.go:113 msg="server crash 76 - exit code 1 - respawning"
time=2024-03-25T12:57:44.917-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T12:59:00.925-05:00 level=WARN source=server.go:113 msg="server crash 77 - exit code 1 - respawning"
time=2024-03-25T12:59:01.436-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T13:00:18.439-05:00 level=WARN source=server.go:113 msg="server crash 78 - exit code 1 - respawning"
time=2024-03-25T13:00:18.951-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T13:01:36.954-05:00 level=WARN source=server.go:113 msg="server crash 79 - exit code 1 - respawning"
time=2024-03-25T13:01:37.466-05:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-03-25T13:02:24.837-05:00 level=INFO source=logging.go:45 msg="ollama app started"
time=2024-03-25T13:02:24.898-05:00 level=INFO source=server.go:135 msg="unable to connect to server"
In the server log it shows
Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions.
| null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3362/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3362/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/5953 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5953/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5953/comments | https://api.github.com/repos/ollama/ollama/issues/5953/events | https://github.com/ollama/ollama/issues/5953 | 2,430,295,149 | I_kwDOJ0Z1Ps6Q21xt | 5,953 | Who are you? | {
"login": "t7aliang",
"id": 11693120,
"node_id": "MDQ6VXNlcjExNjkzMTIw",
"avatar_url": "https://avatars.githubusercontent.com/u/11693120?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/t7aliang",
"html_url": "https://github.com/t7aliang",
"followers_url": "https://api.github.com/users/t7aliang/followers",
"following_url": "https://api.github.com/users/t7aliang/following{/other_user}",
"gists_url": "https://api.github.com/users/t7aliang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/t7aliang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/t7aliang/subscriptions",
"organizations_url": "https://api.github.com/users/t7aliang/orgs",
"repos_url": "https://api.github.com/users/t7aliang/repos",
"events_url": "https://api.github.com/users/t7aliang/events{/privacy}",
"received_events_url": "https://api.github.com/users/t7aliang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 1 | 2024-07-25T15:23:15 | 2024-07-26T14:01:03 | 2024-07-26T14:01:03 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
![Screenshot 2024-07-25 at 11 18 10 PM](https://github.com/user-attachments/assets/5a06a6fd-3e8c-4db9-84f7-5f076acdd4be)
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.2.8 | {
"login": "t7aliang",
"id": 11693120,
"node_id": "MDQ6VXNlcjExNjkzMTIw",
"avatar_url": "https://avatars.githubusercontent.com/u/11693120?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/t7aliang",
"html_url": "https://github.com/t7aliang",
"followers_url": "https://api.github.com/users/t7aliang/followers",
"following_url": "https://api.github.com/users/t7aliang/following{/other_user}",
"gists_url": "https://api.github.com/users/t7aliang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/t7aliang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/t7aliang/subscriptions",
"organizations_url": "https://api.github.com/users/t7aliang/orgs",
"repos_url": "https://api.github.com/users/t7aliang/repos",
"events_url": "https://api.github.com/users/t7aliang/events{/privacy}",
"received_events_url": "https://api.github.com/users/t7aliang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5953/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/ollama/ollama/issues/5953/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/5602 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5602/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5602/comments | https://api.github.com/repos/ollama/ollama/issues/5602/events | https://github.com/ollama/ollama/issues/5602 | 2,400,911,034 | I_kwDOJ0Z1Ps6PGv66 | 5,602 | Running latest version 0.2.1 running slowly and not returning output for long text input | {
"login": "jillvillany",
"id": 42828003,
"node_id": "MDQ6VXNlcjQyODI4MDAz",
"avatar_url": "https://avatars.githubusercontent.com/u/42828003?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jillvillany",
"html_url": "https://github.com/jillvillany",
"followers_url": "https://api.github.com/users/jillvillany/followers",
"following_url": "https://api.github.com/users/jillvillany/following{/other_user}",
"gists_url": "https://api.github.com/users/jillvillany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jillvillany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jillvillany/subscriptions",
"organizations_url": "https://api.github.com/users/jillvillany/orgs",
"repos_url": "https://api.github.com/users/jillvillany/repos",
"events_url": "https://api.github.com/users/jillvillany/events{/privacy}",
"received_events_url": "https://api.github.com/users/jillvillany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
}
] | open | false | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3 | 2024-07-10T14:19:02 | 2024-10-16T16:18:22 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
I am running ollama on an AWS ml.p3.2xlarge SageMaker notebook instance.
When I install the latest version, 0.2.1, the response time on a langchain chain running an extract names prompt on a page of text using llama3:latest is about 8 seconds and doesn't return any names.
However, when I install version 0.1.37, the response time goes down to under a second and I get an accurate response with people's names found in the text.
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.2.1 | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5602/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5602/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/1876 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1876/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1876/comments | https://api.github.com/repos/ollama/ollama/issues/1876/events | https://github.com/ollama/ollama/issues/1876 | 2,073,129,789 | I_kwDOJ0Z1Ps57kXM9 | 1,876 | ollama list flags help | {
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6960960225,
"node_id": "LA_kwDOJ0Z1Ps8AAAABnufS4Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/cli",
"name": "cli",
"color": "5319e7",
"default": false,
"description": "Issues related to the Ollama CLI"
}
] | open | false | null | [] | null | 5 | 2024-01-09T20:35:48 | 2024-10-26T21:58:36 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | There is no obvious way of seeing what flags are available for ollama list
```
ollama list --help
List models
Usage:
ollama list [flags]
Aliases:
list, ls
Flags:
-h, --help help for list
```
| null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1876/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1876/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/2787 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/2787/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/2787/comments | https://api.github.com/repos/ollama/ollama/issues/2787/events | https://github.com/ollama/ollama/issues/2787 | 2,157,567,148 | I_kwDOJ0Z1Ps6Amdys | 2,787 | bug? - session save does not save latest messages of the chat | {
"login": "FotisK",
"id": 7896645,
"node_id": "MDQ6VXNlcjc4OTY2NDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/7896645?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FotisK",
"html_url": "https://github.com/FotisK",
"followers_url": "https://api.github.com/users/FotisK/followers",
"following_url": "https://api.github.com/users/FotisK/following{/other_user}",
"gists_url": "https://api.github.com/users/FotisK/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FotisK/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FotisK/subscriptions",
"organizations_url": "https://api.github.com/users/FotisK/orgs",
"repos_url": "https://api.github.com/users/FotisK/repos",
"events_url": "https://api.github.com/users/FotisK/events{/privacy}",
"received_events_url": "https://api.github.com/users/FotisK/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 1 | 2024-02-27T20:43:59 | 2024-05-17T01:50:42 | 2024-05-17T01:50:42 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I was having a very long conversation with nollama/mythomax-l2-13b:Q5_K_S, saved the session and restored it and found that the latest 100-200 lines of the discussion were missing. I haven't tried to reproduce it (I don't have lengthy chats often), but I thought I'd report it. When I get another chance, I'll test it again | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/2787/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/2787/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/5307 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5307/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5307/comments | https://api.github.com/repos/ollama/ollama/issues/5307/events | https://github.com/ollama/ollama/pull/5307 | 2,376,001,387 | PR_kwDOJ0Z1Ps5zq6Yg | 5,307 | Ollama Show: Check for Projector Type | {
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 1 | 2024-06-26T18:22:07 | 2024-06-28T18:30:19 | 2024-06-28T18:30:17 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/5307",
"html_url": "https://github.com/ollama/ollama/pull/5307",
"diff_url": "https://github.com/ollama/ollama/pull/5307.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5307.patch",
"merged_at": "2024-06-28T18:30:17"
} | Fixes #5289
<img width="410" alt="Screenshot 2024-06-26 at 11 21 57 AM" src="https://github.com/ollama/ollama/assets/65097070/4ae18164-e5c2-453b-91d4-de54569b8e11">
| {
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5307/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5307/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/7499 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/7499/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/7499/comments | https://api.github.com/repos/ollama/ollama/issues/7499/events | https://github.com/ollama/ollama/pull/7499 | 2,634,169,544 | PR_kwDOJ0Z1Ps6A3i20 | 7,499 | build: Make target improvements | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 59 | 2024-11-05T00:47:49 | 2025-01-18T02:06:48 | 2024-12-10T17:47:19 | COLLABORATOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/7499",
"html_url": "https://github.com/ollama/ollama/pull/7499",
"diff_url": "https://github.com/ollama/ollama/pull/7499.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7499.patch",
"merged_at": "2024-12-10T17:47:19"
} | Add a few new targets and help for building locally. This also adjusts the runner lookup to favor local builds, then runners relative to the executable.
Fixes #7491
Fixes #7483
Fixes #7452
Fixes #2187
Fixes #2205
Fixes #2281
Fixes #7457
Fixes #7622
Fixes #7577
Fixes #1756
Fixes #7817
Fixes #6857
Carries #7199 | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/7499/reactions",
"total_count": 17,
"+1": 7,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 10,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/7499/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3528 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3528/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3528/comments | https://api.github.com/repos/ollama/ollama/issues/3528/events | https://github.com/ollama/ollama/pull/3528 | 2,229,959,636 | PR_kwDOJ0Z1Ps5r8bOC | 3,528 | Update generate scripts with new `LLAMA_CUDA` variable, set `HIP_PLATFORM` on Windows to avoid compiler errors | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 0 | 2024-04-07T21:05:46 | 2024-04-07T23:29:52 | 2024-04-07T23:29:51 | MEMBER | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/3528",
"html_url": "https://github.com/ollama/ollama/pull/3528",
"diff_url": "https://github.com/ollama/ollama/pull/3528.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3528.patch",
"merged_at": "2024-04-07T23:29:51"
} | null | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3528/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3528/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/6853 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6853/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6853/comments | https://api.github.com/repos/ollama/ollama/issues/6853/events | https://github.com/ollama/ollama/issues/6853 | 2,533,058,737 | I_kwDOJ0Z1Ps6W-2ix | 6,853 | Setting temperature on any llava model makes the Ollama server hangs on REST calls | {
"login": "jluisreymejias",
"id": 16193562,
"node_id": "MDQ6VXNlcjE2MTkzNTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/16193562?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jluisreymejias",
"html_url": "https://github.com/jluisreymejias",
"followers_url": "https://api.github.com/users/jluisreymejias/followers",
"following_url": "https://api.github.com/users/jluisreymejias/following{/other_user}",
"gists_url": "https://api.github.com/users/jluisreymejias/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jluisreymejias/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jluisreymejias/subscriptions",
"organizations_url": "https://api.github.com/users/jluisreymejias/orgs",
"repos_url": "https://api.github.com/users/jluisreymejias/repos",
"events_url": "https://api.github.com/users/jluisreymejias/events{/privacy}",
"received_events_url": "https://api.github.com/users/jluisreymejias/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] | closed | false | null | [] | null | 4 | 2024-09-18T08:23:25 | 2025-01-06T07:33:52 | 2025-01-06T07:33:52 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
When calling llava models from a REST client, setting temperature cause the ollama server hangs until process is killed.
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.10 | {
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6853/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6853/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/5160 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5160/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5160/comments | https://api.github.com/repos/ollama/ollama/issues/5160/events | https://github.com/ollama/ollama/issues/5160 | 2,363,800,007 | I_kwDOJ0Z1Ps6M5LnH | 5,160 | Add HelpingAI-9B in it | {
"login": "OE-LUCIFER",
"id": 158988478,
"node_id": "U_kgDOCXn4vg",
"avatar_url": "https://avatars.githubusercontent.com/u/158988478?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/OE-LUCIFER",
"html_url": "https://github.com/OE-LUCIFER",
"followers_url": "https://api.github.com/users/OE-LUCIFER/followers",
"following_url": "https://api.github.com/users/OE-LUCIFER/following{/other_user}",
"gists_url": "https://api.github.com/users/OE-LUCIFER/gists{/gist_id}",
"starred_url": "https://api.github.com/users/OE-LUCIFER/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OE-LUCIFER/subscriptions",
"organizations_url": "https://api.github.com/users/OE-LUCIFER/orgs",
"repos_url": "https://api.github.com/users/OE-LUCIFER/repos",
"events_url": "https://api.github.com/users/OE-LUCIFER/events{/privacy}",
"received_events_url": "https://api.github.com/users/OE-LUCIFER/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] | open | false | null | [] | null | 1 | 2024-06-20T08:05:57 | 2024-06-20T21:14:20 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | HelpingAI-9B is an advanced language model designed for emotionally intelligent conversational interactions. This model excels in empathetic engagement, understanding user emotions, and providing supportive dialogue. | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5160/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5160/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/3438 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3438/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3438/comments | https://api.github.com/repos/ollama/ollama/issues/3438/events | https://github.com/ollama/ollama/issues/3438 | 2,218,222,188 | I_kwDOJ0Z1Ps6EN2Js | 3,438 | Bug in MODEL download directory and launching ollama service in Linux | {
"login": "ejgutierrez74",
"id": 11474846,
"node_id": "MDQ6VXNlcjExNDc0ODQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/11474846?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ejgutierrez74",
"html_url": "https://github.com/ejgutierrez74",
"followers_url": "https://api.github.com/users/ejgutierrez74/followers",
"following_url": "https://api.github.com/users/ejgutierrez74/following{/other_user}",
"gists_url": "https://api.github.com/users/ejgutierrez74/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ejgutierrez74/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ejgutierrez74/subscriptions",
"organizations_url": "https://api.github.com/users/ejgutierrez74/orgs",
"repos_url": "https://api.github.com/users/ejgutierrez74/repos",
"events_url": "https://api.github.com/users/ejgutierrez74/events{/privacy}",
"received_events_url": "https://api.github.com/users/ejgutierrez74/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
}
] | open | false | null | [] | null | 15 | 2024-04-01T13:06:01 | 2024-07-18T09:58:57 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | I write this post to add more information:
1 - As you mentioned : I edited `sudo systemctl edit ollama.service`
![imagen](https://github.com/ollama/ollama/assets/11474846/d82ca623-5b89-4e8c-8b25-81a82de0b7b3)
And the /media/Samsung/ollama_models is empty....
![imagen](https://github.com/ollama/ollama/assets/11474846/63001767-af41-4f47-823a-5c6506f3599d)
So seems here a bug ( as said before the doc says you have to change the ollama.service file)
2 - ollama serve vs systemd
I run systemd start ollama ( today i booted my computer), and fails
![imagen](https://github.com/ollama/ollama/assets/11474846/9449fd23-8a4f-4a06-abd1-f3339778ce91)
But if i run ollama serve it seems to work ( i again just to be sure i started ollama, then see the status...and executed ollama serve):
![imagen](https://github.com/ollama/ollama/assets/11474846/a4c14ca7-4994-4497-a634-1ebad8cd1e77)
And in other tab seems ollama works:
![imagen](https://github.com/ollama/ollama/assets/11474846/352524e4-ce54-4b9d-8ec1-e719f4a16b1d)
3 - where are the model downloaded:
As posted before /media/Samsung/ollama_models -> as you can see is empty
/home/ollama -> doesnt exist
![imagen](https://github.com/ollama/ollama/assets/11474846/9dbb5c4e-27ce-4503-b756-eab30b9efd72)
and /usr/share/ollama ->
![imagen](https://github.com/ollama/ollama/assets/11474846/6b2e23b5-f245-4393-8b34-0ffde5705197)
im going mad ;)
Thans for your help
Editing post for update: Finally i found the ollama model at /home/eduardo/.ollama, but it shouldnt be there as default directory is /usr/share/ollama/.ollama, and i set the environment variable OLLAMA_MODEL to point to /media/Samsung/ollama_models
_Originally posted by @ejgutierrez74 in https://github.com/ollama/ollama/issues/3045#issuecomment-1991349181_
| null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3438/reactions",
"total_count": 6,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
} | https://api.github.com/repos/ollama/ollama/issues/3438/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/6948 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6948/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6948/comments | https://api.github.com/repos/ollama/ollama/issues/6948/events | https://github.com/ollama/ollama/pull/6948 | 2,547,117,379 | PR_kwDOJ0Z1Ps58nEbk | 6,948 | Fix Ollama silently failing on extra, unsupported openai parameters. | {
"login": "MadcowD",
"id": 719535,
"node_id": "MDQ6VXNlcjcxOTUzNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/719535?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MadcowD",
"html_url": "https://github.com/MadcowD",
"followers_url": "https://api.github.com/users/MadcowD/followers",
"following_url": "https://api.github.com/users/MadcowD/following{/other_user}",
"gists_url": "https://api.github.com/users/MadcowD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MadcowD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MadcowD/subscriptions",
"organizations_url": "https://api.github.com/users/MadcowD/orgs",
"repos_url": "https://api.github.com/users/MadcowD/repos",
"events_url": "https://api.github.com/users/MadcowD/events{/privacy}",
"received_events_url": "https://api.github.com/users/MadcowD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | open | false | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3 | 2024-09-25T07:00:07 | 2024-12-29T19:29:39 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/6948",
"html_url": "https://github.com/ollama/ollama/pull/6948",
"diff_url": "https://github.com/ollama/ollama/pull/6948.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6948.patch",
"merged_at": null
} | Currently Ollama will just let you send unsupported api params to the openai compatible endpoint and just silently fail. This wreaks havoc on downstream uses causing unexpected behaviour.
```python
# Retrieve the API key and ensure it's set
ollama_api_key = os.getenv("OLLAMA_API_KEY")
if not ollama_api_key:
raise ValueError("API key not found. Please ensure that the OLLAMA_API_KEY is set in your .env.local file.")
# Initialize the OpenAI client
ollama_client = OpenAI(
base_url="http://localhost:11434/v1",
api_key=ollama_api_key,
)
model = "llama3.1:latest"
# Construct the completion request with n=2
response = openai.Completion.create(
model=model,
prompt="Once upon a time",
max_tokens=50,
n=2, # Requesting two completions
temperature=0.7
)
print(f"Number of completions requested: 2")
print(f"Number of completions received: {len(response.choices)}\n")
assert len(response.choices) == 2, "Shouldn't ever get here"
# Iterate and print each completion
for idx, choice in enumerate(response.choices, 1):
print(f"Completion {idx}: {choice.text.strip()}")
```
This leads to
```
umber of completions requested: 2
Number of completions received: 1
Completion 1: Once upon a time, in a land where the sun always shone brightly, there lived a young adventurer eager to explore uncharted territories and uncover hidden treasures.
```
My change would cause the API to error because that parameter is just plainly not supported by ollama
| null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6948/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6948/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3427 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3427/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3427/comments | https://api.github.com/repos/ollama/ollama/issues/3427/events | https://github.com/ollama/ollama/issues/3427 | 2,217,052,006 | I_kwDOJ0Z1Ps6EJYdm | 3,427 | prompt_eval_count in api is broken | {
"login": "drazdra",
"id": 133811709,
"node_id": "U_kgDOB_nN_Q",
"avatar_url": "https://avatars.githubusercontent.com/u/133811709?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/drazdra",
"html_url": "https://github.com/drazdra",
"followers_url": "https://api.github.com/users/drazdra/followers",
"following_url": "https://api.github.com/users/drazdra/following{/other_user}",
"gists_url": "https://api.github.com/users/drazdra/gists{/gist_id}",
"starred_url": "https://api.github.com/users/drazdra/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/drazdra/subscriptions",
"organizations_url": "https://api.github.com/users/drazdra/orgs",
"repos_url": "https://api.github.com/users/drazdra/repos",
"events_url": "https://api.github.com/users/drazdra/events{/privacy}",
"received_events_url": "https://api.github.com/users/drazdra/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 5 | 2024-03-31T15:39:03 | 2024-06-04T06:58:20 | 2024-06-04T06:58:20 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
prompt_eval_count parameter is absent on some calls, on other calls it returns wrong information.
1. i tried /api/chat with "stablelm2", no system prompt, prompt="hi".
in result there is no field "prompt_eval_count" most of the time. sometimes it's there, randomly, but rarely.
2. when you have small num_ctx and the supplied prompt (content of all messages in /api/chat) exceeds the num_ctx size, the prompt_eval_count may either be absent or provide wrong information.
i believe it returns the amount of tokens that could fit the context window, instead of the whole context prompt that was sent.
thanks :).
### What did you expect to see?
expected behavior:
1. there should always be prompt_eval_count.
2. it should report the count of submitted tokens, not the processed ones.
3. optionally there maybe one more parameter returned, showing the amount of processed tokens.
### Steps to reproduce
use /api/chat to send array of messages, limit num_ctx, send longer content than fits into the context window defined with num_ctx, check the values returned in final json reply with done==true.
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
amd64
### Platform
_No response_
### Ollama version
0.1.30
### GPU
Other
### GPU info
cpu only
### CPU
AMD
### Other software
_No response_ | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3427/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3427/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/8366 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/8366/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/8366/comments | https://api.github.com/repos/ollama/ollama/issues/8366/events | https://github.com/ollama/ollama/issues/8366 | 2,778,471,158 | I_kwDOJ0Z1Ps6lnBr2 | 8,366 | deepseek v3 | {
"login": "Morrigan-Ship",
"id": 138357319,
"node_id": "U_kgDOCD8qRw",
"avatar_url": "https://avatars.githubusercontent.com/u/138357319?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Morrigan-Ship",
"html_url": "https://github.com/Morrigan-Ship",
"followers_url": "https://api.github.com/users/Morrigan-Ship/followers",
"following_url": "https://api.github.com/users/Morrigan-Ship/following{/other_user}",
"gists_url": "https://api.github.com/users/Morrigan-Ship/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Morrigan-Ship/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Morrigan-Ship/subscriptions",
"organizations_url": "https://api.github.com/users/Morrigan-Ship/orgs",
"repos_url": "https://api.github.com/users/Morrigan-Ship/repos",
"events_url": "https://api.github.com/users/Morrigan-Ship/events{/privacy}",
"received_events_url": "https://api.github.com/users/Morrigan-Ship/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] | closed | false | null | [] | null | 3 | 2025-01-09T18:07:28 | 2025-01-10T22:28:46 | 2025-01-10T22:28:42 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | null | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/8366/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/8366/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/4260 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4260/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4260/comments | https://api.github.com/repos/ollama/ollama/issues/4260/events | https://github.com/ollama/ollama/issues/4260 | 2,285,566,329 | I_kwDOJ0Z1Ps6IOvl5 | 4,260 | Error: could not connect to ollama app, is it running? | {
"login": "starMagic",
"id": 4728358,
"node_id": "MDQ6VXNlcjQ3MjgzNTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/4728358?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/starMagic",
"html_url": "https://github.com/starMagic",
"followers_url": "https://api.github.com/users/starMagic/followers",
"following_url": "https://api.github.com/users/starMagic/following{/other_user}",
"gists_url": "https://api.github.com/users/starMagic/gists{/gist_id}",
"starred_url": "https://api.github.com/users/starMagic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/starMagic/subscriptions",
"organizations_url": "https://api.github.com/users/starMagic/orgs",
"repos_url": "https://api.github.com/users/starMagic/repos",
"events_url": "https://api.github.com/users/starMagic/events{/privacy}",
"received_events_url": "https://api.github.com/users/starMagic/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1 | 2024-05-08T13:11:05 | 2024-05-21T18:34:09 | 2024-05-21T18:34:09 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
When try run command "Ollama list", the following error occurs:
server.log
2024/05/08 20:50:26 routes.go:989: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[* http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR:C:\\Users\\Qiang.Liu\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_TMPDIR:]"
time=2024-05-08T20:50:26.065+08:00 level=INFO source=images.go:897 msg="total blobs: 16"
time=2024-05-08T20:50:26.068+08:00 level=INFO source=images.go:904 msg="total unused blobs removed: 0"
time=2024-05-08T20:50:26.071+08:00 level=INFO source=routes.go:1034 msg="Listening on 127.0.0.1:11434 (version 0.1.34)"
time=2024-05-08T20:50:26.071+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11.3 rocm_v5.7]"
time=2024-05-08T20:50:26.071+08:00 level=INFO source=gpu.go:122 msg="Detecting GPUs"
time=2024-05-08T20:50:26.091+08:00 level=INFO source=cpu_common.go:15 msg="CPU has AVX"
Exception 0xc0000005 0x8 0x228ed7065e0 0x228ed7065e0
PC=0x228ed7065e0
signal arrived during external code execution
runtime.cgocall(0xf73c20, 0xc000680408)
runtime/cgocall.go:157 +0x3e fp=0xc0003850d0 sp=0xc000385098 pc=0xf092fe
syscall.SyscallN(0x7ffdc82ddcd0?, {0xc0000ff758?, 0x1?, 0x7ffdc7fd0000?})
runtime/syscall_windows.go:544 +0x107 fp=0xc000385148 sp=0xc0003850d0 pc=0xf6f147
github.com/ollama/ollama/gpu.(*HipLib).AMDDriverVersion(0xc0000be750)
github.com/ollama/ollama/gpu/amd_hip_windows.go:82 +0x69 fp=0xc0003851b8 sp=0xc000385148 pc=0x139fc09
github.com/ollama/ollama/gpu.AMDGetGPUInfo()
github.com/ollama/ollama/gpu/amd_windows.go:37 +0x91 fp=0xc000385840 sp=0xc0003851b8 pc=0x13a0331
github.com/ollama/ollama/gpu.GetGPUInfo()
github.com/ollama/ollama/gpu/gpu.go:214 +0x625 fp=0xc000385ae8 sp=0xc000385840 pc=0x13a4845
github.com/ollama/ollama/server.Serve({0x2009900, 0xc000276bc0})
github.com/ollama/ollama/server/routes.go:1059 +0x771 fp=0xc000385c70 sp=0xc000385ae8 pc=0x19ef231
github.com/ollama/ollama/cmd.RunServer(0xc0000a8b00?, {0x27b2ae0?, 0x4?, 0x1e6cc3e?})
github.com/ollama/ollama/cmd/cmd.go:901 +0x17c fp=0xc000385d58 sp=0xc000385c70 pc=0x1a092dc
github.com/spf13/cobra.(*Command).execute(0xc0004b4908, {0x27b2ae0, 0x0, 0x0})
github.com/spf13/[email protected]/command.go:940 +0x882 fp=0xc000385e78 sp=0xc000385d58 pc=0x12aaa22
github.com/spf13/cobra.(*Command).ExecuteC(0xc00002d808)
github.com/spf13/[email protected]/command.go:1068 +0x3a5 fp=0xc000385f30 sp=0xc000385e78 pc=0x12ab265
github.com/spf13/cobra.(*Command).Execute(...)
github.com/spf13/[email protected]/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
github.com/spf13/[email protected]/command.go:985
main.main()
github.com/ollama/ollama/main.go:11 +0x4d fp=0xc000385f50 sp=0xc000385f30 pc=0x1a1208d
runtime.main()
runtime/proc.go:271 +0x28b fp=0xc000385fe0 sp=0xc000385f50 pc=0xf4134b
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000385fe8 sp=0xc000385fe0 pc=0xf72461
goroutine 2 gp=0xc000084700 m=nil [force gc (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000087fa8 sp=0xc000087f88 pc=0xf4174e
runtime.goparkunlock(...)
runtime/proc.go:408
runtime.forcegchelper()
runtime/proc.go:326 +0xb8 fp=0xc000087fe0 sp=0xc000087fa8 pc=0xf415d8
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000087fe8 sp=0xc000087fe0 pc=0xf72461
created by runtime.init.6 in goroutine 1
runtime/proc.go:314 +0x1a
goroutine 3 gp=0xc000084a80 m=nil [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000089f80 sp=0xc000089f60 pc=0xf4174e
runtime.goparkunlock(...)
runtime/proc.go:408
runtime.bgsweep(0xc00003a070)
runtime/mgcsweep.go:318 +0xdf fp=0xc000089fc8 sp=0xc000089f80 pc=0xf2b7ff
runtime.gcenable.gowrap1()
runtime/mgc.go:203 +0x25 fp=0xc000089fe0 sp=0xc000089fc8 pc=0xf200a5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000089fe8 sp=0xc000089fe0 pc=0xf72461
created by runtime.gcenable in goroutine 1
runtime/mgc.go:203 +0x66
goroutine 4 gp=0xc000084c40 m=nil [GC scavenge wait]:
runtime.gopark(0x10000?, 0x1ffb9f0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000099f78 sp=0xc000099f58 pc=0xf4174e
runtime.goparkunlock(...)
runtime/proc.go:408
runtime.(*scavengerState).park(0x2726fc0)
runtime/mgcscavenge.go:425 +0x49 fp=0xc000099fa8 sp=0xc000099f78 pc=0xf29189
runtime.bgscavenge(0xc00003a070)
runtime/mgcscavenge.go:658 +0x59 fp=0xc000099fc8 sp=0xc000099fa8 pc=0xf29739
runtime.gcenable.gowrap2()
runtime/mgc.go:204 +0x25 fp=0xc000099fe0 sp=0xc000099fc8 pc=0xf20045
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000099fe8 sp=0xc000099fe0 pc=0xf72461
created by runtime.gcenable in goroutine 1
runtime/mgc.go:204 +0xa5
goroutine 18 gp=0xc000104380 m=nil [finalizer wait]:
runtime.gopark(0xc00008be48?, 0xf13445?, 0xa8?, 0x1?, 0xc000084000?)
runtime/proc.go:402 +0xce fp=0xc00008be20 sp=0xc00008be00 pc=0xf4174e
runtime.runfinq()
runtime/mfinal.go:194 +0x107 fp=0xc00008bfe0 sp=0xc00008be20 pc=0xf1f127
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00008bfe8 sp=0xc00008bfe0 pc=0xf72461
created by runtime.createfing in goroutine 1
runtime/mfinal.go:164 +0x3d
goroutine 5 gp=0xc000085340 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00009bf50 sp=0xc00009bf30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00009bfe0 sp=0xc00009bf50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00009bfe8 sp=0xc00009bfe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 19 gp=0xc000104fc0 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000095f50 sp=0xc000095f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000095fe0 sp=0xc000095f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000095fe8 sp=0xc000095fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 6 gp=0xc000085500 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00041df50 sp=0xc00041df30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00041dfe0 sp=0xc00041df50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00041dfe8 sp=0xc00041dfe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 20 gp=0xc000105180 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000097f50 sp=0xc000097f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000097fe0 sp=0xc000097f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000097fe8 sp=0xc000097fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 7 gp=0xc0000856c0 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00041ff50 sp=0xc00041ff30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00041ffe0 sp=0xc00041ff50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00041ffe8 sp=0xc00041ffe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 21 gp=0xc000105340 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000419f50 sp=0xc000419f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000419fe0 sp=0xc000419f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000419fe8 sp=0xc000419fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 8 gp=0xc000085880 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000425f50 sp=0xc000425f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000425fe0 sp=0xc000425f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000425fe8 sp=0xc000425fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 22 gp=0xc000105500 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00041bf50 sp=0xc00041bf30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00041bfe0 sp=0xc00041bf50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00041bfe8 sp=0xc00041bfe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 9 gp=0xc000085a40 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000427f50 sp=0xc000427f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000427fe0 sp=0xc000427f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000427fe8 sp=0xc000427fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 23 gp=0xc0001056c0 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000421f50 sp=0xc000421f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000421fe0 sp=0xc000421f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000421fe8 sp=0xc000421fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 10 gp=0xc000085c00 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00042df50 sp=0xc00042df30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00042dfe0 sp=0xc00042df50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00042dfe8 sp=0xc00042dfe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 24 gp=0xc000105880 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000423f50 sp=0xc000423f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000423fe0 sp=0xc000423f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000423fe8 sp=0xc000423fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 11 gp=0xc000085dc0 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00042ff50 sp=0xc00042ff30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00042ffe0 sp=0xc00042ff50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00042ffe8 sp=0xc00042ffe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 34 gp=0xc000482000 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000429f50 sp=0xc000429f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000429fe0 sp=0xc000429f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000429fe8 sp=0xc000429fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 25 gp=0xc000105a40 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002f5f50 sp=0xc0002f5f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002f5fe0 sp=0xc0002f5f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002f5fe8 sp=0xc0002f5fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 35 gp=0xc0004821c0 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00042bf50 sp=0xc00042bf30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00042bfe0 sp=0xc00042bf50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00042bfe8 sp=0xc00042bfe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 26 gp=0xc000105c00 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002f7f50 sp=0xc0002f7f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002f7fe0 sp=0xc0002f7f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002f7fe8 sp=0xc0002f7fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 36 gp=0xc000482380 m=nil [GC worker (idle)]:
runtime.gopark(0xa5c45b8f8c?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002f1f50 sp=0xc0002f1f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002f1fe0 sp=0xc0002f1f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002f1fe8 sp=0xc0002f1fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 27 gp=0xc000105dc0 m=nil [GC worker (idle)]:
runtime.gopark(0xa5c45b8f8c?, 0x3?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002fdf50 sp=0xc0002fdf30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002fdfe0 sp=0xc0002fdf50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002fdfe8 sp=0xc0002fdfe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 37 gp=0xc000482540 m=nil [GC worker (idle)]:
runtime.gopark(0x27b4a60?, 0x1?, 0x4c?, 0xcc?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002f3f50 sp=0xc0002f3f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002f3fe0 sp=0xc0002f3f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002f3fe8 sp=0xc0002f3fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 28 gp=0xc000500000 m=nil [GC worker (idle)]:
runtime.gopark(0x27b4a60?, 0x1?, 0x9c?, 0x61?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002fff50 sp=0xc0002fff30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002fffe0 sp=0xc0002fff50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002fffe8 sp=0xc0002fffe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 38 gp=0xc000482700 m=nil [GC worker (idle)]:
runtime.gopark(0x27b4a60?, 0x1?, 0x7c?, 0xd9?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002f9f50 sp=0xc0002f9f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002f9fe0 sp=0xc0002f9f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002f9fe8 sp=0xc0002f9fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 29 gp=0xc0005001c0 m=nil [GC worker (idle)]:
runtime.gopark(0x27b4a60?, 0x1?, 0xd4?, 0xaf?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000507f50 sp=0xc000507f30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000507fe0 sp=0xc000507f50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000507fe8 sp=0xc000507fe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 39 gp=0xc0004828c0 m=nil [GC worker (idle)]:
runtime.gopark(0xa5c45b8f8c?, 0x1?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0002fbf50 sp=0xc0002fbf30 pc=0xf4174e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0002fbfe0 sp=0xc0002fbf50 pc=0xf221e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002fbfe8 sp=0xc0002fbfe0 pc=0xf72461
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 12 gp=0xc0005841c0 m=5 mp=0xc000100008 [syscall]:
runtime.notetsleepg(0x27b36a0, 0xffffffffffffffff)
runtime/lock_sema.go:296 +0x31 fp=0xc000505fa0 sp=0xc000505f68 pc=0xf11a11
os/signal.signal_recv()
runtime/sigqueue.go:152 +0x29 fp=0xc000505fc0 sp=0xc000505fa0 pc=0xf6e169
os/signal.loop()
os/signal/signal_unix.go:23 +0x13 fp=0xc000505fe0 sp=0xc000505fc0 pc=0x12335d3
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000505fe8 sp=0xc000505fe0 pc=0xf72461
created by os/signal.Notify.func1.1 in goroutine 1
os/signal/signal.go:151 +0x1f
goroutine 13 gp=0xc000584380 m=nil [chan receive]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000509f08 sp=0xc000509ee8 pc=0xf4174e
runtime.chanrecv(0xc0004ce4e0, 0x0, 0x1)
runtime/chan.go:583 +0x3cd fp=0xc000509f80 sp=0xc000509f08 pc=0xf0b98d
runtime.chanrecv1(0x0?, 0x0?)
runtime/chan.go:442 +0x12 fp=0xc000509fa8 sp=0xc000509f80 pc=0xf0b592
github.com/ollama/ollama/server.Serve.func2()
github.com/ollama/ollama/server/routes.go:1043 +0x34 fp=0xc000509fe0 sp=0xc000509fa8 pc=0x19ef2f4
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000509fe8 sp=0xc000509fe0 pc=0xf72461
created by github.com/ollama/ollama/server.Serve in goroutine 1
github.com/ollama/ollama/server/routes.go:1042 +0x6f7
goroutine 14 gp=0xc000584540 m=nil [select]:
runtime.gopark(0xc00047df58?, 0x3?, 0x60?, 0x0?, 0xc00047de32?)
runtime/proc.go:402 +0xce fp=0xc00047dcb8 sp=0xc00047dc98 pc=0xf4174e
runtime.selectgo(0xc00047df58, 0xc00047de2c, 0x0?, 0x0, 0x0?, 0x1)
runtime/select.go:327 +0x725 fp=0xc00047ddd8 sp=0xc00047dcb8 pc=0xf51ba5
github.com/ollama/ollama/server.(*Scheduler).processPending(0xc000174af0, {0x200c0c0, 0xc000174aa0})
github.com/ollama/ollama/server/sched.go:97 +0xcf fp=0xc00047dfb8 sp=0xc00047ddd8 pc=0x19f27af
github.com/ollama/ollama/server.(*Scheduler).Run.func1()
github.com/ollama/ollama/server/sched.go:87 +0x1f fp=0xc00047dfe0 sp=0xc00047dfb8 pc=0x19f26bf
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00047dfe8 sp=0xc00047dfe0 pc=0xf72461
created by github.com/ollama/ollama/server.(*Scheduler).Run in goroutine 1
github.com/ollama/ollama/server/sched.go:86 +0xb4
goroutine 15 gp=0xc000584700 m=nil [select]:
runtime.gopark(0xc000503f50?, 0x3?, 0x0?, 0x0?, 0xc000503d5a?)
runtime/proc.go:402 +0xce fp=0xc000503be8 sp=0xc000503bc8 pc=0xf4174e
runtime.selectgo(0xc000503f50, 0xc000503d54, 0x0?, 0x0, 0x0?, 0x1)
runtime/select.go:327 +0x725 fp=0xc000503d08 sp=0xc000503be8 pc=0xf51ba5
github.com/ollama/ollama/server.(*Scheduler).processCompleted(0xc000174af0, {0x200c0c0, 0xc000174aa0})
github.com/ollama/ollama/server/sched.go:209 +0xec fp=0xc000503fb8 sp=0xc000503d08 pc=0x19f332c
github.com/ollama/ollama/server.(*Scheduler).Run.func2()
github.com/ollama/ollama/server/sched.go:91 +0x1f fp=0xc000503fe0 sp=0xc000503fb8 pc=0x19f267f
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000503fe8 sp=0xc000503fe0 pc=0xf72461
created by github.com/ollama/ollama/server.(*Scheduler).Run in goroutine 1
github.com/ollama/ollama/server/sched.go:90 +0x110
rax 0x228f4cf83f0
rbx 0x228ed7065e0
rcx 0xd3d
rdx 0x6
rdi 0x228ed7117f0
rsi 0x228f4cf83f0
rbp 0xd3c
rsp 0x45e97fe108
r8 0x2
r9 0x40
r10 0x80
r11 0x22000000007f4100
r12 0x1
r13 0x0
r14 0x228f4b98340
r15 0x228ed7065e0
rip 0x228ed7065e0
rflags 0x10202
cs 0x33
fs 0x53
gs 0x2b
### OS
Windows
### GPU
AMD
### CPU
Intel
### Ollama version
0.1.34 | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4260/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4260/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/551 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/551/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/551/comments | https://api.github.com/repos/ollama/ollama/issues/551/events | https://github.com/ollama/ollama/issues/551 | 1,901,647,151 | I_kwDOJ0Z1Ps5xWNUv | 551 | Dockerfile.cuda fails to build server | {
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 7 | 2023-09-18T19:58:22 | 2023-09-26T22:29:49 | 2023-09-26T22:29:49 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | On an AWS EC2 `g4dn.2xlarge` instance with Ollama https://github.com/jmorganca/ollama/tree/c345053a8bf47d5ef8f1fe15d385108059209fba:
```none
> sudo docker buildx build . --file Dockerfile.cuda
[+] Building 57.2s (7/16) docker:default
=> => transferring context: 97B 0.0s
[+] Building 57.3s (7/16) docker:default
=> => transferring dockerfile: 939B 0.0s
[+] Building 113.7s (15/16) docker:default
=> [internal] load .dockerignore 0.0s
=> => transferring context: 97B 0.0s
=> [internal] load build definition from Dockerfile.cuda 0.0s
=> => transferring dockerfile: 939B 0.0s
=> [internal] load metadata for docker.io/nvidia/cuda:12.2.0-runtime-ubuntu22.04 1.0s
=> [internal] load metadata for docker.io/nvidia/cuda:12.2.0-devel-ubuntu22.04 0.9s
=> [stage-0 1/7] FROM docker.io/nvidia/cuda:12.2.0-devel-ubuntu22.04@sha256:0e2d7e252847c334b056937e533683556926f5343a472b6b92f858a7af8ab880 71.6s
...
=> [stage-1 2/3] RUN groupadd ollama && useradd -m -g ollama ollama 20.6s
=> [stage-0 2/7] WORKDIR /go/src/github.com/jmorganca/ollama 21.0s
=> [stage-0 3/7] RUN apt-get update && apt-get install -y git build-essential cmake 13.5s
=> [stage-0 4/7] ADD https://dl.google.com/go/go1.21.1.linux-amd64.tar.gz /tmp/go1.21.1.tar.gz 0.4s
=> [stage-0 5/7] RUN mkdir -p /usr/local && tar xz -C /usr/local </tmp/go1.21.1.tar.gz 3.2s
=> [stage-0 6/7] COPY . . 0.0s
=> ERROR [stage-0 7/7] RUN /usr/local/go/bin/go generate ./... && /usr/local/go/bin/go build -ldflags "-linkmode=external -extldflags='-static' -X=github.com/jmorganca/ollama/version.Version=0.0.0 -X= 2.9s
------
> [stage-0 7/7] RUN /usr/local/go/bin/go generate ./... && /usr/local/go/bin/go build -ldflags "-linkmode=external -extldflags='-static' -X=github.com/jmorganca/ollama/version.Version=0.0.0 -X=github.com/jmorganca/ollama/server.mode=release" .:
0.377 go: downloading github.com/mattn/go-runewidth v0.0.14
0.377 go: downloading github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db
0.378 go: downloading golang.org/x/term v0.10.0
0.385 go: downloading gonum.org/v1/gonum v0.13.0
0.388 go: downloading github.com/chzyer/readline v1.5.1
0.388 go: downloading golang.org/x/crypto v0.10.0
0.391 go: downloading github.com/pbnjay/memory v0.0.0-20210728143218-7b4eea64cf58
0.391 go: downloading github.com/dustin/go-humanize v1.0.1
0.478 go: downloading github.com/gin-contrib/cors v1.4.0
0.480 go: downloading github.com/olekukonko/tablewriter v0.0.5
0.483 go: downloading github.com/gin-gonic/gin v1.9.1
0.486 go: downloading github.com/spf13/cobra v1.7.0
0.487 go: downloading golang.org/x/exp v0.0.0-20230817173708-d852ddb80c63
0.565 go: downloading golang.org/x/sys v0.11.0
0.566 go: downloading github.com/rivo/uniseg v0.2.0
0.573 go: downloading github.com/spf13/pflag v1.0.5
0.580 go: downloading github.com/gin-contrib/sse v0.1.0
0.580 go: downloading github.com/mattn/go-isatty v0.0.19
0.597 go: downloading golang.org/x/net v0.10.0
0.624 go: downloading github.com/go-playground/validator/v10 v10.14.0
0.624 go: downloading github.com/pelletier/go-toml/v2 v2.0.8
0.651 go: downloading github.com/ugorji/go/codec v1.2.11
0.700 go: downloading google.golang.org/protobuf v1.30.0
0.707 go: downloading gopkg.in/yaml.v3 v3.0.1
0.741 go: downloading github.com/gabriel-vasile/mimetype v1.4.2
0.742 go: downloading github.com/go-playground/universal-translator v0.18.1
0.745 go: downloading github.com/leodido/go-urn v1.2.4
0.778 go: downloading golang.org/x/text v0.10.0
0.841 go: downloading github.com/go-playground/locales v0.14.1
1.894 fatal: not a git repository: /go/src/github.com/jmorganca/ollama/../.git/modules/ollama
1.895 llm/llama.cpp/generate.go:6: running "git": exit status 128
------
Dockerfile.cuda:13
--------------------
12 | ENV GOARCH=$TARGETARCH
13 | >>> RUN /usr/local/go/bin/go generate ./... \
14 | >>> && /usr/local/go/bin/go build -ldflags "-linkmode=external -extldflags='-static' -X=github.com/jmorganca/ollama/version.Version=$VERSION -X=github.com/jmorganca/ollama/server.mode=release" .
15 |
--------------------
ERROR: failed to solve: process "/bin/sh -c /usr/local/go/bin/go generate ./... && /usr/local/go/bin/go build -ldflags \"-linkmode=external -extldflags='-static' -X=github.com/jmorganca/ollama/version.Version=$VERSION -X=github.com/jmorganca/ollama/server.mode=release\" ." did not complete successfully: exit code: 1
```
[Line 13](https://github.com/jmorganca/ollama/blob/c345053a8bf47d5ef8f1fe15d385108059209fba/Dockerfile.cuda#L13) fails with:
```
ERROR: failed to solve: process "/bin/sh -c /usr/local/go/bin/go generate ./... && /usr/local/go/bin/go build -ldflags \"-linkmode=external -extldflags='-static' -X=github.com/jmorganca/ollama/version.Version=$VERSION -X=github.com/jmorganca/ollama/server.mode=release\" ." did not complete successfully: exit code: 1
```
Any idea what's going on here? | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/551/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/551/timeline | null | not_planned | false |
https://api.github.com/repos/ollama/ollama/issues/5990 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5990/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5990/comments | https://api.github.com/repos/ollama/ollama/issues/5990/events | https://github.com/ollama/ollama/issues/5990 | 2,432,539,183 | I_kwDOJ0Z1Ps6Q_Zov | 5,990 | Tools and properties.type Not Supporting Arrays | {
"login": "xonlly",
"id": 4999786,
"node_id": "MDQ6VXNlcjQ5OTk3ODY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4999786?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xonlly",
"html_url": "https://github.com/xonlly",
"followers_url": "https://api.github.com/users/xonlly/followers",
"following_url": "https://api.github.com/users/xonlly/following{/other_user}",
"gists_url": "https://api.github.com/users/xonlly/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xonlly/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xonlly/subscriptions",
"organizations_url": "https://api.github.com/users/xonlly/orgs",
"repos_url": "https://api.github.com/users/xonlly/repos",
"events_url": "https://api.github.com/users/xonlly/events{/privacy}",
"received_events_url": "https://api.github.com/users/xonlly/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | open | false | null | [] | null | 2 | 2024-07-26T16:06:52 | 2024-10-18T14:29:13 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
**Title:** Issue with `DynamicStructuredTool` and `properties.type` Not Supporting Arrays in LangchainJS
**Description:**
I am encountering an issue when using `DynamicStructuredTool` in LangchainJS. Specifically, the `type` property within `properties` does not currently support arrays. This results in an error when I try to use a nullable argument specified as `["string", "null"]`. The error message is as follows:
```
LLM run errored with error: "HTTP error! status: 400 Response:
{\"error\":{\"message\":\"json: cannot unmarshal array into Go struct field .tools.function.parameters.properties.type of type string\",\"type\":\"invalid_request_error\",\"param\":null,\"code\":null}}\n
\nMistralAPIError: HTTP error! status: 400 Response:
{\"error\":{\"message\":\"json: cannot unmarshal array into Go struct field .tools.function.parameters.properties.type of type string\",\"type\":\"invalid_request_error\",\"param\":null,\"code\":null}}\n
at MistralClient._request (file:///Users/xonlly/projects/devana.ai/serveur/node_modules/@mistralai/mistralai/src/client.js:162:17)\n
at processTicksAndRejections (node:internal/process/task_queues:95:5)\n
at MistralClient.chat (file:///Users/xonlly/projects/devana.ai/serveur/node_modules/@mistralai/mistralai/src/client.js:338:22)\n
at /Users/xonlly/projects/devana.ai/serveur/node_modules/@langchain/mistralai/dist/chat_models.cjs:366:27\n
at RetryOperation._fn (/Users/xonlly/projects/devana.ai/serveur/node_modules/p-retry/index.js:50:12)"
```
**JSON Configuration:**
```json
{
"model": "mistral-nemo",
"messages": [
{
"role": "user",
"content": "Hello"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "search-files",
"description": "Search all files in the collection",
"parameters": {
"type": "object",
"properties": {
"query": {
"type": ["string", "null"]
}
},
"required": ["query"],
"additionalProperties": false,
"$schema": "http://json-schema.org/draft-07/schema#"
}
}
}
],
"temperature": 0.7,
"top_p": 1,
"stream": false,
"tool_choice": "required"
}
```
**Steps to Reproduce:**
1. Use LangchainJS with `DynamicStructuredTool`.
2. Define a nullable argument with `["string", "null"]` in the `type` property.
3. Observe the error upon execution.
**Expected Behavior:**
The `type` property should accept arrays to support nullable arguments without causing an error.
**Actual Behavior:**
An error occurs because the `type` property does not support arrays, resulting in a failure to unmarshal the array into a Go struct field.
**Proposed Solution:**
Update the `type` property within `properties` to support arrays, allowing for nullable arguments to be correctly processed.
**Environment:**
- Ollama version: 0.3.0
- Operating System: Ubuntu 22.04
**Additional Context:**
Any guidance or updates to address this limitation would be greatly appreciated. Thank you!
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.0 | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5990/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5990/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/6953 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6953/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6953/comments | https://api.github.com/repos/ollama/ollama/issues/6953/events | https://github.com/ollama/ollama/issues/6953 | 2,547,721,251 | I_kwDOJ0Z1Ps6X2yQj | 6,953 | AMD ROCm Card can not use flash attention | {
"login": "superligen",
"id": 4199207,
"node_id": "MDQ6VXNlcjQxOTkyMDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/4199207?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/superligen",
"html_url": "https://github.com/superligen",
"followers_url": "https://api.github.com/users/superligen/followers",
"following_url": "https://api.github.com/users/superligen/following{/other_user}",
"gists_url": "https://api.github.com/users/superligen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/superligen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/superligen/subscriptions",
"organizations_url": "https://api.github.com/users/superligen/orgs",
"repos_url": "https://api.github.com/users/superligen/repos",
"events_url": "https://api.github.com/users/superligen/events{/privacy}",
"received_events_url": "https://api.github.com/users/superligen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] | open | false | null | [] | null | 4 | 2024-09-25T11:26:27 | 2024-12-19T19:36:01 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
My cards is w7900, and rocm driver is 6.3 , I found the llama-cpp server started by Ollama always without -fa flag.
I check the code , found :
// only cuda (compute capability 7+) and metal support flash attention
if g.Library != "metal" && (g.Library != "cuda" || g.DriverMajor < 7) {
flashAttnEnabled = false
}
This code sames wrong.
Ref: https://github.com/Dao-AILab/flash-attention/pull/1010 Support for RoCM has been added tk flash attention 2
### OS
Linux
### GPU
AMD
### CPU
Intel
### Ollama version
_No response_ | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6953/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6953/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/6670 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6670/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6670/comments | https://api.github.com/repos/ollama/ollama/issues/6670/events | https://github.com/ollama/ollama/issues/6670 | 2,509,778,404 | I_kwDOJ0Z1Ps6VmC3k | 6,670 | expose slots data through API | {
"login": "aiseei",
"id": 30615541,
"node_id": "MDQ6VXNlcjMwNjE1NTQx",
"avatar_url": "https://avatars.githubusercontent.com/u/30615541?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aiseei",
"html_url": "https://github.com/aiseei",
"followers_url": "https://api.github.com/users/aiseei/followers",
"following_url": "https://api.github.com/users/aiseei/following{/other_user}",
"gists_url": "https://api.github.com/users/aiseei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aiseei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aiseei/subscriptions",
"organizations_url": "https://api.github.com/users/aiseei/orgs",
"repos_url": "https://api.github.com/users/aiseei/repos",
"events_url": "https://api.github.com/users/aiseei/events{/privacy}",
"received_events_url": "https://api.github.com/users/aiseei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | closed | false | null | [] | null | 1 | 2024-09-06T07:58:13 | 2024-09-06T15:38:13 | 2024-09-06T15:38:13 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | hI
Can the information that can be seen in the logs be exposed through /slots api per server/port ? We need this to manage queuing in our load balancer. This has been exposed by llama cpp already. https://github.com/ggerganov/llama.cpp/tree/master/examples/server#get-slots-returns-the-current-slots-processing-state
DEBUG [update_slots] all slots are idle and system prompt is empty, clear the KV cache | tid="140031137009664" timestamp=1725608401
DEBUG [process_single_task] slot data | n_idle_slots=3 n_processing_slots=0 task_id=0 tid="140031137009664" timestamp=1725608401
Many thanks! | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6670/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6670/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/4248 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4248/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4248/comments | https://api.github.com/repos/ollama/ollama/issues/4248/events | https://github.com/ollama/ollama/issues/4248 | 2,284,620,426 | I_kwDOJ0Z1Ps6ILIqK | 4,248 | error loading model architecture: unknown model architecture: 'qwen2moe' | {
"login": "li904775857",
"id": 43633294,
"node_id": "MDQ6VXNlcjQzNjMzMjk0",
"avatar_url": "https://avatars.githubusercontent.com/u/43633294?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/li904775857",
"html_url": "https://github.com/li904775857",
"followers_url": "https://api.github.com/users/li904775857/followers",
"following_url": "https://api.github.com/users/li904775857/following{/other_user}",
"gists_url": "https://api.github.com/users/li904775857/gists{/gist_id}",
"starred_url": "https://api.github.com/users/li904775857/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/li904775857/subscriptions",
"organizations_url": "https://api.github.com/users/li904775857/orgs",
"repos_url": "https://api.github.com/users/li904775857/repos",
"events_url": "https://api.github.com/users/li904775857/events{/privacy}",
"received_events_url": "https://api.github.com/users/li904775857/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] | open | false | null | [] | null | 1 | 2024-05-08T03:50:18 | 2024-07-25T17:43:34 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
Qwen1.5-MoE-A2.7B-Chat is installed by convert-hf-to-gguf.py according to the process. After 4-bit quantization, ollamamodelfile is created, but it is not supported when loading. What is the cause of this?
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.32 | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4248/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4248/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/5188 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5188/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5188/comments | https://api.github.com/repos/ollama/ollama/issues/5188/events | https://github.com/ollama/ollama/pull/5188 | 2,364,778,595 | PR_kwDOJ0Z1Ps5zF3gJ | 5,188 | fix: skip os.removeAll() if PID does not exist | {
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 0 | 2024-06-20T15:54:26 | 2024-06-20T17:40:59 | 2024-06-20T17:40:59 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/5188",
"html_url": "https://github.com/ollama/ollama/pull/5188",
"diff_url": "https://github.com/ollama/ollama/pull/5188.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5188.patch",
"merged_at": "2024-06-20T17:40:59"
} | previously deleted all directories in $TMPDIR starting with ollama. Added a "continue" to skip the directory removal if a PID doesn't exist. We do this to prevent accidentally deleting directories in tmpdir that share the ollama name but aren't created by us for processes
resolves: https://github.com/ollama/ollama/issues/5129 | {
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5188/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5188/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/4204 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/4204/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/4204/comments | https://api.github.com/repos/ollama/ollama/issues/4204/events | https://github.com/ollama/ollama/issues/4204 | 2,281,198,575 | I_kwDOJ0Z1Ps6H-FPv | 4,204 | Support pull from habor registry in proxy mode an push to harbor | {
"login": "ptempier",
"id": 6312537,
"node_id": "MDQ6VXNlcjYzMTI1Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6312537?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ptempier",
"html_url": "https://github.com/ptempier",
"followers_url": "https://api.github.com/users/ptempier/followers",
"following_url": "https://api.github.com/users/ptempier/following{/other_user}",
"gists_url": "https://api.github.com/users/ptempier/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ptempier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ptempier/subscriptions",
"organizations_url": "https://api.github.com/users/ptempier/orgs",
"repos_url": "https://api.github.com/users/ptempier/repos",
"events_url": "https://api.github.com/users/ptempier/events{/privacy}",
"received_events_url": "https://api.github.com/users/ptempier/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] | open | false | null | [] | null | 5 | 2024-05-06T15:49:57 | 2024-12-19T06:27:32 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Not sure why its not working, maybe i do something bad.
From other ticket i understand it supposed to work with OCI registry.
What i tried :
ollama pull habor-server//ollama.com/library/llama3:text
Error: pull model manifest: 400
ollama pull habor-server/ollama.com/llama3:text
Error: pull model manifest: 400
ollama cp llama2:7b habor-server/aistuff/llama2:7b
ollama push habor-server/aistuff/llama2:7b
retrieving manifest
Error: Get "harbor?nonce=zzzz&scope=&service=&ts=xxxx": unsupported protocol scheme "" | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/4204/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/4204/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/1223 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/1223/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/1223/comments | https://api.github.com/repos/ollama/ollama/issues/1223/events | https://github.com/ollama/ollama/pull/1223 | 2,004,804,614 | PR_kwDOJ0Z1Ps5gDJXg | 1,223 | Make alt+backspace delete word | {
"login": "kejcao",
"id": 106453563,
"node_id": "U_kgDOBlhaOw",
"avatar_url": "https://avatars.githubusercontent.com/u/106453563?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kejcao",
"html_url": "https://github.com/kejcao",
"followers_url": "https://api.github.com/users/kejcao/followers",
"following_url": "https://api.github.com/users/kejcao/following{/other_user}",
"gists_url": "https://api.github.com/users/kejcao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kejcao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kejcao/subscriptions",
"organizations_url": "https://api.github.com/users/kejcao/orgs",
"repos_url": "https://api.github.com/users/kejcao/repos",
"events_url": "https://api.github.com/users/kejcao/events{/privacy}",
"received_events_url": "https://api.github.com/users/kejcao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 0 | 2023-11-21T17:29:44 | 2023-11-21T20:26:47 | 2023-11-21T20:26:47 | CONTRIBUTOR | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/1223",
"html_url": "https://github.com/ollama/ollama/pull/1223",
"diff_url": "https://github.com/ollama/ollama/pull/1223.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1223.patch",
"merged_at": "2023-11-21T20:26:47"
} | In GNU Readline you can press alt+backspace to delete word. I'm used to this behavior and so it's jarring not to be able to do it. This commit adds the feature. | {
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/1223/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/1223/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/3007 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3007/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3007/comments | https://api.github.com/repos/ollama/ollama/issues/3007/events | https://github.com/ollama/ollama/issues/3007 | 2,176,498,516 | I_kwDOJ0Z1Ps6BurtU | 3,007 | Search on ollama.com/library is missing lots of models | {
"login": "maxtheman",
"id": 2172753,
"node_id": "MDQ6VXNlcjIxNzI3NTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/2172753?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/maxtheman",
"html_url": "https://github.com/maxtheman",
"followers_url": "https://api.github.com/users/maxtheman/followers",
"following_url": "https://api.github.com/users/maxtheman/following{/other_user}",
"gists_url": "https://api.github.com/users/maxtheman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/maxtheman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/maxtheman/subscriptions",
"organizations_url": "https://api.github.com/users/maxtheman/orgs",
"repos_url": "https://api.github.com/users/maxtheman/repos",
"events_url": "https://api.github.com/users/maxtheman/events{/privacy}",
"received_events_url": "https://api.github.com/users/maxtheman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
}
] | open | false | {
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0 | 2024-03-08T17:46:15 | 2024-03-11T22:18:50 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | Current behavior:
Using @ehartford as an example since he's a prolific ollama model contributor:
https://ollama.com/search?q=ehartford&p=1
Shows his models:
<img width="1276" alt="Screenshot 2024-03-08 at 9 45 33 AM" src="https://github.com/ollama/ollama/assets/2172753/08b9dc80-5d94-4b86-82dd-37c0dddac326">
https://ollama.com/library?q=ehartford
Does not:
<img width="1257" alt="Screenshot 2024-03-08 at 9 45 52 AM" src="https://github.com/ollama/ollama/assets/2172753/6f1c248b-a93c-468a-a78c-24811a857731">
Expected behavior is that they would be the same.
| null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3007/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3007/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/3227 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3227/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3227/comments | https://api.github.com/repos/ollama/ollama/issues/3227/events | https://github.com/ollama/ollama/issues/3227 | 2,192,660,065 | I_kwDOJ0Z1Ps6CsVZh | 3,227 | ollama/ollama Docker image: committed modifications aren't saved | {
"login": "nicolasduminil",
"id": 1037978,
"node_id": "MDQ6VXNlcjEwMzc5Nzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1037978?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nicolasduminil",
"html_url": "https://github.com/nicolasduminil",
"followers_url": "https://api.github.com/users/nicolasduminil/followers",
"following_url": "https://api.github.com/users/nicolasduminil/following{/other_user}",
"gists_url": "https://api.github.com/users/nicolasduminil/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nicolasduminil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nicolasduminil/subscriptions",
"organizations_url": "https://api.github.com/users/nicolasduminil/orgs",
"repos_url": "https://api.github.com/users/nicolasduminil/repos",
"events_url": "https://api.github.com/users/nicolasduminil/events{/privacy}",
"received_events_url": "https://api.github.com/users/nicolasduminil/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 2 | 2024-03-18T16:14:49 | 2024-03-19T13:46:15 | 2024-03-19T08:48:59 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
I'm using the Docker image `ollama/ollama:latest`. I'm running the image and, in the new created container, I'm pulling `llama2`. Once the pull operation finished, I'm checking its success using the `ollama list` command.
Now, I commit the modification, I tag the new modified image and I push it on DockerHub.
Pulling it again, running it and checking the presence of `llama2` fails as the result of the `ollama list` command is empty this time.
### What did you expect to see?
I expect that the new augmented image contains `llama2`
### Steps to reproduce
$ docker pull ollama/ollama:latest
$ docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
$ docker exec -ti ollama ollama pull llama2
... waiting 10 minutes ...
$ docker exec -ti ollama ollama list
NAME ID SIZE MODIFIED
llama2:latest 78e26419b446 3.8 GB About a minute ago
$ docker commit ollama
$ docker tag <image_id> nicolasduminil/ollama:llama2
$ docker push nicolasduminil/ollama:llama2
$ docker pull nicolasduminil/ollama:llama2
$ docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama nicolasduminil/ollama:llama2
$ docker exec -ti ollama ollama list
NAME ID SIZE MODIFIED
### Are there any recent changes that introduced the issue?
Negative
### OS
Linux
### Architecture
x86
### Platform
_No response_
### Ollama version
_No response_
### GPU
_No response_
### GPU info
_No response_
### CPU
Intel
### Other software
None | {
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3227/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3227/timeline | null | not_planned | false |
https://api.github.com/repos/ollama/ollama/issues/3293 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3293/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3293/comments | https://api.github.com/repos/ollama/ollama/issues/3293/events | https://github.com/ollama/ollama/issues/3293 | 2,202,706,972 | I_kwDOJ0Z1Ps6DSqQc | 3,293 | ollama run in national user name | {
"login": "hgabor47",
"id": 1212585,
"node_id": "MDQ6VXNlcjEyMTI1ODU=",
"avatar_url": "https://avatars.githubusercontent.com/u/1212585?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hgabor47",
"html_url": "https://github.com/hgabor47",
"followers_url": "https://api.github.com/users/hgabor47/followers",
"following_url": "https://api.github.com/users/hgabor47/following{/other_user}",
"gists_url": "https://api.github.com/users/hgabor47/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hgabor47/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hgabor47/subscriptions",
"organizations_url": "https://api.github.com/users/hgabor47/orgs",
"repos_url": "https://api.github.com/users/hgabor47/repos",
"events_url": "https://api.github.com/users/hgabor47/events{/privacy}",
"received_events_url": "https://api.github.com/users/hgabor47/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4 | 2024-03-22T15:01:10 | 2024-05-04T22:03:44 | 2024-05-04T22:03:43 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
![image](https://github.com/ollama/ollama/assets/1212585/5a14f850-213b-471c-90c9-f1f0f8752c31)
My Username has international characters like á and the ollama not handle it.
### What did you expect to see?
RUN
### Steps to reproduce
1 Create a windows user with international charaters like: Gábor
2 start the ollama with: ollama run llama2
### Are there any recent changes that introduced the issue?
_No response_
### OS
Windows
### Architecture
amd64
### Platform
_No response_
### Ollama version
_No response_
### GPU
Nvidia
### GPU info
_No response_
### CPU
AMD
### Other software
_No response_ | {
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3293/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3293/timeline | null | completed | false |
https://api.github.com/repos/ollama/ollama/issues/6946 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6946/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6946/comments | https://api.github.com/repos/ollama/ollama/issues/6946/events | https://github.com/ollama/ollama/issues/6946 | 2,546,759,749 | I_kwDOJ0Z1Ps6XzHhF | 6,946 | llama runner process has terminated: exit status 0xc0000005 | {
"login": "viosay",
"id": 16093380,
"node_id": "MDQ6VXNlcjE2MDkzMzgw",
"avatar_url": "https://avatars.githubusercontent.com/u/16093380?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/viosay",
"html_url": "https://github.com/viosay",
"followers_url": "https://api.github.com/users/viosay/followers",
"following_url": "https://api.github.com/users/viosay/following{/other_user}",
"gists_url": "https://api.github.com/users/viosay/gists{/gist_id}",
"starred_url": "https://api.github.com/users/viosay/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/viosay/subscriptions",
"organizations_url": "https://api.github.com/users/viosay/orgs",
"repos_url": "https://api.github.com/users/viosay/repos",
"events_url": "https://api.github.com/users/viosay/events{/privacy}",
"received_events_url": "https://api.github.com/users/viosay/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] | open | false | null | [] | null | 5 | 2024-09-25T02:30:59 | 2024-11-02T17:12:45 | null | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
It's again the https://github.com/ollama/ollama/issues/6011 issue.
**The issue is with embedding call with the model converted using convert_hf_to_gguf.py.**
litellm.llms.ollama.OllamaError: {"error":"llama runner process has terminated: exit status 0xc0000005"}
```
INFO [wmain] system info | n_threads=6 n_threads_batch=6 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 1 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="18380" timestamp=1727231008 total_threads=12
INFO [wmain] HTTP server listening | hostname="127.0.0.1" n_threads_http="11" port="13505" tid="18380" timestamp=1727231008
llama_model_loader: loaded meta data with 26 key-value pairs and 389 tensors from C:\Users\Administrator\.ollama\models\blobs\sha256-aad91e93e9ec705a527cfa8701698055cf473223437acd029762bb77be6fc92d (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = bert
llama_model_loader: - kv 1: general.type str = model
llama_model_loader: - kv 2: general.name str = Conan_Embedding_V1
llama_model_loader: - kv 3: general.size_label str = 324M
llama_model_loader: - kv 4: general.license str = cc-by-nc-4.0
llama_model_loader: - kv 5: general.tags arr[str,1] = ["mteb"]
llama_model_loader: - kv 6: bert.block_count u32 = 24
llama_model_loader: - kv 7: bert.context_length u32 = 512
llama_model_loader: - kv 8: bert.embedding_length u32 = 1024
llama_model_loader: - kv 9: bert.feed_forward_length u32 = 4096
llama_model_loader: - kv 10: bert.attention.head_count u32 = 16
llama_model_loader: - kv 11: bert.attention.layer_norm_epsilon f32 = 0.000000
llama_model_loader: - kv 12: general.file_type u32 = 1
llama_model_loader: - kv 13: bert.attention.causal bool = false
llama_model_loader: - kv 14: bert.pooling_type u32 = 1
llama_model_loader: - kv 15: tokenizer.ggml.token_type_count u32 = 2
llama_model_loader: - kv 16: tokenizer.ggml.model str = bert
llama_model_loader: - kv 17: tokenizer.ggml.pre str = Conan-embedding-v1
llama_model_loader: - kv 18: tokenizer.ggml.tokens arr[str,21128] = ["[PAD]", "[unused1]", "[unused2]", "...
llama_model_loader: - kv 19: tokenizer.ggml.token_type arr[i32,21128] = [3, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
llama_model_loader: - kv 20: tokenizer.ggml.unknown_token_id u32 = 100
llama_model_loader: - kv 21: tokenizer.ggml.seperator_token_id u32 = 102
llama_model_loader: - kv 22: tokenizer.ggml.padding_token_id u32 = 0
llama_model_loader: - kv 23: tokenizer.ggml.cls_token_id u32 = 101
llama_model_loader: - kv 24: tokenizer.ggml.mask_token_id u32 = 103
llama_model_loader: - kv 25: general.quantization_version u32 = 2
llama_model_loader: - type f32: 244 tensors
llama_model_loader: - type f16: 145 tensors
llm_load_vocab: special tokens cache size = 5
llm_load_vocab: token to piece cache size = 0.0769 MB
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = bert
llm_load_print_meta: vocab type = WPM
llm_load_print_meta: n_vocab = 21128
llm_load_print_meta: n_merges = 0
llm_load_print_meta: vocab_only = 0
llm_load_print_meta: n_ctx_train = 512
llm_load_print_meta: n_embd = 1024
llm_load_print_meta: n_layer = 24
llm_load_print_meta: n_head = 16
llm_load_print_meta: n_head_kv = 16
llm_load_print_meta: n_rot = 64
llm_load_print_meta: n_swa = 0
llm_load_print_meta: n_embd_head_k = 64
llm_load_print_meta: n_embd_head_v = 64
llm_load_print_meta: n_gqa = 1
llm_load_print_meta: n_embd_k_gqa = 1024
llm_load_print_meta: n_embd_v_gqa = 1024
llm_load_print_meta: f_norm_eps = 1.0e-12
llm_load_print_meta: f_norm_rms_eps = 0.0e+00
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 4096
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: causal attn = 0
llm_load_print_meta: pooling type = 1
llm_load_print_meta: rope type = 2
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 10000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_ctx_orig_yarn = 512
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: ssm_dt_b_c_rms = 0
llm_load_print_meta: model type = 335M
llm_load_print_meta: model ftype = F16
llm_load_print_meta: model params = 324.47 M
llm_load_print_meta: model size = 620.50 MiB (16.04 BPW)
llm_load_print_meta: general.name = Conan_Embedding_V1
llm_load_print_meta: UNK token = 100 '[UNK]'
llm_load_print_meta: SEP token = 102 '[SEP]'
llm_load_print_meta: PAD token = 0 '[PAD]'
llm_load_print_meta: CLS token = 101 '[CLS]'
llm_load_print_meta: MASK token = 103 '[MASK]'
llm_load_print_meta: LF token = 0 '[PAD]'
llm_load_print_meta: max token length = 48
llm_load_tensors: ggml ctx size = 0.16 MiB
llm_load_tensors: CPU buffer size = 620.50 MiB
time=2024-09-25T10:23:28.796+08:00 level=INFO source=server.go:621 msg="waiting for server to become available" status="llm server loading model"
llama_new_context_with_model: n_ctx = 2048
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: flash_attn = 0
llama_new_context_with_model: freq_base = 10000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CPU KV buffer size = 192.00 MiB
llama_new_context_with_model: KV self size = 192.00 MiB, K (f16): 96.00 MiB, V (f16): 96.00 MiB
llama_new_context_with_model: CPU output buffer size = 0.00 MiB
llama_new_context_with_model: CPU compute buffer size = 26.00 MiB
llama_new_context_with_model: graph nodes = 851
llama_new_context_with_model: graph splits = 1
time=2024-09-25T10:23:30.338+08:00 level=INFO source=server.go:621 msg="waiting for server to become available" status="llm server not responding"
time=2024-09-25T10:23:31.963+08:00 level=INFO source=server.go:621 msg="waiting for server to become available" status="llm server error"
time=2024-09-25T10:23:32.226+08:00 level=ERROR source=sched.go:455 msg="error loading llama server" error="llama runner process has terminated: exit status 0xc0000005"
[GIN] 2024/09/25 - 10:23:32 | 500 | 3.7323168s | 127.0.0.1 | POST "/api/embed"
```
### OS
Windows
### GPU
_No response_
### CPU
Intel
### Ollama version
0.3.11 0.3.12 | null | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6946/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6946/timeline | null | null | false |
https://api.github.com/repos/ollama/ollama/issues/3955 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/3955/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/3955/comments | https://api.github.com/repos/ollama/ollama/issues/3955/events | https://github.com/ollama/ollama/pull/3955 | 2,266,430,367 | PR_kwDOJ0Z1Ps5t4K3P | 3,955 | return code `499` when user cancels request while a model is loading | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 0 | 2024-04-26T20:03:32 | 2024-04-26T21:38:30 | 2024-04-26T21:38:29 | MEMBER | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/3955",
"html_url": "https://github.com/ollama/ollama/pull/3955",
"diff_url": "https://github.com/ollama/ollama/pull/3955.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3955.patch",
"merged_at": "2024-04-26T21:38:29"
} | null | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/3955/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/3955/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/5963 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/5963/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/5963/comments | https://api.github.com/repos/ollama/ollama/issues/5963/events | https://github.com/ollama/ollama/pull/5963 | 2,431,037,605 | PR_kwDOJ0Z1Ps52hIpP | 5,963 | Revert "llm(llama): pass rope factors" | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [] | closed | false | null | [] | null | 0 | 2024-07-25T21:53:31 | 2024-07-25T22:24:57 | 2024-07-25T22:24:55 | MEMBER | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | false | {
"url": "https://api.github.com/repos/ollama/ollama/pulls/5963",
"html_url": "https://github.com/ollama/ollama/pull/5963",
"diff_url": "https://github.com/ollama/ollama/pull/5963.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5963.patch",
"merged_at": "2024-07-25T22:24:55"
} | Reverts ollama/ollama#5924 | {
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/5963/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/5963/timeline | null | null | true |
https://api.github.com/repos/ollama/ollama/issues/6172 | https://api.github.com/repos/ollama/ollama | https://api.github.com/repos/ollama/ollama/issues/6172/labels{/name} | https://api.github.com/repos/ollama/ollama/issues/6172/comments | https://api.github.com/repos/ollama/ollama/issues/6172/events | https://github.com/ollama/ollama/issues/6172 | 2,447,790,062 | I_kwDOJ0Z1Ps6R5k_u | 6,172 | .git file is missing | {
"login": "Haritha-Maturi",
"id": 100990846,
"node_id": "U_kgDOBgT_fg",
"avatar_url": "https://avatars.githubusercontent.com/u/100990846?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Haritha-Maturi",
"html_url": "https://github.com/Haritha-Maturi",
"followers_url": "https://api.github.com/users/Haritha-Maturi/followers",
"following_url": "https://api.github.com/users/Haritha-Maturi/following{/other_user}",
"gists_url": "https://api.github.com/users/Haritha-Maturi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Haritha-Maturi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Haritha-Maturi/subscriptions",
"organizations_url": "https://api.github.com/users/Haritha-Maturi/orgs",
"repos_url": "https://api.github.com/users/Haritha-Maturi/repos",
"events_url": "https://api.github.com/users/Haritha-Maturi/events{/privacy}",
"received_events_url": "https://api.github.com/users/Haritha-Maturi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | [
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] | closed | false | null | [] | null | 0 | 2024-08-05T07:12:11 | 2024-08-05T08:14:46 | 2024-08-05T08:14:45 | NONE | {
"total": 0,
"completed": 0,
"percent_completed": 0
} | null | null | null | ### What is the issue?
As per the docker file present in repo there needs to be a file named .git in repo but it is missing.
![image](https://github.com/user-attachments/assets/731a02ea-df73-4b4c-873f-f40a110ac7f3)
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_ | {
"login": "Haritha-Maturi",
"id": 100990846,
"node_id": "U_kgDOBgT_fg",
"avatar_url": "https://avatars.githubusercontent.com/u/100990846?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Haritha-Maturi",
"html_url": "https://github.com/Haritha-Maturi",
"followers_url": "https://api.github.com/users/Haritha-Maturi/followers",
"following_url": "https://api.github.com/users/Haritha-Maturi/following{/other_user}",
"gists_url": "https://api.github.com/users/Haritha-Maturi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Haritha-Maturi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Haritha-Maturi/subscriptions",
"organizations_url": "https://api.github.com/users/Haritha-Maturi/orgs",
"repos_url": "https://api.github.com/users/Haritha-Maturi/repos",
"events_url": "https://api.github.com/users/Haritha-Maturi/events{/privacy}",
"received_events_url": "https://api.github.com/users/Haritha-Maturi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
} | {
"url": "https://api.github.com/repos/ollama/ollama/issues/6172/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/ollama/ollama/issues/6172/timeline | null | completed | false |
End of preview. Expand
in Dataset Viewer.
README.md exists but content is empty.
- Downloads last month
- 8