url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/2285
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2285/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2285/comments
|
https://api.github.com/repos/ollama/ollama/issues/2285/events
|
https://github.com/ollama/ollama/issues/2285
| 2,109,201,868
|
I_kwDOJ0Z1Ps59t93M
| 2,285
|
EOF Error When Running A Model
|
{
"login": "meminens",
"id": 42714627,
"node_id": "MDQ6VXNlcjQyNzE0NjI3",
"avatar_url": "https://avatars.githubusercontent.com/u/42714627?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/meminens",
"html_url": "https://github.com/meminens",
"followers_url": "https://api.github.com/users/meminens/followers",
"following_url": "https://api.github.com/users/meminens/following{/other_user}",
"gists_url": "https://api.github.com/users/meminens/gists{/gist_id}",
"starred_url": "https://api.github.com/users/meminens/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/meminens/subscriptions",
"organizations_url": "https://api.github.com/users/meminens/orgs",
"repos_url": "https://api.github.com/users/meminens/repos",
"events_url": "https://api.github.com/users/meminens/events{/privacy}",
"received_events_url": "https://api.github.com/users/meminens/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2024-01-31T03:18:09
| 2024-03-06T21:45:34
| 2024-01-31T18:47:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Running the command `ollama run mistral` results in the error `Error: Post "http://127.0.0.1:11434/api/chat": EOF`
Output of `journal -u ollama`:
```
Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 cpu_common.go:11: INFO CPU has AVX2
Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 dyn_ext_server.go:90: INFO Loading Dynamic llm server: /tmp/ollama519289987/rocm_v5/libext_server.so
Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 dyn_ext_server.go:145: INFO Initializing llama server
Jan 30 22:13:35 arch ollama[14727]: free(): invalid pointer
Jan 30 22:13:35 arch systemd[1]: ollama.service: Main process exited, code=dumped, status=6/ABRT
Jan 30 22:13:35 arch systemd[1]: ollama.service: Failed with result 'core-dump'.
Jan 30 22:13:35 arch systemd[1]: ollama.service: Consumed 17.709s CPU time.
Jan 30 22:13:38 arch systemd[1]: ollama.service: Scheduled restart job, restart counter is at 1.
Jan 30 22:13:38 arch systemd[1]: Started Ollama Service.
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 images.go:857: INFO total blobs: 5
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 images.go:864: INFO total unused blobs removed: 0
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 routes.go:950: INFO Listening on 127.0.0.1:11434 (version 0.1.22)
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 payload_common.go:106: INFO Extracting dynamic libraries...
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 payload_common.go:145: INFO Dynamic LLM libraries [cpu_avx rocm_v6 cpu cuda_v11 cpu_avx2 rocm_v5]
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:94: INFO Detecting GPU type
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:236: INFO Searching for GPU management library libnvidia-ml.so
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:282: INFO Discovered GPU libraries: []
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:236: INFO Searching for GPU management library librocm_smi64.so
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:282: INFO Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.5.0]
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:109: INFO Radeon GPU detected
```
System info:
```
-` misaligar@arch
.o+` ---------
`ooo/ OS: Arch Linux x86_64
`+oooo: Host: B650 AORUS ELITE AX
`+oooooo: Kernel: 6.7.2-arch1-1
-+oooooo+: Uptime: 28 mins
`/:-:++oooo+: Packages: 1073 (pacman), 7 (flatpak)
`/++++/+++++++: Shell: bash 5.2.26
`/++++++++++++++: Resolution: 2560x1440
`/+++ooooooooooooo/` DE: Plasma 5.27.10
./ooosssso++osssssso+` WM: kwin
.oossssso-````/ossssss+` Theme: [Plasma], Breeze [GTK2/3]
-osssssso. :ssssssso. Icons: kora [Plasma], kora [GTK2/3]
:osssssss/ osssso+++. Terminal: konsole
/ossssssss/ +ssssooo/- Terminal Font: Hack Nerd Font Mono 10
`/ossssso+/:- -:/+osssso+- CPU: AMD Ryzen 9 7900X (24) @ 5.733GHz
`+sso+:-` `.-/+oso: GPU: AMD ATI Radeon RX 7900 XT/7900 XTX
`++:. `-/+/ Memory: 8687MiB / 63942MiB
.` `/
```
I have installed ollama manually as per the instructions here: https://github.com/ollama/ollama/blob/main/docs/linux.md
This error started after I disabled the integrated GPU in BIOS. If I keep it enabled, there are no error messages. However, ollama does not use the external GPU, 7900 XTX, even though all the required ROCm packages are installed.
Thanks!
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2285/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2285/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2497
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2497/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2497/comments
|
https://api.github.com/repos/ollama/ollama/issues/2497/events
|
https://github.com/ollama/ollama/issues/2497
| 2,134,930,064
|
I_kwDOJ0Z1Ps5_QHKQ
| 2,497
|
[Linux] Ran out of space while installing llama2 model, can't delete or find
|
{
"login": "saamerm",
"id": 8262287,
"node_id": "MDQ6VXNlcjgyNjIyODc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8262287?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saamerm",
"html_url": "https://github.com/saamerm",
"followers_url": "https://api.github.com/users/saamerm/followers",
"following_url": "https://api.github.com/users/saamerm/following{/other_user}",
"gists_url": "https://api.github.com/users/saamerm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saamerm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saamerm/subscriptions",
"organizations_url": "https://api.github.com/users/saamerm/orgs",
"repos_url": "https://api.github.com/users/saamerm/repos",
"events_url": "https://api.github.com/users/saamerm/events{/privacy}",
"received_events_url": "https://api.github.com/users/saamerm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 15
| 2024-02-14T18:28:19
| 2024-05-09T00:44:52
| 2024-05-09T00:44:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I installed ollama on my Linux EC2 machine with 8GB of Hard disk space and 4GB of free disk space.
I ran `ollama run llama2` by mistake before checking the space, but it was too quick to download before I could react and I ran out of space, with this error "no space left on device"
Now can't delete or find the model, and `ollama rm llama2` is useless. Where can I find the partially downloaded model to delete?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2497/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2497/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/879
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/879/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/879/comments
|
https://api.github.com/repos/ollama/ollama/issues/879/events
|
https://github.com/ollama/ollama/issues/879
| 1,957,430,737
|
I_kwDOJ0Z1Ps50rAXR
| 879
|
Support image inputs
|
{
"login": "tmc",
"id": 3977,
"node_id": "MDQ6VXNlcjM5Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3977?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tmc",
"html_url": "https://github.com/tmc",
"followers_url": "https://api.github.com/users/tmc/followers",
"following_url": "https://api.github.com/users/tmc/following{/other_user}",
"gists_url": "https://api.github.com/users/tmc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tmc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tmc/subscriptions",
"organizations_url": "https://api.github.com/users/tmc/orgs",
"repos_url": "https://api.github.com/users/tmc/repos",
"events_url": "https://api.github.com/users/tmc/events{/privacy}",
"received_events_url": "https://api.github.com/users/tmc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-10-23T15:22:05
| 2023-10-24T23:10:42
| 2023-10-24T23:10:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
With llama.cpp gaining multi-modality and llava support, it'd be nice to enable image inputs to compatible models in Ollama.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/879/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/879/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/69
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/69/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/69/comments
|
https://api.github.com/repos/ollama/ollama/issues/69/events
|
https://github.com/ollama/ollama/pull/69
| 1,799,737,310
|
PR_kwDOJ0Z1Ps5VPrN-
| 69
|
Use embeddings to give the chat client session memory
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-07-11T20:53:44
| 2023-09-08T15:13:20
| 2023-07-20T18:04:37
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/69",
"html_url": "https://github.com/ollama/ollama/pull/69",
"diff_url": "https://github.com/ollama/ollama/pull/69.diff",
"patch_url": "https://github.com/ollama/ollama/pull/69.patch",
"merged_at": null
}
|
Store previous questions and answers in the client during a chat session. Use embeddings to look-up what is relevant to the current context.
This is an initial implementation. We will need to iterate to improve this experience through more dynamic prompts and possibly weighting recency in the conversation too.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/69/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/69/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4715
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4715/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4715/comments
|
https://api.github.com/repos/ollama/ollama/issues/4715/events
|
https://github.com/ollama/ollama/pull/4715
| 2,324,695,303
|
PR_kwDOJ0Z1Ps5w9Wm4
| 4,715
|
proper utf16 support
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-30T04:50:54
| 2024-06-10T18:41:29
| 2024-06-10T18:41:29
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4715",
"html_url": "https://github.com/ollama/ollama/pull/4715",
"diff_url": "https://github.com/ollama/ollama/pull/4715.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4715.patch",
"merged_at": "2024-06-10T18:41:29"
}
|
instead of relying on unreadable runes which can be appear for other reasons, check the header and adjust the scanner and decoder
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4715/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4715/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1006
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1006/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1006/comments
|
https://api.github.com/repos/ollama/ollama/issues/1006/events
|
https://github.com/ollama/ollama/issues/1006
| 1,977,742,083
|
I_kwDOJ0Z1Ps514fMD
| 1,006
|
Mobile support
|
{
"login": "mikestaub",
"id": 1254558,
"node_id": "MDQ6VXNlcjEyNTQ1NTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1254558?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mikestaub",
"html_url": "https://github.com/mikestaub",
"followers_url": "https://api.github.com/users/mikestaub/followers",
"following_url": "https://api.github.com/users/mikestaub/following{/other_user}",
"gists_url": "https://api.github.com/users/mikestaub/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mikestaub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mikestaub/subscriptions",
"organizations_url": "https://api.github.com/users/mikestaub/orgs",
"repos_url": "https://api.github.com/users/mikestaub/repos",
"events_url": "https://api.github.com/users/mikestaub/events{/privacy}",
"received_events_url": "https://api.github.com/users/mikestaub/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 18
| 2023-11-05T11:17:35
| 2024-10-25T04:49:51
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is there a plan to deploy this to iOS or Android so users can run models locally on their mobile devices?
What would it take to achieve this?
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1006/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1006/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/984
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/984/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/984/comments
|
https://api.github.com/repos/ollama/ollama/issues/984/events
|
https://github.com/ollama/ollama/pull/984
| 1,975,328,306
|
PR_kwDOJ0Z1Ps5efar6
| 984
|
Remove grammar mistake: duplicate "install" in GPU support warning message
|
{
"login": "noahgitsham",
"id": 73707948,
"node_id": "MDQ6VXNlcjczNzA3OTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/73707948?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/noahgitsham",
"html_url": "https://github.com/noahgitsham",
"followers_url": "https://api.github.com/users/noahgitsham/followers",
"following_url": "https://api.github.com/users/noahgitsham/following{/other_user}",
"gists_url": "https://api.github.com/users/noahgitsham/gists{/gist_id}",
"starred_url": "https://api.github.com/users/noahgitsham/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/noahgitsham/subscriptions",
"organizations_url": "https://api.github.com/users/noahgitsham/orgs",
"repos_url": "https://api.github.com/users/noahgitsham/repos",
"events_url": "https://api.github.com/users/noahgitsham/events{/privacy}",
"received_events_url": "https://api.github.com/users/noahgitsham/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-03T01:50:32
| 2023-11-03T07:45:14
| 2023-11-03T07:45:14
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/984",
"html_url": "https://github.com/ollama/ollama/pull/984",
"diff_url": "https://github.com/ollama/ollama/pull/984.diff",
"patch_url": "https://github.com/ollama/ollama/pull/984.patch",
"merged_at": "2023-11-03T07:45:14"
}
|
Just realised another grammar mistake in the exact same error I just "fixed" 😸
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/984/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/984/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/401
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/401/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/401/comments
|
https://api.github.com/repos/ollama/ollama/issues/401/events
|
https://github.com/ollama/ollama/pull/401
| 1,862,370,219
|
PR_kwDOJ0Z1Ps5YjCK8
| 401
|
subprocess llama.cpp server
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-08-23T00:26:33
| 2023-08-30T20:35:05
| 2023-08-30T20:35:03
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/401",
"html_url": "https://github.com/ollama/ollama/pull/401",
"diff_url": "https://github.com/ollama/ollama/pull/401.diff",
"patch_url": "https://github.com/ollama/ollama/pull/401.patch",
"merged_at": "2023-08-30T20:35:03"
}
|
This is a pretty big change that moves llama.cpp from a library within cgo to an external process that we manage.
Why?
- This makes building for multiple platforms easier (no more windows cgo incompatibilities)
- We can fallback to non-gpu runners when needed
- Approximately ~200ms faster on average in my tests
- Way less code in our repo
- Maybe easier to manage our build matrix
Minor Breaking Changes
- Generate response no longer includes sample account or sample duration. These metrics are not included in the response from the llama.cpp server.
- Only one LoRA adapter is supported at a time. The llama.cpp server isn't built for this at the moment. Allowing multiple seems like it would be a pretty simple PR to open with llama.cpp.
Features
- Use the existing loading logic to manage a llama.cpp server
- Package in llama.cpp CPU and GPU runtimes in the Go binary
- Removes vendored llama.cpp code
- No more cgo
There's a lot of changes in this PR, here are the files to look at:
- llm/llama.go
- llm/llama_generate.go
- llm/llama_generate_darwin.go
- api/types.go
- app/src/index.ts
- server/routes.go
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/401/reactions",
"total_count": 3,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/401/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8558
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8558/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8558/comments
|
https://api.github.com/repos/ollama/ollama/issues/8558/events
|
https://github.com/ollama/ollama/pull/8558
| 2,808,419,398
|
PR_kwDOJ0Z1Ps6I1_OP
| 8,558
|
Fix build for loongarch64, go arch is not same with uname -m
|
{
"login": "ideal",
"id": 261698,
"node_id": "MDQ6VXNlcjI2MTY5OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/261698?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ideal",
"html_url": "https://github.com/ideal",
"followers_url": "https://api.github.com/users/ideal/followers",
"following_url": "https://api.github.com/users/ideal/following{/other_user}",
"gists_url": "https://api.github.com/users/ideal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ideal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ideal/subscriptions",
"organizations_url": "https://api.github.com/users/ideal/orgs",
"repos_url": "https://api.github.com/users/ideal/repos",
"events_url": "https://api.github.com/users/ideal/events{/privacy}",
"received_events_url": "https://api.github.com/users/ideal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2025-01-24T03:36:14
| 2025-01-27T17:32:29
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8558",
"html_url": "https://github.com/ollama/ollama/pull/8558",
"diff_url": "https://github.com/ollama/ollama/pull/8558.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8558.patch",
"merged_at": null
}
|
go tool dist list: linux/loong64, but the result of uname -m is loongarch64
More info: https://areweloongyet.com/docs/loong-or-loongarch/
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8558/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8558/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6451
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6451/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6451/comments
|
https://api.github.com/repos/ollama/ollama/issues/6451/events
|
https://github.com/ollama/ollama/issues/6451
| 2,476,655,008
|
I_kwDOJ0Z1Ps6TnsGg
| 6,451
|
cannot unmarshal array into Go struct field ChatRequest.messages of type string
|
{
"login": "McCannDahl",
"id": 19883817,
"node_id": "MDQ6VXNlcjE5ODgzODE3",
"avatar_url": "https://avatars.githubusercontent.com/u/19883817?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/McCannDahl",
"html_url": "https://github.com/McCannDahl",
"followers_url": "https://api.github.com/users/McCannDahl/followers",
"following_url": "https://api.github.com/users/McCannDahl/following{/other_user}",
"gists_url": "https://api.github.com/users/McCannDahl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/McCannDahl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/McCannDahl/subscriptions",
"organizations_url": "https://api.github.com/users/McCannDahl/orgs",
"repos_url": "https://api.github.com/users/McCannDahl/repos",
"events_url": "https://api.github.com/users/McCannDahl/events{/privacy}",
"received_events_url": "https://api.github.com/users/McCannDahl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-20T22:43:20
| 2024-08-21T13:26:25
| 2024-08-21T13:26:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I believe support for images via openAPI format has already been added https://github.com/ollama/ollama/pull/5208, but when I do this rest call, I get the following error.
Curl Command
```
curl -X POST -H "Content-Type: application/json" "http://localhost:11434/api/chat" -d '{"model": "llava", "messages": [{"role": "system", "content": "You are a helpful AI assistant"}, {"role": "user", "content": "what up?"}, {"role": "assistant", "content": " Hello! How can I assist you today? "}, {"role": "user", "content": [{"type": "text", "text": "what is this?"}, {"type": "image_url", "image_url": {"url": "data:image/jpeg;base64,iVBORw0KGgoAAAANSUhEUgAAAUIAAACfCAYAAACMetDZAAAABGdBTUEAALGPC/xhBQAAAAFzUkdCAK7OHOkAAAAgY0hSTQAAeiYAAICEAAD6AAAAgOgAAHUwAADqYAAAOpgAABdwnLpRPAAAAAZiS0dEAP8A/wD/oL2nkwAAAAlvRkZzAAACkAAABlgAJjwaSgAAAAlwSFlzAAAhOAAAITgBRZYxYAAAAAl2cEFnAAAGYQAACLQAOxBuyAAAandJREFUeNrtvV2TIzmOpQ3XtyIis6rLumemzeb//7J996LNqqsyI/Qtub8XEYd6eBz0yJrdq03RLC2klOROJ0EQODgAu2EYhoiIvu+DbRiGuN1u8fFxzGaz8lrvu64r39Xv9R19pubv2WazWagPwzBU3+X1vA96zf/zZ+Bvs/6x713XlX+8Bu/D8YiIuN1u5bvz+Xx0n6w/U2PAfrVaNgZ930ff9+X38/m8/L/fQ3PLfvOZ+76PxWJRvedvdc/5fF7JiMuF35/9nc/n5bP5fD66hs+vrtl1XZETPqeeh3Otvuk7Pr7s5+12q/ru48axVR9acur95Vj4e5fp1vz7vTgGemb1mdeUPLuM87oaL39uPYteXy6XakwoC95Xf6bP3vN1S3dwrfq4u+xxrIdhiNlsNlqzlPHF+XwuH14ul9D7w+EQh8Mhrtfr/csfP3RBvVwuZZDUASqM+XxehJGLzpVK1lrKju/fB2cI/Zfr3Nlsni4G/ZZKob7P+zWp5LkAXaEsFoumIPvE65kXi0UsFotKEVIYvK/n87nMEe9/vV6rvuj76vv1eq361lLgERGbzSaWy2W5P5UJrzmbzUb39IXFxu+u1+tYr9cREfH09FT1oeu6uN1ulVD3Q70AZt2s9L3v+zidTpCFe+PYuiLqZhCUISK6j78RlYLXPfRbKh2/pxacb0gadyoljYfeU7np/hwzzh9l0cd8vrjfc9bV/dF48P1qtSp953W5Ni6XS5xOp6IP+r6P6/Va3g/DUOmK2+2W9v0u7/exXy5XRbd0XVfpGZ/Py+VSrsvn4O/KOHyMfTfrYjFfVPfgvMxms2hroEd7tEd7tJ+kLaQlz+dz/PuPf8f//v/+d0RE/K//9b/i999/j8PhEBHvO8Dz83P5IXfh19fXeHt7i4iI0+lUWRCr1Sq2221st9uIeLc2qPFp/bhlol0n4m7xcPfUzsD/j7hbVervfD6vdgP9nteNeN9taH5rV8sswmxXprXL3V3WGMdks9lExLs19PT0VKyjL1++VGNCq+BwOMSff/4Z379/j4iI3W5X5uBwOMTxeIzT+f39+XSOy+VSxm+1Wo0sCo31crmM5XJZPl8ul8VKWCwWMZ/PRxaiXrtr7NYsX99utzIGf//73+M///M/IyLiv/7rv+Ll5WXkcdCSoyXifb9cLvH6+lrGQRbzfD6Pl5eX0vfKavqQNXdpdZ/b7VZeZ+4r38utpqzpnhx3joPWyuVyKWvscDiU+Twej3G9XkeQE2XC4QDJ0Hq9LmO5Wq1iuVyW99vttpK/zWYTT09PZSy5Ni7XSxwPx4iIeHt7i/1+X8b527dv8e3bt6rv+/0+IiL2+32cz+fiJXK+NptNPD8/l77++uuv8eXLl/J+sViU/qzX62qNHY/H2O12lUyX51yvYr1aV94qx302mxUd9PLyEr/99lv89ttv5bkXdEeul2t8+/YtIiJ+//33+PbtW+W+StnJjdak7Xa7Mgin8yn6W18GWq6cvsvOa1GogxIiugd66OPxGOfzuRJUdy10XS1eujZUhFQudOs1gL4YeJ8pzNQVIRXqbDarBM0xPF3rcDiMrqOF/fb2Fv/+97/j999/L989Ho+p8Om1rr/f70c4k7tLkgUK5mazifV6XeaT7r/DABLaDJd07Ozp6SkcllHfb7dbcaHUCL3QTT2dTnE+n8sCvV6vZUz0fV2HCn61Wo3gC441lZDcWW68fH5dm+NDPMoxcD2L+q7+ns/nkaxRhviabqk2HG4Wmj/NK+Wf7znOkunyXEOMXGH1dbfbxevra1GEx+NxtBn5+ox41w98Lj0/FeGXL18iIuL5+blyj2+3W+z3+8rl5ubGcef8ae7oUr+8vFRrecEdse/76kGPx2M1wRSY0+lUlJ92r4h3TGK+HCseKhuC5LPZrAif7sXv0SJ0LJILMMOlWqA9FaHjMvo8G1z+//V6HX2POBwH3oMRVMz6zPuqa3MTkcWQKXFZfxRcLl7ewwMaPnYUNi1AKrSCvXy8JqZEDMcxOd5L1jfnhFa6g/9UJroP70nrjQGU4/E46q/u55ZapsB1nev1WuRU95a8cn6Fh+s+3DAlz47rEv+k16ANQv11K5SKkHitB1Iob9oM3PrmPYjnz+e1dc4152sqCyq6vA39UFnBy+UyDodDuZa8k4i7ktZvteYox2Wcu7t8+Lirr5w/x7YfGOGjPdqj/fStmHnSstqBjsdj3G634hLJpYiIESUhIipMRP90XWpujw66hRFR71K6ru4p93y5XJbfKfLFHdqtHt+VK7PYrLMMo9Jfd9sz/FDfpTW2Wq0KZvn169d4eXmJiHfzf7vdVq7K7XYrVqBwJM0Nx365XFa73mKxKC6GrEP1QfPI8XAslFYEXUju0pvNpsIP6Sk488Cx2/P5XLnjuuZ6vS64kJ5Z8IbmwSPVBcf6cP+Jial/5/O5giFoqUnuHPvUb92FpRzoWsTkuE44LqQJ6flpXbEP/bYfWb6y+hz2ID6sdcQ5Ih725cuXMr7Pz88jeIowDeXgfT7vUWF6BsvlsmB4+pwRZFpk7gE5S8LlhB4PvYHFYhEvLy/lPWMPq9Wquietabn0kr3ValXpgL7vY0EX6HQ6FUXz559/xvV6rZSNHkaTlIXvJeD8HcFa7yyDLrpOxmvSX+KJVL68hwSDOI1jTO5Cqm+kEyyXy9TE1uBSOJ0uwAntuq4CpTmB6/W62hzUV0EUx+OxuBH7/T6Ox2PlLtElc4qHlKr68KPNKR9U6s6no1uYcTUdV2Pjdajs5Ma0ZMbx4dvtVrlTev309FS5zX3fl7E8nU6VwpALlmG8cqWcckJFKJnRXGZBPc0dNyAPormBoecWnqjrOK1E8qq/6o/kTs+5XC0rReQyfb1eY4j7PbU2FaTb7XYREQWvJ55Oxcfn4NrUZ1T43CApM8IO9SwKfK3WH8+23pQNSEYBKX2cAylVf+ayVmj9ZAJHJeB4FKOMVITk9CyXyypSzEWR3WOKh+ZRvxbu4SC+A81ZoIX3p8XFCfUIsitsVxIcH1pVtF5n89pylcBLAB1z48IiXqhxpGKk9chIpxNy/XN/rtvtFufLB457XlYWhG9cbBS47H4c59VqWawP3pvzyPnLWAL6nNY1PyOAr+dkRJlKnePhkWpZg1Q2rc2Az+Hzp2uyv46TZ4rmer1WFrOvm/l8XvomvmZR1LP5qG8uF+ofAzlSirT6iLX7nDjHz9eNW4SuE/js9FaXy2XBAzN5Iy7Pvvpzcry6rntghI/2aI/2aAv6/NSoXdfF6XQqnDVPR4qIdCeTZUYznVkEs9msWJbaGbTrZGx0XpdcKeJhstw8+yHbZbTLMYpNLIrW7Ha7jfV6XUWtdJ39fp/SG5w7qOuv1+sKwyyuwmxeue7H47GiIry+vhZ3ZL/fV7syI4fCD50m4dFPPietqMvlMnLL9Bx0EyPu1s56va6sKo29W6V6LvIIieu98yg30XXncl3HqmglcGw3m82It0p3KSKKO8zXgmT0XJJRj4BTRhwTpExzLOneET4RLkUohtelu71cLj/m91LkXetkGIYRlWw2m5Xn/vr1a8UTzDy2jKcqFoK8ke/fv8cff/wREXfXmHgp++B0oyySzrFya42yRUuN8iTLm7qFrrBzNDXXlDvdg889DMM9WKIPuCB3u125kZviNK+5sLXgaaZ74CILf3OBtfKJnfLAwfQ832EYyqSR+iBlRjfQicSO/WT5lJrcFvbmZGLikgwoSGFLGHa7XRwOh6L8Ko7mR+DkehvjmxxXvXbqRqt/nhbm405gueu6iqrgoLiu522xWMTpfIpFf8f9PJVK8yWcmn0hjpWl0TmHVP0ZhqEKyvA6fE4FVujGVq4T+ivMizLOReWpj6RteB4ysW7CJ1xD7891HRkecnelfDmenrTAsaTCIAVLRoIU7uvr62hD1hzpt1TGbJ5m6HQ6bxw/XzeenKH1mekOnwe/pn+m9w/X+NEe7dF++lZlKfvOS8uFboIXBuCupt2KQRgH6j3ix53VU7gywmfEeAfKdhma7a1UPe4Usiyc0OmR6+w6rSoc+h2tDU+m5658PB4rs14ZNXrmYRiKRXaL2wikr6hJsy66uLts3jd3V9RoMWcZDhrL+XxeXBI9VyvKrn5npG4l4NNKYZTWo7B0n+RVeGaHy0vEnaTckicPdhHCoXt3l+/cg/HxcnIxLWgFinRdRledPOxryscyi3h7KqjGjxDO5SMQdjqdKouQbr3mxOWc0BqzOvQMmpPMelN/6HG4a+w0mL7v49aPGQy+nr2fzm5weKBShB6RPBwO5ctMW9IEEr8TJiHTlW40m08KXRApHRcc/tb7QeH3yff0IAmm4wweYaMZrw1BYyDXwPE6fd85krq/Il4aW/6G1Ii3t7cKI6Q7kgmhT3zlVtzyCineV40lF6jjkD5nEkyyCYQPcz7p1tDNYZ/P50vcbn1xib99+xbX67WCKLyvdHc5h15liIuF11GOMrN2NAf6vHWP90WfQyLa5HQtKhM9P13sd/jljhEyip1trnouYotaQ+R3uivorif5iaRnEZN+e3srMYIs99kVLnl6HmXP8GfJEKGFqQ36crkUmCgi4ni4p+apaWxdoVLeFVPgxlIUofPrhG3owgwUeP4iSazC1ZjmlOVJ8pqOmWS4lvdP36fwO5Wk1Wj1EfQWWdipPp7iptcuGM6P8mAEgXpa2MSR3t7eys4sQW3REpxqMQxD2S1v13HBCCqTisIwu6c/SeCoYDnuFHhtOMyhdjK9j6Xv9rrf6XQqyv+PP/6oPAO38pwET6XlGyIXlsuWpzIej8fKUuL9ncLT9/f7zOc15eR6vRY5OZ/PIwvQFSyxbY5zi6cqxcdn5ngxmHQ8Hj/Gq1ZE5/OdLE5FSK7gfr8fFU7gPHhqKOXQqVxZsIJjm6V8ar5ote/3+zJGi8WiGGC6PvFDyXUXtSKUjFbUwXi0R3u0R/vJWxU1nkqcZvO0ObpAXplE12LEdMoi9Ggrr+HkazXHPbKkb++ru8MRd9fOXVq6OSQ6c+d1HCnDjWgBeUqRVyTxqKZfS6+5q1UW4cc1dR23Hmkp6f0t8uwPx1daBVQ1HszAcEsui0xLDgg7kBKz3W4rq8lTNp2C4YUmOJ9TskYs0uWLVrGzBegZZKXbPK2QntRisbhbLlZ4g4UCmGrpUeIMHuHYvnsVd8uc69FT95hN4plKvK5jqk6WbxX5cJdan3naIWWajWuH8+BjwTWu8XUdUMUHeBOneBAj3G63NbgIHp+7H1SMU2C7Pssqh/jguqL2fnMQNGEUXLq/wmYi6vQomemOkwqXII5UAheWe0k3kQqCrhZdYY0HcSSNRcR75klrDM7nc6XghmEouKAUIeEMChgzWm797b3k0mWsfDOFycXggR9mb7hr7Ncm1kjX+M8//6xcyuVyWVLE5BaTrtUKokkW3QVXP7nJzefzwkn0JmyK/fVqTM7npGJm3jarf6/X61gs79fx2pjku/pcEocXHJBBSlIcfkwD0/WY1857em41/89dY5cZT8Hl/HstRYey+JnLu2e0sTYhK/AQO5SyzY6vUFu0OqhJIWguLKHruoqTxYf1VD0vm8QIqe7B6JxjUJ4f2EoZY9PgEbSXMG6326pwADEttwD3+30pSKmJoDDq2hH3RHdeyxVfxmvUzurjXniO3SziQw/1tzr6nE1oK52Lc0QuYMQ7NujcN08Dc04d50D31PNxjFgGnpgzF5fuKfn6/v17HA6HojC2222lFEmk1WLMLDkpTSpJWli05BTkKTIVQ1W2P8NNGSV1a419YLR5sVgUpa6/ZY31t1EQo6WUuJnLkmUfnP1B+eB3qQglo1mpO8chs5RJxwszEr6nu00R9j0C33VdFdByJoQfCTLigX5sOp648eARPtqjPdqjhbnGEVG5sEy7YiWaqRQ2WZFutmfJ7G5RyEz2FBs1zx7h9/wZvPCDcwMz3pnuR8uILgnNdFFwPNuAEULHppwKpM8yHlVlxpdDqdq7LuEGzQNpQrxn13XRD33Mu/vuyeh0VnQ2KybhfZHbwqot9BRYEIHNCxXIKmH0kmlW2+120kVz3DSzjuQae9WkjIsX3XsUnlWSWlFQ53WShZAVB2GZLnpHU5kuWSkrwlH0ajTuzgbxcdDvMu8qe+/PSyjKseOM2jZ1zAbni7it5CnLPNN39Szr9bqMgcZH0I/DdcMw3CtU+4W9dA0xG104K/ftfrhXe/Eq0yyh5BPsVVBawRtPEfPwPd1fT4/ygc04YAz1k69GTJCpPz6WrsBc2TpPz8nqZXF2tQvhFTXUf71vXSfi3c2+Xe90DD8dz8c3O4VQv/XAjuaXpyAuFosirOwn78FG11gl4XRtpVppbr12IZUZy8ufTqfYH95hDi0Kx3j52yrvvr9zRnU9r3/J+xPDJMd2s9mMgk18NpZfo+xxI9U68bRC9f18Ppfx0HUI9zAg4puw7uUyLMXWcnGn8PyMO0w5baWKal0zAOk8TE/r1H12u111HW5y2mQqGCUe7dEe7dF+8rZoFQ3gThXxvpN4JIbNGeDaBbXLMfLKKHHXddXBQCRje4SSFIvMPHZX2Um3eu30AjVFiHkWC3dlJ8N6TTqm43H8RNYlUK+d3ylETjNxygmf14mo2bPzvY+ru4x+gLf+ZlW8vS/eLz0368OdTqfK4qL7eD6fK2ubRRheX18rq8kt85eXl8pi1XMJ/JeLfTqdCmncmxdA8Ofg2TpeTDSiLhhMK4ZZV5JtnTvc3/pRgJIprU5fIZzDwICsQ8IynGuPEjNAQhnyCLCnt/m8uyXnMtiyCPnMHqR1y5Bye71eq6i7W/+k1jiEw/4LeuG9CmAjgc+Y//oxG90BRrAUxuYAkmnPg1rosqoPXj7di5JmeaRaVO4GZhVJPDeRCkGusEdMqWi98rZPSiZEmmgu5mp8btcSxXWMiW3K5XBX2CPw7p7zuadgh0yo2BzPdMXIZ/Ycb1e2as6hY5n12WxWbVZyNxlB5SbHDajClY1b5gqfucaSyYJhfpzExg2dCr+Wk/kIs+RzUk68motnZ/iYO1zhBoHukR3qlSmwKc6q96clD63WYnj42vBjNJwy10rdE7bIOaH8MvbgkFeEnVmSJXZzt2K6HYXTzyMmRqHvUym0AgNeXogKVVaT1//jZLhQOy7B+7ulGXEnM3vZJC8DFDE+C1hCTTxKr8Vf43XLc/W3d+7fx3DNu/lo8kvf+8/PEHZFWgVIrOiD05hIJeE9fNPJNiPd01MgPfWMlBH/LWXPKR6OVXGuKX/kDR6Px5KyqP6wUrPzEfncwzBUcsh5OJ/Pcb1dUyUgDGu9FmF/U1lyfd/H6XjH506nYxwO93RKWnW8d1bTz1PqMpnONvdMCfKzH9kgpS/csND9KXt87RhhUcxDfn5ztvFn1qM/G+VQn6mv3HDKs8SjPdqjPdpP3hatXVk7EM1UWoctnKjv+yr6xkhNRJ147xaFWzXehmFoHsBNk1n9YCUKTxb3kk/qIy0j4TJyu+SGZeMjV45YKAsnkJ7iZvlsPiuucVa2rFhyfe36uPnP/3Oaih8alFUXzqKGHE8fS/3/lHvklpzPkf5maWm0wGQxON1CFoLG+unpqXKBVBlbc6bm0IYf1EWLdOSeRTv9VOM3m81H9+z7Pi7XS4lYXy7nOBzuh3MRo1V/PGNLr7suou9r5oHa9Xotcqjiqn72SEY9k6y1ipfQAhTzomQnYV6I0/r4ab6cTsN7lHGcjeEcMlncA/IotveBsINbu5Vr7KlyKp2kickuqk7I/RBGSMXjAujHNDqVhIMxqkUGs7jFR/T0Ny4qLVznPeoenAzx8LLyRnLHiL20sA4qHV23jMEwxvKc4+SuQIs/2XVdSclzd47X9MPgdf9W9ojGyVtLcVJRM8vE3Z5WzT6NT4XVAkO9Xq8lAKJgBBW/ZPV8OY8wJ45zhrNlbqEv1sW8DkSRNyv5zXizt9stbldyBS+jvHIPgGSK0OdW0BODATytj66gr8cM1/YUWDXi947rujxkedO6R2sdcx6y63qGT0YPm8LEp/DNKmrMh3OCMF9TACLupX4oqBxo8u1Yc02Do91Lg+Qlqjih2U7s6T/CSPzZeM9M4P1c42EY0jNrOcG0Jk+nU2URkk/ntfh8oXj6Vik+eatrvrF5jbdu1sVycRdU7ojEaYVRcj6Js/lpbhyvbKflM7nF55YqP3fiOjc9XpsYkoP/+p4snq9fv5ZjUyVLVCBehMIXEp+H+PQoiIYURfaFpGa9p0Ilh0/YMT0bL23H/lH2PC+ZZd7e3t4qMvp+vx/V++N6pZdFLJtrSNYzg1KcI641eT+thIsqSjy0j5CgHBU5aHAOtallRwLomh7/YHtghI/2aI/207eq6AKtFrmExFe4W3rUj5YbdytFymjie0RZTYVPszI8vpN4NRXSLdQ8Chkxrg5Ci/B2u41O3GOVHVrFskxo9TFCSb6kU48YbdbYV8cXKJIcYxeNz+WWm6eaDcNQRePoSvkBVsTLsshdq3oQ0/jcrfGDr9ySdOvcv0s3UU2Wn7h4f/75Z8VT3e128fz8HBHveCHdZpZmkpWnvkue3TrSXHLcVRCUFq2e4Xg8xnw+r05Q0/1luTldxosy6B6qMqO+e6RVfRX/lYcusdiq80MJWWQZT17Bh89Mi8tZHOyfH5hGvLVyabv5yApsYc5d917N/XK9ew6aI1Gopvi3HANPrR0lfrbSbarKHBaYoJIUKE9z388McbPaFWpGiswGhcpVE6BJ8cXrSpEKzYmonvKX8RqFk7KOIDeAagySsu6OP5EiQ3fYn7ll4me4yvV2LWl0jnPxN37dDHRu4XnuKvu8cGyziiX66xxWp+z4uPEYAkIzJB6zdJz/XovYAyLkPTINLaIm6Hs5fD4TlQAVurt2jlOSzqNgSTa/jkdLuXpFI39m/pbX87RMbjyezvYjvNSsJiPHJ+PI/ggfUWOXwWOak1bgpRW8Kd/99O6P9miP9mj/j7dRhWrX4tkO4OZtRF39hK6o3pNUmlmb7EN2Poa+36ITuKul8w0iYkTS9mvq/Xq9/jhs/O4Kc8emNSR6jO6hZPasJtx8Vme1+A5NK0G1Adk/Wkqe4M/Gne56u8blfI8AMt3Nx1XP2YISvC5lqyKQ5pmAv9NeMohEJGhPk8vIuhmJdpgN8fr2Wt57wCFzL2V1smIKg1S3620UBGHtyc1mU8mbB0B4yJgH7UjZIbTAwh1Kx6Nlrv4cDofq7Ov9YR/HwzGllmXkfIdFqiosVkAlq+/n19I9vaJNFn12CMcDVK3XfBZCPOfubgVzTTw/PzcrBGWe6MJdoCkeTuvwHw4IFYH+eiYAMxgoCD4A2QBSqHlN4hoZlYVYp0dImabjebWuFIi9VMK431dcQedVZbib+nq9XQtHMHMfGfXkuPvkMsXI8SgKv57TqSMt/qBjPOwbfycYQXO0XC7vyuJWH93qpdCoCKegk8zVvd1ucT19KOrrbbRxENpYru6wB11jyQjllsV7WSX76empws+rDagRAddzMqVUm0aVv775GLvFspI9bmTiq6qSzvFwrDZhL6zLv461j0rWG2XHD0bjuHP83DX3fGKnMLUgiyrrq79VueGuo6rNcuhHhhIxb8p/lmK4mOqQ/3UsqFU3MFOoDBz4wLfOmGDzlJ4MP2n1lffX577o/F76y0CGCLoR98q+DnyzDy0A2BUhrUDnQH7WWsEbCWZGhPaCDF4aixQor8nIeXe6kaoAE/D3Ta81ztlnTo3g2DndiO81R29vb/H09FToNNVGvp41LRqXYQUKWPrL8ScuQI6X12TU99VGVldXX3eq5uedmH2ZtAIpJ35/l4PW0ZtO0G+t0c+ar3GNc0qVuvWpMZQqTTu1kZatB+Uc5454YISP9miP9mh1ZknEtI/ufjhbxnbXdUlrYQRZZ4ewhJHjB7y+Yyatfvr/e5HNLOVOr2kBujXL8kWtXbjFrmd/qt15yM9+4OFNnuWRQQe3flzhxt0IL0rRslSczjPaPS0DhdeZdbPU0iv3nLWjjn4PWoR0cypaUP9eHEHY3tDfLUL1Xc/y8vISv/76ayUHPteVdYuCDEwE8OwfL1TglibH4Hq7VoeeOWzkXpaek+62yPss4kFrKCuc0MLoHGKiqzyFqyndjeshK7aQrUmOD/8vIkbP4X2PiBTDlOUv6z8rIJxRfdRGPMJsMP3htACmQE5SUDRoui6FnXXe5HIQI3IqTabcnBKj/mUYRRYMclfPBywrnyVBJC7TEgC5blMKm9jZMAwlVW4xX1QbT+UOGHfLgxqcbGKCAukp8BRA5wpSCL1KDKsEZxCFKzOmp/lc8v66vsbEc3Znw4fy7t7nXQGpW9yiv94Pk3p9fa3SwljHMOOvURZW63tqJauaO2jvgQH23Y9C7WK8OCuX9/Zx8uIwr7A+VkXKTpujUqeC8rxaf25P3WvJWgkmoa9UxtxEPK88U2ZVtZr+VlVLb+Gb7rozkyqrVN4a50y3NRWhrB0KSlYOXQ/OxdlSoGqO13GQXOhbOFILk1T/vBz/VHoNn4PFCYQb0SJUsMRP/JIidKCfrQX6V69nXXRhp8w5p0+R2Ott/Bn6TkXkZF2OYQk4NCLMrqi5yXmwqfXsbj3yex5gIK7m8+fzLytrMfuwEmbzGGb3676+vo5wVN2D9TAlL9WRDvO7gnCrxRUPAwX6Da8bEbFar6ogi8puVST0611Rc0xImD6ejun54I7rqj9cx+4F+Pz5hs10Wi8cS2VMjq2umVmW3g+tt+wMkxaWSUyVx/R6fVPKSos7rPbACB/t0R7tp28VRugHwojdrvdZ+ZuIcYI/KR/aSTIumFxmxy+IxbTc1gxXc8uSVoNbh1ma32KxqPBM0QOYykSL0IvDuitPay2jArDvws7kCrdKlfVDX6KFTgmgK5xFhjPXNeJu0TgFimOZZT/I8q6oUUPt2tBdYn/pVjm9watrc1wzq3LWgbc61G4P5eDbt28VF+/l5aXg01++fKlKePE5ZaE6ruRsB/2u67rqfGvPwuKYsHKTH2xG11h0rYgoVbfFYfXCuqxoI2uIOLCvBzbKLa8pa5AWYVZVqZLphCvoPEb2v9VaBR68LNh2u630RbX+zONxy3dUodoJlBn/70epHbwuBZcD0HVdMfndffJTqjzFzbEoms5eW9FJyNkkeZ1FJzs7N8qf50eb46mZq55RBPq+r9LvPE86Sy/K5oq/yZ6T4+NjmZG42Z+u6+IW43p26usUoO8k/BaZf0TPsgomykOex7wK3pD47NinKgyJZ8jgiONs3t/M3Xdupf6P60iL09NROT4ZRii32Ok0joNPyV8Gc7kcOtTCQKfGpCUzfO8y7mu3tYYcs5zCdPVd0qxa4+ywUcTDNX60R3u0RxtbhO4aq01pZo8OsnliOZtM1re3t4ioXQ3dk1Wm+ZfAtywIP6yJqXKMHGbWSERemcaB3B/ZydjP8jzDxGf2O7cCWZuwlbrUchuciMxxJ8WD7532klnmuk6WCtf1H1XJu/tJb7Jo1Ad331xmuu5eZDazJiq44NZXFhhd92FAIYXrpfI+CAGs1+t4fn4uQSo/j8YLh/gcEjqYz+d3CxPpnRovubiCXjjXDolkmRte8TwrRuBQkHtHrQrVraCZoAwPyGQFUvy8kEy+acVPpdw5jYpWnwfuKEuUZ6XC8j3HL8IUIRWWyjIRt+F32TLXt4UXqJP6ywmWa7pYIlfV3HG6da4IOblMZ2KJKeWbejqQruOHGjkjf+q5ptqsmxWXcYpK4zgeqQV65izdTc/m6YK6HpW4FF+WF63rOF3GK/1oPJwOxWfg+OnALy6yVoXssli7HMfKFgpPpSPPrJrPy7WaW2J5KsDLKDJxbsqbnoNuGXPZWcz3/Zoq9XUbjQn7R8WTRTp97LKsE8pmJqeuCHVt3oMKl4rQN+EM1836kM2d5nbohtHmyr5XinAxL4duRdTwhTDBLJIvRahx3u12VUHciKjLcGU7G/lHrZZxDFu+vOcli4+kwbxcLnE53wXDyZ4ZaTPDN50o6laCc+j4mpZSpmwoPBTUFvVAVmgrb9PngNcd+mGE02RjkKW7eePC8UXHZ8vwuQxH8rMp+H31jzs0LcKM4+ibcEZ58vlgcM3lseS89vfnpEIYhnv9wd1uV51Js16vR9hnVl07G2ufh/lcXs34jJcRBmzE4szCkVJ2pdmii7mR0CIaa5P1pAG+9pTNKXw04xm7Ne9YnQdhaZln69zP2/YAYMT96AI/WpbtgRE+2qM92k/fJqPGpB7QGovIK5+weSI38RYmRvOvcBthKIwS6+wJ7jZu8bGayVSkk43PLHeJffcdjBYFd6Ap18Cjhd2si+FmSfE9KAtDP3KHI8Yukfqhe3ipJPbdz7hwTHCUAdFIv3Nrkee4ZBFlupP7/b6qEM1nYLms7XZbqCG6zxQ2y9JuxNUKyfx2l2E+T9d1VfVwP/Pi69evETEuspBZ9qxq7hYX5y8rh+aeA58zO2Qsi/Jn1YT4/1x/XK+UC6fI+MHwtN6yuXYPwzFpPUemO1KPYzEf0fa8cDLvzzHx84P2+32VCdRMsdON3KTmQDtTPKNuZOB9ht9pwPw+npcsQVW+JyuJsHqup3pxEbqids5dy2V1RZjlbHK8Mga9v46IlE/IZ25VRfHKv+RV+n08bc7dGn7fc4+90ojPNQVwuVwWysmsm40yAxSc2O12FWDt408XV4etT1GUfJNRo4JXIIIYE+egm90VoTCwjDP6yy+/xGazKcEbz16Yz+cF1/YcVyoa3YMBkCqd0kpgEdDn/LWOrcjWrgcqJRfEAUmJ8dJtbvzwumz8nh++leUPu3xl62XW1fi0XGWW4qNO8hRYyZ4OsGKgznXWwq0Yx2WmggFTu3QrNU6D4YLEicgWgK7ROlqTguC7KV97fzNcrRWJzZ7xR6zH7LdTOMlURJr9ze7VIq56iTA+r8911pdM+boV7H13y+NyuVQY6xS47jUm/XtumbhSz8Z2PsM53cuP86NRB/J6vRblt9vtSl9lEZaAyLzGrmbzWWzWm0ouvR6nxmAqMNVFFwPoBY4R0tKlhegJApl34jhkq7xXVqJKv8vmoYUDRoxPqHNZdRnO3mfslMzy1ZjTgiUmSAMrO0rggRE+2qM92k/fJguzZuzsiDwtxxutFi+y4DQJaWthJM6Xirgnyeu3z8/PlZXimOBUWSDuKu62EyfS87dKjusZ+D5LAfR7dl1XqowoU6Tl9rib7NQR4nMeNWtV5XVsyqlTXrnHrWnSkubzeSlOoGdS4xm+Klght8YtCWI/3pzbmVEs1Cg/ETW+5xYn50/sBbqxPCT9t99+i19++SUi3ks+kZLlPEs/cY6uuqebKjWMzyrZ1/0j7mdj8zqULWd8uLVPOfAK2nzt/DofLx/DVpzAqwm1vCzJJRkDBcf9SJck9r9er9PouBeFOBwOo0IYzlioUuzYQU+hyUBOvwAngwPkijDjm3nFD4XuHdTndURvYJ0+AfbuHgojcFeYys6Js06tcQXWcpflArVSAN3dnQ8fROe+HsvMbC+/+aCrZPiPlzuTsLeEelRHsBFQ8uolnL8RlNAPlQvHaj065S/bVDSeVG6tknDZ7yjcfd9XAQ/KRbYxtPKHRbpV36l8hmGIr798TU/L88Xm5bOIsQoTzA6gZ/8l4zyc3pMY/DpOIWIesionSU645rnZZjECd1Uz+XLqj29kTqfjPLqRkp2bknGSpTuYf+21Oadw+4dr/GiP9mg/fVu4hm9FRampI8bpXZ6BwcjwVCqaW1HefAcv5vY1r8gbMX14jN5n1V181/OajL5DcqfTNadc46zJKmklyvu48i8J7x6NdYswq0TcqkCSBX2y4Jee2QM0nvGj+09lGzgEQC/CP+NceGAsIkb1EumijdIBPzJSZsNsFGBTE+1HB8dfr9dyRoa+SzjFLU8/QEu/W61WIytlqoq4rym36KcCda3gnM+Jv24R6z35wD1GX5Muw37dVsqfB+r4PpM9WuJukbpnV7nGLpAujBI8D8FnDHUKkJv4WaRT7u5UYU8u7L7vR3nJ+p27RI4VqXkVG14nyw6JqHmOHiX26GDmhjl3KiKqbAevLN2KGnv2iAubc7m4CfnxmVmUOHM/veKvR3PpjsvtIq6rayrn1ueackDqA9kDLGHmm4G7Z35qnY/l+XKXCRa/9fEktp1l3jjUw3XiRVI5J1Sa+j8qd4dwdM/NZlPmYL1ej2Ajl4UW5qyq6uQKev8yyCRjfjgOqCixr+Msyq82otfN7vItqgwxQt1LfeehcMfjsegH3xzT7Bo8Uk0Yi1qRsQIzO+KTxM55CL5FPdFgZkLFvE3foVka3ImgTHSfz+cFhM52wxbw7LsMn4F4k/Kg3VLhBLcW6zAMheTrGI5z3zwIRMvA75kFFTKeoQcY9Fsq7qkULc5513VVTiextUwYs1LqujeJxyQ6e9/Zf1nFGSUjUxAK7BQ5jg+sdj7E9XaN2fVeK5Byczqd4vv379V74rEKpCg1zzc2yjwpRU7ydhyeipA1DhkY8AIITlvyWpNZBWvOQ2a1+2YpZa8AGU9inPq9H22beQmMITCfWAke7LvkxI/XVQAuu4/GnMdGPDDCR3u0R/vpW9MiVKVhd7Ui7u6lV0HW96bwiiw66Yn83O0znCgiatcpIYJ6ulnrnm4t+q7suCDD/MQP9d0psij/362WKQK1R9T0ntWBsqyYqWiw98fH11O06NI6yVe7stwup0upsaLLZ6eM+elvU/2hXNFi8LH2eXesbD6blwpBnHuNI6tFd11XqECsNuMZF86g8Aj47XarsL8W7u3ZPhwfWd6ObfMzd9VbSQtTMux0OrdCXaZcjvV3CoPmXJIGlP22gvJu12odnc/nMj9+3WztL+gGVi5qN6sG3xULaRKe6jVVjYMd0ITxPlSwVKjCA15fXyMiqiwF8QZ5feXB6nNno2fPndF8lNYTUZf9WS6XlQuZgcctoNldOx5yPxU4cbfQ8R2HJLyCSybQ6dxj8QpmyHDJ2+1Wcoh53Qw71li18KcWSO/NF73Go+JoAtujXHA8fE70f5QZjuUw3I8JZfqdN+XEZ5VzSg78vF5H3Eg8AEnZ82d0DimVnWf0qO/arBz+cRnxecjodFOpc57t0sou03qn+6tNZbvdVqmWjkFfr9fYH95lb7/bV5CKB3b4fJ4VMwzDPWrsArVer+Pp6amKcOlCAmqZ8uPBEWpj5+Z5852Pi9C1Pycw2800SeRDrVarasd2nIID5VHkVt6mn6HiOymvK2yjtbgz/mE2Ti6MbkFkipwWmFsJHMsWQ4D8NP2WnDSe9ezWmmQl65/joBonzTsXgMuKY7ObzaaKFLtlSbyNgY8MpyQG7DhdOUUOOKj6oPe//PJLPD09xcvLS0S8E/+lXKXMqtzj6yUux/Hpfev1ugr8bLfb6hlZD1H9p0IdYXnYHJwPm6VWckz0O65zTxd0sr/672vB76WYAAuweNCT2LGfLc458YIubvVSzhiUiXjwCB/t0R7t0cYYoVd08ShpxD2am1WpEFved/vMklPE2F0QNbcIaOF4hNRTmXhKGl0e35noAmm3bu0k7to5jcTTnNQyqpHjoK2Im5co4nO6e+uuiWfbZO4Q+5dZvll0mVYdo7s+XozcuXXm1Y09zWq1WlVpiC4/nHu3His2Q38rEXq6k5kb38K2W5iW5P/t7a26rmCJiPsRAHouypDWkNzs2+1WlSmjVeVWOmEZv65z5gglZGPfsggdA3c5dVaHr1195umT2VEHtNboJju+ybRNrgX1j1azZ4hV94wazqsUIfGwzWYT2+22fHmz2RTBnB1nozMUsuMeNUk6rFpCo9e6nw8mzW22rutKGS6a/8JWSNvgbzebTbVwyE0iHJBRV1gaybFFbgaCC5j25xOq7/o12TTmWZqhhIK4l/fNAwxcHF4Hj/POPnk+J4XxdDoVisLxeKzco6myZeJ1tgIyVBi//PJL7Ha7KgeWPDh36w+Hw4gsLtmjHPV9f6+I0uflnzKlUNzm2R3r43judrvCT1SpMbq4qmv49PQU8/m8OhKWwSY3EiS7kifHozl/q/WqjBfXFHOi9duMUsXPMm6gN7mWmXxJDqjceHbObF7jpNlB7eobMUFhs3KHdV99lxt/FohrGTgP+syjPdqjPVokFaqdzCgrsGLDL/vqMOzT6ZQCkmq0KPgduXpZJom+R9eMQQ7RVyLGQLd25G/fvkXEOBrHexKoJSCte7CyiFuAfiB3dn6yWkZ+5l9GZT0Qxd9kOy/Hy3f4rKYfx5NznbnDTt711KUs9ZLzQDfeaS8cA7pHy+Uynp6eRmOtlh38xMpIes1AUTY+lANZi8w2ydIwIyIWy0UM/VCRpi+7j+ya07l6Flm76ofTYMiSyPrK8XMaDi38+axOIKCLyApPCjZkBUlcNvvhXt3b09t8HXkgcbFYlEK4HtSZz+YjPUNogRY8gyMq3MHK6hxH/qXb7lBQBkONMMKRS9CPcaXZfFadEidh0O95DbkyjNaRN+X4BQciy73MIsp6aLq/ojFE1KWGzudzk0oit4oVPzx7htVJ6Eaor1nZeo9QkjLkz98STo5ti7nvlAXPaGhxt7LqJXTHSUuQO6fGedA9MtdY4+PjwDGgS0vl4PQYPpvmtlqQi/okv4y6pety/mZwkKiopZQq/l/c0mtq8SrV6/n5ubzebDbF1WUfGCX1jdTpR+X+SVomU/A4Xl3XjdzJ7GhSl7tZN4s+7vfns3o6HqlJoyrdt2vRI1LSzhDgczPL6nQ6xel8jxI7dYotwyn5t9IzUa+vUal+vvb3xLyWi2VlUTj2o4XzWc6fBo4T3CJxc8C44BeLRWW1ZFQSNREttXtut9ui0H/99ddYLpcV1YaKmbvT8Xgc0S1YRpxKVNZQtoM5/+96vb5jKLfx0QYSnGzOnGztgpERw3VdYabcOJwio4XEYxCHYYj1el09CzcWB8E9r5sUGOeiEtelDMl64EKXpRDxoTA+0uT6ZV/l6NIidApRhFVUHt4rWrfWg88B+7rb7ap5Ud8Ph0N8/fq1yJ4CA+W7s65YpHpGWkfEEt1K5phst9tKno7HY8F1NX4aE+flUaG6guf/aW26ZadxJh7L9LvZfFbR2Zw0nR3DqefmWUb8y35l/5dihGZnPDDCR3u0R/vpW/PwpuKGze7a190+WT/ZgTSyIDx1ywm3zuD35qZuRm8gkZrfU8YDiZeqcvvly5fRdWS5kdzp7q++q+dkNMzdTceDMha+W4Tq+212px64i5ZleThe6FVsHMOp3JPVsrJ+HEqgO+yWP/uj+7F/3O1bVVdkidCqo8v9GXF9sVgUi4dWkmSUFmGWWqZ7cAwi7hWCOL+UN17D6SGSPeJzu90u/va3v5UCDS8vL7HZbKoT8LzKDy07r1RO72i5XBYSt+Pzl8ulWKGeCMA5dZnxCDOf2fuwWC6qqj+32y26/sP67oZiHa5X6woT1Dyz8hBPnzsej5WctPBoZ01k2Hr1LKZqqmBJllHAhZQJlAvJfDGP/tZXVafdBanC1ihhpPeZSytXJsud9d8wpcknW4NXFnZf5y87GNtSvq74vH6cQwJZ+J7X5NgS53KAunW6W4Y1eiCD+CqDQJwnCRyV0kiIbCxGfU/KVWWY12gcQE9RPi/ngc9EXqhoFuyT7sngieSYnzteXeRk6KtKNf7clWEAKsj1Uit1np7mG9BisYjlKi/5r1PsqKTkGiq9lAqU1BE/2Gw2mxX3UvS2jG/Xml+NTxVc+LhHGYNuVgXxqs0eY6l7e+CnMgz6uxxSTrIK1b4Je5Wn7HXWRjxCAtbEL3yS2AniIFo4rLFGYWREVItGk+JRnqmzDjLOlyslXtdry0moNptNmSThfA76MsVHfX96ehpZCXxuJ6Mq6KAJppVHa0J4pwTssriMsDQKKq1hr6fH767X66pIgAvj6XbH/khGd2HkvDNljXLhKXn6ezgcqvLpnpdNQJ2Ncy1MiQEtpnMysKNc6Kw4iCslfT87a8THUmPgnkNExG6/qzbT79+/V7xLP0tjsVhUJ+B52X6W+ZelpJJzeu7lchnb7bZwbOv01j4ul3t/hHOT7UBZ9GK/em6xBfg7Tz/1Aglqy/l9nNVPftePMzjs39cm5YVylaUH+pxlxlE1h4MFgOPRHu3RHu0nb9W5xjS35X5w5/UoUoZ5aZdQVIjZHhFR+fzCb5zy4bQY3oO7lUef6S6RXc9IlLA9RqZ4fq0yICLuBzvRqvGCoF71mdFyj6ZmNAVZhKTTtEqRKQLIMVBbLpeVlaX7MorN5H+62MfTe1RYu/1+vx/xBlupg5xjPQuj5fr+4XCoLC53b5TJFPFupbvF6PCEnmW9XleZTaKqqF8t3tlms6kwJ821vuOsA0Io89l8FLkuWTmrdfS3vpIDP1CIqWek1HiZshYdSta/912fkzrWf0Rsab0yausW/WKxqPBhWmGn02l0Up7G63g83nmD3aximShlUvNFjPJ2u5WiqpITpm1+dqQE+aQtupgsW48hVPh1q/qMBkwXk8vmk5I1xxoJmtNklU/vrkuLJ9fCLPRQ/p7fbeGbns7G+2gAs1QhvfY6fY6tqT/uymdUGo2XPzeVm9NQOB+tHGI9t7As58VdL9fKHabrJHfcYQddP+OheQ40xyDLZxZu5CWnMrzV5c7TNFerVeV6+m91TyUK0JXiX7qIhRY01PiTy3GRvXld9YcbIhW8jgyVEliv16PKKxkVSRCOBw6qVEKmyeEzKSim0roRw/4S4vK8ZE85HS4fumK5iOV8OYLZKN+EqJg2l528mCnArHkg8a+0h2v8aI/2aD99K35ORmHxaitZ5Rf+Vo2uiyg4vkPpmv777EwD3tPT6vQ6M6FpybkV6Lt0xDu9gZaRn7usQpER7xkDtGoUDKELTrPdi8NmB8yr73KP9V3u5rfbrbhpi8uimjdP1eI8LBaLEhSSVU7Kgtd5o2XrUchWkwXhNBn+5XPSYmCVZ9XCZOFfkuWdlsOA23q9rlzh4+kY8VE6sAqG3ProFuMoexYUKjKKR2jJ/2w2i/VqXag4XnhCrmDEuxu/Xq+rDC0Vb9isx647D4J3l5pMiMzjYcBtNpsV+hiv4xYqyc6n0ymlujHAVuZjNq8i+4R+JMMk6J/OtQfic5JRYrxlnoJXh5r6/SImmi9YZ2pnLO7MxVj0iwqjcHqKR3ey9DMpO052y8X2dDx3SzlAjGLr2Uq4fjGvMmiIe2ghtw6I4Zgora+VU+pZHw5BOI9R2QeZi+1VWIqLtqhdNCpCKtqIsXvC6zpmyubZB543mnFKJRPEkTbbTVUBOsOD/a/jvBy762V8qLw2ZLrGmdvPRlnks/F7cutLf9arSll4JR9FkjUOpe/9++YpF77ruqIItTlzE+bm6QVwvXxWVXR2MU83aF2Hz5hR2zLlIriCuCBlnzjp6XSK8+lcRaN9vFv3aqWhtn4/1SqLkBwecnciasKohIggOW9OZZalJnFxeMl/ToTjRGxOrXFysd+XCsIr+BKvoHWhBcqSRpzcTIllubT8XNflbpmdBkhr0qlJ3Eiq8kZWNIPg+3JRE35ZTmu328Vutyu7dEaAz/BNjbs/MzE5Kgk/SsBr0BXLe7ON4dc7yXa9XhcrSlY7A0guMwoQuQx4bvHlcinyfrlcKsuFzyxvw6k13Eic61YFPbqaKiLr6e3trdqY397eirIT2Vrvn5+fS5ky5RIzDZJ1DZUHz3VDAjMDZ5Q1pVNSppzK5oGnrKyaMEHKOAM5+/2+HLnx/fv3KgDI62R85R9tGae2hdFHPDDCR3u0R3u0u0VYIlrX2vrwiFLE3UX0c1bVuNtnzdPx6JqRdpJdJ6Oy+G7lqWie6iYqAJ8z4m5dsIAqXZCnp6cKA/QoKBn7LBMmt724fgm+w+YY5qjc2GyM1/X9ezYPrRi6xqKv6JnP53OxIFS0IHNLstPmst2WfznOGX6cyROfxw+DZ7Rev/MivBlW6vDAZrOpnnkYhuI2X4b3NNEM18osE4dm6HK3Ds1SPylvb29vlessa/Z4PMbLy0u57tPTU3V+iTJPuI4u1zvmS6vcqzZ5Wh/dVGLSr6+vzbl2fJ/xBK0DWsUsrvr6+lrOiH57e6twZa9EM5VUkbUs68T/Zr8fucbMKeVJcU5JYaUV0hIcHGf5rYhxpV26NlOAplM+ODDuGkvgmD7lFUimXFg2ZoQcDocijE9PT6Pr+Al0pDtUeF0S2PGF5qeF6fUwDMXV8k3DU/U0FnpGcgMZ2PEca/5eitrr/+meGU2JbhOVhWNQrDrt+cyUN8IyklG6cy4Lnu3ABUqcm1iVxpxZT2ypO5WU9+pmNQbtoH22sRAz9N9qjfmJhd6/YRhK7UDfPCln2hxYKYbjnp1Wqb/cLB0yoQLzzZ16xTNbhEczS4b6IDsZMlNoLYXNuZtSoAsqLE+2Z8FVWjvCpog7UPidMOy4kgdLnLfUwgt80fHBOEnEgTIhbvHidE1XWCzDRYXP8vLb7bayHJj6JaCZ40WAmuOlOciKkmox0IJ2pVSE5MNqlIUvXDDiTpgmR4x/1S/OEbEiLx7q85VZUq4kKQd+OpmUopN3KSPEg3n6G/tOS1BjSQXAQIXkTv2lRe+cOclsFvzy8coCRD5e2TGcSi+jPHGNsdSWy7vz/ch88Pp/bFI6rU3YgzB+T2LrxD4vl0shun/79i2+f/9eHVfgc8cgkFve3j5TgGqj9D/TiQ+M8NEe7dF++rbw6g00YXkoDs1baW3ttEwtU4qOWy3cMd0CU/ME/yzS0zpIyXmFfp1WGh+bu/dq6u/hcKgOoVqtV/Hl8M7HUsFN7oocW1JveLatxsAjd0w5IhaUuZvqI3dl4aC0Zt01JqbqFapb1reP4/V6HRXkzKrPuNtFbPh8Psd+v68OIT+dThXeyqKxtIZaVB71VamH3nfharRaHOJxviutHxbA8FLzdLH9aAP2U666p0VSDhTZ/9e//lVZ0Dwv2Z/fmQxebYZWqmdW7ff7Yr2pD5If0mk8+ydjeLDQAzNHGDm/Xq/x8vIyqqjN52oVv3A3mTqBr2UhV0wWMwmrMlx8CHeV3R3JAgX63VRKFO/l6XdSWFnIXCYy++nubUvBZv3JXGMqEf3lcxPUVc4kc5bd3WiZ6QxCqTmZmS6Ju8bMVWUbhqE6AtNr2Lnwc64d7/GW5ZVrzJxf2gqUtfAdr86z3+9HtBtuiK4Adrtd83yKlgsr5VDgi9s1btc7X9HP2aFc6pqZK5o1Hx/nnvI170Fcd7/fVxuO+sj3XJ8Felm803ccz8zy8JXzS6XF7/FIBCl1d8/1Gd8T//WYQab4nLIzxSecosy14gIP+syjPdqjPVrSRhYhtbHSaiLqGmciZGbVN+T+8DoeTfVEfJ4R4pFI3y2yQIoXbaB7EVFnSjh43Pf3Q6EdnM2iyaynxywGWTTZWShuMbg7pKoyEVHOaKBLwqhedo4wx5XWK6OQdJNltfE6LSvOLWi32D3Z38fNwXbOHceEhUe/ffsW5/O51NfzOfYk/r7vC+G6VQxBv6VF79aGn59cMj6W7+mJTk2ilTqiOIFcrGIX/W2ckUXrjRk0wzBUbjWDSYfDIfb7fQnUbTabUQCOwRFPj2XFdh6Y/u3bt4ra4rUUaZlnxXrpdfFzP/yLpHdmhPnakKxnXmCrZd/1SjmT9Bmnp0gR6v16vS6CrIN1qBy9qoje6yQqXyzqEEt26Xc0o7m4PFXIXRNOEhcrC1wqZ5N9p7Lwarl+T9JS6F7Khc3oPXKj6MJQaITNSMi5COlWZNVd1FwJCd+h6865ytx4TyHjtSlEjsX6OPL5nZrBqicsu0XhPJ/P8ccff1Spj/qulKCuo80gw3GdhUBFQ/ea92XOLGX2OrvTvDabzSgVrYUrz2azWEQNGxG+0EHpER+HR+n8+b6v5iziXmJst9vF29tbOTj+l19+iefn5zJGpKhlhyNdLpei4Fhy7fX1tfzje45FwVCHfrTplJTSyzku57vL7bAMDRGXGc5Xi0GS0bemmCEZXOPKcNH6wImqPAZToGmWLytl1uLXccCc++ZctuvtWkr7MFij33oanwcnslQhLWo/HlJC4xhcK3yv67BEPMnYnHxZXxktQcrC6TJZJecsMEBL18m63O0J7mfl2Vt4Zpbf2eJvStGQc8j+eACCdCLeS/IlRcBcVQWamNLGYIBzA6msPUGA+KYHC2fzWSU/bq15sM75f0485nypIEN/69/PR46PAhfXW2qB+xqSp0bFvNlsyng5d5EGjsaV1iVf+5nRpVL65VI/VzdrEv8v50vlybj16JzaLDdaz5zJZquGgG/mPu4tfDPigRE+2qM92qOND2/ySItHOyPubiB3LbL/PfrFHYBEy3JWRSOFZj6bxzC/uzWknbA/+q1bbl4lW8/Vcum0azgLn/drZYBoV80K0ArrctxU1yDTXtSWViZHNh96Xe3KH5akH4zF77ur4VSqH2me7TPVPHLp0IYTzjVGu91uZFmqibjuRQ80J/y+V7HxNMyqWOx8UU4S9NQ4zV+GOWXriFaoj93tdgudFc+IMpMUXNZkxTEzSJQanz9hqiX6e30vziAc8Nu3b0V2X19fKy+CUeKsYjbnwcfZ5SorNadxd5mmR+aWd5aa6a/1npCIn0fj8l2l2HnNPC/Vz85SEbrZ64rQS9Fz0ZOi4yY0eXEaNMfLOHjO/M+ESgJN5egK1Lly7n5SwJkV4PXPjqc7DYHunffdwWTPnSV04FiaTzZdauf0sW/8q/v8iEJz98PpTxx3B8LdJSNvkDKig6ay1DONDeeIudtUhKvVqsJfCa9kKXZ016lcXcaZi065z8ZI73kPpuPFLS89JePBUwB1f3L6NIY8EoCBQipC1UMU9vfnn39W1XDOl3PJRuJmqaAPK+nc+ls5cMs3d8pcK9+81VzePcNnKujBxs1zpAijDryMzjWmFtcZwBIq5+xkEygF6QuZ13VLxrluWSI38Zxs4LOIMgfQibKZxdNSEBnvq4UTEevTdw/doYpOZymHGf/N7+8pbb5QvaCqhInCxTHgvTJQ+rOmvnlZsJagViTqyz3XWYUKmBLH84i5MUiBOTZLzIl57cS53KrjJuxpc24tem0+yhDvmRG9W+Rvne2RWfxTi173djxYjcEcBR/dA9H3GVCTZ5Ktm9l8Vp37fOtv7xinBRZdLjOZmeLU8nM3qn7US9F1WkHPvu8j7FIPjPDRHu3Rfvq2oPU1m9Xnx7I0fXZuqafSRNzTvtyypPvk1appCusa+q5bNIWWc73E+XSvPDLF0/NdprV7tTCzrHnY3814csBkxbD0kVrf93G5Xu7loBAZ/6wPTM1zHEvj67grWyuBfsqVyT7zk+2yPrsFPfR1ts9qtaqKkDq3kjSc19fX8ttff/11hBGy2AX7w0imywwxq4ioSmn1fV+58m9vb9WZu4vForjmpPXoPo5HV8UtFvNYdndPgWPJU+McQ83waY0BqUeqmKRnFQwj+oxbxfTIKm7ibJ7OX+sUOce2nTvsHkkrS4fUKJfHLEOkZTU6e8A9sEXLDfJyPTr+kA/dwlCyCiUZ98ePD9RnjgnoOhW+0o0PBPc0qynMqwV0Z+PQSi/jvT1NzV2X6ppz4HV9XWpIwtZKXfLna1ECsgDUlIJrKbspEPqzNnJHeJ3ZfZN7p7+gluNqWdXfE9anOdbZGurr5XIpiujp6alOL7OKN1lV9Yi7LBIj53PwuRXQ4qbH5/Sq47q/zptRVexZN4vlYjkiFEdEqThNBU2Xn2X99X1dhznCWovsO4/UJW7rRza4LLh8teSptUam2hQc41SiqdRGzoMrTE/re9BnHu3RHu3R0Iw+Mz5pzQsiRORgu5q7W7Li6LrwMweaafVlIGp2QlpEfXaFLKws8qrrTgVLvGWBHo+ITVFbHPie9WPzn1ZDi7Hvf6cA6h9xGTzVS+9Jvuaz6p6+80/t5i3rXCedSSZms3vAY7VcxWV5GRGY1Tc//F1unq7FExTZf57hslgsRrXw3Fvh861WqxFDgNV8XNYyK28Y3oun8nxp9ikiKkI+3dbMNWYE3ItoeNV39oWk96kqTT8SNMusx7/iiWlN/Qgso9bKJskoWXo9RXKPMPpMK3rjncoKcvprL0zp3CB91nVdVek6S6vzSdR1JPCKXCp16Xq5jrAYNsc0f7R5ZsKUMqU7LLclOy2spbC8wgg/m9ooWqXE/kpz9j6bH3zeGqe/cl0fx9lsVuUZM2tCkU1GPX3xErvzg5147CflKat+TOqWH0C/Xq+LAn59fa02pOvtGrNLXgGdis9zlh3qoPKbzWal2G7hKs7rjUJjQkXo2H6WBTNffMjirftUB/h8cg6nMMJWm6pW9Fe/m7FV+LqFLQ7DMC7V36Jf+DGcnGDPnXWLxuvZ+ZkSzDts1UrT60zjKwAyv37gkt0sLRmke3hQIbNWs8HluLQmIbPW9PwsFOC1/xikYp8yGkBLkU+Rpj+zgClknyWoq3n5p6nvZB5GhqdynMlhpTXkxSW40fIENwU8nMCva1IROu/SLUIqws1mE8/Pz1WARLy82+0W18u1BL90Ld3T557jx+fQ7yqrpqs3kXJ28uxj84RizGhcLld6X7iB83fMumUhDsPwng/daC0vMTOoGAfwvy3jxI2HqWBJZh1W93pUqH60R3u0R6vbqAKnE4S9JE7E3ZKbii45xpVhRQrrU9PTMuHrrBJFyxoqrkNCYWhFV6daFvH2ckpqWURX3yU1xl3E7LAgfW9qbLmzZzSi/0n7K4Tqv3oPL87B67iHQSvZIRG5ziIEsyCBZ4uwqb8edfX54fwKZmF/eK4w53q3371bhI0USRaIiKjpK9Uam89is9gUC4zFVXU/ZqjMZ/fzk/shIQx3tZuYlXLrhi6Gbki9nnLeed92nb3Qyo+61f83WmZ1tuCn7N6VIvRcVZbhqvIwzbRvZV7oO+QbMc3JFZZeZyC5Pqcb4VkeBMWJoShlSN9t4ReZizeFk3pr5T77dRmMyLIxWkLi18sWeqt9Rp/JUvBarUWb8mv5eHDcvWw/U+F2u11JDYt4p8QwyCGXVPdiOX5WXHYoxykoPGBrGIZRxRTnaBLbY+aLH13ppel9Pp1qRqWpJlqNZyDpN0xvcxww+igKtKTAxRjq0vg55svA1FTlowyK0XdaOcm6p//fjxgmHqj8DC/0YEllKEWtVyqMUBytiPuZJeQ1+QC4AmOHJcTaWb0YAO/rQQRXcLoH07kY+XKrqpyd8YEZ3ubj3Sk7SsCLLmQ7XgsH8QmtSLMJ9uiT5vmVGUbYdV1lpU/VWPNSTNn9Wml9rd9lzTerrOwU/2Y4ZN/3FTfw7e0tjsdjRUAX2brruthut0UZbrfb8n313VPGSBDWNYXxybJUOikL4lIhZJFYydt6vS61AT2hgJaJ1hhPD6TXk20UU5tKzO9jzjXWD31lufkct4JWzsujsm0ZCZlh4CRy/R/v15ITftfl0mXts+breKp24QMjfLRHe7SfvlU8wpH5bQcHkZrRMpE9lSpzdz2aOkpGT3C4z6KYU1aJ/75lUjuvLLtfK1Ka7XhOgcnMf8dbfaee2gWzyNzU2LZcWs1JVi1kyur1a2UcMXdjHOfS9xkNV/YDI7p+Zq7aarUqFdMj6iMJlAHCMeD96SorMkwPiK61l38jPYpVmxaLRaxWqxxn+1hPlAv+NktLy67jFr3mWjheRBSqDd1n/bbCF7u8gDH7rL+jyOuEDE6VH/uM2uXwmntHmVfmVp9/Nsr6GmpGxaLVCd2UQkWgucIzkMuYlc3XAef6rh7ED4LmhOm1Ey/pFtKVy6pgVxyshos7RY3xgfX8ZRcEX2hq4qxNudWttKQssDOlsKau6888xUv80TaVasXXUorZomO6V8R7bvF+v08B/fV6PTqfg8d7+sH1lOFhGIorrH6zVBsV6mq1KqXxdV3BRrvdrqLBUDkrNVCKmznKNCzUd/0+mztPveSYErMf8W3NLZZS5O8rkrdR3YqCvY0Dfq11rv9Tv6fkJWvZhtuifk3JKdfIj1DASt8//cajPdqjPdr/460KlmTNzW/+P3crr27M2nuecuSuFN+3ztaQhvdDydk39o/0Br2P+Ejgv16iv42pD59le/C17teKCHLXbe3qU9dp3d/v1YqwT13H+/VZwONHrpm1LPMgK6jhfxVFznb32+32HvFdf2QVfaTqsTgIx7vruqrQgiwyBWMUdFHGh1N69JkCORH3k99IzqZ8MaLMIIbqKPp5yZQ7zsn5ci7EbK4hBVJ8PbYCJJ/Nn1uH2Zxm541nv/1MTj7zOqYsQ2+eafIZhDS1PkaHN3leZhblcYyEjHxG2vSe4X3yt5Tl4QvTJ4B/s/SubFKmwvwREZcYV7pupZVli7c1oVm5+2wCWveaapmLkCneVpty47NKHj/S/L7OA20V3c3GvYUlsfCo7lf6uqr5d55r7G4jN09SbfT8TNuUG61NVcrQ82MdL4yINCtmsVi8Fzgd8nL8nOPb7RaX86WCBxw/dCXgLvBfkSP+Jd2Ic+EpgGkk+2M8HFJqKcKpteFykMEwDp9k2Lpnb33KI2RpfO262mW8bDiJoORRebksfVadTgd+VlYPjRYhhYi7p3PxuPNn4Xvf6ctJYl1fDVgmmJkl58p3aiebwggzMnErYDNlPQp4nxKalgD+lQCNfy979laaIc+18LEg0ZjUq4ganz4cDrFYLIqSEh+Q9CPmKdPqI+Ym60w4oCy1l5eX0gceIUrLLuIdJyRf0eVCjfURnYqm+WodVaHf+xhLCfoYsoJ0mQNLiXMak+OFLhflzO/+FrdrXRXeU2B5j5aSzOSvRcP5v9V83TiPMOKBET7aoz3ao9WuMS231WoVy8VyhOep+W7PIg3cSfSaZGcv2ur0kiwJPctmcTOdZbi4g2cWVkaablFFMncucwl5DV6X/eJzqnkppJYLLpghwyzZB34/+54/i9Md/Hs/ij0KgpiysLNovWRElpOKGlC+aH2RTC/XU79dr9fFvdX/y0K83W7FvVX2ik5zm8/nVSbVy8tL+d1yuawKvm6323h9fS3VZ/xsYMIOdLe1xli+i2dak741m81iuVoWK88Luqak4IRNMItZ9ENfWYZDP1Spe2Vuhr5K5XMZvnZ3CMDpda2U00z+Mmsww/ocOpmKRjtNj9dJiy5Yax7e5J3zTAVfgG5y0wW5XC6jChy6xjAMRVA8h9LZ/I5FElvyBei5uy3T3KtrsGWL9jMKS6YIMwHg9dw19vSzjPOl905vmMJbpxoxpx/FCh2vy4S6Vd/S3SopCX23OoISnD7nevIIBP2WFXzIFeQ9ddAVNzkeCtUPdRCNwQoFRJg2qj5JmWa4ssbAjzZozdVqubo/6zUqniDhH1/oTomJiFE195BerJfJO79u+GsyrD44dayVljcVHGzBUK33rfoCTuFr5fLrdaUIM86a8/j0mWMmjnFR+HgCGLEggdB+yli2ezgm4VgGa83JalLuamb98PpusWm37G99yi3kGLSaT4r6qdY6n5lz4Z9rcp0cq+v5AvyMGM17/U8CJX5dPTOFbiovlKD8anW3CJ+enipLSWwCPpcUljZacgfpcdBC7Pt+VNNPHD/Jl2RmsViUHPXNZlON+3q9HpWWI1fQOa0egGRetBdvoHy4IeIWvhTjIhYjbu9UG4ahOi7iR9pnAQe38Fv4+V9RhLpupoP8+6N84llXrF3XGdn9Hxjhoz3ao/30rVmGSztbC0tzy2RET0FE2St5uEXmbuBUiF6tKh/00Re65953ujFsGbYi18Dvy92JuJ6uw7+0TDwyzL55oQe3irMzc7OMCx8f75dTH/hZdijOZ/SgrO8+jhwDuYTtqHZ9gqIwO7+OZz/IbWUpq2yc9Nvl4l5clVxFLwZyOBxqF38xL3id8LusagwPlVJ/HArSddfrdZxOp3tkFgUZ3MJjX67Xa+UmZxBONSf9vfoMP2s1XdvXhlN0/H6ck4x3zGfxdN0MSvCm/mSUIpepxWxRnkOucUu+qxQ7YQ5MqWMFDj+DIXP99BmVnYSNi8OrFsslkuvr9Bpdh4uZ5FcFY6iM6FpcLpfiumQDyMlyheFphhR+py98FiiYalRYHFviG7pmK1/XsaGMT8m/Ldc5c0Ey11ZzSTyYv1Perb779vaWKu733wzVfK5Wd3yMSkqHrWeBtYiocpT7vn8P+iFwpwDIer2uqs0cDoc4HA4VZUfut0p/iYazWq2im3WFXsN7qIKTXGyHc7hxKF1Q15Vi1HU4v1TwHhRQ0GISqpEC++QEB5Ky+w73/6hE46W2WjzA1pEO2f+1+v0Zru+u+tSac8Pk4Ro/2qM92qNZW/gOTQuMlh0T1CPGoLRbUa1MFEb9svNf3dJ0V73q/IeVmqXYdV1XdlpaLe4WetaBB4H43Swbwvv+GWXlszZFZeF9fNxVHYg0plZEO4uOf5Zm9yPgtvrOnZfy5NYZ+36xqs5ewZrRXsqQrENahBxvWgnr9boKeKxWq3h+fo6I90IPu92uqkcosrXqJbIP/iySRRVd4FiTLnM+n4sHJMtXbbPZFNd6v99XlbenDiPr+z5u/a2qQj0KYsnmmSe/RxFXygYzZhxWce9kisBPDyNjFvxoyzKRRskGCIZpPH7EO1u4IHsZrkwp+YM7TsQH9WimH4btWSctCo/+P+Mu6r3TaZhpwkofUxkhWc5kNsGr1SrNy5zKqfwRJZlV9fgRpSjFxzLwTq/xiLJvXtl9PlOITp3y+aSyUz/112VL3zmdTnE+n6uqLKwSw/nM+ie5UD4ws5P02Xq9rqg1jj0yl7jv+0rBstqNnoWbu2g7fk9CPvqMY+LZV8w64WL2vkZELOaLpky3aFutluG4DoP5puqwSIttofloUWO8j5l73IKfqPwoM+o7N2VG9iNiTJ9hfictQlad9s45KO610ijktOx0zczn17VaAZIsfcvT8bjb6jM/k5YkWgqiC4Tur2vqmfgs7BMFwWvoZa1laXJOpnh+XthC38t4ha6sPSWxZe36a58Tf+8BIi4Wzo82J1lD379/j/P5XDC42WxWFUfgCXd+pATzh2XlEaMjVYqBKKVBcmFT+RH302JmjUTim1TGXhiDC1C4N48YZTrecrks96QBoXH0YGG2Cf8opaYlk5QfGiYKHjFo47/7kXRP96Ram7fkwPnDri+yhIvZvC7HJm+3nMMyDA+M8NEe7dEe7dMyXNTUTqim6+CUGFpjfkCNJ26TKkLXiql6CsmT7uBFY2l9MGLJXURFIGia01p1S9NNcbr83awLsRI84t1ynfyzzLprkVWdBpARVf06reyYzzJPHJKgC96yDiUXnCNFQRW5V1EDd12u12uxfv744484HA7xyy+/RETEly9fikXIg5s0D5vNpow7z9kR80HXJUaYvSeD4eXlpXx2PB7j+/fv1bPsdrtyT15HVomuQ8t3Pp+XtDrKXpY5pN9lFLBs/jzSP5XdwcbUu27WRfQ5jcpTSJXmt5iPGHgjqppDUfQ03X2lzLq3obEllJAR+l0257Px+mvSZzKBz27EC/1ICo1jAZ5X6w9KReT14ebzeSUMLRNaCq0VZifHyTlqGS+OA9bCDzM6TWu8vGVj7NfnZ9n3vc+imagRq5Jb6Lgk54HXbSk/pymRG6bG+SOU4M/IxS06iBTPZrOpAjCEAJxjyTL5yjpRfw6HQ1GkXfd+fCdxOi4kZY/o2nLTdV26433fl88d9/OgEN+LjsXx4sbPau7ZpufZR5niKWvEco2nSnalOctyS+c5PDPrZsWsyoKXbFlpNOqEFiVGc5TJ4mcQAGVGxpB4oV3XjXONPdqbYV5SZrSkWhaHJrrFJfLdgrgNS54L98se2C1LTkbEna/FCaSF6IRW7mgeQGIAxnmOWfEEf+0ti/C1JnYqCu3RuCxyz2enFZ+RdzMrOPssi/ITH6PcPD8/VwqjRSqnJabv6vV6vY7n5+eqECtxZsdFOb7n8zn+9a9/letQLlSui6W3PD1Qf//8889Srl/91e++fv0a2+22UmgM+pzP5/LZfr+P4/GY8lQzgJ/rzRWoWzxefqt8psjwx8dUiLNuFv0M10SpKilPKg+X43n3MR6belOhHLiF6bxZl+mpyLB/v2Vs6DosMejBugdG+GiP9mg/fWtWn4mISdPbqStuQXj0uVXAMcMeyQ9kRoGz1enaeRTNK424+8YioOq7ItrchXU2rt77QfX8jFZWq4JMNra0Ytzl8dba9ZwH6BQZPrunIHo6VOoSJc+lzzmftFQoI6Kr8KxiNRXZpVtKV95xv+v1Gr/++muZP0bzvQmXi7iX3tL/r9frYsnpbGL1gfjcdrutnnu321XcWFrNu90uhmGIL1++lOuwag0xQj2zouXH47GqsuOy6Fi2ey6FwrO6U6j6W58WWCjyFl3JJrkO1yYcNlvORl4Xm3uTrDLlKbYOj1AWec+WRcjSYXwepwI6u4Fl3lixaOQaZy17cJUoyvAxPTSVgA/gFJmSSpa5oF6eKpsEx++4IFk3LRvciPHh6g76Zq4wc1y9T1lffFxd8bWOb2RfM4J3hh+25m/q+j4mmq8MhM64pRnOzNbi/xGrlXIjr5D3YQWX6/VaKS1eO5tP5i/rWrqnV0liX4lXazwom1K2y+WySgl0+gyPDNV36TrzuTJeL187JliCDNcazqjS5jy3va/XcQkCxv08GMpGixLDcf6sEk6GpxMLbXEWBZ1lNDLf+H2jz+SC/Xm4xo/2aI/207fJc43ZCMD2t3GU1tnzBHtbCeOZRdHaBbVDc/f0rApexy0BUnRa18kOtc8oIGp0A7VbedSwNZb+ukWC/YyI/aPNx7pFt2nd80f6IcuJ12HRhb7vi8voJ7154VMnJRNsZ/0/WR+0PL2oR2YRyiogRJIVbGBf/aQ6wiKkDB0OhzIGJPp7QGCxWBS3W+OgqteSLfdA+JnLTmveullXWYWUBWdfDMMQXX93h11equwWBGQUKMn6083q+qRZJpm7wxovUoqyYC6vw0wc3p/jrbn2PnzqGpebDn05AjPj4WQZDJo0jwhWZvtQl6Ciye/4Ie9DviEHi9dsue7OXXRF6Cev0cV2AaKLxQg0J6DFxeJn3jJl6DSTzMWfckmmItc/2j77LpULcSIprEwR6ndcAE6DIVxyPB4rhfHy8lKwPufieaYLs1nowt1ut1EEVzjSdrutDojabrfVoWNOpTkej+Wz/X5fxsMj3H6shXMwiQterpe4nO/wDmUvywRyl1ZluGYxS40PylfZDIbxYVKVmzob1wlg/9TEOdS4k9rmUWPqh9l8Vv321t8qBTz0darlbDaL+eIjc+ha6xHi09vttuDQ5fOWoGcWw9QO5EqHHDZODGkJPhEaGCeccpJ4b985HDOgUqayY3+J4TgpWulawn8YHKEi1XOR7+Y0Elky/pne/whtpmA4Cbk5a1O0HZ+DTAY4tm698Xvsn891sYbmsxj6oQgjMTfOL/vNogYcd2JyjktSwcpCJ9WGZ51km7I+Z+BCjel42ZGV+qtSXLqO7n+73eLp6akiWPvmzmemd3I+1SXgXHk4DWbEL/0IMPSzPqKv5T/boCPeFY2vIQZF3SByA6ZVzCGjY7X0Tj/0ER8/XcwXla7wM6H1nYiI6+VaGxOzeoMepYbGoz3aoz3aT94mD29qNffrZ/NxZQy6b46d+UEqDK33Q44RZtFnxwjdYm2RmzMrNOJOzSgnkh0PcdgfCh4lSoMaz0lRWSaeyUvLskVlcavDn9GtxSm32t+3LC63KKZ+12L3c/ydEC/6xma9qbAYpcNxvCLu5GrOO10XL+NGTE7YI61tpreRJrFararfkSDvWR6kcT0/P8fXr19LUdfZbBbPz8/l/eVyKWl8x+Ox/NN1y4FQlkCw2WxS11TfZf8Y0VbfiFkSe/fMHlJJur6LoRvjgr5O1B8/vL5V+EOlwCLerU/i8i5jI0jHrNnqukNf8M3T9VStgcW6Pv+IfWJWjqAWpmxut9tKNn/4FDt/eD5AdrD0VIDgM5DeXcHsO2yeZuX9z+gfxIZYeomK8HQ8jZQfr0mcUm6NYxZ+D28Zrtcauyy4NEWf8dJNRbgm+GA+di0uV9bP+Xwem82mCjI4/srxInh9u92Pq9QiolJi1R9mYHRdF6fTqXDxOM7qi3779evXKsChcl8R71zF/X6fprsR1oi4K3HijXTRWLaLykSlvXxTZp/0HJJF59hS1rip0M13WMqrZFeK0dxdyopzhRfLRSzijqvxu8MwFIV16S+jQAZlktzdqVqBCtwI3+yGDx7h/L4GCVdQXzFFUhui8Gll/vAZfjhYwqadP1sQmSJ0MrbXxfsRK9QtTY9gZX2UNeBKkZOc8bMYNfaCsBTGDKNo9cXff6aIfiTFzhV0dp+sXz8SNHFBbc2Rk2hdQXiQinLAObndbgUL0gaoeaDS9NRGx35ZhksWelG+y7pWIiOMCsJkm47upw0yorZASOjW8zM32cfKjxQlnvj6+lpeuzLhxsHT+RRl51hMWXlU7C7brXTY0bx3s5Gh1JInV7ZZmmaLG+ty65afF9pVo/GhoxZ4tMKIxxqP9miP9mg/eRuV4WodhKwzXvl/xI3UHAfRtZmdoZZhGl4yh9dw7hItOe4e2j31ObE7j8Z5tgjdpYyv5bucY6P+3sfMr8Ox99f+vhWlZX/4XXfPp+7BXdrTk9xVJqbD+XKXbblcFmtHOJ/K35Oqos/03e/fv8fxeEwLF0S8l8P3TBO6c7qOrDNGaQXjeCUYhwvoap5Op3h9fR3RoGjR6rMvX74UnqE/p6xZfff79+9xvV4LFejt7a1gjYfDYZQ94hYoS/4zss7x8L5KLrWOFsvF6PxfrkvOUTXmy0XMIndrRzEEWF+EEXwMKeOUQ3oDukbEu4uroxYUjSc8QDx4tVpVB3c533XBRcWOCAMjIMs8TS4y52plYXUHgYtg4hhOLcjsXAsNLCeNgRTHsfwvB58cMOZ+Kq+Y6XicCD6z0zYoSBE179KxF6fP8DPHOx1Ad35kK5Di18nc3kyA9dsWVuzk3JhFJSOszUd87Hw+x59//lnGklw7P7HtdDrFbrcrguv5r15NiFga50+LQf1/fn4e0a/Ic2RgxQN8VFieV+4VtD1dkPSriCjX2e12cTgcyuZAjFL9puzRFeailzxl9Kyi+CyAkuXBd7Muhts4XZYyWtHk5sAaZ4gZdPPoh746M4TjynXtNQi4AWnePTWOilAystlsRhWNnHLl96poPPFoj/Zoj/aTtwUtwIrVbe8z0qNHvyI+dqOPM1AjxsU6fXcibcItMF1P3+P1eE1v7pb6Z55pkqW5sX+tiKlHC2kBVZGyeW0RujXR6qtfZ+r/9FwOOv9PCNXZvegOez04DyBxV2Z2CM/doNvnrrgsPP8/fZcBGX0nOwt7GIbY7XZV8MbHgNZFVhFZre/74nIrQ4VWPQ9ZcovQZY2W736/L5FipuZ5/Uy6+Kpo46mqU1lKDrFkMAitZ733s4Ad0pFsz2dWlWioPcYWM0Rj3grUeUCSFb951rSixtlanqLV6e+Ci/OziCePyssGuww6q10M/f0owY/3Kgoppdg6eNn5bK0oaFaOnliRFwhlzjDpMYrUuWLOaDgUdo6fR8ciPopaNp7LN4opgfYxmooS+/vPosatexalnRTy1FjqGQQ5UPkJ89rtdvHvf/+7LPDX19cyByrDxQjg29vbKDNHfV8ul8Ul0gFHug9fi8bCU+2EKUW8K0Zdp+u6NJ9Xz+iplmzL5bJK8fMK2jxo6nq9xvfv3yPiHSPc7/dVLruXoWchW6aEZVxYh1AoAy2MmkbMbDarSna5wnRmQt/30d3qzBX9rmVEeV8/k3Vm24i3y2Mb9FpH2dIddvhL77fbbby9vZX3y+WyVoQujE9PT3UpoltdqKDFBfLS23zgbhjnFU7x6HxC/dQ4F1p+V5PqlaVJcuXZscJ+3CrwHUrX9PNNtKDL4FqyPQVxavf2v1M4bmvsnGjs38msR/9cr31htfpMQdTYig7y559/xuvraxFc8eTYKHskvGacVuKStMQLFSfeLSy3QjXvyvvloiKuxM1S55Nk5eEi3hcWcUlWqObGqrOSC0/1A5P2Ew91zc1mk+YTe66xB+p8rjOaS2YADcNQFVRx761VZZ7vI+41Plv0Oj/Jks/nRgv5fsKgOWdO2NfYns/ngsWeTqcqP/35+bk6c6bv+wdG+GiP9miPtmCEhv63tG9GB3Gio5vlGdaQNVkinoQ9hdVlpFHdV01WnKgSHkXz6zNilZ2lzCi2W6ievJ6Nl5NGW8z+bGyzXT5zOaYyTPx9dk/+baX4Zb/1z2lF0B2R9UNLycdZu7tcnoxepblsWZOsCK2IckmZPBxGqZ58TzlhlFaZLHSbhXlG1AUkyEhQ372StBdPIP7NsmC0hnxeOC4/6mZ+9joiRq6x45CtSjUOG/lJkS77rcId/ppe6lT1Jnl6nGthr7Lu9XsxFio6UGvA/MF9gitG+qwrVBG5qXQLab72wz0nUa5bxnz3AZTA0O/3Q2+c/pFRdqRAs8WaZWc4hcEniLgRFdz5fB7RbjIl6eXtpSAyN0QtOyEtU4Qt9ySr9M2x9myDLBNAr+lCCqchXUTuyB9//BH/+te/4m9/+1tEvGOEcpvP53O8vLyUoz7//ve/x2KxqK5Dl8fxXlazFoYYcaeu6Dpvb29VdRmV14p4V76CAVzGXYY1XzxInpw+jm9VSutyidfX1/j9998jYswVJPSy2+2qg57Ii/OAn8umY9U+zzx58Edy3fX//OvBOKciLZfLe5DQTs27XW+pa675o+Ij1qfK5MT2+Bxvb29VZg7la7/fFzz4t99+q6qeD8NQY4Rs2i35ZRYlrQDsvqsUJidJ6T/kOE2V3XISKfujHV4Pl53nwMnXPZl2Rb6V3nsxWfaNOx375IvDuVx+Xd8hHetk6qCPkQdvqPBbVkAGkvN3Xr8x27V5HV7DNwQPjmgnplLc7XaVHPAz5SArkPHrr7/GarUqQY/X19dqrklSlkIjjvTbb79FxDvJVifFsX+8P60+jic3ZOVQEz+kgj0ej+WZtVFU+eu3u2weD8eRNZvxN7Vh67qOg1NGxIFsRbwz4j+j7C3vzgMpbK6MvX8sScffu8fnPEeyKIT5sQ4kx525/X3fx263Kxvv4XAoivB6vcbhcCjX0TXOl3sxjAdG+GiP9mg/fRvxCFnFlX413cCI8c7B79G60Pmx2bmz2gm8dFX5HOz0LFvEs2C4gxP/OZ1OlUVKi8its7/S3Kpio4uWpS36GDj+wUinQwWtZPep6DstUlktLSwyq0pM3M3HQNeR20qogaW2xESQXDAVj7w8RXDlWpHDt16v43w+j6xbFnxVe3p6ivV6XayE/X4/GiPP+skqGSmzhOOy3++bXDz//XJxr2LDw5sU8WbqmOO/stwEO0TcrTgWYXCZ1nX0GeVNY8g+tOSoxUPN+Lj0avzwK686zWM/XNb8d370QpaO59j/bD4rHsbxeKyoSMWNx2MXqZnNZrHdbuM//uM/ilD/4x//qCaYhFIvLeU+P8uR04T177prPNWIF3gIPsMrKlpAQqDOBvAzQWhRa7K+T+VR+v1bpM+sPz9yHcf2pviH/qxTBNzWM0a8L9Dj8VhVaaFCv91uxRX95z//Gf/93/8dEe/lsV5eXuLp+V1J/vLLLxX2+O3rt/j69WtEvON8WsQRUTiFLEmle4hHyPxd/pbBCW2cKvUew12pKm+dVWL4nJ6GpntnnxFnd2ijFQTQ9fzavrFldKOs5qTT0Fr35D2mZM5/k8mbp8lNtRYsI6OJAS5+fjqdKk4m4ZS+78sm/M9//jP+8Y9/1NWp4tEe7dEe7Sdv3e12K6pchM+Id3CbQGXEuIJJlg3h4KpHTNmGoY/brR9FrkrnJqgDbuXxnsx00P+3mO28z191jR2Ednez5cJO7Yitsfjst5ll2xr3z3bkz1rrGRUgyp7BXc/tdluixKz+rMaASLa7sxEG8Yyf6tyP8zmGgb8dn5DmBSYi3s/BIISi5+SB6y3PwdcJ0yvdosloSVMZR1PZIq21M0WVmvr+Z97aZ9dw6/VHrput+Zp68z6Hak5V8t9S9r5+/VqCaovFIroBd8tKome5vVMHt3ibonE4c/2z1hI2V4SeOTE1wFNusd93Ssll9yMPbsrFnlL4U23qGf9P2l9JwaNgSkG00vo4DnRxVBaJbrQXceVnHFtdtxWh9AwkKiinNLXmr1XKrJUr7WPYknHnx2Xj2qKZ+D1a3F2nVbWetTXvU9jn1Hd97lvV7j9ThFPr0V87t5g6xw89Y5pm13Xx/wNAa4edtK/DnQAAACV0RVh0ZGF0ZTpjcmVhdGUAMjAxNi0wOC0xMlQxODo0MDo0MCswMDowMPKwuYkAAAAldEVYdGRhdGU6bW9kaWZ5ADIwMTYtMDgtMTJUMTg6NDA6NDArMDA6MDCD7QE1AAAAKHRFWHRwZGY6SGlSZXNCb3VuZGluZ0JveAA1NDQuMjUyeDc0Mi42NzcrMCswk78jGAAAABR0RVh0cGRmOlZlcnNpb24AUERGLTEuNw1ytTXOAAAAAElFTkSuQmCC"}}]}], "options": {"temperature": 0.0, "num_predict": 1024}, "stream": true}'
```
Error
```
{"error":"json: cannot unmarshal array into Go struct field ChatRequest.messages of type string"}
```
Version info:
- Docker version 0.3.6 https://hub.docker.com/layers/ollama/ollama/0.3.6/images/sha256-2e30dbf12a7e8b78aed96bcb6b3b039ae424512f6def5abbb0c4e3676b70c083?context=explore
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.6
|
{
"login": "McCannDahl",
"id": 19883817,
"node_id": "MDQ6VXNlcjE5ODgzODE3",
"avatar_url": "https://avatars.githubusercontent.com/u/19883817?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/McCannDahl",
"html_url": "https://github.com/McCannDahl",
"followers_url": "https://api.github.com/users/McCannDahl/followers",
"following_url": "https://api.github.com/users/McCannDahl/following{/other_user}",
"gists_url": "https://api.github.com/users/McCannDahl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/McCannDahl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/McCannDahl/subscriptions",
"organizations_url": "https://api.github.com/users/McCannDahl/orgs",
"repos_url": "https://api.github.com/users/McCannDahl/repos",
"events_url": "https://api.github.com/users/McCannDahl/events{/privacy}",
"received_events_url": "https://api.github.com/users/McCannDahl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6451/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6451/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/280
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/280/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/280/comments
|
https://api.github.com/repos/ollama/ollama/issues/280/events
|
https://github.com/ollama/ollama/issues/280
| 1,836,763,417
|
I_kwDOJ0Z1Ps5teskZ
| 280
|
Non-interactive mode for batching inputs
|
{
"login": "jmthackett",
"id": 844469,
"node_id": "MDQ6VXNlcjg0NDQ2OQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/844469?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmthackett",
"html_url": "https://github.com/jmthackett",
"followers_url": "https://api.github.com/users/jmthackett/followers",
"following_url": "https://api.github.com/users/jmthackett/following{/other_user}",
"gists_url": "https://api.github.com/users/jmthackett/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmthackett/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmthackett/subscriptions",
"organizations_url": "https://api.github.com/users/jmthackett/orgs",
"repos_url": "https://api.github.com/users/jmthackett/repos",
"events_url": "https://api.github.com/users/jmthackett/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmthackett/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2023-08-04T13:38:13
| 2023-12-04T19:09:14
| 2023-12-04T19:09:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Just something along these lines:
```
ollama run <my model> -f input.txt -n <number of runs> -o output.txt
```
Not essential by any stretch of the imagination but it'd be handy. My use case is being able to batch process prompts by just iterating over a list of text files. At the moment I'm just looking at how to wrap it all up in bash - probably by piping to stdin - but it isn't the easiest thing to know when it has returned.
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/280/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/280/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7030
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7030/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7030/comments
|
https://api.github.com/repos/ollama/ollama/issues/7030/events
|
https://github.com/ollama/ollama/pull/7030
| 2,554,801,732
|
PR_kwDOJ0Z1Ps59BUP9
| 7,030
|
server: add "Cache-Control: max-age=0" response header
|
{
"login": "justincranford",
"id": 2488888,
"node_id": "MDQ6VXNlcjI0ODg4ODg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2488888?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/justincranford",
"html_url": "https://github.com/justincranford",
"followers_url": "https://api.github.com/users/justincranford/followers",
"following_url": "https://api.github.com/users/justincranford/following{/other_user}",
"gists_url": "https://api.github.com/users/justincranford/gists{/gist_id}",
"starred_url": "https://api.github.com/users/justincranford/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/justincranford/subscriptions",
"organizations_url": "https://api.github.com/users/justincranford/orgs",
"repos_url": "https://api.github.com/users/justincranford/repos",
"events_url": "https://api.github.com/users/justincranford/events{/privacy}",
"received_events_url": "https://api.github.com/users/justincranford/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-09-29T08:13:41
| 2024-09-29T08:13:41
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7030",
"html_url": "https://github.com/ollama/ollama/pull/7030",
"diff_url": "https://github.com/ollama/ollama/pull/7030.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7030.patch",
"merged_at": null
}
|
Response header `Cache-Control` is missing for APIs such as /api/tags and /api/ps are missing.
Adding `Cache-Control: max-age=0` directive to HTTP headers tells clients the response is considered stale immediately after receiving it.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7030/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7030/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8342
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8342/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8342/comments
|
https://api.github.com/repos/ollama/ollama/issues/8342/events
|
https://github.com/ollama/ollama/issues/8342
| 2,773,989,214
|
I_kwDOJ0Z1Ps6lV7de
| 8,342
|
CORS error x-stainless-helper-method
|
{
"login": "isamu",
"id": 231763,
"node_id": "MDQ6VXNlcjIzMTc2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/231763?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/isamu",
"html_url": "https://github.com/isamu",
"followers_url": "https://api.github.com/users/isamu/followers",
"following_url": "https://api.github.com/users/isamu/following{/other_user}",
"gists_url": "https://api.github.com/users/isamu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/isamu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/isamu/subscriptions",
"organizations_url": "https://api.github.com/users/isamu/orgs",
"repos_url": "https://api.github.com/users/isamu/repos",
"events_url": "https://api.github.com/users/isamu/events{/privacy}",
"received_events_url": "https://api.github.com/users/isamu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2025-01-08T00:33:11
| 2025-01-17T01:58:44
| 2025-01-17T01:58:44
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am making requests to a local instance of Ollama using the OpenAI npm package from browser. When I add `stream: true` to OpenAI npm, I encounter the following CORS error:
```
Access to fetch at 'http://127.0.0.1:11434/v1/chat/completions' from origin 'http://localhost:5174' has been blocked by CORS policy: Request header field x-stainless-helper-method is not allowed by Access-Control-Allow-Headers in preflight response.
```
When adding `stream: true` to OpenAI, the following HTTP header is included:
```
x-stainless-helper-method: stream
```
This header causes a CORS error. Similar to issue #6910, preflight header is necessary.
Additionally, by inspecting OpenAI's npm package, the following headers may also be added:
- `X-Stainless-Poll-Helper`
- `X-Stainless-Custom-Poll-Interval`
These headers might require handling in your setup to avoid similar CORS-related issues.
Thanks.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.5.4
|
{
"login": "isamu",
"id": 231763,
"node_id": "MDQ6VXNlcjIzMTc2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/231763?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/isamu",
"html_url": "https://github.com/isamu",
"followers_url": "https://api.github.com/users/isamu/followers",
"following_url": "https://api.github.com/users/isamu/following{/other_user}",
"gists_url": "https://api.github.com/users/isamu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/isamu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/isamu/subscriptions",
"organizations_url": "https://api.github.com/users/isamu/orgs",
"repos_url": "https://api.github.com/users/isamu/repos",
"events_url": "https://api.github.com/users/isamu/events{/privacy}",
"received_events_url": "https://api.github.com/users/isamu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8342/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8342/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6188
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6188/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6188/comments
|
https://api.github.com/repos/ollama/ollama/issues/6188/events
|
https://github.com/ollama/ollama/pull/6188
| 2,449,535,600
|
PR_kwDOJ0Z1Ps53f-rI
| 6,188
|
Allow singular array for CompletionRequest prompt field
|
{
"login": "igor-drozdov",
"id": 3660805,
"node_id": "MDQ6VXNlcjM2NjA4MDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3660805?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/igor-drozdov",
"html_url": "https://github.com/igor-drozdov",
"followers_url": "https://api.github.com/users/igor-drozdov/followers",
"following_url": "https://api.github.com/users/igor-drozdov/following{/other_user}",
"gists_url": "https://api.github.com/users/igor-drozdov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/igor-drozdov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/igor-drozdov/subscriptions",
"organizations_url": "https://api.github.com/users/igor-drozdov/orgs",
"repos_url": "https://api.github.com/users/igor-drozdov/repos",
"events_url": "https://api.github.com/users/igor-drozdov/events{/privacy}",
"received_events_url": "https://api.github.com/users/igor-drozdov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2024-08-05T21:47:12
| 2024-12-24T03:57:37
| 2024-12-24T03:57:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6188",
"html_url": "https://github.com/ollama/ollama/pull/6188",
"diff_url": "https://github.com/ollama/ollama/pull/6188.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6188.patch",
"merged_at": null
}
|
## Overview
OpenAI `v1/completions` to handle `[]string`, `[]int` and `[][]int`, in addition to just a `string` according to https://platform.openai.com/docs/api-reference/completions/create#completions-create-prompt
Also some aggregators (like litellm) send a list of prompts as a prompt: https://github.com/ollama/ollama/issues/5259#issuecomment-2242611375
This PR allows sending singular arrays as the `prompt` field.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6188/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6188/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/387
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/387/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/387/comments
|
https://api.github.com/repos/ollama/ollama/issues/387/events
|
https://github.com/ollama/ollama/issues/387
| 1,857,908,587
|
I_kwDOJ0Z1Ps5uvW9r
| 387
|
Client can't connect to server
|
{
"login": "freeqaz",
"id": 4573221,
"node_id": "MDQ6VXNlcjQ1NzMyMjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/4573221?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/freeqaz",
"html_url": "https://github.com/freeqaz",
"followers_url": "https://api.github.com/users/freeqaz/followers",
"following_url": "https://api.github.com/users/freeqaz/following{/other_user}",
"gists_url": "https://api.github.com/users/freeqaz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/freeqaz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/freeqaz/subscriptions",
"organizations_url": "https://api.github.com/users/freeqaz/orgs",
"repos_url": "https://api.github.com/users/freeqaz/repos",
"events_url": "https://api.github.com/users/freeqaz/events{/privacy}",
"received_events_url": "https://api.github.com/users/freeqaz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 8
| 2023-08-20T00:25:51
| 2024-01-08T17:44:35
| 2023-08-22T21:41:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Following the readme on my Arch linux setup yields the following error:
```sh
$ ./ollama run llama2
Error: could not connect to ollama server, run 'ollama serve' to start it
```
Steps to reproduce:
```sh
git clone git@github.com:jmorganca/ollama.git
cd ollama
git build .
./ollama serve &
./ollama run llama2
```
The output from the serve command is the following:
```sh
$ ./ollama serve
Couldn't find '/home/<USER>/.ollama/id_ed25519'. Generating new private key.
Your new public key is:
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMQh86qSVLsOKQASDF123/FpS123/ASDF123ADg0uHka
[GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.
[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
- using env: export GIN_MODE=release
- using code: gin.SetMode(gin.ReleaseMode)
[GIN-debug] GET / --> github.com/jmorganca/ollama/server.Serve.func1 (4 handlers)
[GIN-debug] HEAD / --> github.com/jmorganca/ollama/server.Serve.func2 (4 handlers)
[GIN-debug] POST /api/pull --> github.com/jmorganca/ollama/server.PullModelHandler (4 handlers)
[GIN-debug] POST /api/generate --> github.com/jmorganca/ollama/server.GenerateHandler (4 handlers)
[GIN-debug] POST /api/embeddings --> github.com/jmorganca/ollama/server.EmbeddingHandler (4 handlers)
[GIN-debug] POST /api/create --> github.com/jmorganca/ollama/server.CreateModelHandler (4 handlers)
[GIN-debug] POST /api/push --> github.com/jmorganca/ollama/server.PushModelHandler (4 handlers)
[GIN-debug] POST /api/copy --> github.com/jmorganca/ollama/server.CopyModelHandler (4 handlers)
[GIN-debug] GET /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (4 handlers)
[GIN-debug] DELETE /api/delete --> github.com/jmorganca/ollama/server.DeleteModelHandler (4 handlers)
2023/08/19 17:12:15 routes.go:437: Listening on 127.0.0.1:11434
[GIN] 2023/08/19 - 17:12:51 | 400 | 389.591µs | 127.0.0.1 | POST "/api/generate"
```
Searching for the error in code doesn't show me which line in the Golang is the issue. I have tried passing in `--verbose` and any other flags while reading the source code as well as trying to set `OLLAMA_HOST="localhost:11434"` since maybe I figure it listening on `127.0.0.1:11434` is the issue. Alas, no success.
I also checked my firewall and I don't think that's the issue because `curl` works.
```sh
$ curl -X POST http://localhost:11434/api/generate -d '{
"model": "llama2",
"prompt":"Why is the sky blue?"
}'
{"error":"stat /home/<USER>/.ollama/models/manifests/registry.ollama.ai/library/llama2/latest: no such file or directory"
```
I wish that I could make this report more actionable for y'all. I just know that I'm not the only user that's going to get hit by this and want to make an issue to track this. Thanks for the help -- hopefully we can find a solution that's easy and be able to document for users how to debug this in the future.
Cheers!
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/387/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/387/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1222
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1222/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1222/comments
|
https://api.github.com/repos/ollama/ollama/issues/1222/events
|
https://github.com/ollama/ollama/pull/1222
| 2,004,767,209
|
PR_kwDOJ0Z1Ps5gDBMM
| 1,222
|
fix relative path on create
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-21T17:06:12
| 2023-11-21T20:43:18
| 2023-11-21T20:43:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1222",
"html_url": "https://github.com/ollama/ollama/pull/1222",
"diff_url": "https://github.com/ollama/ollama/pull/1222.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1222.patch",
"merged_at": "2023-11-21T20:43:18"
}
|
This fixes a regression in the API. Previously calling the API directly with a modelfile that has a relative file would work.
Ex:
```
FROM nous-capybara-34b.Q4_0.gguf
TEMPLATE "USER: { .Prompt } ASSISTANT: "
```
```
curl -X POST http://localhost:11434/api/create -d '{
"name": "bruce/nous-capybara",
"path": "/Users/bruce/models/nous-capybara/Modelfile"
}'
```
part of #1217
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1222/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1222/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3916
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3916/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3916/comments
|
https://api.github.com/repos/ollama/ollama/issues/3916/events
|
https://github.com/ollama/ollama/issues/3916
| 2,264,122,788
|
I_kwDOJ0Z1Ps6G88Wk
| 3,916
|
Error: The parameter is incorrect.
|
{
"login": "aaamoon",
"id": 25700476,
"node_id": "MDQ6VXNlcjI1NzAwNDc2",
"avatar_url": "https://avatars.githubusercontent.com/u/25700476?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aaamoon",
"html_url": "https://github.com/aaamoon",
"followers_url": "https://api.github.com/users/aaamoon/followers",
"following_url": "https://api.github.com/users/aaamoon/following{/other_user}",
"gists_url": "https://api.github.com/users/aaamoon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aaamoon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aaamoon/subscriptions",
"organizations_url": "https://api.github.com/users/aaamoon/orgs",
"repos_url": "https://api.github.com/users/aaamoon/repos",
"events_url": "https://api.github.com/users/aaamoon/events{/privacy}",
"received_events_url": "https://api.github.com/users/aaamoon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 21
| 2024-04-25T17:27:29
| 2025-01-24T01:50:07
| 2024-05-13T08:43:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
[GIN] 2024/04/26 - 01:24:28 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/04/26 - 01:24:28 | 200 | 1.1779ms | 127.0.0.1 | POST "/api/show"
[GIN] 2024/04/26 - 01:24:28 | 200 | 1.4496ms | 127.0.0.1 | POST "/api/show"
{"function":"process_single_task","level":"INFO","line":1510,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":7,"tid":"28416","timestamp":1714065868}
{"function":"log_server_request","level":"INFO","line":2741,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":53568,"status":200,"tid":"32280","timestamp":1714065868}
[GIN] 2024/04/26 - 01:24:28 | 200 | 2.2784ms | 127.0.0.1 | POST "/api/chat"
<img width="691" alt="image" src="https://github.com/ollama/ollama/assets/25700476/0a4c78aa-6a17-4a04-be93-d0cfec776682">
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "aaamoon",
"id": 25700476,
"node_id": "MDQ6VXNlcjI1NzAwNDc2",
"avatar_url": "https://avatars.githubusercontent.com/u/25700476?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aaamoon",
"html_url": "https://github.com/aaamoon",
"followers_url": "https://api.github.com/users/aaamoon/followers",
"following_url": "https://api.github.com/users/aaamoon/following{/other_user}",
"gists_url": "https://api.github.com/users/aaamoon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aaamoon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aaamoon/subscriptions",
"organizations_url": "https://api.github.com/users/aaamoon/orgs",
"repos_url": "https://api.github.com/users/aaamoon/repos",
"events_url": "https://api.github.com/users/aaamoon/events{/privacy}",
"received_events_url": "https://api.github.com/users/aaamoon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3916/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3916/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7939
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7939/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7939/comments
|
https://api.github.com/repos/ollama/ollama/issues/7939/events
|
https://github.com/ollama/ollama/pull/7939
| 2,719,102,345
|
PR_kwDOJ0Z1Ps6EHPp1
| 7,939
|
Add generate endpoint for structured outputs
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-05T01:22:12
| 2024-12-05T01:37:14
| 2024-12-05T01:37:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7939",
"html_url": "https://github.com/ollama/ollama/pull/7939",
"diff_url": "https://github.com/ollama/ollama/pull/7939.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7939.patch",
"merged_at": "2024-12-05T01:37:12"
}
|
Follow up to #7900
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7939/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7939/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6430
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6430/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6430/comments
|
https://api.github.com/repos/ollama/ollama/issues/6430/events
|
https://github.com/ollama/ollama/pull/6430
| 2,474,353,323
|
PR_kwDOJ0Z1Ps54ygVE
| 6,430
|
Linux Doc cosmetic fixes.
|
{
"login": "fujitatomoya",
"id": 43395114,
"node_id": "MDQ6VXNlcjQzMzk1MTE0",
"avatar_url": "https://avatars.githubusercontent.com/u/43395114?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fujitatomoya",
"html_url": "https://github.com/fujitatomoya",
"followers_url": "https://api.github.com/users/fujitatomoya/followers",
"following_url": "https://api.github.com/users/fujitatomoya/following{/other_user}",
"gists_url": "https://api.github.com/users/fujitatomoya/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fujitatomoya/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fujitatomoya/subscriptions",
"organizations_url": "https://api.github.com/users/fujitatomoya/orgs",
"repos_url": "https://api.github.com/users/fujitatomoya/repos",
"events_url": "https://api.github.com/users/fujitatomoya/events{/privacy}",
"received_events_url": "https://api.github.com/users/fujitatomoya/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-08-19T22:31:20
| 2024-09-04T18:45:09
| 2024-09-04T18:45:09
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6430",
"html_url": "https://github.com/ollama/ollama/pull/6430",
"diff_url": "https://github.com/ollama/ollama/pull/6430.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6430.patch",
"merged_at": "2024-09-04T18:45:09"
}
|
minor doc update for Linux Users.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6430/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6430/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/399
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/399/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/399/comments
|
https://api.github.com/repos/ollama/ollama/issues/399/events
|
https://github.com/ollama/ollama/issues/399
| 1,862,248,922
|
I_kwDOJ0Z1Ps5u_6na
| 399
|
Images for Readmes
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2023-08-22T21:44:08
| 2023-08-29T18:42:50
| 2023-08-22T21:44:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |

|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/399/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/399/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5108
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5108/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5108/comments
|
https://api.github.com/repos/ollama/ollama/issues/5108/events
|
https://github.com/ollama/ollama/issues/5108
| 2,358,749,805
|
I_kwDOJ0Z1Ps6Ml6pt
| 5,108
|
ollama run loading a long time
|
{
"login": "wangzi2124",
"id": 13045190,
"node_id": "MDQ6VXNlcjEzMDQ1MTkw",
"avatar_url": "https://avatars.githubusercontent.com/u/13045190?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wangzi2124",
"html_url": "https://github.com/wangzi2124",
"followers_url": "https://api.github.com/users/wangzi2124/followers",
"following_url": "https://api.github.com/users/wangzi2124/following{/other_user}",
"gists_url": "https://api.github.com/users/wangzi2124/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wangzi2124/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wangzi2124/subscriptions",
"organizations_url": "https://api.github.com/users/wangzi2124/orgs",
"repos_url": "https://api.github.com/users/wangzi2124/repos",
"events_url": "https://api.github.com/users/wangzi2124/events{/privacy}",
"received_events_url": "https://api.github.com/users/wangzi2124/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 11
| 2024-06-18T02:52:50
| 2024-06-19T19:47:54
| 2024-06-19T08:15:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
<img width="1178" alt="20240618105223" src="https://github.com/ollama/ollama/assets/13045190/24009f4a-46ee-4ec4-a5c5-582a80208aeb">
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "wangzi2124",
"id": 13045190,
"node_id": "MDQ6VXNlcjEzMDQ1MTkw",
"avatar_url": "https://avatars.githubusercontent.com/u/13045190?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wangzi2124",
"html_url": "https://github.com/wangzi2124",
"followers_url": "https://api.github.com/users/wangzi2124/followers",
"following_url": "https://api.github.com/users/wangzi2124/following{/other_user}",
"gists_url": "https://api.github.com/users/wangzi2124/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wangzi2124/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wangzi2124/subscriptions",
"organizations_url": "https://api.github.com/users/wangzi2124/orgs",
"repos_url": "https://api.github.com/users/wangzi2124/repos",
"events_url": "https://api.github.com/users/wangzi2124/events{/privacy}",
"received_events_url": "https://api.github.com/users/wangzi2124/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5108/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5108/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6566
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6566/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6566/comments
|
https://api.github.com/repos/ollama/ollama/issues/6566/events
|
https://github.com/ollama/ollama/issues/6566
| 2,496,508,417
|
I_kwDOJ0Z1Ps6UzbIB
| 6,566
|
Ollama can't import safetensor of mistral 7B v0.1
|
{
"login": "ZhoraZhang",
"id": 48072946,
"node_id": "MDQ6VXNlcjQ4MDcyOTQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/48072946?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZhoraZhang",
"html_url": "https://github.com/ZhoraZhang",
"followers_url": "https://api.github.com/users/ZhoraZhang/followers",
"following_url": "https://api.github.com/users/ZhoraZhang/following{/other_user}",
"gists_url": "https://api.github.com/users/ZhoraZhang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZhoraZhang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZhoraZhang/subscriptions",
"organizations_url": "https://api.github.com/users/ZhoraZhang/orgs",
"repos_url": "https://api.github.com/users/ZhoraZhang/repos",
"events_url": "https://api.github.com/users/ZhoraZhang/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZhoraZhang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-08-30T07:53:43
| 2024-09-02T07:46:28
| 2024-09-01T23:11:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have written modelfile:
FROM /data/models/Mistral-7B-v0.1 (path of mistral's safetensor files)
I tried to import model using /api/create but failed.
{"error":"read /data/models/Mistral-7B-v0.1: is a directory"}
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6566/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6566/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2642
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2642/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2642/comments
|
https://api.github.com/repos/ollama/ollama/issues/2642/events
|
https://github.com/ollama/ollama/issues/2642
| 2,147,190,008
|
I_kwDOJ0Z1Ps5_-4T4
| 2,642
|
🚀🔍 GPU Mystery: Unleashing the Power on Small Models but Stuck on Idle with Giants like MiXtral8x7B & Llama 70B on Ubuntu 22 🧩💡
|
{
"login": "jaifar530",
"id": 31308766,
"node_id": "MDQ6VXNlcjMxMzA4NzY2",
"avatar_url": "https://avatars.githubusercontent.com/u/31308766?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jaifar530",
"html_url": "https://github.com/jaifar530",
"followers_url": "https://api.github.com/users/jaifar530/followers",
"following_url": "https://api.github.com/users/jaifar530/following{/other_user}",
"gists_url": "https://api.github.com/users/jaifar530/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jaifar530/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jaifar530/subscriptions",
"organizations_url": "https://api.github.com/users/jaifar530/orgs",
"repos_url": "https://api.github.com/users/jaifar530/repos",
"events_url": "https://api.github.com/users/jaifar530/events{/privacy}",
"received_events_url": "https://api.github.com/users/jaifar530/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 4
| 2024-02-21T16:28:40
| 2024-03-12T02:00:28
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi
Using Ubuntu 22.
both commands nvcc --version and nvidia-smi are showing valied outputs.
I've noticed that the GPU is not utilized when running larger models (e.g., MiXtral8x7B, Llama 70B), yet it functions well with smaller models like Mistral and Llama 7B. Is this issue known to others, or is it just me experiencing it? By the way, I tested this on both RTX 3090 and RTX 2080, and both exhibited the same issue with the larger models.
Additionally, with the larger models (Mistral and Llama 70B), the GPU RAM is almost fully utilized, but not the GPU itself (which is very strange), while the CPU is fully utilized.
Here is the summary:
Larger models MiXtral8x7B, Llama 70B
GPU: Not utalised
GPU RAM: utalised
CPU: utalised
RAM: Not utalised
Small models Mistral and Llama 7B
GPU: utalised
GPU RAM: utalised
CPU: not utalised
RAM: not utalised
in summary i can use the power of GPU on small models only unfortuntly.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2642/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2642/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5856
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5856/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5856/comments
|
https://api.github.com/repos/ollama/ollama/issues/5856/events
|
https://github.com/ollama/ollama/pull/5856
| 2,423,402,616
|
PR_kwDOJ0Z1Ps52Hrb1
| 5,856
|
template: disable func checking
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 1
| 2024-07-22T17:36:09
| 2024-12-17T19:51:20
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5856",
"html_url": "https://github.com/ollama/ollama/pull/5856",
"diff_url": "https://github.com/ollama/ollama/pull/5856.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5856.patch",
"merged_at": null
}
|
func checking will return error during parsing if a function is undefined even if it doesn't end up being used. disabling this enables additional functions to be defined in the future without breaking older versions assuming the function usage is properly guarded
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5856/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5856/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6831
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6831/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6831/comments
|
https://api.github.com/repos/ollama/ollama/issues/6831/events
|
https://github.com/ollama/ollama/pull/6831
| 2,529,500,473
|
PR_kwDOJ0Z1Ps57rRWD
| 6,831
|
cache: Clear old KV cache entries when evicting a slot
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-16T21:06:24
| 2024-09-16T21:15:57
| 2024-09-16T21:15:56
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6831",
"html_url": "https://github.com/ollama/ollama/pull/6831",
"diff_url": "https://github.com/ollama/ollama/pull/6831.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6831.patch",
"merged_at": "2024-09-16T21:15:56"
}
|
When forking a cache entry, if no empty slots are available we evict the least recently used one and copy over the KV entries from the closest match. However, this copy does not overwrite existing values but only adds new ones. Therefore, we need to clear the old slot first.
This change fixes two issues:
- The KV cache fills up and runs out of space even though we think we are managing it correctly
- Performance gets worse over time as we use new cache entries that are not hot in the processor caches
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6831/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6831/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4806
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4806/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4806/comments
|
https://api.github.com/repos/ollama/ollama/issues/4806/events
|
https://github.com/ollama/ollama/issues/4806
| 2,332,560,619
|
I_kwDOJ0Z1Ps6LCAzr
| 4,806
|
codegemma broken on releases after v0.1.39
|
{
"login": "evertjr",
"id": 13040196,
"node_id": "MDQ6VXNlcjEzMDQwMTk2",
"avatar_url": "https://avatars.githubusercontent.com/u/13040196?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/evertjr",
"html_url": "https://github.com/evertjr",
"followers_url": "https://api.github.com/users/evertjr/followers",
"following_url": "https://api.github.com/users/evertjr/following{/other_user}",
"gists_url": "https://api.github.com/users/evertjr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/evertjr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/evertjr/subscriptions",
"organizations_url": "https://api.github.com/users/evertjr/orgs",
"repos_url": "https://api.github.com/users/evertjr/repos",
"events_url": "https://api.github.com/users/evertjr/events{/privacy}",
"received_events_url": "https://api.github.com/users/evertjr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 21
| 2024-06-04T05:05:04
| 2024-12-13T23:51:59
| 2024-11-12T01:42:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I use codegemma with continue.dev extension on vscode, it works fine on version 0.1.39. but on the last two releases it doesn't generate completions and behave very strangely in terminal.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4806/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4806/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2358
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2358/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2358/comments
|
https://api.github.com/repos/ollama/ollama/issues/2358/events
|
https://github.com/ollama/ollama/issues/2358
| 2,117,754,882
|
I_kwDOJ0Z1Ps5-OmAC
| 2,358
|
Models autodelete?
|
{
"login": "SinanAkkoyun",
"id": 43215895,
"node_id": "MDQ6VXNlcjQzMjE1ODk1",
"avatar_url": "https://avatars.githubusercontent.com/u/43215895?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SinanAkkoyun",
"html_url": "https://github.com/SinanAkkoyun",
"followers_url": "https://api.github.com/users/SinanAkkoyun/followers",
"following_url": "https://api.github.com/users/SinanAkkoyun/following{/other_user}",
"gists_url": "https://api.github.com/users/SinanAkkoyun/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SinanAkkoyun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SinanAkkoyun/subscriptions",
"organizations_url": "https://api.github.com/users/SinanAkkoyun/orgs",
"repos_url": "https://api.github.com/users/SinanAkkoyun/repos",
"events_url": "https://api.github.com/users/SinanAkkoyun/events{/privacy}",
"received_events_url": "https://api.github.com/users/SinanAkkoyun/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 15
| 2024-02-05T06:09:15
| 2024-09-30T19:12:24
| 2024-09-30T19:12:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi! I noticed, as soon as I kill ollama (because one can not unload models from VRAM manually) and start ollama serve on my own, all models delete themselves.
Is that a bug or a feature (perhaps ensuring non-corrupted files)?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2358/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2358/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6647
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6647/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6647/comments
|
https://api.github.com/repos/ollama/ollama/issues/6647/events
|
https://github.com/ollama/ollama/issues/6647
| 2,506,588,739
|
I_kwDOJ0Z1Ps6VZ4JD
| 6,647
|
can't use nvidia GPU only after sleep
|
{
"login": "brookate",
"id": 171191880,
"node_id": "U_kgDOCjQuSA",
"avatar_url": "https://avatars.githubusercontent.com/u/171191880?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/brookate",
"html_url": "https://github.com/brookate",
"followers_url": "https://api.github.com/users/brookate/followers",
"following_url": "https://api.github.com/users/brookate/following{/other_user}",
"gists_url": "https://api.github.com/users/brookate/gists{/gist_id}",
"starred_url": "https://api.github.com/users/brookate/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/brookate/subscriptions",
"organizations_url": "https://api.github.com/users/brookate/orgs",
"repos_url": "https://api.github.com/users/brookate/repos",
"events_url": "https://api.github.com/users/brookate/events{/privacy}",
"received_events_url": "https://api.github.com/users/brookate/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-09-05T01:35:17
| 2024-09-05T16:21:35
| 2024-09-05T16:21:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
after gentoo linux sleep, ollama only use cpu
turn on OOLAMA_DEBUG, I find such line
```
time=2024-09-05T09:20:35.622+08:00 level=DEBUG source=gpu.go:521 msg="discovered GPU libraries" paths="[/tmp/ollama786265597/runners/cuda_v11/libcudart.so.11.0 /opt/cuda/lib64/libcudart.so.12.6.37]"
cudaSetDevice err: 999
time=2024-09-05T09:20:35.624+08:00 level=DEBUG source=gpu.go:533 msg="Unable to load cudart" library=/tmp/ollama786265597/runners/cuda_v11/libcudart.so.11.0 error="cudart init failure: 999"
cudaSetDevice err: 999
time=2024-09-05T09:20:35.627+08:00 level=DEBUG source=gpu.go:533 msg="Unable to load cudart" library=/opt/cuda/lib64/libcudart.so.12.6.37 error="cudart init failure: 999"
```
my cuda is
dell➜ ~ cd /opt/cuda
dell➜ cuda l
total 88K
drwxr-xr-x 6 root root 4.0K Aug 28 10:01 .
drwxr-xr-x 14 root root 4.0K Aug 26 18:22 ..
-rw-r--r-- 1 root root 62K Aug 28 10:00 EULA.txt
drwxr-xr-x 3 root root 4.0K Aug 28 10:01 bin
lrwxrwxrwx 1 root root 28 Aug 28 10:00 include -> targets/x86_64-linux/include
lrwxrwxrwx 1 root root 24 Aug 28 10:00 lib64 -> targets/x86_64-linux/lib
drwxr-xr-x 3 root root 4.0K Jul 31 18:53 nvml
drwxr-xr-x 6 root root 4.0K Jul 31 18:53 nvvm
drwxr-xr-x 3 root root 4.0K Jul 31 18:53 targets
dell➜ cuda cd targets/x86_64-linux/lib
dell➜ lib ls libcudart.so
libcudart.so@ libcudart.so.12@ libcudart.so.12.6.37*
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.0
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6647/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6647/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7434
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7434/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7434/comments
|
https://api.github.com/repos/ollama/ollama/issues/7434/events
|
https://github.com/ollama/ollama/pull/7434
| 2,625,652,718
|
PR_kwDOJ0Z1Ps6AdqfH
| 7,434
|
Remove server.cpp compatibility code
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-31T00:26:44
| 2024-11-06T21:32:20
| 2024-11-06T21:32:19
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7434",
"html_url": "https://github.com/ollama/ollama/pull/7434",
"diff_url": "https://github.com/ollama/ollama/pull/7434.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7434.patch",
"merged_at": "2024-11-06T21:32:19"
}
|
Some interfaces in the Go runner were kept the same as server.cpp for compatibility, we can now start to make things more natural.
The one user-facing impact of this change is that multimodal models other than mllama can now support parallel requests.
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7434/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7434/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2174
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2174/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2174/comments
|
https://api.github.com/repos/ollama/ollama/issues/2174/events
|
https://github.com/ollama/ollama/pull/2174
| 2,098,860,070
|
PR_kwDOJ0Z1Ps5k-_kP
| 2,174
|
More logging for gpu management
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-24T18:35:14
| 2024-01-24T19:09:20
| 2024-01-24T19:09:18
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2174",
"html_url": "https://github.com/ollama/ollama/pull/2174",
"diff_url": "https://github.com/ollama/ollama/pull/2174.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2174.patch",
"merged_at": "2024-01-24T19:09:17"
}
|
Fix an ordering glitch of dlerr/dlclose and add more logging to help root cause some crashes users are hitting. This also refines the function pointer names to use the underlying function names instead of simplified names for readability.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2174/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2174/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/505
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/505/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/505/comments
|
https://api.github.com/repos/ollama/ollama/issues/505/events
|
https://github.com/ollama/ollama/issues/505
| 1,889,361,502
|
I_kwDOJ0Z1Ps5wnV5e
| 505
|
Go library fails to compile
|
{
"login": "JayNakrani",
"id": 6269279,
"node_id": "MDQ6VXNlcjYyNjkyNzk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6269279?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JayNakrani",
"html_url": "https://github.com/JayNakrani",
"followers_url": "https://api.github.com/users/JayNakrani/followers",
"following_url": "https://api.github.com/users/JayNakrani/following{/other_user}",
"gists_url": "https://api.github.com/users/JayNakrani/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JayNakrani/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JayNakrani/subscriptions",
"organizations_url": "https://api.github.com/users/JayNakrani/orgs",
"repos_url": "https://api.github.com/users/JayNakrani/repos",
"events_url": "https://api.github.com/users/JayNakrani/events{/privacy}",
"received_events_url": "https://api.github.com/users/JayNakrani/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2023-09-10T22:27:08
| 2023-10-03T18:21:02
| 2023-09-11T17:51:42
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am trying to use the [Ollama Go library](https://pkg.go.dev/github.com/jmorganca/ollama/server) in my own project, and running into the following error:
```shell
% go build .
../../go/pkg/mod/github.com/jmorganca/ollama@v0.0.18/llm/ggml_llama.go:31:12: pattern llama.cpp/ggml/build/*/bin/*: no matching files found
```
[JayNakrani/ollama-lib-issue](https://github.com/JayNakrani/ollama-lib-issue) repo has the minimal repro code. Filing this issue based on [discussion on discord](https://discord.com/channels/1128867683291627614/1128867684130508875/1150556983246721107)
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/505/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/505/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/2991
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2991/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2991/comments
|
https://api.github.com/repos/ollama/ollama/issues/2991/events
|
https://github.com/ollama/ollama/pull/2991
| 2,174,627,374
|
PR_kwDOJ0Z1Ps5pAdn5
| 2,991
|
fix ci
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-07T19:34:00
| 2024-03-07T19:35:06
| 2024-03-07T19:35:06
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2991",
"html_url": "https://github.com/ollama/ollama/pull/2991",
"diff_url": "https://github.com/ollama/ollama/pull/2991.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2991.patch",
"merged_at": "2024-03-07T19:35:06"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2991/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2991/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2264
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2264/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2264/comments
|
https://api.github.com/repos/ollama/ollama/issues/2264/events
|
https://github.com/ollama/ollama/pull/2264
| 2,106,894,561
|
PR_kwDOJ0Z1Ps5lZpJ2
| 2,264
|
Add support for MIG mode detection and use
|
{
"login": "waTeim",
"id": 5779395,
"node_id": "MDQ6VXNlcjU3NzkzOTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/5779395?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/waTeim",
"html_url": "https://github.com/waTeim",
"followers_url": "https://api.github.com/users/waTeim/followers",
"following_url": "https://api.github.com/users/waTeim/following{/other_user}",
"gists_url": "https://api.github.com/users/waTeim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/waTeim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/waTeim/subscriptions",
"organizations_url": "https://api.github.com/users/waTeim/orgs",
"repos_url": "https://api.github.com/users/waTeim/repos",
"events_url": "https://api.github.com/users/waTeim/events{/privacy}",
"received_events_url": "https://api.github.com/users/waTeim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 18
| 2024-01-30T03:47:30
| 2024-05-25T15:37:06
| 2024-05-25T15:37:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2264",
"html_url": "https://github.com/ollama/ollama/pull/2264",
"diff_url": "https://github.com/ollama/ollama/pull/2264.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2264.patch",
"merged_at": null
}
|
The issue here is that when the startup code checks for the capabilities of the GPU so it can allocate resources (in particular memory), it mistakenly uses the host GPU for its check rather than the MIG instance. This PR modifies the algorithm of cuda GPU detection. Essentially for each host GPU, check it that GPU supports MIG and if MIG is enabled, and if yes then iterate over all MIG instances. This results in a deviceMAP
typedef struct {
unsigned numDevices;
nvmlDevice_t **layout;
} deviceMap_t;
Later, that map can be iterated over. `layout[i][0]` is a pointer to the ith host GPU. layout[i][j + 1] will is the jth MIG instance of host GPU **i**. A value of `(void*)0` marks the end of the MIG instance list. There can only be 7 total MIG instances per host GPU, so the size of the pointer array for each host is set to 9. Both `cuda_check_vram` and `cuda_compute_capability` were updated to use this new data structure.
MIG-related API calls were added to enable this see [multi GPU management](https://docs.nvidia.com/deploy/archive/R520/nvml-api/group__nvmlMultiInstanceGPU.html) for details
Addresses #1500
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2264/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2264/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7216
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7216/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7216/comments
|
https://api.github.com/repos/ollama/ollama/issues/7216/events
|
https://github.com/ollama/ollama/pull/7216
| 2,589,972,279
|
PR_kwDOJ0Z1Ps5-vGyM
| 7,216
|
Update README.md
|
{
"login": "tcsenpai",
"id": 153772003,
"node_id": "U_kgDOCSpf4w",
"avatar_url": "https://avatars.githubusercontent.com/u/153772003?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tcsenpai",
"html_url": "https://github.com/tcsenpai",
"followers_url": "https://api.github.com/users/tcsenpai/followers",
"following_url": "https://api.github.com/users/tcsenpai/following{/other_user}",
"gists_url": "https://api.github.com/users/tcsenpai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tcsenpai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tcsenpai/subscriptions",
"organizations_url": "https://api.github.com/users/tcsenpai/orgs",
"repos_url": "https://api.github.com/users/tcsenpai/repos",
"events_url": "https://api.github.com/users/tcsenpai/events{/privacy}",
"received_events_url": "https://api.github.com/users/tcsenpai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-10-15T21:29:27
| 2024-11-29T20:45:25
| 2024-11-28T23:16:28
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7216",
"html_url": "https://github.com/ollama/ollama/pull/7216",
"diff_url": "https://github.com/ollama/ollama/pull/7216.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7216.patch",
"merged_at": "2024-11-28T23:16:28"
}
|
added three projects
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7216/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7216/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6385
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6385/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6385/comments
|
https://api.github.com/repos/ollama/ollama/issues/6385/events
|
https://github.com/ollama/ollama/issues/6385
| 2,469,368,937
|
I_kwDOJ0Z1Ps6TL5Rp
| 6,385
|
Significant Drop in Prompt Adherence in Updated Gemma2 Model
|
{
"login": "shzhou12",
"id": 26590783,
"node_id": "MDQ6VXNlcjI2NTkwNzgz",
"avatar_url": "https://avatars.githubusercontent.com/u/26590783?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shzhou12",
"html_url": "https://github.com/shzhou12",
"followers_url": "https://api.github.com/users/shzhou12/followers",
"following_url": "https://api.github.com/users/shzhou12/following{/other_user}",
"gists_url": "https://api.github.com/users/shzhou12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shzhou12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shzhou12/subscriptions",
"organizations_url": "https://api.github.com/users/shzhou12/orgs",
"repos_url": "https://api.github.com/users/shzhou12/repos",
"events_url": "https://api.github.com/users/shzhou12/events{/privacy}",
"received_events_url": "https://api.github.com/users/shzhou12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2024-08-16T03:27:57
| 2024-08-16T05:49:15
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I recently noticed that the Gemma2 model was updated 5 weeks ago, resulting in a new version of gemma2:9b-instruct-fp16:
- Older Version (6 weeks ago): gemma2:9b-instruct-fp16 - **9de55d4bf6ae** - 18 GB
- Updated Version (5 weeks ago): gemma2:9b-instruct-fp16 - **28e6684b0850** - 18 GB
After switching to the updated version 28e6684b0850, I've observed a significant decrease in the model's ability to adhere to prompts in my specific downstream tasks.
Could you please clarify why there are two different versions of gemma2:9b-instruct-fp16 and what changes were made between these versions? Should I revert to the older version 9de55d4bf6ae to maintain the previous performance?
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.2.8
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6385/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6385/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6656
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6656/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6656/comments
|
https://api.github.com/repos/ollama/ollama/issues/6656/events
|
https://github.com/ollama/ollama/pull/6656
| 2,507,674,135
|
PR_kwDOJ0Z1Ps56hQ7W
| 6,656
|
Fixed redirect check if direct URL is already Present
|
{
"login": "Tobix99",
"id": 22603015,
"node_id": "MDQ6VXNlcjIyNjAzMDE1",
"avatar_url": "https://avatars.githubusercontent.com/u/22603015?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Tobix99",
"html_url": "https://github.com/Tobix99",
"followers_url": "https://api.github.com/users/Tobix99/followers",
"following_url": "https://api.github.com/users/Tobix99/following{/other_user}",
"gists_url": "https://api.github.com/users/Tobix99/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Tobix99/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Tobix99/subscriptions",
"organizations_url": "https://api.github.com/users/Tobix99/orgs",
"repos_url": "https://api.github.com/users/Tobix99/repos",
"events_url": "https://api.github.com/users/Tobix99/events{/privacy}",
"received_events_url": "https://api.github.com/users/Tobix99/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-09-05T12:36:50
| 2024-09-05T17:48:27
| 2024-09-05T17:48:27
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6656",
"html_url": "https://github.com/ollama/ollama/pull/6656",
"diff_url": "https://github.com/ollama/ollama/pull/6656.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6656.patch",
"merged_at": "2024-09-05T17:48:27"
}
|
This is a fix regarding #6308 where the redirect check would fail with
`unexpected status code 200`.
The problem is, that if you try to pull a Model from an internal Registry, there would be no redirect, but the current logic expects at least one redirect. So i've added a the StatusCode 200 - OK to the check and return the Location of the redirect.
The bug was introduced with [Pull#5962](https://github.com/ollama/ollama/pull/5962)
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6656/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6656/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5678
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5678/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5678/comments
|
https://api.github.com/repos/ollama/ollama/issues/5678/events
|
https://github.com/ollama/ollama/pull/5678
| 2,407,096,074
|
PR_kwDOJ0Z1Ps51TTQ1
| 5,678
|
Add API integration tests
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-07-13T18:46:23
| 2025-01-16T17:36:31
| 2025-01-16T17:36:31
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5678",
"html_url": "https://github.com/ollama/ollama/pull/5678",
"diff_url": "https://github.com/ollama/ollama/pull/5678.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5678.patch",
"merged_at": null
}
|
These tests try to validate fields in the response payloads to catch regressions if we drop any.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5678/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5678/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4056
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4056/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4056/comments
|
https://api.github.com/repos/ollama/ollama/issues/4056/events
|
https://github.com/ollama/ollama/issues/4056
| 2,271,980,393
|
I_kwDOJ0Z1Ps6Ha6tp
| 4,056
|
How to change model store path on disk?
|
{
"login": "Leonard-Li777",
"id": 16662626,
"node_id": "MDQ6VXNlcjE2NjYyNjI2",
"avatar_url": "https://avatars.githubusercontent.com/u/16662626?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Leonard-Li777",
"html_url": "https://github.com/Leonard-Li777",
"followers_url": "https://api.github.com/users/Leonard-Li777/followers",
"following_url": "https://api.github.com/users/Leonard-Li777/following{/other_user}",
"gists_url": "https://api.github.com/users/Leonard-Li777/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Leonard-Li777/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Leonard-Li777/subscriptions",
"organizations_url": "https://api.github.com/users/Leonard-Li777/orgs",
"repos_url": "https://api.github.com/users/Leonard-Li777/repos",
"events_url": "https://api.github.com/users/Leonard-Li777/events{/privacy}",
"received_events_url": "https://api.github.com/users/Leonard-Li777/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-04-30T16:28:47
| 2024-04-30T18:06:23
| 2024-04-30T18:06:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
There is not enough space on the disk.
|
{
"login": "Leonard-Li777",
"id": 16662626,
"node_id": "MDQ6VXNlcjE2NjYyNjI2",
"avatar_url": "https://avatars.githubusercontent.com/u/16662626?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Leonard-Li777",
"html_url": "https://github.com/Leonard-Li777",
"followers_url": "https://api.github.com/users/Leonard-Li777/followers",
"following_url": "https://api.github.com/users/Leonard-Li777/following{/other_user}",
"gists_url": "https://api.github.com/users/Leonard-Li777/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Leonard-Li777/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Leonard-Li777/subscriptions",
"organizations_url": "https://api.github.com/users/Leonard-Li777/orgs",
"repos_url": "https://api.github.com/users/Leonard-Li777/repos",
"events_url": "https://api.github.com/users/Leonard-Li777/events{/privacy}",
"received_events_url": "https://api.github.com/users/Leonard-Li777/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4056/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4056/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5534
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5534/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5534/comments
|
https://api.github.com/repos/ollama/ollama/issues/5534/events
|
https://github.com/ollama/ollama/pull/5534
| 2,394,139,803
|
PR_kwDOJ0Z1Ps50nfzP
| 5,534
|
llm: allow gemma 2 to context shift
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-07T17:15:01
| 2024-07-07T17:41:53
| 2024-07-07T17:41:51
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5534",
"html_url": "https://github.com/ollama/ollama/pull/5534",
"diff_url": "https://github.com/ollama/ollama/pull/5534.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5534.patch",
"merged_at": "2024-07-07T17:41:51"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5534/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5534/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4468
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4468/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4468/comments
|
https://api.github.com/repos/ollama/ollama/issues/4468/events
|
https://github.com/ollama/ollama/issues/4468
| 2,299,552,120
|
I_kwDOJ0Z1Ps6JEGF4
| 4,468
|
Ollama speed dropped with setting OLLAMA_NUM_PARALLEL
|
{
"login": "hugefrog",
"id": 83398604,
"node_id": "MDQ6VXNlcjgzMzk4NjA0",
"avatar_url": "https://avatars.githubusercontent.com/u/83398604?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hugefrog",
"html_url": "https://github.com/hugefrog",
"followers_url": "https://api.github.com/users/hugefrog/followers",
"following_url": "https://api.github.com/users/hugefrog/following{/other_user}",
"gists_url": "https://api.github.com/users/hugefrog/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hugefrog/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hugefrog/subscriptions",
"organizations_url": "https://api.github.com/users/hugefrog/orgs",
"repos_url": "https://api.github.com/users/hugefrog/repos",
"events_url": "https://api.github.com/users/hugefrog/events{/privacy}",
"received_events_url": "https://api.github.com/users/hugefrog/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-05-16T07:17:02
| 2024-06-24T15:13:57
| 2024-06-21T23:27:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
After setting OLLAMA_NUM_PARALLEL in Ollama 0.1.38, the speed of single user access has dropped by half, and the GPU utilization rate is only about 50%."
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.38
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4468/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4468/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7782
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7782/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7782/comments
|
https://api.github.com/repos/ollama/ollama/issues/7782/events
|
https://github.com/ollama/ollama/pull/7782
| 2,680,294,447
|
PR_kwDOJ0Z1Ps6CtBbB
| 7,782
|
tests: fix max queue integration test
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-21T17:32:27
| 2024-11-22T16:05:49
| 2024-11-22T16:05:46
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7782",
"html_url": "https://github.com/ollama/ollama/pull/7782",
"diff_url": "https://github.com/ollama/ollama/pull/7782.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7782.patch",
"merged_at": "2024-11-22T16:05:46"
}
|
This had fallen out of sync with the envconfig behavior, where max queue default was not zero.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7782/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7782/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4543
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4543/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4543/comments
|
https://api.github.com/repos/ollama/ollama/issues/4543/events
|
https://github.com/ollama/ollama/pull/4543
| 2,306,518,363
|
PR_kwDOJ0Z1Ps5v_N-V
| 4,543
|
simplify safetensors reading
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-20T18:18:01
| 2024-05-21T21:43:56
| 2024-05-21T21:43:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4543",
"html_url": "https://github.com/ollama/ollama/pull/4543",
"diff_url": "https://github.com/ollama/ollama/pull/4543.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4543.patch",
"merged_at": "2024-05-21T21:43:55"
}
|
mapstructure is unnecessary and the safetensors header can be read directly into a struct
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4543/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4543/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6366
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6366/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6366/comments
|
https://api.github.com/repos/ollama/ollama/issues/6366/events
|
https://github.com/ollama/ollama/issues/6366
| 2,467,123,490
|
I_kwDOJ0Z1Ps6TDVEi
| 6,366
|
Unable to Pull Model Manifest - "Get https://registry.ollama.ai/v2/library/llama3/manifests/latest: EOF"
|
{
"login": "uestcxt",
"id": 58102372,
"node_id": "MDQ6VXNlcjU4MTAyMzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/58102372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/uestcxt",
"html_url": "https://github.com/uestcxt",
"followers_url": "https://api.github.com/users/uestcxt/followers",
"following_url": "https://api.github.com/users/uestcxt/following{/other_user}",
"gists_url": "https://api.github.com/users/uestcxt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/uestcxt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/uestcxt/subscriptions",
"organizations_url": "https://api.github.com/users/uestcxt/orgs",
"repos_url": "https://api.github.com/users/uestcxt/repos",
"events_url": "https://api.github.com/users/uestcxt/events{/privacy}",
"received_events_url": "https://api.github.com/users/uestcxt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-08-15T01:29:38
| 2024-09-17T15:34:01
| 2024-09-17T15:34:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
# Description
I am experiencing an issue when trying to pull the llama3 model using the ollama CLI. The process fails with an "EOF" error. I have also tried pulling other models, but the same error occurs.
I have verified that DNS resolution works correctly, as I can resolve the registry.ollama.ai domain:
```
PS C:\Windows\System32> nslookup registry.ollama.ai
Server: www.huaweimobilewifi.com
Address: fe80::1a9e:2cff:fe47:b669
Non-authoritative answer:
Name: registry.ollama.ai
Addresses: 2606:4700:3036::6815:4be3
2606:4700:3034::ac43:b6e5
104.21.75.227
```
# Environment:
- Ollama Version: 0.3.6
- Operating System: Windows 11 (Version 10.0.22631, Build 22631)
- GPU: NVIDIA RTX 4090 Laptop
- Memory: 64 GB
- CPU: Intel(R) Core(TM) i9-14900HX, 2200 MHz, 24 cores, 32 logical processors
```
PS C:\Windows\System32> ollama run llama3
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": EOF
PS C:\Windows\System32> ollama -v
ollama version is 0.3.6
```
# server.log
```
time=2024-08-15T09:15:43.929+08:00 level=INFO source=images.go:1059 msg="request failed: Get \"https://registry.ollama.ai/v2/library/llama3/manifests/latest\": EOF"
[GIN] 2024/08/15 - 09:15:43 | 200 | 11.3628421s | 127.0.0.1 | POST "/api/pull"
[GIN] 2024/08/15 - 09:16:02 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/08/15 - 09:16:02 | 404 | 0s | 127.0.0.1 | POST "/api/show"
time=2024-08-15T09:16:02.628+08:00 level=INFO source=images.go:1059 msg="request failed: Get \"https://registry.ollama.ai/v2/library/llama3/manifests/latest\": EOF"
```
It appears that the process is attempting to fetch the manifest but is encountering an EOF error. I have confirmed that my network connection is stable.
Could you please advise on how to resolve this issue?
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.6
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6366/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6366/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1578
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1578/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1578/comments
|
https://api.github.com/repos/ollama/ollama/issues/1578/events
|
https://github.com/ollama/ollama/issues/1578
| 2,046,155,280
|
I_kwDOJ0Z1Ps559doQ
| 1,578
|
Ollama order of magnitude slower on Apple M1 vs Llama.cpp
|
{
"login": "svilupp",
"id": 49557684,
"node_id": "MDQ6VXNlcjQ5NTU3Njg0",
"avatar_url": "https://avatars.githubusercontent.com/u/49557684?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/svilupp",
"html_url": "https://github.com/svilupp",
"followers_url": "https://api.github.com/users/svilupp/followers",
"following_url": "https://api.github.com/users/svilupp/following{/other_user}",
"gists_url": "https://api.github.com/users/svilupp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/svilupp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/svilupp/subscriptions",
"organizations_url": "https://api.github.com/users/svilupp/orgs",
"repos_url": "https://api.github.com/users/svilupp/repos",
"events_url": "https://api.github.com/users/svilupp/events{/privacy}",
"received_events_url": "https://api.github.com/users/svilupp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2023-12-18T09:27:56
| 2024-12-06T22:08:09
| 2023-12-18T14:46:44
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
First of all, thank you for the amazing app!
**Observation**: When I run the same prompt via latest Ollama vs Llama.cpp I get order of magnitude slower generation on Ollama.
- With Ollama in generation, GPU usage is 0% and from time to time it jumps to 40%
- With llama.cpp in generation, GPU usage constantly sits at ~99%
**Setup**:
- Device: Apple M1 Pro, 32GB ram, shifted memory limit for mixtral to work
- System: Ventura 13.6
- Model: dolphin-mixtral:8x7b-v2.5-q4_K_M
**Prompt**: "Count to 5 and say hi"
**Ollama**: `ollama run dolphin-mixtral:8x7b-v2.5-q4_K_M "Count to 5 then say hi." --verbose`
> First, I will start by counting from 1 to 5.
>
> 1. One
> 2. Two
> 3. Three
> 4. Four
> 5. Five
>
> Now that I have counted to 5, let me say hi! Hi there!
>
> total duration: 5m3.16583525s
> load duration: 33.760953875s
> prompt eval count: 35 token(s)
> prompt eval duration: 24.710485s
> prompt eval rate: 1.42 tokens/s
> eval count: 54 token(s)
> eval duration: 4m4.681389s
> eval rate: 0.22 tokens/s
**Llama.cpp**: `./main -m .ollama/models/blobs/sha256:34855d29fd5901f6ed6fe8112a80dc137bafdeb135d89bf75f9b171e62980ac2 --prompt "[INST] Count to 5 and then say hi. [INST]"`
> 1
> 2
> 3
> 4
> 5
> Hi!
> <...it goes on about something else for a bit...it has some stopping issues>
>
> llama_print_timings: load time = 5242.30 ms
> llama_print_timings: sample time = 38.25 ms / 425 runs ( 0.09 ms per token, 11109.95 tokens per second)
> llama_print_timings: prompt eval time = 800.60 ms / 17 tokens ( 47.09 ms per token, 21.23 tokens per second)
> llama_print_timings: eval time = 25695.06 ms / 424 runs ( 60.60 ms per token, 16.50 tokens per second)
> llama_print_timings: total time = 26599.97 ms
> ggml_metal_free: deallocating
> Log end
Any idea what I could doing wrong?
|
{
"login": "svilupp",
"id": 49557684,
"node_id": "MDQ6VXNlcjQ5NTU3Njg0",
"avatar_url": "https://avatars.githubusercontent.com/u/49557684?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/svilupp",
"html_url": "https://github.com/svilupp",
"followers_url": "https://api.github.com/users/svilupp/followers",
"following_url": "https://api.github.com/users/svilupp/following{/other_user}",
"gists_url": "https://api.github.com/users/svilupp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/svilupp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/svilupp/subscriptions",
"organizations_url": "https://api.github.com/users/svilupp/orgs",
"repos_url": "https://api.github.com/users/svilupp/repos",
"events_url": "https://api.github.com/users/svilupp/events{/privacy}",
"received_events_url": "https://api.github.com/users/svilupp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1578/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1578/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1907
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1907/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1907/comments
|
https://api.github.com/repos/ollama/ollama/issues/1907/events
|
https://github.com/ollama/ollama/issues/1907
| 2,075,150,578
|
I_kwDOJ0Z1Ps57sEjy
| 1,907
|
Mixtral OOM
|
{
"login": "coder543",
"id": 726063,
"node_id": "MDQ6VXNlcjcyNjA2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/726063?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coder543",
"html_url": "https://github.com/coder543",
"followers_url": "https://api.github.com/users/coder543/followers",
"following_url": "https://api.github.com/users/coder543/following{/other_user}",
"gists_url": "https://api.github.com/users/coder543/gists{/gist_id}",
"starred_url": "https://api.github.com/users/coder543/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coder543/subscriptions",
"organizations_url": "https://api.github.com/users/coder543/orgs",
"repos_url": "https://api.github.com/users/coder543/repos",
"events_url": "https://api.github.com/users/coder543/events{/privacy}",
"received_events_url": "https://api.github.com/users/coder543/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 12
| 2024-01-10T20:29:40
| 2024-01-14T22:14:18
| 2024-01-14T22:14:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I’ve been enjoying the new auto-VRAM implementation for the most part, but when trying to use Mixtral at very large context sizes (~30000) to process a 25k token document, I’m still getting OOMs, repeatedly. (So, not when changing context sizes, which I see is an existing ticket.)
I tried different context sizes between 27k and 31k to see if I could nudge the auto-VRAM calculation into the happy path, but I couldn’t.
I’m using an RTX 3090 w/24GB VRAM, and this is the Mixtral Instruct q3_K_M model.
Relevant log snippet:
```
23852]: llm_load_tensors: using CUDA for GPU acceleration
23852]: llm_load_tensors: mem required = 3166.49 MiB
23852]: llm_load_tensors: offloading 27 repeating layers to GPU
23852]: llm_load_tensors: offloaded 27/33 layers to GPU
23852]: llm_load_tensors: VRAM used: 16253.16 MiB
23852]: ....................................................................................................
23852]: llama_new_context_with_model: n_ctx = 27000
23852]: llama_new_context_with_model: freq_base = 1000000.0
23852]: llama_new_context_with_model: freq_scale = 1
23852]: llama_kv_cache_init: VRAM kv self = 2847.66 MB
23852]: llama_new_context_with_model: KV self size = 3375.00 MiB, K (f16): 1687.50 MiB, V (f16): 1687.50 MiB
23852]: llama_build_graph: non-view tensors processed: 1124/1124
23852]: llama_new_context_with_model: compute buffer total size = 1795.46 MiB
23852]: llama_new_context_with_model: VRAM scratch buffer: 1792.27 MiB
23852]: llama_new_context_with_model: total VRAM used: 20893.08 MiB (model: 16253.16 MiB, context: 4639.93 MiB)
23852]: 2024/01/10 20:19:36 ext_server_common.go:144: Starting internal llama main loop
23852]: 2024/01/10 20:19:36 ext_server_common.go:158: loaded 0 images
23852]: CUDA error 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:6600: out of memory
23852]: current device: 0
23852]: Lazy loading /tmp/ollama3998269130/cuda/libext_server.so library
23852]: GGML_ASSERT: /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:6600: !"CUDA error"
]: ollama.service: Main process exited, code=dumped, status=6/ABRT
]: ollama.service: Failed with result 'core-dump'.
]: ollama.service: Consumed 18min 9.528s CPU time.
]: ollama.service: Scheduled restart job, restart counter is at 3.
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1907/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1907/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3933
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3933/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3933/comments
|
https://api.github.com/repos/ollama/ollama/issues/3933/events
|
https://github.com/ollama/ollama/pull/3933
| 2,265,000,754
|
PR_kwDOJ0Z1Ps5tzRL2
| 3,933
|
Move cuda/rocm dependency gathering into generate script
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-26T05:38:55
| 2024-04-26T14:01:30
| 2024-04-26T14:01:24
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3933",
"html_url": "https://github.com/ollama/ollama/pull/3933",
"diff_url": "https://github.com/ollama/ollama/pull/3933.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3933.patch",
"merged_at": "2024-04-26T14:01:24"
}
|
This will make it simpler for CI to accumulate artifacts from prior steps
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3933/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3933/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1542
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1542/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1542/comments
|
https://api.github.com/repos/ollama/ollama/issues/1542/events
|
https://github.com/ollama/ollama/issues/1542
| 2,043,545,495
|
I_kwDOJ0Z1Ps55zgeX
| 1,542
|
API endpoint to query models supported by ollama
|
{
"login": "gmaijoe",
"id": 7184919,
"node_id": "MDQ6VXNlcjcxODQ5MTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/7184919?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gmaijoe",
"html_url": "https://github.com/gmaijoe",
"followers_url": "https://api.github.com/users/gmaijoe/followers",
"following_url": "https://api.github.com/users/gmaijoe/following{/other_user}",
"gists_url": "https://api.github.com/users/gmaijoe/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gmaijoe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gmaijoe/subscriptions",
"organizations_url": "https://api.github.com/users/gmaijoe/orgs",
"repos_url": "https://api.github.com/users/gmaijoe/repos",
"events_url": "https://api.github.com/users/gmaijoe/events{/privacy}",
"received_events_url": "https://api.github.com/users/gmaijoe/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2023-12-15T11:20:23
| 2023-12-24T21:59:52
| 2023-12-24T21:59:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://github.com/ollama-webui/ollama-webui is looking to support a deeper integration with ollama. is there any way to expose an api of all model names supported by ollama? could manually scrape but an API would be easier for ingestion
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1542/reactions",
"total_count": 3,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/1542/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6495
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6495/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6495/comments
|
https://api.github.com/repos/ollama/ollama/issues/6495/events
|
https://github.com/ollama/ollama/pull/6495
| 2,484,938,971
|
PR_kwDOJ0Z1Ps55Vyyt
| 6,495
|
Detect running in a container
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-25T00:01:39
| 2024-09-05T20:37:09
| 2024-09-05T20:24:51
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6495",
"html_url": "https://github.com/ollama/ollama/pull/6495",
"diff_url": "https://github.com/ollama/ollama/pull/6495.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6495.patch",
"merged_at": "2024-09-05T20:24:51"
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6495/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6495/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5142
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5142/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5142/comments
|
https://api.github.com/repos/ollama/ollama/issues/5142/events
|
https://github.com/ollama/ollama/issues/5142
| 2,362,524,800
|
I_kwDOJ0Z1Ps6M0USA
| 5,142
|
`Segmentation fault` on Ubuntu 24.04 LXC container
|
{
"login": "MmDawN",
"id": 40926229,
"node_id": "MDQ6VXNlcjQwOTI2MjI5",
"avatar_url": "https://avatars.githubusercontent.com/u/40926229?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MmDawN",
"html_url": "https://github.com/MmDawN",
"followers_url": "https://api.github.com/users/MmDawN/followers",
"following_url": "https://api.github.com/users/MmDawN/following{/other_user}",
"gists_url": "https://api.github.com/users/MmDawN/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MmDawN/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MmDawN/subscriptions",
"organizations_url": "https://api.github.com/users/MmDawN/orgs",
"repos_url": "https://api.github.com/users/MmDawN/repos",
"events_url": "https://api.github.com/users/MmDawN/events{/privacy}",
"received_events_url": "https://api.github.com/users/MmDawN/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-06-19T14:25:57
| 2024-06-27T03:13:22
| 2024-06-27T03:13:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
My runtime environment is based on an LXC container running Ubuntu 24.04 LTS.
After the installation of ollama v0.1.44, running `ollama` in bash returns a `Segmentation fault` error.
The `journalctl -u ollama` command reveals the following recurring error and indicates constant restarting:
```
ollama.service: Main process exited, code=killed, status=11/SEGV
ollama.service: Failed with result 'signal'.
```
See the attached image for reference:
> 
I'm hoping someone can assist me in resolving this issue.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.44
|
{
"login": "MmDawN",
"id": 40926229,
"node_id": "MDQ6VXNlcjQwOTI2MjI5",
"avatar_url": "https://avatars.githubusercontent.com/u/40926229?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MmDawN",
"html_url": "https://github.com/MmDawN",
"followers_url": "https://api.github.com/users/MmDawN/followers",
"following_url": "https://api.github.com/users/MmDawN/following{/other_user}",
"gists_url": "https://api.github.com/users/MmDawN/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MmDawN/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MmDawN/subscriptions",
"organizations_url": "https://api.github.com/users/MmDawN/orgs",
"repos_url": "https://api.github.com/users/MmDawN/repos",
"events_url": "https://api.github.com/users/MmDawN/events{/privacy}",
"received_events_url": "https://api.github.com/users/MmDawN/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5142/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5142/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1327
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1327/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1327/comments
|
https://api.github.com/repos/ollama/ollama/issues/1327/events
|
https://github.com/ollama/ollama/issues/1327
| 2,018,043,425
|
I_kwDOJ0Z1Ps54SOYh
| 1,327
|
Modelfile prompt should support chat / multiturn.
|
{
"login": "ehartford",
"id": 1117701,
"node_id": "MDQ6VXNlcjExMTc3MDE=",
"avatar_url": "https://avatars.githubusercontent.com/u/1117701?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ehartford",
"html_url": "https://github.com/ehartford",
"followers_url": "https://api.github.com/users/ehartford/followers",
"following_url": "https://api.github.com/users/ehartford/following{/other_user}",
"gists_url": "https://api.github.com/users/ehartford/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ehartford/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ehartford/subscriptions",
"organizations_url": "https://api.github.com/users/ehartford/orgs",
"repos_url": "https://api.github.com/users/ehartford/repos",
"events_url": "https://api.github.com/users/ehartford/events{/privacy}",
"received_events_url": "https://api.github.com/users/ehartford/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2023-11-30T07:28:38
| 2023-12-04T23:23:03
| 2023-12-04T23:23:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | ERROR: type should be string, got "https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#template\r\n\r\nSo basically all that's coming in is .Prompt which is just a string.\r\n\r\nBut that can't handle chat and multi turn.\r\n\r\nWhat's coming in should look a messages array. then this template should format that into a prompt.\r\n```\r\n[\r\n { \"role\": \"system\", \"content\": \"You are a helpful AI assistant\" },\r\n { \"role\": \"user\", \"content\": \"Hello AI, How are you today?\" },\r\n { \"role\": \"assistant\", \"content\": \"I have no notion of time. State your question?\" },\r\n { \"role\": \"user\", \"content\": \"Oh ok then, tell me the 38th state\" }\r\n]\r\n```\r\n\r\nthen the template in the modelfile would look something like\r\n\r\n```\r\n{% for message in messages %}{{'<|im_start|>' + message['role'] + '\\n' + message['content'] + '<|im_end|>' + '\\n'}}{% endfor %}<|im_start|>assistant\r\n\r\n```\r\n\r\nBasically the idea that a prompt consists of a single system message and a single user message, isn't how most models actually work."
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1327/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1327/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5819
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5819/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5819/comments
|
https://api.github.com/repos/ollama/ollama/issues/5819/events
|
https://github.com/ollama/ollama/pull/5819
| 2,421,080,552
|
PR_kwDOJ0Z1Ps51_zto
| 5,819
|
Track and Expose GPU discovery failure information
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-20T22:24:45
| 2024-07-20T22:25:07
| 2024-07-20T22:25:07
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5819",
"html_url": "https://github.com/ollama/ollama/pull/5819",
"diff_url": "https://github.com/ollama/ollama/pull/5819.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5819.patch",
"merged_at": null
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5819/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5819/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8346
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8346/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8346/comments
|
https://api.github.com/repos/ollama/ollama/issues/8346/events
|
https://github.com/ollama/ollama/issues/8346
| 2,775,653,112
|
I_kwDOJ0Z1Ps6lcRr4
| 8,346
|
Unable to run llama on IPv6 Single Stack env
|
{
"login": "chaturvedi-kna",
"id": 63336082,
"node_id": "MDQ6VXNlcjYzMzM2MDgy",
"avatar_url": "https://avatars.githubusercontent.com/u/63336082?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chaturvedi-kna",
"html_url": "https://github.com/chaturvedi-kna",
"followers_url": "https://api.github.com/users/chaturvedi-kna/followers",
"following_url": "https://api.github.com/users/chaturvedi-kna/following{/other_user}",
"gists_url": "https://api.github.com/users/chaturvedi-kna/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chaturvedi-kna/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chaturvedi-kna/subscriptions",
"organizations_url": "https://api.github.com/users/chaturvedi-kna/orgs",
"repos_url": "https://api.github.com/users/chaturvedi-kna/repos",
"events_url": "https://api.github.com/users/chaturvedi-kna/events{/privacy}",
"received_events_url": "https://api.github.com/users/chaturvedi-kna/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2025-01-08T15:15:22
| 2025-01-11T08:37:24
| 2025-01-11T08:37:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi Guys,
I am using Ollama on OpenShift (v4.16), Open Data Hub followed below guide and used the same image mentioned there
https://github.com/rh-aiservices-bu/llm-on-openshift/tree/main/serving-runtimes/ollama_runtime
In the [ollama-runtime.yaml](https://github.com/rh-aiservices-bu/llm-on-openshift/blob/main/serving-runtimes/ollama_runtime/ollama-runtime.yaml) changes OLLAMA_HOST value to '::'
Then used /api/pull for model llama3.2-vision:11b which is success post that tried to check if it is running fine with /api/generate and got error core dump upon looking into logs I have found that hostname is being shown as 127.0.0.1 and I am suspecting that this might be the reason for llama not starting properly.
Kindly guide me to resolve this.
Api Calls and outputs:
{"status":"verifying sha256 digest"}
{"status":"writing manifest"}
{"status":"removing any unused layers"}
{"status":"success"}
(app-root) sh-5.1$ curl https://ollma-doc.chatur.svc.cluster.local/api/tags -k {"models":[{"name":"llama3.2-vision:11b","model":"llama3.2-vision:11b","modified_at":"2025-01-08T15:00:32.627854408Z","size":7901829417,"digest":"085a1fdae525a3804ac95416b38498099c241defd0f1efc71dcca7f63190ba3d","details":{"parent_model":"","format":"gguf","family":"mllama","families":["mllama","mllama"],"parameter_size":"9.8B","quantization_level":"Q4_K_M"}}]}(app-root) sh-5.1$ curl https://ollma-doc.chatur.svc.cluster.local/api/generate -k -H "Content-Type: application/json" -d '{"model": "llama3.2-vision:11b", "prompt": "why is the sky blue?"}'
curl: (6) Could not resolve host: ollma-doc.chatur.svc.cluster.locagenerate
(app-root) sh-5.1$ curl https://ollma-doc.chatur.svc.cluster.local/api/generate -k -H "Content-Type: application/json" -d '{"model": "llama3.2-vision:11b", "prompt": "why is the sky blue?"}'
{"error":"llama runner process has terminated: signal: aborted (core dumped)"}(app-root) sh-5.1$
(app-root) sh-5.1$
Logs from Ollama pod:
Couldn't find '/.ollama/id_ed25519'. Generating new private key.
Your new public key is:
ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFgZvE8cJ0k0eZH/4I6S9r/EKNzEuKNGh/aC3AWTsf+n
2025/01/08 14:09:34 routes.go:1100: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://:::11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:2562047h47m16.854775807s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2025-01-08T14:09:34.834Z level=INFO source=images.go:784 msg="total blobs: 0"
time=2025-01-08T14:09:34.834Z level=INFO source=images.go:791 msg="total unused blobs removed: 0"
[GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.
[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
- using env: export GIN_MODE=release
- using code: gin.SetMode(gin.ReleaseMode)
[GIN-debug] POST /api/pull --> github.com/ollama/ollama/server.(*Server).PullModelHandler-fm (5 handlers)
[GIN-debug] POST /api/generate --> github.com/ollama/ollama/server.(*Server).GenerateHandler-fm (5 handlers)
[GIN-debug] POST /api/chat --> github.com/ollama/ollama/server.(*Server).ChatHandler-fm (5 handlers)
[GIN-debug] POST /api/embed --> github.com/ollama/ollama/server.(*Server).EmbedHandler-fm (5 handlers)
[GIN-debug] POST /api/embeddings --> github.com/ollama/ollama/server.(*Server).EmbeddingsHandler-fm (5 handlers)
[GIN-debug] POST /api/create --> github.com/ollama/ollama/server.(*Server).CreateModelHandler-fm (5 handlers)
[GIN-debug] POST /api/push --> github.com/ollama/ollama/server.(*Server).PushModelHandler-fm (5 handlers)
[GIN-debug] POST /api/copy --> github.com/ollama/ollama/server.(*Server).CopyModelHandler-fm (5 handlers)
[GIN-debug] DELETE /api/delete --> github.com/ollama/ollama/server.(*Server).DeleteModelHandler-fm (5 handlers)
[GIN-debug] POST /api/show --> github.com/ollama/ollama/server.(*Server).ShowModelHandler-fm (5 handlers)
[GIN-debug] POST /api/blobs/:digest --> github.com/ollama/ollama/server.(*Server).CreateBlobHandler-fm (5 handlers)
[GIN-debug] HEAD /api/blobs/:digest --> github.com/ollama/ollama/server.(*Server).HeadBlobHandler-fm (5 handlers)
[GIN-debug] GET /api/ps --> github.com/ollama/ollama/server.(*Server).ProcessHandler-fm (5 handlers)
[GIN-debug] POST /v1/chat/completions --> github.com/ollama/ollama/server.(*Server).ChatHandler-fm (6 handlers)
[GIN-debug] POST /v1/completions --> github.com/ollama/ollama/server.(*Server).GenerateHandler-fm (6 handlers)
[GIN-debug] POST /v1/embeddings --> github.com/ollama/ollama/server.(*Server).EmbedHandler-fm (6 handlers)
[GIN-debug] GET /v1/models --> github.com/ollama/ollama/server.(*Server).ListModelsHandler-fm (6 handlers)
[GIN-debug] GET /v1/models/:model --> github.com/ollama/ollama/server.(*Server).ShowModelHandler-fm (6 handlers)
[GIN-debug] GET / --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func1 (5 handlers)
[GIN-debug] GET /api/tags --> github.com/ollama/ollama/server.(*Server).ListModelsHandler-fm (5 handlers)
[GIN-debug] GET /api/version --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
[GIN-debug] HEAD / --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func1 (5 handlers)
[GIN-debug] HEAD /api/tags --> github.com/ollama/ollama/server.(*Server).ListModelsHandler-fm (5 handlers)
[GIN-debug] HEAD /api/version --> github.com/ollama/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
time=2025-01-08T14:09:34.834Z level=INFO source=routes.go:1147 msg="Listening on [::]:11434 (version 0.0.0)"
time=2025-01-08T14:09:34.834Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama3519681156/runners
time=2025-01-08T14:09:34.883Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx2]"
time=2025-01-08T14:09:34.883Z level=INFO source=gpu.go:205 msg="looking for compatible GPUs"
time=2025-01-08T14:09:34.884Z level=INFO source=gpu.go:346 msg="no compatible GPUs were discovered"
time=2025-01-08T14:09:34.884Z level=INFO source=types.go:105 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="377.5 GiB" available="343.4 GiB"
[GIN] 2025/01/08 - 14:34:56 | 200 | 377.045µs | fd01:0:0:1::af9 | GET "/api/tags"
[GIN] 2025/01/08 - 14:37:21 | 200 | 128.615µs | fd01:0:0:1::af9 | GET "/api/tags"
time=2025-01-08T14:40:51.547Z level=INFO source=download.go:136 msg="downloading 11f274007f09 in 60 100 MB part(s)"
time=2025-01-08T14:41:06.549Z level=INFO source=download.go:251 msg="11f274007f09 part 57 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:08.550Z level=INFO source=download.go:251 msg="11f274007f09 part 25 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:09.550Z level=INFO source=download.go:251 msg="11f274007f09 part 52 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:09.550Z level=INFO source=download.go:251 msg="11f274007f09 part 46 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:12.547Z level=INFO source=download.go:251 msg="11f274007f09 part 18 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:12.550Z level=INFO source=download.go:251 msg="11f274007f09 part 40 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:15.550Z level=INFO source=download.go:251 msg="11f274007f09 part 27 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:16.547Z level=INFO source=download.go:251 msg="11f274007f09 part 3 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:19.551Z level=INFO source=download.go:251 msg="11f274007f09 part 36 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:20.551Z level=INFO source=download.go:251 msg="11f274007f09 part 47 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:21.548Z level=INFO source=download.go:251 msg="11f274007f09 part 10 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:28.551Z level=INFO source=download.go:251 msg="11f274007f09 part 9 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:33.553Z level=INFO source=download.go:251 msg="11f274007f09 part 47 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:41:55.551Z level=INFO source=download.go:251 msg="11f274007f09 part 37 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:42:15.551Z level=INFO source=download.go:251 msg="11f274007f09 part 42 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:42:22.548Z level=INFO source=download.go:251 msg="11f274007f09 part 4 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:53:51.667Z level=INFO source=download.go:136 msg="downloading ece5e659647a in 20 100 MB part(s)"
time=2025-01-08T14:54:19.668Z level=INFO source=download.go:251 msg="ece5e659647a part 5 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:54:22.668Z level=INFO source=download.go:251 msg="ece5e659647a part 3 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:54:33.668Z level=INFO source=download.go:251 msg="ece5e659647a part 4 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:54:33.669Z level=INFO source=download.go:251 msg="ece5e659647a part 5 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:54:48.668Z level=INFO source=download.go:251 msg="ece5e659647a part 15 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:59:32.668Z level=INFO source=download.go:251 msg="ece5e659647a part 6 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2025-01-08T14:59:52.181Z level=INFO source=download.go:136 msg="downloading 715415638c9c in 1 269 B part(s)"
time=2025-01-08T14:59:54.665Z level=INFO source=download.go:136 msg="downloading 0b4284c1f870 in 1 7.7 KB part(s)"
time=2025-01-08T14:59:56.042Z level=INFO source=download.go:136 msg="downloading fefc914e46e6 in 1 32 B part(s)"
time=2025-01-08T14:59:58.651Z level=INFO source=download.go:136 msg="downloading fbd313562bb7 in 1 572 B part(s)"
[GIN] 2025/01/08 - 15:00:32 | 200 | 19m43s | fd01:0:0:1::af9 | POST "/api/pull"
[GIN] 2025/01/08 - 15:00:55 | 200 | 459.312µs | fd01:0:0:1::af9 | GET "/api/tags"
time=2025-01-08T15:03:07.761Z level=WARN source=sched.go:134 msg="multimodal models don't support parallel requests yet"
time=2025-01-08T15:03:07.794Z level=INFO source=memory.go:309 msg="offload to cpu" layers.requested=-1 layers.model=41 layers.offload=0 layers.split="" memory.available="[343.2 GiB]" memory.required.full="7.7 GiB" memory.required.partial="0 B" memory.required.kv="320.0 MiB" memory.required.allocations="[7.7 GiB]" memory.weights.total="5.2 GiB" memory.weights.repeating="4.8 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="213.3 MiB" memory.graph.partial="213.3 MiB"
time=2025-01-08T15:03:07.795Z level=INFO source=server.go:383 msg="starting llama server" cmd="/tmp/ollama3519681156/runners/cpu_avx2/ollama_llama_server --model /.ollama/models/blobs/sha256-11f274007f093fefeec994a5dbbb33d0733a4feb87f7ab66dcd7c1069fef0068 --ctx-size 2048 --batch-size 512 --embedding --log-disable --mmproj /.ollama/models/blobs/sha256-ece5e659647a20a5c28ab9eea1c12a1ad430bc0f2a27021d00ad103b3bf5206f --no-mmap --parallel 1 --port 36453"
time=2025-01-08T15:03:07.795Z level=INFO source=sched.go:437 msg="loaded runners" count=1
time=2025-01-08T15:03:07.795Z level=INFO source=server.go:583 msg="waiting for llama runner to start responding"
time=2025-01-08T15:03:07.795Z level=INFO source=server.go:617 msg="waiting for server to become available" status="llm server error"
INFO [main] build info | build=3440 commit="d94c6e0c" tid="139814772467584" timestamp=1736348587
INFO [main] system info | n_threads=40 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 1 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 0 | " tid="139814772467584" timestamp=1736348587 total_threads=80
INFO [main] HTTP server listening | hostname="127.0.0.1" n_threads_http="79" port="36453" tid="139814772467584" timestamp=1736348587
terminate called after throwing an instance of 'std::runtime_error'
what(): Missing required key: clip.has_text_encoder
time=2025-01-08T15:03:08.046Z level=ERROR source=sched.go:443 msg="error loading llama server" error="llama runner process has terminated: signal: aborted (core dumped)"
[GIN] 2025/01/08 - 15:03:08 | 500 | 314.868828ms | fd01:0:0:1::af9 | POST "/api/generate"
### OS
_No response_
### GPU
_No response_
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "chaturvedi-kna",
"id": 63336082,
"node_id": "MDQ6VXNlcjYzMzM2MDgy",
"avatar_url": "https://avatars.githubusercontent.com/u/63336082?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chaturvedi-kna",
"html_url": "https://github.com/chaturvedi-kna",
"followers_url": "https://api.github.com/users/chaturvedi-kna/followers",
"following_url": "https://api.github.com/users/chaturvedi-kna/following{/other_user}",
"gists_url": "https://api.github.com/users/chaturvedi-kna/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chaturvedi-kna/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chaturvedi-kna/subscriptions",
"organizations_url": "https://api.github.com/users/chaturvedi-kna/orgs",
"repos_url": "https://api.github.com/users/chaturvedi-kna/repos",
"events_url": "https://api.github.com/users/chaturvedi-kna/events{/privacy}",
"received_events_url": "https://api.github.com/users/chaturvedi-kna/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8346/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8346/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2149
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2149/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2149/comments
|
https://api.github.com/repos/ollama/ollama/issues/2149/events
|
https://github.com/ollama/ollama/pull/2149
| 2,095,052,458
|
PR_kwDOJ0Z1Ps5kyE7x
| 2,149
|
Use all layers for metal on macOS if model is small enough
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-23T01:05:05
| 2024-01-23T01:40:07
| 2024-01-23T01:40:07
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2149",
"html_url": "https://github.com/ollama/ollama/pull/2149",
"diff_url": "https://github.com/ollama/ollama/pull/2149.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2149.patch",
"merged_at": "2024-01-23T01:40:07"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2149/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2149/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5895
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5895/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5895/comments
|
https://api.github.com/repos/ollama/ollama/issues/5895/events
|
https://github.com/ollama/ollama/pull/5895
| 2,426,214,454
|
PR_kwDOJ0Z1Ps52RRo9
| 5,895
|
Better explain multi-gpu behavior
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-23T22:16:45
| 2024-07-29T21:25:44
| 2024-07-29T21:25:42
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5895",
"html_url": "https://github.com/ollama/ollama/pull/5895",
"diff_url": "https://github.com/ollama/ollama/pull/5895.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5895.patch",
"merged_at": "2024-07-29T21:25:41"
}
|
Fixes #5635 #5455
This topic seems to come up ~weekly, so lets explain it more clearly in the docs, and expose the existing env var to force spreading over all GPUs.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5895/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5895/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4639
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4639/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4639/comments
|
https://api.github.com/repos/ollama/ollama/issues/4639/events
|
https://github.com/ollama/ollama/issues/4639
| 2,317,162,519
|
I_kwDOJ0Z1Ps6KHRgX
| 4,639
|
Prompt caching causes reproducible outputs to be inconsistent
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-05-25T18:10:35
| 2024-06-11T21:29:47
| 2024-06-11T21:29:46
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When trying to generate [reproducible outputs](https://github.com/ollama/ollama/blob/main/docs/api.md#request-reproducible-outputs), changing prompts causes results to be inconsistent
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4639/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4639/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5443
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5443/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5443/comments
|
https://api.github.com/repos/ollama/ollama/issues/5443/events
|
https://github.com/ollama/ollama/pull/5443
| 2,387,121,395
|
PR_kwDOJ0Z1Ps50PwQF
| 5,443
|
add conversion for microsoft phi 3 mini/medium 4k, 128k
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-02T20:35:53
| 2024-08-12T22:48:00
| 2024-08-12T22:47:58
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5443",
"html_url": "https://github.com/ollama/ollama/pull/5443",
"diff_url": "https://github.com/ollama/ollama/pull/5443.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5443.patch",
"merged_at": "2024-08-12T22:47:58"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5443/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5443/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4213
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4213/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4213/comments
|
https://api.github.com/repos/ollama/ollama/issues/4213/events
|
https://github.com/ollama/ollama/pull/4213
| 2,281,920,211
|
PR_kwDOJ0Z1Ps5usSyJ
| 4,213
|
Close server on receiving signal
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-06T22:51:37
| 2024-05-06T23:01:38
| 2024-05-06T23:01:37
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4213",
"html_url": "https://github.com/ollama/ollama/pull/4213",
"diff_url": "https://github.com/ollama/ollama/pull/4213.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4213.patch",
"merged_at": "2024-05-06T23:01:37"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4213/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4213/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2562
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2562/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2562/comments
|
https://api.github.com/repos/ollama/ollama/issues/2562/events
|
https://github.com/ollama/ollama/issues/2562
| 2,140,125,367
|
I_kwDOJ0Z1Ps5_j7i3
| 2,562
|
Inconsistent OCR Results with LLaVA 1.6 and Ollama vs. LLaVA Online Demo
|
{
"login": "arcaweb-ch",
"id": 43749906,
"node_id": "MDQ6VXNlcjQzNzQ5OTA2",
"avatar_url": "https://avatars.githubusercontent.com/u/43749906?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arcaweb-ch",
"html_url": "https://github.com/arcaweb-ch",
"followers_url": "https://api.github.com/users/arcaweb-ch/followers",
"following_url": "https://api.github.com/users/arcaweb-ch/following{/other_user}",
"gists_url": "https://api.github.com/users/arcaweb-ch/gists{/gist_id}",
"starred_url": "https://api.github.com/users/arcaweb-ch/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/arcaweb-ch/subscriptions",
"organizations_url": "https://api.github.com/users/arcaweb-ch/orgs",
"repos_url": "https://api.github.com/users/arcaweb-ch/repos",
"events_url": "https://api.github.com/users/arcaweb-ch/events{/privacy}",
"received_events_url": "https://api.github.com/users/arcaweb-ch/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
open
| false
| null |
[] | null | 8
| 2024-02-17T14:11:32
| 2024-05-17T11:33:21
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hey there, I've posted this issue on [LLaVA repo](https://github.com/haotian-liu/LLaVA/issues/1116) already, not sure if this problem refers to an implementation issue in Ollama. Any idea?
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2562/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2562/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3195
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3195/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3195/comments
|
https://api.github.com/repos/ollama/ollama/issues/3195/events
|
https://github.com/ollama/ollama/issues/3195
| 2,190,712,706
|
I_kwDOJ0Z1Ps6Ck5-C
| 3,195
|
Modified /systemd/system/ollama.service but it didn't take effect
|
{
"login": "michelle-chou25",
"id": 71402902,
"node_id": "MDQ6VXNlcjcxNDAyOTAy",
"avatar_url": "https://avatars.githubusercontent.com/u/71402902?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/michelle-chou25",
"html_url": "https://github.com/michelle-chou25",
"followers_url": "https://api.github.com/users/michelle-chou25/followers",
"following_url": "https://api.github.com/users/michelle-chou25/following{/other_user}",
"gists_url": "https://api.github.com/users/michelle-chou25/gists{/gist_id}",
"starred_url": "https://api.github.com/users/michelle-chou25/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/michelle-chou25/subscriptions",
"organizations_url": "https://api.github.com/users/michelle-chou25/orgs",
"repos_url": "https://api.github.com/users/michelle-chou25/repos",
"events_url": "https://api.github.com/users/michelle-chou25/events{/privacy}",
"received_events_url": "https://api.github.com/users/michelle-chou25/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-03-17T15:04:00
| 2024-05-10T15:57:22
| 2024-03-18T07:49:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama service file's modification didn't take effect
### What did you expect to see?
Make the modification of the service file effective?
### Steps to reproduce
1. I tried to start ollama service but failed it, used "sudo journalctl -u ollama --reverse --lines=100" to check the log and it showed:
Failed at step EXEC spawning /usr/bin/ollama: No such file or directory
Started ollama.service.
Stopped ollama.service.
ollama.service holdoff time over, scheduling restart.
ollama.service failed.
Unit ollama.service entered failed state.
ollama.service: main process exited, code=exited, status=203/EXEC
2. Then I found my ollama file is actually here: /usr/local/bin/ollama
I modified my ollama service file to the other path as above, didn't change anything else excep modifying "ExecStart=/usr/bin/ollama serve" to "ExecStart=/usr/local/bin/ollama serve"
cat /etc/systemd/system/ollama.service and it's like the following:
[unit]
Description=Ollama Service
After=network-online.target
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_MODELS=/data/ollama/.ollama/models"
ExecStart=/usr/local/bin/ollama serve
User=root
Group=root
Restart=always
RestartSec=3
[Install]
WantedBy=default.target
3. then start the service
sudo systemctl daemon-reload
sudo systemctl enable ollama
but still get the same error.
Looks like the service is stilling looking for /user/bin/ollama so it failed.
I tried to run this command and it succeded: /usr/local/bin/ollama serve
the log is:systemctl status ollama
● ollama.service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: disabled)
Active: active (running) since Sun 2024-03-17 22:38:42 HKT; 2min 38s ago
Main PID: 75147 (ollama)
Tasks: 23
Memory: 454.5M
CGroup: /system.slice/ollama.service
└─75147 /usr/local/bin/ollama serve
My env:
centos7
I tried to reinstall ollama but it also fails
### Are there any recent changes that introduced the issue?
No.
### OS
Linux
### Architecture
x86
### Platform
WSL
### Ollama version
0.1.29
### GPU
Nvidia
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3195/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3195/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/7958
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7958/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7958/comments
|
https://api.github.com/repos/ollama/ollama/issues/7958/events
|
https://github.com/ollama/ollama/issues/7958
| 2,721,506,915
|
I_kwDOJ0Z1Ps6iNuZj
| 7,958
|
Model request: HunyuanVideo text-to-video
|
{
"login": "artem-zinnatullin",
"id": 967132,
"node_id": "MDQ6VXNlcjk2NzEzMg==",
"avatar_url": "https://avatars.githubusercontent.com/u/967132?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/artem-zinnatullin",
"html_url": "https://github.com/artem-zinnatullin",
"followers_url": "https://api.github.com/users/artem-zinnatullin/followers",
"following_url": "https://api.github.com/users/artem-zinnatullin/following{/other_user}",
"gists_url": "https://api.github.com/users/artem-zinnatullin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/artem-zinnatullin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/artem-zinnatullin/subscriptions",
"organizations_url": "https://api.github.com/users/artem-zinnatullin/orgs",
"repos_url": "https://api.github.com/users/artem-zinnatullin/repos",
"events_url": "https://api.github.com/users/artem-zinnatullin/events{/privacy}",
"received_events_url": "https://api.github.com/users/artem-zinnatullin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-12-05T21:51:24
| 2024-12-14T15:42:32
| 2024-12-14T15:42:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It's an "open-source" rich text-to-video model by Tencent:
>HunyuanVideo represents the most parameter-rich and high-performce text-to-video model currently available in the open-source domain. With 13 billion parameters, it is capable of generating videos that exhibit high physical accuracy and scene consistency, thereby actualizing conceptual visions and fostering creative expression.
Links:
- Model website: https://aivideo.hunyuan.tencent.com/
- GitHub: https://github.com/Tencent/HunyuanVideo
- HuggingFace: https://huggingface.co/tencent/HunyuanVideo
- Paper: https://github.com/Tencent/HunyuanVideo/blob/main/assets/hunyuanvideo.pdf
Abstract:
>We present HunyuanVideo, a novel open-source video foundation model that exhibits performance in video generation that is comparable to, if not superior to, leading closed-source models. HunyuanVideo features a comprehensive framework that integrates several key contributions, including data curation, image-video joint model training, and an efficient infrastructure designed to facilitate large-scale model training and inference. Additionally, through an effective strategy for scaling model architecture and dataset, we successfully trained a video generative model with over 13 billion parameters, making it the largest among all open-source models.
>
>We conducted extensive experiments and implemented a series of targeted designs to ensure high visual quality, motion diversity, text-video alignment, and generation stability. According to professional human evaluation results, HunyuanVideo outperforms previous state-of-the-art models, including Runway Gen-3, Luma 1.6, and 3 top performing Chinese video generative models. By releasing the code and weights of the foundation model and its applications, we aim to bridge the gap between closed-source and open-source video foundation models. This initiative will empower everyone in the community to experiment with their ideas, fostering a more dynamic and vibrant video generation ecosystem.
This is a tracking issue for HunyuanVideo model support ⌛
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7958/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7958/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/5151
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5151/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5151/comments
|
https://api.github.com/repos/ollama/ollama/issues/5151/events
|
https://github.com/ollama/ollama/pull/5151
| 2,363,189,410
|
PR_kwDOJ0Z1Ps5zAdxa
| 5,151
|
Update OpenAI Compatibility Docs with /v1/models
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-19T22:09:40
| 2024-08-01T22:48:45
| 2024-08-01T22:48:44
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5151",
"html_url": "https://github.com/ollama/ollama/pull/5151",
"diff_url": "https://github.com/ollama/ollama/pull/5151.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5151.patch",
"merged_at": "2024-08-01T22:48:44"
}
| null |
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5151/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5151/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2874
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2874/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2874/comments
|
https://api.github.com/repos/ollama/ollama/issues/2874/events
|
https://github.com/ollama/ollama/issues/2874
| 2,164,662,099
|
I_kwDOJ0Z1Ps6BBh9T
| 2,874
|
Support Qwen VL
|
{
"login": "thesby",
"id": 10773886,
"node_id": "MDQ6VXNlcjEwNzczODg2",
"avatar_url": "https://avatars.githubusercontent.com/u/10773886?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thesby",
"html_url": "https://github.com/thesby",
"followers_url": "https://api.github.com/users/thesby/followers",
"following_url": "https://api.github.com/users/thesby/following{/other_user}",
"gists_url": "https://api.github.com/users/thesby/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thesby/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thesby/subscriptions",
"organizations_url": "https://api.github.com/users/thesby/orgs",
"repos_url": "https://api.github.com/users/thesby/repos",
"events_url": "https://api.github.com/users/thesby/events{/privacy}",
"received_events_url": "https://api.github.com/users/thesby/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 49
| 2024-03-02T06:53:40
| 2025-01-28T15:07:16
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Could you please support Qwen VL model
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2874/reactions",
"total_count": 77,
"+1": 76,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2874/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4493
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4493/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4493/comments
|
https://api.github.com/repos/ollama/ollama/issues/4493/events
|
https://github.com/ollama/ollama/issues/4493
| 2,302,119,638
|
I_kwDOJ0Z1Ps6JN47W
| 4,493
|
How can we make model calls faster
|
{
"login": "userandpass",
"id": 26294920,
"node_id": "MDQ6VXNlcjI2Mjk0OTIw",
"avatar_url": "https://avatars.githubusercontent.com/u/26294920?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/userandpass",
"html_url": "https://github.com/userandpass",
"followers_url": "https://api.github.com/users/userandpass/followers",
"following_url": "https://api.github.com/users/userandpass/following{/other_user}",
"gists_url": "https://api.github.com/users/userandpass/gists{/gist_id}",
"starred_url": "https://api.github.com/users/userandpass/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/userandpass/subscriptions",
"organizations_url": "https://api.github.com/users/userandpass/orgs",
"repos_url": "https://api.github.com/users/userandpass/repos",
"events_url": "https://api.github.com/users/userandpass/events{/privacy}",
"received_events_url": "https://api.github.com/users/userandpass/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-05-17T08:25:22
| 2025-01-12T00:53:21
| 2025-01-12T00:53:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I used docker to load multiple ollama images and distribute them using nginx, which was much slower than calling the deployed model directly
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.1.34
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4493/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4493/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7766
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7766/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7766/comments
|
https://api.github.com/repos/ollama/ollama/issues/7766/events
|
https://github.com/ollama/ollama/issues/7766
| 2,676,975,457
|
I_kwDOJ0Z1Ps6fj2dh
| 7,766
|
ollama hangs randomly and sometimes responds with G's
|
{
"login": "Pho3niX90",
"id": 7858187,
"node_id": "MDQ6VXNlcjc4NTgxODc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7858187?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Pho3niX90",
"html_url": "https://github.com/Pho3niX90",
"followers_url": "https://api.github.com/users/Pho3niX90/followers",
"following_url": "https://api.github.com/users/Pho3niX90/following{/other_user}",
"gists_url": "https://api.github.com/users/Pho3niX90/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Pho3niX90/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Pho3niX90/subscriptions",
"organizations_url": "https://api.github.com/users/Pho3niX90/orgs",
"repos_url": "https://api.github.com/users/Pho3niX90/repos",
"events_url": "https://api.github.com/users/Pho3niX90/events{/privacy}",
"received_events_url": "https://api.github.com/users/Pho3niX90/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 11
| 2024-11-20T19:32:34
| 2024-11-23T19:47:55
| 2024-11-23T19:47:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am starting my journey into ollama, so my info below might not align 100% to what you need, but can provide as needed.
After the prompts "hang", I need to reboot the service to get it going again.
Short generation relatively seems OK,
Asking for longer responses typically hang it mid sentence
Asking again has it replying in "GGGGGGGGGGGGGGGG"
On the below graph, the reponse stood still on the 6th prompt

here is an example where it gave me the "G's" straight off the bat, restarted the service, and all was well

Until I asked it to write an extra long essay, multiple times

Notice the last G output, before it hangs
**More Info**
**Model**: llama3.2
**Params** --ctx-size 8192 --batch-size 512 --embedding --log-disable --n-gpu-layers 29 --parallel 4 --port 32807
1. Vram seems to never exceed 3700mb
2. It seems only a cpu single thread is being utilized, always at 100%
Cpu only, gives no issues, at around 10tokens/s
Gpu around 90tokens/s
**System Specs:**
Cpu: Amd EPYC 32core
Gpu: 3060 12Gb (gen4 8x riser)
Mem: 256GB DDR4
OS: Ubuntu 22.04
Disk: nvme
Cuda 12.6
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.4.2
|
{
"login": "Pho3niX90",
"id": 7858187,
"node_id": "MDQ6VXNlcjc4NTgxODc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7858187?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Pho3niX90",
"html_url": "https://github.com/Pho3niX90",
"followers_url": "https://api.github.com/users/Pho3niX90/followers",
"following_url": "https://api.github.com/users/Pho3niX90/following{/other_user}",
"gists_url": "https://api.github.com/users/Pho3niX90/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Pho3niX90/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Pho3niX90/subscriptions",
"organizations_url": "https://api.github.com/users/Pho3niX90/orgs",
"repos_url": "https://api.github.com/users/Pho3niX90/repos",
"events_url": "https://api.github.com/users/Pho3niX90/events{/privacy}",
"received_events_url": "https://api.github.com/users/Pho3niX90/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7766/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7766/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1859
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1859/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1859/comments
|
https://api.github.com/repos/ollama/ollama/issues/1859/events
|
https://github.com/ollama/ollama/issues/1859
| 2,071,016,279
|
I_kwDOJ0Z1Ps57cTNX
| 1,859
|
Pull model menifest connect timed out
|
{
"login": "shivrajjadhav733",
"id": 35407279,
"node_id": "MDQ6VXNlcjM1NDA3Mjc5",
"avatar_url": "https://avatars.githubusercontent.com/u/35407279?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shivrajjadhav733",
"html_url": "https://github.com/shivrajjadhav733",
"followers_url": "https://api.github.com/users/shivrajjadhav733/followers",
"following_url": "https://api.github.com/users/shivrajjadhav733/following{/other_user}",
"gists_url": "https://api.github.com/users/shivrajjadhav733/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shivrajjadhav733/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shivrajjadhav733/subscriptions",
"organizations_url": "https://api.github.com/users/shivrajjadhav733/orgs",
"repos_url": "https://api.github.com/users/shivrajjadhav733/repos",
"events_url": "https://api.github.com/users/shivrajjadhav733/events{/privacy}",
"received_events_url": "https://api.github.com/users/shivrajjadhav733/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 24
| 2024-01-08T18:36:59
| 2024-07-25T03:35:49
| 2024-03-11T20:24:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
OS - Apple M1 Pro chip
I tried to install ollama on machine. Installation was successful. I can see Ollama icon in menu bar at the top.
when I try to run model using command -
ollama run laama2
Or
ollama run mistral
I get attached error of operation timed out.

I tried to run - brew services restart ollama and I got error saying “ Error: Formula ‘ollama’ is not installed.
How do I fix the errors and run models using ollama?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1859/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 4
}
|
https://api.github.com/repos/ollama/ollama/issues/1859/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2119
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2119/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2119/comments
|
https://api.github.com/repos/ollama/ollama/issues/2119/events
|
https://github.com/ollama/ollama/issues/2119
| 2,092,504,174
|
I_kwDOJ0Z1Ps58uRRu
| 2,119
|
Can ollama access internet?
|
{
"login": "zinwelzl",
"id": 113045180,
"node_id": "U_kgDOBrzuvA",
"avatar_url": "https://avatars.githubusercontent.com/u/113045180?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zinwelzl",
"html_url": "https://github.com/zinwelzl",
"followers_url": "https://api.github.com/users/zinwelzl/followers",
"following_url": "https://api.github.com/users/zinwelzl/following{/other_user}",
"gists_url": "https://api.github.com/users/zinwelzl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zinwelzl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zinwelzl/subscriptions",
"organizations_url": "https://api.github.com/users/zinwelzl/orgs",
"repos_url": "https://api.github.com/users/zinwelzl/repos",
"events_url": "https://api.github.com/users/zinwelzl/events{/privacy}",
"received_events_url": "https://api.github.com/users/zinwelzl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-21T09:40:12
| 2024-01-27T00:30:51
| 2024-01-27T00:30:50
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Can ollama access internet?
And summarize text, etc.
I try it, but didn't work.
Maybe my installation don't work correctly?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2119/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2119/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8533
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8533/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8533/comments
|
https://api.github.com/repos/ollama/ollama/issues/8533/events
|
https://github.com/ollama/ollama/issues/8533
| 2,804,037,092
|
I_kwDOJ0Z1Ps6nIjXk
| 8,533
|
pulling model: stuck at 0% and Error: max retries exceeded: EOF
|
{
"login": "yifan0011",
"id": 173574832,
"node_id": "U_kgDOCliKsA",
"avatar_url": "https://avatars.githubusercontent.com/u/173574832?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yifan0011",
"html_url": "https://github.com/yifan0011",
"followers_url": "https://api.github.com/users/yifan0011/followers",
"following_url": "https://api.github.com/users/yifan0011/following{/other_user}",
"gists_url": "https://api.github.com/users/yifan0011/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yifan0011/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yifan0011/subscriptions",
"organizations_url": "https://api.github.com/users/yifan0011/orgs",
"repos_url": "https://api.github.com/users/yifan0011/repos",
"events_url": "https://api.github.com/users/yifan0011/events{/privacy}",
"received_events_url": "https://api.github.com/users/yifan0011/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 10
| 2025-01-22T10:44:48
| 2025-01-24T12:36:51
| 2025-01-24T12:34:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
After successfully install ollama, I want to pull the mistral model, and encountered an error:
```
U:\>ollama pull mistral
pulling manifest
pulling ff82381e2bea... 0% ▕ ▏ 0 B/4.1 GB
Error: max retries exceeded: EOF
```
This is after I have set the system variable HTTPS_Proxy. Without this system variable, I would get:
```
U:\>ollama pull mistral
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/mistral/manifests/latest": dial tcp 172.67.182.229:443: i/o timeout
```
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.7
|
{
"login": "yifan0011",
"id": 173574832,
"node_id": "U_kgDOCliKsA",
"avatar_url": "https://avatars.githubusercontent.com/u/173574832?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yifan0011",
"html_url": "https://github.com/yifan0011",
"followers_url": "https://api.github.com/users/yifan0011/followers",
"following_url": "https://api.github.com/users/yifan0011/following{/other_user}",
"gists_url": "https://api.github.com/users/yifan0011/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yifan0011/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yifan0011/subscriptions",
"organizations_url": "https://api.github.com/users/yifan0011/orgs",
"repos_url": "https://api.github.com/users/yifan0011/repos",
"events_url": "https://api.github.com/users/yifan0011/events{/privacy}",
"received_events_url": "https://api.github.com/users/yifan0011/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8533/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8533/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8598
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8598/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8598/comments
|
https://api.github.com/repos/ollama/ollama/issues/8598/events
|
https://github.com/ollama/ollama/issues/8598
| 2,811,838,719
|
I_kwDOJ0Z1Ps6nmUD_
| 8,598
|
Error Running Mistral Nemo Imported from .safetensors
|
{
"login": "aallgeier",
"id": 121313302,
"node_id": "U_kgDOBzsYFg",
"avatar_url": "https://avatars.githubusercontent.com/u/121313302?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aallgeier",
"html_url": "https://github.com/aallgeier",
"followers_url": "https://api.github.com/users/aallgeier/followers",
"following_url": "https://api.github.com/users/aallgeier/following{/other_user}",
"gists_url": "https://api.github.com/users/aallgeier/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aallgeier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aallgeier/subscriptions",
"organizations_url": "https://api.github.com/users/aallgeier/orgs",
"repos_url": "https://api.github.com/users/aallgeier/repos",
"events_url": "https://api.github.com/users/aallgeier/events{/privacy}",
"received_events_url": "https://api.github.com/users/aallgeier/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-26T23:15:22
| 2025-01-26T23:27:58
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I encountered an error when attempting to run the Mistral Nemo model imported from `.safetensors`. I intend to run the model on CPU only, even though I have a GPU (see the Modelfile below).
- I am able to run the model converted to `.gguf`.
- However, I would like to import and run directly from `.safetensors` if possible.
### Steps to Reproduce
1. Download model files from [mistralai/Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407/tree/main).
2. Create a `Modelfile` with the following content:
```
FROM <PATH TO .SAFETENSOR FILES>
PARAMETER num_gpu 0
```
3. Start the ollama server: `ollama serve`
4. Create the model: `ollama create nemo -f Modelfile`
5. Run the model: `ollama run nemo`
- **Error message**: `Error: llama runner process has terminated: error loading model: error loading model hyperparameters: invalid n_rot: 160, expected 128 llama_load_model_from_file: failed to load model`
### OS, GPU, CPU
- OS: Linux fedora 6.12.6
- GPU: Radeon RX 7600 XT
- CPU: AMD Ryzen 7 7700X
- RAM: 64GB
Thank you in advance for the help!
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.5.4
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8598/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8598/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3610
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3610/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3610/comments
|
https://api.github.com/repos/ollama/ollama/issues/3610/events
|
https://github.com/ollama/ollama/pull/3610
| 2,238,984,165
|
PR_kwDOJ0Z1Ps5sbbAo
| 3,610
|
Added Solar example at README.md
|
{
"login": "hunkim",
"id": 901975,
"node_id": "MDQ6VXNlcjkwMTk3NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/901975?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hunkim",
"html_url": "https://github.com/hunkim",
"followers_url": "https://api.github.com/users/hunkim/followers",
"following_url": "https://api.github.com/users/hunkim/following{/other_user}",
"gists_url": "https://api.github.com/users/hunkim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hunkim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hunkim/subscriptions",
"organizations_url": "https://api.github.com/users/hunkim/orgs",
"repos_url": "https://api.github.com/users/hunkim/repos",
"events_url": "https://api.github.com/users/hunkim/events{/privacy}",
"received_events_url": "https://api.github.com/users/hunkim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-12T03:37:07
| 2024-04-15T23:54:23
| 2024-04-15T23:54:23
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3610",
"html_url": "https://github.com/ollama/ollama/pull/3610",
"diff_url": "https://github.com/ollama/ollama/pull/3610.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3610.patch",
"merged_at": "2024-04-15T23:54:23"
}
|
Added just one line
| Solar | 10.7B | 6.1GB | `ollama run solar` |
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3610/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3610/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5007
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5007/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5007/comments
|
https://api.github.com/repos/ollama/ollama/issues/5007/events
|
https://github.com/ollama/ollama/pull/5007
| 2,349,673,042
|
PR_kwDOJ0Z1Ps5ySS7K
| 5,007
|
OpenAI: /v1/models and /v1/models/{model} compatibility
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-06-12T20:59:37
| 2024-07-02T18:51:00
| 2024-07-02T18:50:56
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5007",
"html_url": "https://github.com/ollama/ollama/pull/5007",
"diff_url": "https://github.com/ollama/ollama/pull/5007.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5007.patch",
"merged_at": "2024-07-02T18:50:56"
}
|
This PR adds compatibility with the /v1/models and /v1/models/{model} endpoint for listing models.
E.g.
`curl http://localhost:11434/v1/models`
```
{
"object": "list",
"data": [
{
"id": "mario:latest",
"object": "model",
"created": 1718141294,
"owned_by": "ollama"
},
{
"id": "nomic-embed-text:latest",
"object": "model",
"created": 1718054969,
"owned_by": "ollama"
},
{
"id": "llava:latest",
"object": "model",
"created": 1718049682,
"owned_by": "ollama"
},
{
"id": "mistral:latest",
"object": "model",
"created": 1717609491,
"owned_by": "ollama"
},
{
"id": "llama3:latest",
"object": "model",
"created": 1717451603,
"owned_by": "ollama"
}
]
}
```
Resolves #2430
Resolves #2476
Includes #5028
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5007/reactions",
"total_count": 5,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 5,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5007/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6139
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6139/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6139/comments
|
https://api.github.com/repos/ollama/ollama/issues/6139/events
|
https://github.com/ollama/ollama/issues/6139
| 2,444,415,836
|
I_kwDOJ0Z1Ps6RstNc
| 6,139
|
error: llama runner process has terminated: CUDA error: CUBLAS_STATUS_ALLOC_FAILED
|
{
"login": "trixtipsfix",
"id": 69011613,
"node_id": "MDQ6VXNlcjY5MDExNjEz",
"avatar_url": "https://avatars.githubusercontent.com/u/69011613?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/trixtipsfix",
"html_url": "https://github.com/trixtipsfix",
"followers_url": "https://api.github.com/users/trixtipsfix/followers",
"following_url": "https://api.github.com/users/trixtipsfix/following{/other_user}",
"gists_url": "https://api.github.com/users/trixtipsfix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/trixtipsfix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/trixtipsfix/subscriptions",
"organizations_url": "https://api.github.com/users/trixtipsfix/orgs",
"repos_url": "https://api.github.com/users/trixtipsfix/repos",
"events_url": "https://api.github.com/users/trixtipsfix/events{/privacy}",
"received_events_url": "https://api.github.com/users/trixtipsfix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 22
| 2024-08-02T08:41:15
| 2024-10-31T18:18:28
| 2024-10-31T18:18:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm getting the following error when I try to run the Ollama model:

Additionally, sometimes it behaves like this and disappears upon system restart:

### Error Details:
```
error: llama runner process has terminated: CUDA error: CUBLAS_STATUS_ALLOC_FAILED
current device: 0, in function cublas_handle at /go/src/github.com/ollama/ollama/llm/llama.cpp/ggml/src/ggml-cuda/common.cuh:826
cublasCreate_v2(&cublas_handles[device]) GGML_ASSERT: /go/src/github.com/ollama/ollama/llm/llama.cpp/ggml/src/ggml-cuda.cu:101: !"CUDA error"
```
**LLM Model:** deepseek-coder-v2:latest
**Note:** It is working with _deepseek-coder_ model.
### System Information:
- **CPU** - AMD Ryzen 5600
- **OS** - Ubuntu 24.04 LTS (GNU/Linux 6.8.0-39-generic x86_64)
- **GPU** - Nvidia 3060 Ti 8 GB VRAM
- **CUDA** - 12.5
- **RAM** - 32 GB
- **Ollama version** - 0.3.0
### Steps to Reproduce:
- Start the Ollama model.
`ollama run deepseek-coder-v2`
- Observe the error messages in the logs or terminal.
### Expected Behavior:
- The model should run without any CUDA-related errors.
### Actual Behavior:
The model fails to run, showing the above-mentioned CUDA error. When I restart Ollama service, then it works for a while.
```
~$ systemctl stop ollama
~$ systemctl start ollama
```
### Additional Context:
I've tried restarting the system and reinstalling Ollama, but the issue persists. Any insights or suggestions would be greatly appreciated.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6139/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6139/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6720
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6720/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6720/comments
|
https://api.github.com/repos/ollama/ollama/issues/6720/events
|
https://github.com/ollama/ollama/issues/6720
| 2,515,506,580
|
I_kwDOJ0Z1Ps6V75WU
| 6,720
|
Can you specify a graphics card in the ollama deployment model?
|
{
"login": "LIUKAI0815",
"id": 48339931,
"node_id": "MDQ6VXNlcjQ4MzM5OTMx",
"avatar_url": "https://avatars.githubusercontent.com/u/48339931?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LIUKAI0815",
"html_url": "https://github.com/LIUKAI0815",
"followers_url": "https://api.github.com/users/LIUKAI0815/followers",
"following_url": "https://api.github.com/users/LIUKAI0815/following{/other_user}",
"gists_url": "https://api.github.com/users/LIUKAI0815/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LIUKAI0815/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LIUKAI0815/subscriptions",
"organizations_url": "https://api.github.com/users/LIUKAI0815/orgs",
"repos_url": "https://api.github.com/users/LIUKAI0815/repos",
"events_url": "https://api.github.com/users/LIUKAI0815/events{/privacy}",
"received_events_url": "https://api.github.com/users/LIUKAI0815/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2024-09-10T06:17:18
| 2024-09-12T01:24:12
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
For example, qwen2 is in CUDA_VISIBLE_DEVICE=2 and glm4 is in CUDA_VISIBLE_DEVICE=4
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6720/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6720/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1464
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1464/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1464/comments
|
https://api.github.com/repos/ollama/ollama/issues/1464/events
|
https://github.com/ollama/ollama/issues/1464
| 2,035,222,853
|
I_kwDOJ0Z1Ps55TwlF
| 1,464
|
Ollama API Documentation: Error responses are not defined.
|
{
"login": "therohitdas",
"id": 43847374,
"node_id": "MDQ6VXNlcjQzODQ3Mzc0",
"avatar_url": "https://avatars.githubusercontent.com/u/43847374?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/therohitdas",
"html_url": "https://github.com/therohitdas",
"followers_url": "https://api.github.com/users/therohitdas/followers",
"following_url": "https://api.github.com/users/therohitdas/following{/other_user}",
"gists_url": "https://api.github.com/users/therohitdas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/therohitdas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/therohitdas/subscriptions",
"organizations_url": "https://api.github.com/users/therohitdas/orgs",
"repos_url": "https://api.github.com/users/therohitdas/repos",
"events_url": "https://api.github.com/users/therohitdas/events{/privacy}",
"received_events_url": "https://api.github.com/users/therohitdas/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 1
| 2023-12-11T09:29:45
| 2024-11-06T19:04:33
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I received a 500 Internal Server error as a response when using Ollama MacOS and Ollama-UI.
This made me realise that errors are not mentioned in [Ollama's documentation](https://github.com/jmorganca/ollama/blob/main/docs/api.md), so other projects are unable to implement it in their proxy.
Here is the Ollama logs dump for my issue: https://github.com/ollama-webui/ollama-webui/issues/193#issuecomment-1849616910
Can we have Errors returned by each API in the documentation?
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1464/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1464/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2159
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2159/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2159/comments
|
https://api.github.com/repos/ollama/ollama/issues/2159/events
|
https://github.com/ollama/ollama/issues/2159
| 2,096,434,733
|
I_kwDOJ0Z1Ps589Q4t
| 2,159
|
Do we have a Go client
|
{
"login": "liliang-cn",
"id": 20553741,
"node_id": "MDQ6VXNlcjIwNTUzNzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/20553741?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/liliang-cn",
"html_url": "https://github.com/liliang-cn",
"followers_url": "https://api.github.com/users/liliang-cn/followers",
"following_url": "https://api.github.com/users/liliang-cn/following{/other_user}",
"gists_url": "https://api.github.com/users/liliang-cn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/liliang-cn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/liliang-cn/subscriptions",
"organizations_url": "https://api.github.com/users/liliang-cn/orgs",
"repos_url": "https://api.github.com/users/liliang-cn/repos",
"events_url": "https://api.github.com/users/liliang-cn/events{/privacy}",
"received_events_url": "https://api.github.com/users/liliang-cn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-01-23T16:09:04
| 2024-03-31T02:44:33
| 2024-03-11T19:16:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm know there is an HTTP API, but can I utilize this API in a similar manner like [ollama-python?](https://github.com/jmorganca/ollama-python)
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2159/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2159/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/3784
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3784/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3784/comments
|
https://api.github.com/repos/ollama/ollama/issues/3784/events
|
https://github.com/ollama/ollama/pull/3784
| 2,254,686,971
|
PR_kwDOJ0Z1Ps5tQT79
| 3,784
|
Allow whitespace within objects and arrays, but remove trailing possibly infinite whitespace
|
{
"login": "hughescr",
"id": 46348,
"node_id": "MDQ6VXNlcjQ2MzQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/46348?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hughescr",
"html_url": "https://github.com/hughescr",
"followers_url": "https://api.github.com/users/hughescr/followers",
"following_url": "https://api.github.com/users/hughescr/following{/other_user}",
"gists_url": "https://api.github.com/users/hughescr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hughescr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hughescr/subscriptions",
"organizations_url": "https://api.github.com/users/hughescr/orgs",
"repos_url": "https://api.github.com/users/hughescr/repos",
"events_url": "https://api.github.com/users/hughescr/events{/privacy}",
"received_events_url": "https://api.github.com/users/hughescr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2024-04-20T19:28:13
| 2024-12-05T00:51:35
| 2024-12-05T00:51:35
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3784",
"html_url": "https://github.com/ollama/ollama/pull/3784",
"diff_url": "https://github.com/ollama/ollama/pull/3784.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3784.patch",
"merged_at": null
}
|
The PR tweaks the JSON grammar to improve use of whitespace (though doesn't remove it entirely); it prevents trailing whitespace on grammatical elements, but does allow whitespace inside of `{}` or `[]`. This reduces the likelihood that a model might spit out a complete JSON object/array/primitive/literal, and then append a whole bunch of useless whitespace, consuming time and money and churning more CO2 into the atmosphere.
There's still a chance with the definition of `ws` that it could do this *within* objects/arrays, but from anecdotal testing on my machine with various models, this seems way way less common than trailing whitespace at the end of the entire reponse.
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3784/reactions",
"total_count": 8,
"+1": 8,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3784/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6841
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6841/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6841/comments
|
https://api.github.com/repos/ollama/ollama/issues/6841/events
|
https://github.com/ollama/ollama/pull/6841
| 2,531,641,164
|
PR_kwDOJ0Z1Ps57yfoV
| 6,841
|
Add python examples for `bespoke-minicheck`
|
{
"login": "RyanMarten",
"id": 18333503,
"node_id": "MDQ6VXNlcjE4MzMzNTAz",
"avatar_url": "https://avatars.githubusercontent.com/u/18333503?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RyanMarten",
"html_url": "https://github.com/RyanMarten",
"followers_url": "https://api.github.com/users/RyanMarten/followers",
"following_url": "https://api.github.com/users/RyanMarten/following{/other_user}",
"gists_url": "https://api.github.com/users/RyanMarten/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RyanMarten/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RyanMarten/subscriptions",
"organizations_url": "https://api.github.com/users/RyanMarten/orgs",
"repos_url": "https://api.github.com/users/RyanMarten/repos",
"events_url": "https://api.github.com/users/RyanMarten/events{/privacy}",
"received_events_url": "https://api.github.com/users/RyanMarten/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-17T16:57:14
| 2024-09-18T16:35:25
| 2024-09-18T16:35:25
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6841",
"html_url": "https://github.com/ollama/ollama/pull/6841",
"diff_url": "https://github.com/ollama/ollama/pull/6841.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6841.patch",
"merged_at": "2024-09-18T16:35:25"
}
|
Adds two examples `python-grounded-factuality-rag-check` and `python-grounded-factuality-simple-check` which showcase the `bespoke-minicheck` model.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6841/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6841/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6882
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6882/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6882/comments
|
https://api.github.com/repos/ollama/ollama/issues/6882/events
|
https://github.com/ollama/ollama/issues/6882
| 2,537,360,886
|
I_kwDOJ0Z1Ps6XPQ32
| 6,882
|
Core Dump | applicationError
|
{
"login": "nPHYN1T3",
"id": 38122105,
"node_id": "MDQ6VXNlcjM4MTIyMTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/38122105?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nPHYN1T3",
"html_url": "https://github.com/nPHYN1T3",
"followers_url": "https://api.github.com/users/nPHYN1T3/followers",
"following_url": "https://api.github.com/users/nPHYN1T3/following{/other_user}",
"gists_url": "https://api.github.com/users/nPHYN1T3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nPHYN1T3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nPHYN1T3/subscriptions",
"organizations_url": "https://api.github.com/users/nPHYN1T3/orgs",
"repos_url": "https://api.github.com/users/nPHYN1T3/repos",
"events_url": "https://api.github.com/users/nPHYN1T3/events{/privacy}",
"received_events_url": "https://api.github.com/users/nPHYN1T3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-09-19T21:01:14
| 2024-09-20T22:17:09
| 2024-09-19T22:14:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have been trying to do something with deepseek-coder-v2 but it takes constant prompt revisions of which it then core dumps and I've got to start all over trying to figure out how to get it back to the understanding it had before it died.
None of my GPU's are overly taxed and have plenty of VRAM available when it dies as system memory is also no where near capacity.
System / Kernel log shows (ollama_llama_se) of user XXX terminated abnormally with signal 6/ABRT while in the used term I get applicationError: an unknown error was encountered while running the model.
As an additional side note to this I see guys on here trying basic word problems with models to test logic which touches on why this crash is extra frustrating. I don't know how they expect this to work at all for the same reason starting over after crashes is extremely frustrating prompt wise. If you give a lengthy detailed prompt it seems to make things worse, where as too little a prompt yields the same useless answers.
For example with all the models I've tested if I (over simplifying here) say "I have three apples, I like cloudy days, James Hetfields song writing is as bad as Donald Trumps hair style. How many Apples do I have?" The model will respond something like "It's sad you don't like Donald Trumps personal style or the musical stylings of Metallica...perhaps I could help you find some music you do like." Face and palm, ask again, CRASH...start over, CRASH start over...never mind asking anything technical...So given this it seems insane for the "A Train leaves moving east at 30kph" style word questions as the model is at the back of the class eating paste and will reply "I like trains, they go choo choo." So perhaps this is a "featureRANTquest" where there needs to be some kind of session persistence for crashes. At present there is rather literally no way to actually move forward without a potentially infinite prompt testing which still has RnD failure baked in.
As an additional tidbit I also read for some pushing the questions into llama.cpp seems to work, but outside that you end up with the "2/3rd's the details ignored" then some irrelevant reply on the last bit of info...perhaps a related issue showing it's not the model (that report is on llama3.1) but something in ollama?
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.8
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6882/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6882/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6101
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6101/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6101/comments
|
https://api.github.com/repos/ollama/ollama/issues/6101/events
|
https://github.com/ollama/ollama/issues/6101
| 2,440,541,009
|
I_kwDOJ0Z1Ps6Rd7NR
| 6,101
|
Ollama is unable to resume interrupted pulls
|
{
"login": "nviraj",
"id": 8409854,
"node_id": "MDQ6VXNlcjg0MDk4NTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/8409854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nviraj",
"html_url": "https://github.com/nviraj",
"followers_url": "https://api.github.com/users/nviraj/followers",
"following_url": "https://api.github.com/users/nviraj/following{/other_user}",
"gists_url": "https://api.github.com/users/nviraj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nviraj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nviraj/subscriptions",
"organizations_url": "https://api.github.com/users/nviraj/orgs",
"repos_url": "https://api.github.com/users/nviraj/repos",
"events_url": "https://api.github.com/users/nviraj/events{/privacy}",
"received_events_url": "https://api.github.com/users/nviraj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-07-31T17:25:10
| 2024-07-31T23:57:33
| 2024-07-31T23:57:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Previously if I interrupted a pull and started it again (Usually after the speed dropped or wouldn't go to completion after 95%), it was able to resume.
However today when I pulled the gemma2:2b model, it was unable to do so and started from scratch. This happened multiple times.
Not sure why this could be the case. I am on Windows 11, with the latest version.
### OS
Windows
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.3.1
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6101/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6101/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/435
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/435/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/435/comments
|
https://api.github.com/repos/ollama/ollama/issues/435/events
|
https://github.com/ollama/ollama/issues/435
| 1,869,531,192
|
I_kwDOJ0Z1Ps5vbsg4
| 435
|
Incorrect size displayed for codellama:34b-code-q4_0 on ollama.ai
|
{
"login": "spqw",
"id": 101190846,
"node_id": "U_kgDOBggMvg",
"avatar_url": "https://avatars.githubusercontent.com/u/101190846?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/spqw",
"html_url": "https://github.com/spqw",
"followers_url": "https://api.github.com/users/spqw/followers",
"following_url": "https://api.github.com/users/spqw/following{/other_user}",
"gists_url": "https://api.github.com/users/spqw/gists{/gist_id}",
"starred_url": "https://api.github.com/users/spqw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/spqw/subscriptions",
"organizations_url": "https://api.github.com/users/spqw/orgs",
"repos_url": "https://api.github.com/users/spqw/repos",
"events_url": "https://api.github.com/users/spqw/events{/privacy}",
"received_events_url": "https://api.github.com/users/spqw/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-08-28T11:02:05
| 2023-08-28T14:00:43
| 2023-08-28T13:58:48
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It seems like there is a typo on the following url https://ollama.ai/library/codellama/tags, where the displayed model size for `34b-code-q4_0` is 6.7 GB. When downloading it, it appears to be 19 GB instead.
<img width="996" alt="image" src="https://github.com/jmorganca/ollama/assets/101190846/0d32d491-6498-4b1f-9a65-06bf7a8458e4">
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/435/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/435/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4062
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4062/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4062/comments
|
https://api.github.com/repos/ollama/ollama/issues/4062/events
|
https://github.com/ollama/ollama/issues/4062
| 2,272,582,725
|
I_kwDOJ0Z1Ps6HdNxF
| 4,062
|
llama3-gradient going crazy
|
{
"login": "DuckyBlender",
"id": 42645784,
"node_id": "MDQ6VXNlcjQyNjQ1Nzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/42645784?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DuckyBlender",
"html_url": "https://github.com/DuckyBlender",
"followers_url": "https://api.github.com/users/DuckyBlender/followers",
"following_url": "https://api.github.com/users/DuckyBlender/following{/other_user}",
"gists_url": "https://api.github.com/users/DuckyBlender/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DuckyBlender/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DuckyBlender/subscriptions",
"organizations_url": "https://api.github.com/users/DuckyBlender/orgs",
"repos_url": "https://api.github.com/users/DuckyBlender/repos",
"events_url": "https://api.github.com/users/DuckyBlender/events{/privacy}",
"received_events_url": "https://api.github.com/users/DuckyBlender/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 8
| 2024-04-30T22:26:07
| 2024-11-10T22:33:23
| 2024-04-30T22:32:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Just ran `ollama run llama3-gradient` and this happened. Used the default (2k) context size

### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4062/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4062/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1087
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1087/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1087/comments
|
https://api.github.com/repos/ollama/ollama/issues/1087/events
|
https://github.com/ollama/ollama/issues/1087
| 1,989,020,115
|
I_kwDOJ0Z1Ps52jgnT
| 1,087
|
System Performance Benchmarking
|
{
"login": "K1ngjulien",
"id": 16562333,
"node_id": "MDQ6VXNlcjE2NTYyMzMz",
"avatar_url": "https://avatars.githubusercontent.com/u/16562333?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/K1ngjulien",
"html_url": "https://github.com/K1ngjulien",
"followers_url": "https://api.github.com/users/K1ngjulien/followers",
"following_url": "https://api.github.com/users/K1ngjulien/following{/other_user}",
"gists_url": "https://api.github.com/users/K1ngjulien/gists{/gist_id}",
"starred_url": "https://api.github.com/users/K1ngjulien/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/K1ngjulien/subscriptions",
"organizations_url": "https://api.github.com/users/K1ngjulien/orgs",
"repos_url": "https://api.github.com/users/K1ngjulien/repos",
"events_url": "https://api.github.com/users/K1ngjulien/events{/privacy}",
"received_events_url": "https://api.github.com/users/K1ngjulien/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
},
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 3
| 2023-11-11T16:09:44
| 2024-04-01T09:31:39
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi!
In threads like #738, I see a lot of people trying different hardware and software setups, followed by checking the logs for the `llama_print_timings` output to see performance results.
From my (admittedly short) time playing around with my own hardware, I've noticed a lot of inconsistency between runs, making it difficult to evaluate changes.
I would suggest an enhancement like an `ollama bench <model>` command, which would set up a suite of example prompts, which would be sequentially or randomly sent to the LLM and the data recorded.
This way, we can all have a consistent way of comparing benchmark runs, which would also be excellent for development.
Introspecting a running session and just keeping a performance log, separate from stdout, would also be excellent.
Is there a way to do this already, maybe through `llama.cpp`?
I would be happy to try and implement this with some help 👍
Cheers,
Julian
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1087/reactions",
"total_count": 18,
"+1": 18,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1087/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5449
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5449/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5449/comments
|
https://api.github.com/repos/ollama/ollama/issues/5449/events
|
https://github.com/ollama/ollama/issues/5449
| 2,387,350,854
|
I_kwDOJ0Z1Ps6OTBVG
| 5,449
|
Validate templates on `ollama create`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-07-02T23:45:14
| 2024-07-19T22:24:30
| 2024-07-19T22:24:30
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
`ollama create` should validate templates by parsing it if one is provided
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5449/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5449/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2347
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2347/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2347/comments
|
https://api.github.com/repos/ollama/ollama/issues/2347/events
|
https://github.com/ollama/ollama/issues/2347
| 2,117,130,379
|
I_kwDOJ0Z1Ps5-MNiL
| 2,347
|
parser/parser.go:9:2: package log/slog is not in GOROOT
|
{
"login": "kenorb",
"id": 266306,
"node_id": "MDQ6VXNlcjI2NjMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/266306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kenorb",
"html_url": "https://github.com/kenorb",
"followers_url": "https://api.github.com/users/kenorb/followers",
"following_url": "https://api.github.com/users/kenorb/following{/other_user}",
"gists_url": "https://api.github.com/users/kenorb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kenorb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kenorb/subscriptions",
"organizations_url": "https://api.github.com/users/kenorb/orgs",
"repos_url": "https://api.github.com/users/kenorb/repos",
"events_url": "https://api.github.com/users/kenorb/events{/privacy}",
"received_events_url": "https://api.github.com/users/kenorb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-02-04T14:39:12
| 2024-02-20T04:04:07
| 2024-02-20T04:04:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I've tried to build the project on Ubuntu 22.04 according to instructions, however I've got the error (`master` branch):
> parser/parser.go:9:2: package log/slog is not in GOROOT
```
$ go generate ./...
...
Finished compression
+ '[' -z '' ']'
+ ROCM_PATH=/opt/rocm
+ '[' -z '' ']'
+ '[' -d /usr/lib/cmake/CLBlast ']'
+ '[' -d /opt/rocm ']'
+ cleanup
+ cd ../llama.cpp/examples/server/
+ git checkout CMakeLists.txt server.cpp
Updated 2 paths from the index
++ ls -A ../patches/01-cache.diff ../patches/02-shutdown.diff
+ '[' -n '../patches/01-cache.diff
../patches/02-shutdown.diff' ']'
+ for patch in ../patches/*.diff
++ grep '^+++ ' ../patches/01-cache.diff
++ cut -f2 '-d '
++ cut -f2- -d/
+ for file in $(grep "^+++ " ${patch} | cut -f2 -d' ' | cut -f2- -d/)
+ cd ../llama.cpp
+ git checkout examples/server/server.cpp
Updated 0 paths from the index
+ for patch in ../patches/*.diff
++ grep '^+++ ' ../patches/02-shutdown.diff
++ cut -f2 '-d '
++ cut -f2- -d/
+ for file in $(grep "^+++ " ${patch} | cut -f2 -d' ' | cut -f2- -d/)
+ cd ../llama.cpp
+ git checkout examples/server/server.cpp
Updated 0 paths from the index
+ for file in $(grep "^+++ " ${patch} | cut -f2 -d' ' | cut -f2- -d/)
+ cd ../llama.cpp
+ git checkout examples/server/utils.hpp
Updated 1 path from the index
$ go build .
parser/parser.go:9:2: package log/slog is not in GOROOT (/usr/lib/go-1.18/src/log/slog)
parser/parser.go:10:2: package slices is not in GOROOT (/usr/lib/go-1.18/src/slices)
```
What's the reason and how to resolve it?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2347/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2347/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5580
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5580/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5580/comments
|
https://api.github.com/repos/ollama/ollama/issues/5580/events
|
https://github.com/ollama/ollama/pull/5580
| 2,398,906,842
|
PR_kwDOJ0Z1Ps503weR
| 5,580
|
Detect CUDA OS overhead
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-09T18:44:26
| 2024-07-10T19:47:34
| 2024-07-10T19:47:31
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5580",
"html_url": "https://github.com/ollama/ollama/pull/5580",
"diff_url": "https://github.com/ollama/ollama/pull/5580.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5580.patch",
"merged_at": "2024-07-10T19:47:31"
}
|
This adds logic to detect skew between the driver and
management library which can be attributed to OS overhead
and records that so we can adjust subsequent management
library free VRAM updates and avoid OOM scenarios.
Fixes #5504
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5580/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5580/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6217
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6217/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6217/comments
|
https://api.github.com/repos/ollama/ollama/issues/6217/events
|
https://github.com/ollama/ollama/issues/6217
| 2,451,988,607
|
I_kwDOJ0Z1Ps6SJmB_
| 6,217
|
batch embed 500 error: no slots available after 10 retries
|
{
"login": "Schumpeterx",
"id": 29852284,
"node_id": "MDQ6VXNlcjI5ODUyMjg0",
"avatar_url": "https://avatars.githubusercontent.com/u/29852284?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Schumpeterx",
"html_url": "https://github.com/Schumpeterx",
"followers_url": "https://api.github.com/users/Schumpeterx/followers",
"following_url": "https://api.github.com/users/Schumpeterx/following{/other_user}",
"gists_url": "https://api.github.com/users/Schumpeterx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Schumpeterx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Schumpeterx/subscriptions",
"organizations_url": "https://api.github.com/users/Schumpeterx/orgs",
"repos_url": "https://api.github.com/users/Schumpeterx/repos",
"events_url": "https://api.github.com/users/Schumpeterx/events{/privacy}",
"received_events_url": "https://api.github.com/users/Schumpeterx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-08-07T01:39:05
| 2024-08-07T03:20:50
| 2024-08-07T03:20:50
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I try to use 10 threads to send batch embedding request to `/api/embed`, and sometime got the below error:
```
Aug 07 09:19:18 *-gpu ollama[345349]: time=2024-08-07T09:19:18.764+08:00 level=ERROR source=routes.go:368 msg="embedding generation failed" error="no slots available after 10 retries"
```
Is it a bug or not? if not, how to increase the number of available slots?
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.0
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6217/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6217/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2338
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2338/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2338/comments
|
https://api.github.com/repos/ollama/ollama/issues/2338/events
|
https://github.com/ollama/ollama/issues/2338
| 2,116,472,733
|
I_kwDOJ0Z1Ps5-Js-d
| 2,338
|
Very nice to have: capabilities info for multimodal models
|
{
"login": "da-z",
"id": 3681019,
"node_id": "MDQ6VXNlcjM2ODEwMTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/3681019?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/da-z",
"html_url": "https://github.com/da-z",
"followers_url": "https://api.github.com/users/da-z/followers",
"following_url": "https://api.github.com/users/da-z/following{/other_user}",
"gists_url": "https://api.github.com/users/da-z/gists{/gist_id}",
"starred_url": "https://api.github.com/users/da-z/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/da-z/subscriptions",
"organizations_url": "https://api.github.com/users/da-z/orgs",
"repos_url": "https://api.github.com/users/da-z/repos",
"events_url": "https://api.github.com/users/da-z/events{/privacy}",
"received_events_url": "https://api.github.com/users/da-z/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-03T13:00:29
| 2024-02-03T19:33:21
| 2024-02-03T19:33:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Not sure if this is done already, I checked the llava info and it does not mention capabilities anywhere. Would be nice to detect via ollama show or API model info that this model supports `vision`.
API Example
`GET /api/tags`
```js
{
//...
"details": {
"parent_model": "",
"format": "gguf",
"family": "llama",
"families": [
"llama",
"clip"
],
"capabilities": ["vision"]
//...
}
}
```
|
{
"login": "da-z",
"id": 3681019,
"node_id": "MDQ6VXNlcjM2ODEwMTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/3681019?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/da-z",
"html_url": "https://github.com/da-z",
"followers_url": "https://api.github.com/users/da-z/followers",
"following_url": "https://api.github.com/users/da-z/following{/other_user}",
"gists_url": "https://api.github.com/users/da-z/gists{/gist_id}",
"starred_url": "https://api.github.com/users/da-z/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/da-z/subscriptions",
"organizations_url": "https://api.github.com/users/da-z/orgs",
"repos_url": "https://api.github.com/users/da-z/repos",
"events_url": "https://api.github.com/users/da-z/events{/privacy}",
"received_events_url": "https://api.github.com/users/da-z/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2338/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2338/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7920
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7920/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7920/comments
|
https://api.github.com/repos/ollama/ollama/issues/7920/events
|
https://github.com/ollama/ollama/issues/7920
| 2,715,963,182
|
I_kwDOJ0Z1Ps6h4k8u
| 7,920
|
Custom context size not being respected.
|
{
"login": "luisbrandao",
"id": 25795753,
"node_id": "MDQ6VXNlcjI1Nzk1NzUz",
"avatar_url": "https://avatars.githubusercontent.com/u/25795753?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/luisbrandao",
"html_url": "https://github.com/luisbrandao",
"followers_url": "https://api.github.com/users/luisbrandao/followers",
"following_url": "https://api.github.com/users/luisbrandao/following{/other_user}",
"gists_url": "https://api.github.com/users/luisbrandao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/luisbrandao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luisbrandao/subscriptions",
"organizations_url": "https://api.github.com/users/luisbrandao/orgs",
"repos_url": "https://api.github.com/users/luisbrandao/repos",
"events_url": "https://api.github.com/users/luisbrandao/events{/privacy}",
"received_events_url": "https://api.github.com/users/luisbrandao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-03T20:57:31
| 2024-12-03T22:59:58
| 2024-12-03T22:59:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hello,
I noticed an inconsistent behavior.
I had edited my model parameters on admin view and set its context size.
however, i should be able to change it "again" in the chat windows:

in this image, the number "4096" does not get set, if it is already set on admin -> models.
Is this intentional?
im using:
ghcr.io/open-webui/open-webui:v0.4.7-cuda
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.4.7
|
{
"login": "luisbrandao",
"id": 25795753,
"node_id": "MDQ6VXNlcjI1Nzk1NzUz",
"avatar_url": "https://avatars.githubusercontent.com/u/25795753?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/luisbrandao",
"html_url": "https://github.com/luisbrandao",
"followers_url": "https://api.github.com/users/luisbrandao/followers",
"following_url": "https://api.github.com/users/luisbrandao/following{/other_user}",
"gists_url": "https://api.github.com/users/luisbrandao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/luisbrandao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luisbrandao/subscriptions",
"organizations_url": "https://api.github.com/users/luisbrandao/orgs",
"repos_url": "https://api.github.com/users/luisbrandao/repos",
"events_url": "https://api.github.com/users/luisbrandao/events{/privacy}",
"received_events_url": "https://api.github.com/users/luisbrandao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7920/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7920/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2668
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2668/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2668/comments
|
https://api.github.com/repos/ollama/ollama/issues/2668/events
|
https://github.com/ollama/ollama/issues/2668
| 2,148,432,888
|
I_kwDOJ0Z1Ps6ADnv4
| 2,668
|
Error: Unable to load dynamic library: Unable to load dynamic server library:
|
{
"login": "123124-1",
"id": 88172698,
"node_id": "MDQ6VXNlcjg4MTcyNjk4",
"avatar_url": "https://avatars.githubusercontent.com/u/88172698?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/123124-1",
"html_url": "https://github.com/123124-1",
"followers_url": "https://api.github.com/users/123124-1/followers",
"following_url": "https://api.github.com/users/123124-1/following{/other_user}",
"gists_url": "https://api.github.com/users/123124-1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/123124-1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/123124-1/subscriptions",
"organizations_url": "https://api.github.com/users/123124-1/orgs",
"repos_url": "https://api.github.com/users/123124-1/repos",
"events_url": "https://api.github.com/users/123124-1/events{/privacy}",
"received_events_url": "https://api.github.com/users/123124-1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-02-22T07:47:21
| 2024-05-02T22:10:53
| 2024-05-02T22:10:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
time=2024-02-22T15:43:18.086+08:00 level=INFO source=images.go:710 msg="total blobs: 11"
time=2024-02-22T15:43:18.117+08:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0"
time=2024-02-22T15:43:18.120+08:00 level=INFO source=routes.go:1019 msg="Listening on 127.0.0.1:11434 (version 0.1.26)"
time=2024-02-22T15:43:18.120+08:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
time=2024-02-22T15:43:18.274+08:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 cpu cuda_v11.3]"
[GIN] 2024/02/22 - 15:43:18 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/02/22 - 15:43:18 | 200 | 2.1738ms | 127.0.0.1 | POST "/api/show"
[GIN] 2024/02/22 - 15:43:18 | 200 | 1.6371ms | 127.0.0.1 | POST "/api/show"
time=2024-02-22T15:43:18.996+08:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
time=2024-02-22T15:43:18.996+08:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library nvml.dll"
time=2024-02-22T15:43:19.003+08:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [c:\\Windows\\System32\\nvml.dll C:\\Windows\\system32\\nvml.dll]"
time=2024-02-22T15:43:19.019+08:00 level=INFO source=gpu.go:99 msg="Nvidia GPU detected"
time=2024-02-22T15:43:19.019+08:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-22T15:43:19.034+08:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 6.1"
time=2024-02-22T15:43:19.034+08:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-22T15:43:19.034+08:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 6.1"
time=2024-02-22T15:43:19.034+08:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-22T15:43:19.034+08:00 level=INFO source=dyn_ext_server.go:385 msg="Updating PATH to C:\\Users\\葫芦娃\\AppData\\Local\\Temp\\ollama3325263888\\cuda_v11.3;C:\\Windows\\system32;C:\\Windows;C:\\Windows\\System32\\Wbem;C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\;C:\\Windows\\System32\\OpenSSH\\;C:\\Program Files (x86)\\NVIDIA Corporation\\PhysX\\Common;C:\\Program Files\\NVIDIA Corporation\\NVIDIA NvDLISR;C:\\Program Files (x86)\\NetSarang\\Xshell 7\\;C:\\Program Files (x86)\\NetSarang\\Xftp 7\\;C:\\Program Files\\dotnet\\;C:\\Program Files\\Git\\cmd;C:\\Users\\葫芦娃\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Program Files\\Bandizip\\;F:\\Microsoft VS Code\\bin;C:\\ffmpeg-2024-02-15-git-a2cfd6062c-full_build\\bin;;C:\\Users\\葫芦娃\\AppData\\Local\\Programs\\Ollama"
time=2024-02-22T15:43:19.035+08:00 level=WARN source=llm.go:162 msg="Failed to load dynamic library C:\\Users\\葫芦娃\\AppData\\Local\\Temp\\ollama3325263888\\cuda_v11.3\\ext_server.dll Unable to load dynamic library: Unable to load dynamic server library: \xd5Ҳ\xbb\xb5\xbdָ\xb6\xa8\xb5\xc4ģ\xbf顣\r\n"
time=2024-02-22T15:43:19.035+08:00 level=INFO source=dyn_ext_server.go:385 msg="Updating PATH to C:\\Users\\葫芦娃\\AppData\\Local\\Temp\\ollama3325263888\\cpu_avx2;C:\\Windows\\system32;C:\\Windows;C:\\Windows\\System32\\Wbem;C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\;C:\\Windows\\System32\\OpenSSH\\;C:\\Program Files (x86)\\NVIDIA Corporation\\PhysX\\Common;C:\\Program Files\\NVIDIA Corporation\\NVIDIA NvDLISR;C:\\Program Files (x86)\\NetSarang\\Xshell 7\\;C:\\Program Files (x86)\\NetSarang\\Xftp 7\\;C:\\Program Files\\dotnet\\;C:\\Program Files\\Git\\cmd;C:\\Users\\葫芦娃\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Program Files\\Bandizip\\;F:\\Microsoft VS Code\\bin;C:\\ffmpeg-2024-02-15-git-a2cfd6062c-full_build\\bin;;C:\\Users\\葫芦娃\\AppData\\Local\\Programs\\Ollama;C:\\Users\\葫芦娃\\AppData\\Local\\Programs\\Ollama"
time=2024-02-22T15:43:19.035+08:00 level=WARN source=llm.go:162 msg="Failed to load dynamic library C:\\Users\\葫芦娃\\AppData\\Local\\Temp\\ollama3325263888\\cpu_avx2\\ext_server.dll Unable to load dynamic library: Unable to load dynamic server library: \xd5Ҳ\xbb\xb5\xbdָ\xb6\xa8\xb5\xc4ģ\xbf顣\r\n"
[GIN] 2024/02/22 - 15:43:19 | 500 | 671.9922ms | 127.0.0.1 | POST "/api/chat"
When running the error message, the hardware used includes GPU: 1070TI CPU: i7 8700K RMA: 32G
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2668/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2668/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6154
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6154/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6154/comments
|
https://api.github.com/repos/ollama/ollama/issues/6154/events
|
https://github.com/ollama/ollama/pull/6154
| 2,446,518,858
|
PR_kwDOJ0Z1Ps53VuaY
| 6,154
|
Disable paging for journalctl
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2024-08-03T17:33:43
| 2024-08-05T04:10:54
| 2024-08-05T04:10:53
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6154",
"html_url": "https://github.com/ollama/ollama/pull/6154",
"diff_url": "https://github.com/ollama/ollama/pull/6154.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6154.patch",
"merged_at": "2024-08-05T04:10:53"
}
|
Users using `journalctl` to get logs for issue logging sometimes don't realize that paging is causing information to be missed.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6154/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6154/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4404
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4404/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4404/comments
|
https://api.github.com/repos/ollama/ollama/issues/4404/events
|
https://github.com/ollama/ollama/issues/4404
| 2,292,915,977
|
I_kwDOJ0Z1Ps6Iqx8J
| 4,404
|
error loading model vocabulary: unknown pre-tokenizer type: 'qwen2'
|
{
"login": "HouseYeung",
"id": 70836781,
"node_id": "MDQ6VXNlcjcwODM2Nzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/70836781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HouseYeung",
"html_url": "https://github.com/HouseYeung",
"followers_url": "https://api.github.com/users/HouseYeung/followers",
"following_url": "https://api.github.com/users/HouseYeung/following{/other_user}",
"gists_url": "https://api.github.com/users/HouseYeung/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HouseYeung/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HouseYeung/subscriptions",
"organizations_url": "https://api.github.com/users/HouseYeung/orgs",
"repos_url": "https://api.github.com/users/HouseYeung/repos",
"events_url": "https://api.github.com/users/HouseYeung/events{/privacy}",
"received_events_url": "https://api.github.com/users/HouseYeung/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 13
| 2024-05-13T14:00:24
| 2025-01-29T14:09:09
| 2024-06-04T06:54:11
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
llama runner process has terminated: signal: abort trap error:error loading model vocabulary: unknown pre-tokenizer type: 'qwen2'
i was running qwen1.5-8B-chat
the old version of ollama can run this model properly.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.37
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4404/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 4
}
|
https://api.github.com/repos/ollama/ollama/issues/4404/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1751
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1751/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1751/comments
|
https://api.github.com/repos/ollama/ollama/issues/1751/events
|
https://github.com/ollama/ollama/issues/1751
| 2,061,030,874
|
I_kwDOJ0Z1Ps562NXa
| 1,751
|
[FEATURE] add more options while chatting like `/bye` (e.g `/clear_context` or `/new_chat`)
|
{
"login": "tikendraw",
"id": 68785366,
"node_id": "MDQ6VXNlcjY4Nzg1MzY2",
"avatar_url": "https://avatars.githubusercontent.com/u/68785366?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tikendraw",
"html_url": "https://github.com/tikendraw",
"followers_url": "https://api.github.com/users/tikendraw/followers",
"following_url": "https://api.github.com/users/tikendraw/following{/other_user}",
"gists_url": "https://api.github.com/users/tikendraw/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tikendraw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tikendraw/subscriptions",
"organizations_url": "https://api.github.com/users/tikendraw/orgs",
"repos_url": "https://api.github.com/users/tikendraw/repos",
"events_url": "https://api.github.com/users/tikendraw/events{/privacy}",
"received_events_url": "https://api.github.com/users/tikendraw/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2023-12-31T11:33:59
| 2024-01-26T16:25:56
| 2024-01-25T22:56:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
While chatting with the model, you necessarily do not need to have the context, or you just want a new chat. Well, there are no options for this, rather than just cancelling this chat and restarting it.
So, similar to the `/bye` option, there can be other options for the ease of using llm.
* `/clear_context` or `/no_context`: to not use the above context
* `/new_chat` : to initialize new chat
or any other option that may be useful for the user.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1751/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1751/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/312
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/312/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/312/comments
|
https://api.github.com/repos/ollama/ollama/issues/312/events
|
https://github.com/ollama/ollama/pull/312
| 1,843,943,830
|
PR_kwDOJ0Z1Ps5Xks3E
| 312
|
add embed docs for modelfile
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-08-09T20:15:56
| 2023-08-17T17:37:43
| 2023-08-17T17:37:43
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/312",
"html_url": "https://github.com/ollama/ollama/pull/312",
"diff_url": "https://github.com/ollama/ollama/pull/312.diff",
"patch_url": "https://github.com/ollama/ollama/pull/312.patch",
"merged_at": "2023-08-17T17:37:43"
}
|
I removed the embed instruction from our model documentation since its not in a release. Staging it here for a release.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/312/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/312/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4520
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4520/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4520/comments
|
https://api.github.com/repos/ollama/ollama/issues/4520/events
|
https://github.com/ollama/ollama/issues/4520
| 2,304,663,711
|
I_kwDOJ0Z1Ps6JXmCf
| 4,520
|
llama3:70B pull error
|
{
"login": "DimIsaev",
"id": 11172642,
"node_id": "MDQ6VXNlcjExMTcyNjQy",
"avatar_url": "https://avatars.githubusercontent.com/u/11172642?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DimIsaev",
"html_url": "https://github.com/DimIsaev",
"followers_url": "https://api.github.com/users/DimIsaev/followers",
"following_url": "https://api.github.com/users/DimIsaev/following{/other_user}",
"gists_url": "https://api.github.com/users/DimIsaev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DimIsaev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DimIsaev/subscriptions",
"organizations_url": "https://api.github.com/users/DimIsaev/orgs",
"repos_url": "https://api.github.com/users/DimIsaev/repos",
"events_url": "https://api.github.com/users/DimIsaev/events{/privacy}",
"received_events_url": "https://api.github.com/users/DimIsaev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 25
| 2024-05-19T15:31:58
| 2025-01-28T14:01:55
| 2024-05-25T00:21:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

`Error: max retries exceeded: unexpected EOF`
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.33
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4520/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4520/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5832
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5832/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5832/comments
|
https://api.github.com/repos/ollama/ollama/issues/5832/events
|
https://github.com/ollama/ollama/issues/5832
| 2,421,490,146
|
I_kwDOJ0Z1Ps6QVQHi
| 5,832
|
rx6800xt is 4xfaster as my new rtx4070ti super
|
{
"login": "konian71",
"id": 176228734,
"node_id": "U_kgDOCoEJfg",
"avatar_url": "https://avatars.githubusercontent.com/u/176228734?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/konian71",
"html_url": "https://github.com/konian71",
"followers_url": "https://api.github.com/users/konian71/followers",
"following_url": "https://api.github.com/users/konian71/following{/other_user}",
"gists_url": "https://api.github.com/users/konian71/gists{/gist_id}",
"starred_url": "https://api.github.com/users/konian71/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/konian71/subscriptions",
"organizations_url": "https://api.github.com/users/konian71/orgs",
"repos_url": "https://api.github.com/users/konian71/repos",
"events_url": "https://api.github.com/users/konian71/events{/privacy}",
"received_events_url": "https://api.github.com/users/konian71/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-07-21T15:39:51
| 2024-09-05T20:00:23
| 2024-09-05T20:00:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have a setup with the following specifications:
CPU: AMD Ryzen 5700X
RAM: 128GB DDR4-3200, CL16
Old GPU: AMD RX6800XT
New GPU: Nvidia RTX4070Ti Super
I am running large language models, specifically Gemma2:32b-fp16 and LLaMA3:70b. All drivers are up to date, and the system was cleaned with DDU before installing the new GPU.
I am very confused because the RTX4070Ti Super takes 21 minutes to complete tasks, whereas the RX6800XT only takes 6 minutes for the same task. The VRAM of the RTX4070Ti Super fills up to 16GB, but the GPU load never exceeds 60% and is mostly near zero. The CPU load with the Nvidia GPU never goes above 80%. In contrast, with the AMD GPU, both the CPU and GPU load approach 100%.
Can you help me understand why this is happening? What is going wrong?
```
2024/07/21 17:05:23 routes.go:1096: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS:C:\\Users\\xxx\\.ollama\\models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR:C:\\Users\\xxx\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-07-21T17:05:23.370+02:00 level=INFO source=images.go:778 msg="total blobs: 10"
time=2024-07-21T17:05:23.380+02:00 level=INFO source=images.go:785 msg="total unused blobs removed: 0"
time=2024-07-21T17:05:23.383+02:00 level=INFO source=routes.go:1143 msg="Listening on [::]:11434 (version 0.2.7)"
time=2024-07-21T17:05:23.389+02:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11.3 rocm_v6.1]"
time=2024-07-21T17:05:23.389+02:00 level=INFO source=gpu.go:205 msg="looking for compatible GPUs"
time=2024-07-21T17:05:23.732+02:00 level=INFO source=gpu.go:287 msg="detected OS VRAM overhead" id=GPU-a40844ea-c474-77a6-f1f5-c10603b67d2e library=cuda compute=8.9 driver=12.6 name="NVIDIA GeForce RTX 4070 Ti SUPER" overhead="599.7 MiB"
time=2024-07-21T17:05:23.734+02:00 level=INFO source=types.go:105 msg="inference compute" id=GPU-a40844ea-c474-77a6-f1f5-c10603b67d2e library=cuda compute=8.9 driver=12.6 name="NVIDIA GeForce RTX 4070 Ti SUPER" total="16.0 GiB" available="14.7 GiB"
[GIN] 2024/07/21 - 17:13:47 | 200 | 27.7292ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/21 - 17:13:47 | 200 | 2.1085ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/21 - 17:13:47 | 200 | 1.0491ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/21 - 17:13:47 | 200 | 1.0606ms | 127.0.0.1 | GET "/api/version"
time=2024-07-21T17:14:05.372+02:00 level=INFO source=memory.go:309 msg="offload to cuda" layers.requested=-1 layers.model=47 layers.offload=10 layers.split="" memory.available="[14.4 GiB]" memory.required.full="54.3 GiB" memory.required.partial="13.6 GiB" memory.required.kv="736.0 MiB" memory.required.allocations="[13.6 GiB]" memory.weights.total="49.2 GiB" memory.weights.repeating="47.0 GiB" memory.weights.nonrepeating="2.2 GiB" memory.graph.full="509.0 MiB" memory.graph.partial="1.4 GiB"
time=2024-07-21T17:14:05.383+02:00 level=INFO source=server.go:383 msg="starting llama server" cmd="C:\\Users\\xxx\\AppData\\Local\\Programs\\Ollama\\ollama_runners\\cuda_v11.3\\ollama_llama_server.exe --model C:\\Users\\xxx\\.ollama\\models\\blobs\\sha256-ed43fde1ebff5c14455b8fd478d7ceed87826031a0f0f2b81361b1030c7560c3 --ctx-size 2048 --batch-size 512 --embedding --log-disable --n-gpu-layers 10 --no-mmap --parallel 1 --port 17987"
time=2024-07-21T17:14:05.418+02:00 level=INFO source=sched.go:437 msg="loaded runners" count=1
time=2024-07-21T17:14:05.420+02:00 level=INFO source=server.go:571 msg="waiting for llama runner to start responding"
time=2024-07-21T17:14:05.467+02:00 level=INFO source=server.go:612 msg="waiting for server to become available" status="llm server error"
INFO [wmain] build info | build=3337 commit="a8db2a9c" tid="16936" timestamp=1721574846
INFO [wmain] system info | n_threads=8 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 0 | " tid="16936" timestamp=1721574846 total_threads=16
INFO [wmain] HTTP server listening | hostname="127.0.0.1" n_threads_http="15" port="17987" tid="16936" timestamp=1721574846
llama_model_loader: loaded meta data with 29 key-value pairs and 508 tensors from C:\Users\xxx\.ollama\models\blobs\sha256-ed43fde1ebff5c14455b8fd478d7ceed87826031a0f0f2b81361b1030c7560c3 (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = gemma2
llama_model_loader: - kv 1: general.name str = gemma-2-27b-it
llama_model_loader: - kv 2: gemma2.context_length u32 = 8192
llama_model_loader: - kv 3: gemma2.embedding_length u32 = 4608
llama_model_loader: - kv 4: gemma2.block_count u32 = 46
llama_model_loader: - kv 5: gemma2.feed_forward_length u32 = 36864
llama_model_loader: - kv 6: gemma2.attention.head_count u32 = 32
llama_model_loader: - kv 7: gemma2.attention.head_count_kv u32 = 16
llama_model_loader: - kv 8: gemma2.attention.layer_norm_rms_epsilon f32 = 0.000001
llama_model_loader: - kv 9: gemma2.attention.key_length u32 = 128
llama_model_loader: - kv 10: gemma2.attention.value_length u32 = 128
llama_model_loader: - kv 11: general.file_type u32 = 1
llama_model_loader: - kv 12: gemma2.attn_logit_softcapping f32 = 50.000000
llama_model_loader: - kv 13: gemma2.final_logit_softcapping f32 = 30.000000
llama_model_loader: - kv 14: gemma2.attention.sliding_window u32 = 4096
llama_model_loader: - kv 15: tokenizer.ggml.model str = llama
llama_model_loader: - kv 16: tokenizer.ggml.pre str = default
llama_model_loader: - kv 17: tokenizer.ggml.tokens arr[str,256000] = ["<pad>", "<eos>", "<bos>", "<unk>", ...
time=2024-07-21T17:14:06.220+02:00 level=INFO source=server.go:612 msg="waiting for server to become available" status="llm server loading model"
llama_model_loader: - kv 18: tokenizer.ggml.scores arr[f32,256000] = [0.000000, 0.000000, 0.000000, 0.0000...
llama_model_loader: - kv 19: tokenizer.ggml.token_type arr[i32,256000] = [3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, ...
llama_model_loader: - kv 20: tokenizer.ggml.bos_token_id u32 = 2
llama_model_loader: - kv 21: tokenizer.ggml.eos_token_id u32 = 1
llama_model_loader: - kv 22: tokenizer.ggml.unknown_token_id u32 = 3
llama_model_loader: - kv 23: tokenizer.ggml.padding_token_id u32 = 0
llama_model_loader: - kv 24: tokenizer.ggml.add_bos_token bool = true
llama_model_loader: - kv 25: tokenizer.ggml.add_eos_token bool = false
llama_model_loader: - kv 26: tokenizer.chat_template str = {{ bos_token }}{% if messages[0]['rol...
llama_model_loader: - kv 27: tokenizer.ggml.add_space_prefix bool = false
llama_model_loader: - kv 28: general.quantization_version u32 = 2
llama_model_loader: - type f32: 185 tensors
llama_model_loader: - type f16: 323 tensors
llm_load_vocab: special tokens cache size = 364
llm_load_vocab: token to piece cache size = 1.6014 MB
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = gemma2
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 256000
llm_load_print_meta: n_merges = 0
llm_load_print_meta: vocab_only = 0
llm_load_print_meta: n_ctx_train = 8192
llm_load_print_meta: n_embd = 4608
llm_load_print_meta: n_layer = 46
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 16
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_swa = 4096
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 2
llm_load_print_meta: n_embd_k_gqa = 2048
llm_load_print_meta: n_embd_v_gqa = 2048
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-06
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 36864
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: causal attn = 1
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 2
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 10000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_ctx_orig_yarn = 8192
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: model type = 27B
llm_load_print_meta: model ftype = F16
llm_load_print_meta: model params = 27.23 B
llm_load_print_meta: model size = 50.72 GiB (16.00 BPW)
llm_load_print_meta: general.name = gemma-2-27b-it
llm_load_print_meta: BOS token = 2 '<bos>'
llm_load_print_meta: EOS token = 1 '<eos>'
llm_load_print_meta: UNK token = 3 '<unk>'
llm_load_print_meta: PAD token = 0 '<pad>'
llm_load_print_meta: LF token = 227 '<0x0A>'
llm_load_print_meta: EOT token = 107 '<end_of_turn>'
llm_load_print_meta: max token length = 93
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 4070 Ti SUPER, compute capability 8.9, VMM: yes
llm_load_tensors: ggml ctx size = 0.45 MiB
llm_load_tensors: offloading 10 repeating layers to GPU
llm_load_tensors: offloaded 10/47 layers to GPU
llm_load_tensors: CUDA_Host buffer size = 43382.55 MiB
llm_load_tensors: CUDA0 buffer size = 10800.70 MiB
llama_new_context_with_model: n_ctx = 2048
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: flash_attn = 0
llama_new_context_with_model: freq_base = 10000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CUDA_Host KV buffer size = 576.00 MiB
llama_kv_cache_init: CUDA0 KV buffer size = 160.00 MiB
llama_new_context_with_model: KV self size = 736.00 MiB, K (f16): 368.00 MiB, V (f16): 368.00 MiB
llama_new_context_with_model: CUDA_Host output buffer size = 0.99 MiB
llama_new_context_with_model: CUDA0 compute buffer size = 2759.00 MiB
llama_new_context_with_model: CUDA_Host compute buffer size = 17.01 MiB
llama_new_context_with_model: graph nodes = 1850
llama_new_context_with_model: graph splits = 472
INFO [wmain] model loaded | tid="16936" timestamp=1721574907
time=2024-07-21T17:15:07.935+02:00 level=INFO source=server.go:617 msg="llama runner started in 62.52 seconds"
[GIN] 2024/07/21 - 17:16:00 | 200 | 1m55s | 127.0.0.1 | POST "/api/chat"
```
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.2.7
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5832/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5832/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/437
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/437/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/437/comments
|
https://api.github.com/repos/ollama/ollama/issues/437/events
|
https://github.com/ollama/ollama/issues/437
| 1,870,626,242
|
I_kwDOJ0Z1Ps5vf33C
| 437
|
Error downloading manifest with `llama2-uncensored:70b`
|
{
"login": "satvikpendem",
"id": 42670561,
"node_id": "MDQ6VXNlcjQyNjcwNTYx",
"avatar_url": "https://avatars.githubusercontent.com/u/42670561?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/satvikpendem",
"html_url": "https://github.com/satvikpendem",
"followers_url": "https://api.github.com/users/satvikpendem/followers",
"following_url": "https://api.github.com/users/satvikpendem/following{/other_user}",
"gists_url": "https://api.github.com/users/satvikpendem/gists{/gist_id}",
"starred_url": "https://api.github.com/users/satvikpendem/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/satvikpendem/subscriptions",
"organizations_url": "https://api.github.com/users/satvikpendem/orgs",
"repos_url": "https://api.github.com/users/satvikpendem/repos",
"events_url": "https://api.github.com/users/satvikpendem/events{/privacy}",
"received_events_url": "https://api.github.com/users/satvikpendem/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-08-28T23:05:37
| 2023-08-29T15:04:32
| 2023-08-29T15:04:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am on macOS. I run `ollama run llama2-uncensored:70b` and get the following:
```sh
pulling manifest
pulling 47f73cb430c8... 100% |██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| (39/39 GB, 60 MB/s)
pulling 750599e5d655... 100% |██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| (40/40 B, 124 kB/s)
Error: download failed: Get "https://registry.ollama.ai/v2/library/llama2-uncensored/blobs/sha256:c3916a776ed02180603497f012bbdb04375d9597b3e660d7fb4051c4d4011c9c": dial tcp: lookup registry.ollama.ai: no such host
```
I try running `ollama run llama2-uncensored:70b` again and I get:
```sh
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2-uncensored/manifests/70b": dial tcp: lookup registry.ollama.ai: no such host
```
When I go to the [manifests URL](https://registry.ollama.ai/v2/library/llama2-uncensored/manifests/70b), I get the following JSON response:
```json
{
"errors": [
{
"code": "MANIFEST_INVALID",
"message": "manifest invalid",
"detail": {}
}
]
}
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/437/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/437/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/645
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/645/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/645/comments
|
https://api.github.com/repos/ollama/ollama/issues/645/events
|
https://github.com/ollama/ollama/issues/645
| 1,919,383,449
|
I_kwDOJ0Z1Ps5yZ3eZ
| 645
|
Allow global Ollama settings configuration
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2023-09-29T14:33:32
| 2024-03-06T01:56:16
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
In some cases (specifically related to hardware usually) it makes sense to have some global Ollama configuration rather than binding the setting to the Modelfile.
For example if I am running many different servers with different hardware capabilities I don't want to create and load Modelfiles for each machine to set `num_thread`, I want to set it once.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/645/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/645/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/789
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/789/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/789/comments
|
https://api.github.com/repos/ollama/ollama/issues/789/events
|
https://github.com/ollama/ollama/issues/789
| 1,943,035,684
|
I_kwDOJ0Z1Ps5z0F8k
| 789
|
How to disable streaming output in Rest API
|
{
"login": "ajasingh",
"id": 15189049,
"node_id": "MDQ6VXNlcjE1MTg5MDQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/15189049?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ajasingh",
"html_url": "https://github.com/ajasingh",
"followers_url": "https://api.github.com/users/ajasingh/followers",
"following_url": "https://api.github.com/users/ajasingh/following{/other_user}",
"gists_url": "https://api.github.com/users/ajasingh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ajasingh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ajasingh/subscriptions",
"organizations_url": "https://api.github.com/users/ajasingh/orgs",
"repos_url": "https://api.github.com/users/ajasingh/repos",
"events_url": "https://api.github.com/users/ajasingh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ajasingh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-10-14T07:07:37
| 2023-10-16T18:06:40
| 2023-10-16T18:06:39
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am trying to get structured information like json back from model , so i am not looking at streamed output . I have tried setting content-type:application/json as mentioned in one of the issues but is still get back streamed output . Can somebody help me how to disable streamed output
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/789/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/789/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6494
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6494/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6494/comments
|
https://api.github.com/repos/ollama/ollama/issues/6494/events
|
https://github.com/ollama/ollama/issues/6494
| 2,484,900,335
|
I_kwDOJ0Z1Ps6UHJHv
| 6,494
|
igpu
|
{
"login": "ayttop",
"id": 178673810,
"node_id": "U_kgDOCqZYkg",
"avatar_url": "https://avatars.githubusercontent.com/u/178673810?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ayttop",
"html_url": "https://github.com/ayttop",
"followers_url": "https://api.github.com/users/ayttop/followers",
"following_url": "https://api.github.com/users/ayttop/following{/other_user}",
"gists_url": "https://api.github.com/users/ayttop/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ayttop/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ayttop/subscriptions",
"organizations_url": "https://api.github.com/users/ayttop/orgs",
"repos_url": "https://api.github.com/users/ayttop/repos",
"events_url": "https://api.github.com/users/ayttop/events{/privacy}",
"received_events_url": "https://api.github.com/users/ayttop/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-08-24T22:25:26
| 2024-08-28T02:53:23
| 2024-08-27T21:21:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
ollama with igpu intel
how to run ollama on igpu intel
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6494/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6494/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7611
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7611/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7611/comments
|
https://api.github.com/repos/ollama/ollama/issues/7611/events
|
https://github.com/ollama/ollama/issues/7611
| 2,648,163,131
|
I_kwDOJ0Z1Ps6d18M7
| 7,611
|
with_structured_output support for ollama.chat()
|
{
"login": "ChmHsm",
"id": 12183061,
"node_id": "MDQ6VXNlcjEyMTgzMDYx",
"avatar_url": "https://avatars.githubusercontent.com/u/12183061?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ChmHsm",
"html_url": "https://github.com/ChmHsm",
"followers_url": "https://api.github.com/users/ChmHsm/followers",
"following_url": "https://api.github.com/users/ChmHsm/following{/other_user}",
"gists_url": "https://api.github.com/users/ChmHsm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ChmHsm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ChmHsm/subscriptions",
"organizations_url": "https://api.github.com/users/ChmHsm/orgs",
"repos_url": "https://api.github.com/users/ChmHsm/repos",
"events_url": "https://api.github.com/users/ChmHsm/events{/privacy}",
"received_events_url": "https://api.github.com/users/ChmHsm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-11-11T04:59:38
| 2025-01-11T02:19:50
| 2024-12-02T15:23:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Thank you for the llama 3.2 vision integration!
I was using llama3.2-3b with ChatOllama(model="llama3.2:latest").with_structured_output() to get a structured response from the model and I was hoping to be able to do the same with llama3.2-vision.
But it turns out, at least to my knowledge, I can't for now since i can't use with_structured_output directly with ollama.chat(model='llama3.2-vision'). (when I do that I get a "llama vision doesn't support tools" error)
Is this still a pending feature or am I missing something?
It's very important to be able to get a structured response from llama vision. A workaround is I'm now having to use a chain containing llama3.2-vision AND llama3.2 to process an image + get a structured response, which is a bit of an overkill...
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7611/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7611/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1037
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1037/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1037/comments
|
https://api.github.com/repos/ollama/ollama/issues/1037/events
|
https://github.com/ollama/ollama/issues/1037
| 1,982,465,027
|
I_kwDOJ0Z1Ps52KgQD
| 1,037
|
run a multi-file model
|
{
"login": "eramax",
"id": 542413,
"node_id": "MDQ6VXNlcjU0MjQxMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/542413?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eramax",
"html_url": "https://github.com/eramax",
"followers_url": "https://api.github.com/users/eramax/followers",
"following_url": "https://api.github.com/users/eramax/following{/other_user}",
"gists_url": "https://api.github.com/users/eramax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eramax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eramax/subscriptions",
"organizations_url": "https://api.github.com/users/eramax/orgs",
"repos_url": "https://api.github.com/users/eramax/repos",
"events_url": "https://api.github.com/users/eramax/events{/privacy}",
"received_events_url": "https://api.github.com/users/eramax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-11-08T00:06:10
| 2023-11-13T17:55:15
| 2023-11-13T17:55:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
How to import a model which is multiple files like the image bellow

I tried but It gave me error
```parsing modelfile
looking for model
⠋ creating model layer Error: invalid file magic```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1037/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1037/timeline
| null |
completed
| false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.