url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/255
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/255/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/255/comments
|
https://api.github.com/repos/ollama/ollama/issues/255/events
|
https://github.com/ollama/ollama/pull/255
| 1,832,163,734
|
PR_kwDOJ0Z1Ps5W9NDo
| 255
|
Update llama cpp
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-08-01T23:22:47
| 2023-08-02T00:18:34
| 2023-08-02T00:18:33
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/255",
"html_url": "https://github.com/ollama/ollama/pull/255",
"diff_url": "https://github.com/ollama/ollama/pull/255.diff",
"patch_url": "https://github.com/ollama/ollama/pull/255.patch",
"merged_at": "2023-08-02T00:18:33"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/255/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/255/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3360
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3360/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3360/comments
|
https://api.github.com/repos/ollama/ollama/issues/3360/events
|
https://github.com/ollama/ollama/issues/3360
| 2,207,914,992
|
I_kwDOJ0Z1Ps6Dmhvw
| 3,360
|
OpenAI interface compatible with vector interfaces
|
{
"login": "yuanjie-ai",
"id": 20265321,
"node_id": "MDQ6VXNlcjIwMjY1MzIx",
"avatar_url": "https://avatars.githubusercontent.com/u/20265321?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanjie-ai",
"html_url": "https://github.com/yuanjie-ai",
"followers_url": "https://api.github.com/users/yuanjie-ai/followers",
"following_url": "https://api.github.com/users/yuanjie-ai/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanjie-ai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanjie-ai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanjie-ai/subscriptions",
"organizations_url": "https://api.github.com/users/yuanjie-ai/orgs",
"repos_url": "https://api.github.com/users/yuanjie-ai/repos",
"events_url": "https://api.github.com/users/yuanjie-ai/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanjie-ai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-03-26T10:56:44
| 2024-04-29T10:59:14
| 2024-03-27T02:15:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
OpenAI interface compatible with vector interfaces
### How should we solve this?
OpenAI interface compatible with vector interfaces
### What is the impact of not solving this?
_No response_
### Anything else?
OpenAI interface compatible with vector interfaces
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3360/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3360/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5689
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5689/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5689/comments
|
https://api.github.com/repos/ollama/ollama/issues/5689/events
|
https://github.com/ollama/ollama/issues/5689
| 2,407,319,063
|
I_kwDOJ0Z1Ps6PfMYX
| 5,689
|
System wide old version of cuda v11 used instead of bundled version - runner fails to start due to missing symbols
|
{
"login": "hljhyb",
"id": 42955249,
"node_id": "MDQ6VXNlcjQyOTU1MjQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/42955249?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hljhyb",
"html_url": "https://github.com/hljhyb",
"followers_url": "https://api.github.com/users/hljhyb/followers",
"following_url": "https://api.github.com/users/hljhyb/following{/other_user}",
"gists_url": "https://api.github.com/users/hljhyb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hljhyb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hljhyb/subscriptions",
"organizations_url": "https://api.github.com/users/hljhyb/orgs",
"repos_url": "https://api.github.com/users/hljhyb/repos",
"events_url": "https://api.github.com/users/hljhyb/events{/privacy}",
"received_events_url": "https://api.github.com/users/hljhyb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 23
| 2024-07-14T08:08:32
| 2024-08-13T07:14:24
| 2024-08-13T07:14:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Previously, all models could run very well, but after a recent upgrade, errors are occurring. I have enabled Debug mode. What could be the issue?
[GIN] 2024/07/14 - 16:05:11 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/14 - 16:05:20 | 200 | 69.6µs | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/14 - 16:05:20 | 200 | 23.4842ms | 127.0.0.1 | POST "/api/show"
time=2024-07-14T16:05:21.026+08:00 level=INFO source=sched.go:701 msg="new model will fit in available VRAM in single GPU, loading" model=C:\Users\X170\.ollama\models\blobs\sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa gpu=GPU-06932cd6-8249-ad0c-67b5-f2bcc68be311 parallel=4 available=14287372288 required="6.2 GiB"
time=2024-07-14T16:05:21.026+08:00 level=INFO source=memory.go:309 msg="offload to cuda" layers.requested=-1 layers.model=33 layers.offload=33 layers.split="" memory.available="[13.3 GiB]" memory.required.full="6.2 GiB" memory.required.partial="6.2 GiB" memory.required.kv="1.0 GiB" memory.required.allocations="[6.2 GiB]" memory.weights.total="4.7 GiB" memory.weights.repeating="4.3 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="560.0 MiB" memory.graph.partial="677.5 MiB"
time=2024-07-14T16:05:21.046+08:00 level=INFO source=server.go:383 msg="starting llama server" cmd="C:\\Users\\X170\\AppData\\Local\\Programs\\Ollama\\ollama_runners\\cuda_v11.3\\ollama_llama_server.exe --model C:\\Users\\X170\\.ollama\\models\\blobs\\sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa --ctx-size 8192 --batch-size 512 --embedding --log-disable --n-gpu-layers 33 --no-mmap --parallel 4 --port 14534"
time=2024-07-14T16:05:21.050+08:00 level=INFO source=sched.go:437 msg="loaded runners" count=1
time=2024-07-14T16:05:21.050+08:00 level=INFO source=server.go:571 msg="waiting for llama runner to start responding"
time=2024-07-14T16:05:21.051+08:00 level=INFO source=server.go:612 msg="waiting for server to become available" status="llm server error"
time=2024-07-14T16:05:21.314+08:00 level=ERROR source=sched.go:443 msg="error loading llama server" error="llama runner process has terminated: exit status 0xc0000139 "
[GIN] 2024/07/14 - 16:05:21 | 500 | 354.9161ms | 127.0.0.1 | POST "/api/chat"
[server.log](https://github.com/user-attachments/files/16216040/server.log)
[app.log](https://github.com/user-attachments/files/16216041/app.log)
### OS
Windows
### GPU
Intel
### CPU
Intel
### Ollama version
0.2.5
|
{
"login": "hljhyb",
"id": 42955249,
"node_id": "MDQ6VXNlcjQyOTU1MjQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/42955249?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hljhyb",
"html_url": "https://github.com/hljhyb",
"followers_url": "https://api.github.com/users/hljhyb/followers",
"following_url": "https://api.github.com/users/hljhyb/following{/other_user}",
"gists_url": "https://api.github.com/users/hljhyb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hljhyb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hljhyb/subscriptions",
"organizations_url": "https://api.github.com/users/hljhyb/orgs",
"repos_url": "https://api.github.com/users/hljhyb/repos",
"events_url": "https://api.github.com/users/hljhyb/events{/privacy}",
"received_events_url": "https://api.github.com/users/hljhyb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5689/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5689/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3444
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3444/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3444/comments
|
https://api.github.com/repos/ollama/ollama/issues/3444/events
|
https://github.com/ollama/ollama/issues/3444
| 2,219,423,368
|
I_kwDOJ0Z1Ps6ESbaI
| 3,444
|
Can Someone help me with setting up the openai endpoint I cant figure it out!
|
{
"login": "alfi4000",
"id": 149228038,
"node_id": "U_kgDOCOUKBg",
"avatar_url": "https://avatars.githubusercontent.com/u/149228038?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alfi4000",
"html_url": "https://github.com/alfi4000",
"followers_url": "https://api.github.com/users/alfi4000/followers",
"following_url": "https://api.github.com/users/alfi4000/following{/other_user}",
"gists_url": "https://api.github.com/users/alfi4000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alfi4000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alfi4000/subscriptions",
"organizations_url": "https://api.github.com/users/alfi4000/orgs",
"repos_url": "https://api.github.com/users/alfi4000/repos",
"events_url": "https://api.github.com/users/alfi4000/events{/privacy}",
"received_events_url": "https://api.github.com/users/alfi4000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-04-02T02:10:33
| 2024-04-15T19:22:05
| 2024-04-15T19:22:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Can Someone help me with setting up the openai endpoint I cant figure it out!
### What did you expect to see?
Running openai api endpoint
### Steps to reproduce
there is nothing to reproduce
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
x86
### Platform
_No response_
### Ollama version
latest
### GPU
Nvidia
### GPU info
Tesla P40
### CPU
Intel
### Other software
Ubuntu Server 22.04
How can I setup the endpoint it looks for me unclear in the doc!:
https://ollama.com/blog/openai-compatibility
|
{
"login": "alfi4000",
"id": 149228038,
"node_id": "U_kgDOCOUKBg",
"avatar_url": "https://avatars.githubusercontent.com/u/149228038?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alfi4000",
"html_url": "https://github.com/alfi4000",
"followers_url": "https://api.github.com/users/alfi4000/followers",
"following_url": "https://api.github.com/users/alfi4000/following{/other_user}",
"gists_url": "https://api.github.com/users/alfi4000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alfi4000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alfi4000/subscriptions",
"organizations_url": "https://api.github.com/users/alfi4000/orgs",
"repos_url": "https://api.github.com/users/alfi4000/repos",
"events_url": "https://api.github.com/users/alfi4000/events{/privacy}",
"received_events_url": "https://api.github.com/users/alfi4000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3444/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3444/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1091
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1091/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1091/comments
|
https://api.github.com/repos/ollama/ollama/issues/1091/events
|
https://github.com/ollama/ollama/issues/1091
| 1,989,049,541
|
I_kwDOJ0Z1Ps52jnzF
| 1,091
|
ollama install messed the CUDA setup, ollama unable to use CUDA
|
{
"login": "ArsBinarii",
"id": 6293391,
"node_id": "MDQ6VXNlcjYyOTMzOTE=",
"avatar_url": "https://avatars.githubusercontent.com/u/6293391?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArsBinarii",
"html_url": "https://github.com/ArsBinarii",
"followers_url": "https://api.github.com/users/ArsBinarii/followers",
"following_url": "https://api.github.com/users/ArsBinarii/following{/other_user}",
"gists_url": "https://api.github.com/users/ArsBinarii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ArsBinarii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArsBinarii/subscriptions",
"organizations_url": "https://api.github.com/users/ArsBinarii/orgs",
"repos_url": "https://api.github.com/users/ArsBinarii/repos",
"events_url": "https://api.github.com/users/ArsBinarii/events{/privacy}",
"received_events_url": "https://api.github.com/users/ArsBinarii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 13
| 2023-11-11T17:27:24
| 2024-04-12T21:55:16
| 2024-04-12T21:55:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Distributor ID: Ubuntu
Description: Ubuntu 22.04.3 LTS
Release: 22.04
Codename: jammy
downloaded and installed:
https://developer.nvidia.com/cuda-12-0-0-download-archive
GPU: 1080
after cuda install, nvidia-smi reports:
NVIDIA-SMI 525.60.13 Driver Version: 525.60.13 CUDA Version: 12.0
installed ollama: curl https://ollama.ai/install.sh | sh
strange message appears: >>> NVIDIA GPU installed.
run dophin2.2: ollama run dolphin2.2-mistral
very very low performance
check nvidia-smi: No devices were found
reinstall CUDA from: https://developer.nvidia.com/cuda-12-0-0-download-archive
check nvidia-smi: No devices were found
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1091/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1091/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8566
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8566/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8566/comments
|
https://api.github.com/repos/ollama/ollama/issues/8566/events
|
https://github.com/ollama/ollama/issues/8566
| 2,809,271,562
|
I_kwDOJ0Z1Ps6nchUK
| 8,566
|
Model loaded each time
|
{
"login": "PrinceSajjadHussain",
"id": 66195602,
"node_id": "MDQ6VXNlcjY2MTk1NjAy",
"avatar_url": "https://avatars.githubusercontent.com/u/66195602?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PrinceSajjadHussain",
"html_url": "https://github.com/PrinceSajjadHussain",
"followers_url": "https://api.github.com/users/PrinceSajjadHussain/followers",
"following_url": "https://api.github.com/users/PrinceSajjadHussain/following{/other_user}",
"gists_url": "https://api.github.com/users/PrinceSajjadHussain/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PrinceSajjadHussain/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PrinceSajjadHussain/subscriptions",
"organizations_url": "https://api.github.com/users/PrinceSajjadHussain/orgs",
"repos_url": "https://api.github.com/users/PrinceSajjadHussain/repos",
"events_url": "https://api.github.com/users/PrinceSajjadHussain/events{/privacy}",
"received_events_url": "https://api.github.com/users/PrinceSajjadHussain/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 4
| 2025-01-24T11:58:42
| 2025-01-24T17:14:40
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Model reload each time it's need the internet to load when i terminate the cmd and run again

### OS
_No response_
### GPU
_No response_
### CPU
Intel
### Ollama version
0.5.4
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8566/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8566/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2530
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2530/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2530/comments
|
https://api.github.com/repos/ollama/ollama/issues/2530/events
|
https://github.com/ollama/ollama/pull/2530
| 2,137,653,760
|
PR_kwDOJ0Z1Ps5nCWQq
| 2,530
|
use http.DefaultClient
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-16T00:18:55
| 2024-02-20T23:34:48
| 2024-02-20T23:34:47
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2530",
"html_url": "https://github.com/ollama/ollama/pull/2530",
"diff_url": "https://github.com/ollama/ollama/pull/2530.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2530.patch",
"merged_at": "2024-02-20T23:34:47"
}
|
default client already handles proxy: https://pkg.go.dev/net/http#RoundTripper
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2530/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2530/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1790
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1790/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1790/comments
|
https://api.github.com/repos/ollama/ollama/issues/1790/events
|
https://github.com/ollama/ollama/pull/1790
| 2,066,377,309
|
PR_kwDOJ0Z1Ps5jQwI1
| 1,790
|
Cleaup stale submodule
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-04T21:44:08
| 2024-01-04T21:47:28
| 2024-01-04T21:47:25
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1790",
"html_url": "https://github.com/ollama/ollama/pull/1790",
"diff_url": "https://github.com/ollama/ollama/pull/1790.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1790.patch",
"merged_at": "2024-01-04T21:47:25"
}
|
If the tree has a stale submodule, make sure we clean it up first
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1790/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1790/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/547
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/547/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/547/comments
|
https://api.github.com/repos/ollama/ollama/issues/547/events
|
https://github.com/ollama/ollama/issues/547
| 1,899,638,939
|
I_kwDOJ0Z1Ps5xOjCb
| 547
|
ollama is version 0.0.0
|
{
"login": "Dialga",
"id": 5157928,
"node_id": "MDQ6VXNlcjUxNTc5Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5157928?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Dialga",
"html_url": "https://github.com/Dialga",
"followers_url": "https://api.github.com/users/Dialga/followers",
"following_url": "https://api.github.com/users/Dialga/following{/other_user}",
"gists_url": "https://api.github.com/users/Dialga/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Dialga/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Dialga/subscriptions",
"organizations_url": "https://api.github.com/users/Dialga/orgs",
"repos_url": "https://api.github.com/users/Dialga/repos",
"events_url": "https://api.github.com/users/Dialga/events{/privacy}",
"received_events_url": "https://api.github.com/users/Dialga/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2023-09-17T03:13:26
| 2023-10-26T11:12:30
| 2023-10-26T11:12:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I've installed version 0.0.19 but the output of `-v` is 0.0.0.
```
$ ollama -v
ollama version 0.0.0
```
|
{
"login": "Dialga",
"id": 5157928,
"node_id": "MDQ6VXNlcjUxNTc5Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5157928?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Dialga",
"html_url": "https://github.com/Dialga",
"followers_url": "https://api.github.com/users/Dialga/followers",
"following_url": "https://api.github.com/users/Dialga/following{/other_user}",
"gists_url": "https://api.github.com/users/Dialga/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Dialga/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Dialga/subscriptions",
"organizations_url": "https://api.github.com/users/Dialga/orgs",
"repos_url": "https://api.github.com/users/Dialga/repos",
"events_url": "https://api.github.com/users/Dialga/events{/privacy}",
"received_events_url": "https://api.github.com/users/Dialga/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/547/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/547/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6534
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6534/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6534/comments
|
https://api.github.com/repos/ollama/ollama/issues/6534/events
|
https://github.com/ollama/ollama/pull/6534
| 2,490,483,947
|
PR_kwDOJ0Z1Ps55oaDP
| 6,534
|
update templates to use messages
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-27T22:35:15
| 2024-08-30T16:40:01
| 2024-08-30T16:39:59
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6534",
"html_url": "https://github.com/ollama/ollama/pull/6534",
"diff_url": "https://github.com/ollama/ollama/pull/6534.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6534.patch",
"merged_at": "2024-08-30T16:39:59"
}
|
messages support has been out in the wild for over a month at this point. it's time to update the built-in templates to use the messages format instead of the previous prompt/response format
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6534/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6534/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2108
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2108/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2108/comments
|
https://api.github.com/repos/ollama/ollama/issues/2108/events
|
https://github.com/ollama/ollama/issues/2108
| 2,091,950,649
|
I_kwDOJ0Z1Ps58sKI5
| 2,108
|
:dog2: Please publish `mlabonne/NeuralBeagle14-7B` :pray:
|
{
"login": "adriens",
"id": 5235127,
"node_id": "MDQ6VXNlcjUyMzUxMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5235127?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adriens",
"html_url": "https://github.com/adriens",
"followers_url": "https://api.github.com/users/adriens/followers",
"following_url": "https://api.github.com/users/adriens/following{/other_user}",
"gists_url": "https://api.github.com/users/adriens/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adriens/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adriens/subscriptions",
"organizations_url": "https://api.github.com/users/adriens/orgs",
"repos_url": "https://api.github.com/users/adriens/repos",
"events_url": "https://api.github.com/users/adriens/events{/privacy}",
"received_events_url": "https://api.github.com/users/adriens/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-01-20T07:42:51
| 2024-01-21T04:59:59
| 2024-01-20T07:46:27
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Please publish [`mlabonne/NeuralBeagle14-7B`](https://huggingface.co/mlabonne/NeuralBeagle14-7B) so we can enjoy it from `ollama` :pray:
|
{
"login": "adriens",
"id": 5235127,
"node_id": "MDQ6VXNlcjUyMzUxMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5235127?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adriens",
"html_url": "https://github.com/adriens",
"followers_url": "https://api.github.com/users/adriens/followers",
"following_url": "https://api.github.com/users/adriens/following{/other_user}",
"gists_url": "https://api.github.com/users/adriens/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adriens/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adriens/subscriptions",
"organizations_url": "https://api.github.com/users/adriens/orgs",
"repos_url": "https://api.github.com/users/adriens/repos",
"events_url": "https://api.github.com/users/adriens/events{/privacy}",
"received_events_url": "https://api.github.com/users/adriens/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2108/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2108/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7733
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7733/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7733/comments
|
https://api.github.com/repos/ollama/ollama/issues/7733/events
|
https://github.com/ollama/ollama/issues/7733
| 2,670,648,560
|
I_kwDOJ0Z1Ps6fLtzw
| 7,733
|
Error: POST predict: Post "http://127.0.0.1:35943/completion": EOF
|
{
"login": "DominicTWHV",
"id": 182333671,
"node_id": "U_kgDOCt4w5w",
"avatar_url": "https://avatars.githubusercontent.com/u/182333671?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DominicTWHV",
"html_url": "https://github.com/DominicTWHV",
"followers_url": "https://api.github.com/users/DominicTWHV/followers",
"following_url": "https://api.github.com/users/DominicTWHV/following{/other_user}",
"gists_url": "https://api.github.com/users/DominicTWHV/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DominicTWHV/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DominicTWHV/subscriptions",
"organizations_url": "https://api.github.com/users/DominicTWHV/orgs",
"repos_url": "https://api.github.com/users/DominicTWHV/repos",
"events_url": "https://api.github.com/users/DominicTWHV/events{/privacy}",
"received_events_url": "https://api.github.com/users/DominicTWHV/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6849881759,
"node_id": "LA_kwDOJ0Z1Ps8AAAABmEjmnw",
"url": "https://api.github.com/repos/ollama/ollama/labels/memory",
"name": "memory",
"color": "5017EA",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 11
| 2024-11-19T03:56:41
| 2024-11-23T15:20:51
| 2024-11-22T20:30:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Tries to run a model:
ollama run aya-expanse:32b --verbose
`>>> Hello`
gets error: `Error: POST predict: Post "http://127.0.0.1:35943/completion": EOF`
llama3.1 works, however it seems like bigger models like aya-expanse 32b and command r have issues. Nemotron 70b also works for some reason.
For reference, the models that aren't working has worked before on the exact same system.
Ubuntu Server 2204 LTS
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.2
|
{
"login": "DominicTWHV",
"id": 182333671,
"node_id": "U_kgDOCt4w5w",
"avatar_url": "https://avatars.githubusercontent.com/u/182333671?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DominicTWHV",
"html_url": "https://github.com/DominicTWHV",
"followers_url": "https://api.github.com/users/DominicTWHV/followers",
"following_url": "https://api.github.com/users/DominicTWHV/following{/other_user}",
"gists_url": "https://api.github.com/users/DominicTWHV/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DominicTWHV/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DominicTWHV/subscriptions",
"organizations_url": "https://api.github.com/users/DominicTWHV/orgs",
"repos_url": "https://api.github.com/users/DominicTWHV/repos",
"events_url": "https://api.github.com/users/DominicTWHV/events{/privacy}",
"received_events_url": "https://api.github.com/users/DominicTWHV/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7733/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7733/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1486
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1486/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1486/comments
|
https://api.github.com/repos/ollama/ollama/issues/1486/events
|
https://github.com/ollama/ollama/pull/1486
| 2,038,415,179
|
PR_kwDOJ0Z1Ps5h1BaH
| 1,486
|
Fix issues with `/set template` and `/set system`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-12T19:23:05
| 2023-12-12T19:43:21
| 2023-12-12T19:43:20
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1486",
"html_url": "https://github.com/ollama/ollama/pull/1486",
"diff_url": "https://github.com/ollama/ollama/pull/1486.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1486.patch",
"merged_at": "2023-12-12T19:43:20"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1486/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1486/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4883
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4883/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4883/comments
|
https://api.github.com/repos/ollama/ollama/issues/4883/events
|
https://github.com/ollama/ollama/issues/4883
| 2,339,239,720
|
I_kwDOJ0Z1Ps6Lbfco
| 4,883
|
Add Ollama config file
|
{
"login": "t18n",
"id": 14198542,
"node_id": "MDQ6VXNlcjE0MTk4NTQy",
"avatar_url": "https://avatars.githubusercontent.com/u/14198542?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/t18n",
"html_url": "https://github.com/t18n",
"followers_url": "https://api.github.com/users/t18n/followers",
"following_url": "https://api.github.com/users/t18n/following{/other_user}",
"gists_url": "https://api.github.com/users/t18n/gists{/gist_id}",
"starred_url": "https://api.github.com/users/t18n/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/t18n/subscriptions",
"organizations_url": "https://api.github.com/users/t18n/orgs",
"repos_url": "https://api.github.com/users/t18n/repos",
"events_url": "https://api.github.com/users/t18n/events{/privacy}",
"received_events_url": "https://api.github.com/users/t18n/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-06-06T22:21:47
| 2024-06-06T22:21:47
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I want to request a config file that allow user to manage models in the text form. This would allow us to sync in dotfiles, manage models very easily.
For example:
```yaml
// ~/.config/ollama/config.yaml
models:
- name: llama3
- name: llama3:70b
- name: mistral:7b
config:
- name: mistral
max_input_tokens: 32000
```
Then in terminal, you can just run `ollama pull`, it will automatically pull all listed models, or `ollama purge` to clean up and keep only models in the config file
## Inspirations
https://asdf-vm.com/manage/configuration.html#tool-versions
https://mise.jdx.dev/configuration.html#mise-toml
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4883/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4883/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7019
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7019/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7019/comments
|
https://api.github.com/repos/ollama/ollama/issues/7019/events
|
https://github.com/ollama/ollama/issues/7019
| 2,554,197,361
|
I_kwDOJ0Z1Ps6YPfVx
| 7,019
|
Why are all pixtral models removed from the ollama model library?
|
{
"login": "CRCODE22",
"id": 88407346,
"node_id": "MDQ6VXNlcjg4NDA3MzQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/88407346?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/CRCODE22",
"html_url": "https://github.com/CRCODE22",
"followers_url": "https://api.github.com/users/CRCODE22/followers",
"following_url": "https://api.github.com/users/CRCODE22/following{/other_user}",
"gists_url": "https://api.github.com/users/CRCODE22/gists{/gist_id}",
"starred_url": "https://api.github.com/users/CRCODE22/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CRCODE22/subscriptions",
"organizations_url": "https://api.github.com/users/CRCODE22/orgs",
"repos_url": "https://api.github.com/users/CRCODE22/repos",
"events_url": "https://api.github.com/users/CRCODE22/events{/privacy}",
"received_events_url": "https://api.github.com/users/CRCODE22/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-09-28T12:32:59
| 2024-09-29T12:25:23
| 2024-09-29T12:25:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://ollama.com/library?q=pixtral&sort=newest
|
{
"login": "CRCODE22",
"id": 88407346,
"node_id": "MDQ6VXNlcjg4NDA3MzQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/88407346?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/CRCODE22",
"html_url": "https://github.com/CRCODE22",
"followers_url": "https://api.github.com/users/CRCODE22/followers",
"following_url": "https://api.github.com/users/CRCODE22/following{/other_user}",
"gists_url": "https://api.github.com/users/CRCODE22/gists{/gist_id}",
"starred_url": "https://api.github.com/users/CRCODE22/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CRCODE22/subscriptions",
"organizations_url": "https://api.github.com/users/CRCODE22/orgs",
"repos_url": "https://api.github.com/users/CRCODE22/repos",
"events_url": "https://api.github.com/users/CRCODE22/events{/privacy}",
"received_events_url": "https://api.github.com/users/CRCODE22/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7019/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7019/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7950
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7950/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7950/comments
|
https://api.github.com/repos/ollama/ollama/issues/7950/events
|
https://github.com/ollama/ollama/pull/7950
| 2,720,521,621
|
PR_kwDOJ0Z1Ps6EMHT7
| 7,950
|
Add IntelliBar to list of community integrations
|
{
"login": "erusev",
"id": 184170,
"node_id": "MDQ6VXNlcjE4NDE3MA==",
"avatar_url": "https://avatars.githubusercontent.com/u/184170?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/erusev",
"html_url": "https://github.com/erusev",
"followers_url": "https://api.github.com/users/erusev/followers",
"following_url": "https://api.github.com/users/erusev/following{/other_user}",
"gists_url": "https://api.github.com/users/erusev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/erusev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erusev/subscriptions",
"organizations_url": "https://api.github.com/users/erusev/orgs",
"repos_url": "https://api.github.com/users/erusev/repos",
"events_url": "https://api.github.com/users/erusev/events{/privacy}",
"received_events_url": "https://api.github.com/users/erusev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-12-05T14:09:15
| 2024-12-23T17:04:18
| 2024-12-23T17:04:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7950",
"html_url": "https://github.com/ollama/ollama/pull/7950",
"diff_url": "https://github.com/ollama/ollama/pull/7950.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7950.patch",
"merged_at": "2024-12-23T17:04:18"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7950/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7950/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6498
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6498/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6498/comments
|
https://api.github.com/repos/ollama/ollama/issues/6498/events
|
https://github.com/ollama/ollama/issues/6498
| 2,485,116,664
|
I_kwDOJ0Z1Ps6UH974
| 6,498
|
Custom model generating response not based on the prompt given
|
{
"login": "Shehjad-Ishan",
"id": 38762649,
"node_id": "MDQ6VXNlcjM4NzYyNjQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/38762649?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Shehjad-Ishan",
"html_url": "https://github.com/Shehjad-Ishan",
"followers_url": "https://api.github.com/users/Shehjad-Ishan/followers",
"following_url": "https://api.github.com/users/Shehjad-Ishan/following{/other_user}",
"gists_url": "https://api.github.com/users/Shehjad-Ishan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Shehjad-Ishan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Shehjad-Ishan/subscriptions",
"organizations_url": "https://api.github.com/users/Shehjad-Ishan/orgs",
"repos_url": "https://api.github.com/users/Shehjad-Ishan/repos",
"events_url": "https://api.github.com/users/Shehjad-Ishan/events{/privacy}",
"received_events_url": "https://api.github.com/users/Shehjad-Ishan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-08-25T08:09:04
| 2024-08-25T08:09:44
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama run mapler/gpt2:latest
>>> Hello
"What do you mean?" I asked. "Well, what about the first two or three hundred people who came in a week after your show and didn't speak up for themselves? Or how did they
react to that announcement of yours as well—that was just an excuse," he said, pointing at my face with his finger on mine.
"What's going through our heads now?" I asked, looking down over the audience in dismay. The only one who seemed visibly shaken up and angry right back then had been me (or
maybe a friend) while we were making this interview—and he hadn't bothered to tell us about it yet at all since… well, that was too much for him, anyway.
He wasn'a good guy," I said flatly, "but so why are you saying something like 'you did nothing wrong?' If they thought the right thing would happen with those numbers and
then told me we were going to get it done by now… well... that was just what happened in Boston for them. They didn't care—the media treated us very badly."
/randy_goddard
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.6
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6498/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6498/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6954
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6954/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6954/comments
|
https://api.github.com/repos/ollama/ollama/issues/6954/events
|
https://github.com/ollama/ollama/pull/6954
| 2,547,883,319
|
PR_kwDOJ0Z1Ps58pt_m
| 6,954
|
example: add example notebook on llm tracing
|
{
"login": "jannikmaierhoefer",
"id": 48529566,
"node_id": "MDQ6VXNlcjQ4NTI5NTY2",
"avatar_url": "https://avatars.githubusercontent.com/u/48529566?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jannikmaierhoefer",
"html_url": "https://github.com/jannikmaierhoefer",
"followers_url": "https://api.github.com/users/jannikmaierhoefer/followers",
"following_url": "https://api.github.com/users/jannikmaierhoefer/following{/other_user}",
"gists_url": "https://api.github.com/users/jannikmaierhoefer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jannikmaierhoefer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jannikmaierhoefer/subscriptions",
"organizations_url": "https://api.github.com/users/jannikmaierhoefer/orgs",
"repos_url": "https://api.github.com/users/jannikmaierhoefer/repos",
"events_url": "https://api.github.com/users/jannikmaierhoefer/events{/privacy}",
"received_events_url": "https://api.github.com/users/jannikmaierhoefer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 2
| 2024-09-25T12:40:01
| 2024-12-05T10:00:51
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6954",
"html_url": "https://github.com/ollama/ollama/pull/6954",
"diff_url": "https://github.com/ollama/ollama/pull/6954.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6954.patch",
"merged_at": null
}
|
- Added example notebook on tracing Ollama models with [Langfuse](https://github.com/langfuse/langfuse)
- Added link to Langfuse in Community Integrations List
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6954/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6954/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/139
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/139/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/139/comments
|
https://api.github.com/repos/ollama/ollama/issues/139/events
|
https://github.com/ollama/ollama/pull/139
| 1,814,276,626
|
PR_kwDOJ0Z1Ps5WBHee
| 139
|
Update icon
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-20T15:31:55
| 2023-12-05T23:52:45
| 2023-07-20T15:55:20
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/139",
"html_url": "https://github.com/ollama/ollama/pull/139",
"diff_url": "https://github.com/ollama/ollama/pull/139.diff",
"patch_url": "https://github.com/ollama/ollama/pull/139.patch",
"merged_at": "2023-07-20T15:55:20"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/139/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/139/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5587
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5587/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5587/comments
|
https://api.github.com/repos/ollama/ollama/issues/5587/events
|
https://github.com/ollama/ollama/pull/5587
| 2,399,645,579
|
PR_kwDOJ0Z1Ps506GEp
| 5,587
|
Update README.md
|
{
"login": "emrgnt-cmplxty",
"id": 68796651,
"node_id": "MDQ6VXNlcjY4Nzk2NjUx",
"avatar_url": "https://avatars.githubusercontent.com/u/68796651?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/emrgnt-cmplxty",
"html_url": "https://github.com/emrgnt-cmplxty",
"followers_url": "https://api.github.com/users/emrgnt-cmplxty/followers",
"following_url": "https://api.github.com/users/emrgnt-cmplxty/following{/other_user}",
"gists_url": "https://api.github.com/users/emrgnt-cmplxty/gists{/gist_id}",
"starred_url": "https://api.github.com/users/emrgnt-cmplxty/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/emrgnt-cmplxty/subscriptions",
"organizations_url": "https://api.github.com/users/emrgnt-cmplxty/orgs",
"repos_url": "https://api.github.com/users/emrgnt-cmplxty/repos",
"events_url": "https://api.github.com/users/emrgnt-cmplxty/events{/privacy}",
"received_events_url": "https://api.github.com/users/emrgnt-cmplxty/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-07-10T03:03:48
| 2024-11-21T10:09:37
| 2024-11-21T10:09:37
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5587",
"html_url": "https://github.com/ollama/ollama/pull/5587",
"diff_url": "https://github.com/ollama/ollama/pull/5587.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5587.patch",
"merged_at": "2024-11-21T10:09:37"
}
| null |
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5587/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5587/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8283
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8283/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8283/comments
|
https://api.github.com/repos/ollama/ollama/issues/8283/events
|
https://github.com/ollama/ollama/issues/8283
| 2,765,281,420
|
I_kwDOJ0Z1Ps6k0tiM
| 8,283
|
Memory Leak! Locking app.
|
{
"login": "raymondbernard",
"id": 31748457,
"node_id": "MDQ6VXNlcjMxNzQ4NDU3",
"avatar_url": "https://avatars.githubusercontent.com/u/31748457?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/raymondbernard",
"html_url": "https://github.com/raymondbernard",
"followers_url": "https://api.github.com/users/raymondbernard/followers",
"following_url": "https://api.github.com/users/raymondbernard/following{/other_user}",
"gists_url": "https://api.github.com/users/raymondbernard/gists{/gist_id}",
"starred_url": "https://api.github.com/users/raymondbernard/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/raymondbernard/subscriptions",
"organizations_url": "https://api.github.com/users/raymondbernard/orgs",
"repos_url": "https://api.github.com/users/raymondbernard/repos",
"events_url": "https://api.github.com/users/raymondbernard/events{/privacy}",
"received_events_url": "https://api.github.com/users/raymondbernard/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2025-01-01T22:08:56
| 2025-01-03T00:57:41
| 2025-01-03T00:57:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

I am running the OpenAI API and have developed an LLM evaluator to assess logic and reasoning. However, the Windows version is locking up and becoming unresponsive despite unloading or restarting all related Ollama processes (`ollama.exe`, `ollama_llama_server.exe`, and the app itself). Initially, I suspected issues with NVIDIA, Windows, or code configurations.
As a workaround, I ran Ollama in Docker without relying on Windows or GPU. Within my scripts, I implemented a Python routine to gracefully unload models based on the `keep_alive` value being set to zero:
```bash
curl http://localhost:11434/api/generate -d "{\"model\": \"cosmic-reasoner:latest\", \"keep_alive\": 0}"
```
Monitoring the Docker memory usage, I noticed distinct drops when unloading the model after each API call. However, after each unload, memory consumption continues to increase progressively. This issue is not specific to LLama or any particular model, as I’ve tested with different setups.
It seems others are encountering similar problems, and the issue remains unresolved. This is a critical severity 1 issue that urgently needs to be addressed.
Please advise.
Best regards,
Ray
### OS
Docker
### GPU
_No response_
### CPU
AMD
### Ollama version
# ollama -v ollama version is 0.5.4-0-g2ddc32d-dirty This is the docker latest
|
{
"login": "raymondbernard",
"id": 31748457,
"node_id": "MDQ6VXNlcjMxNzQ4NDU3",
"avatar_url": "https://avatars.githubusercontent.com/u/31748457?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/raymondbernard",
"html_url": "https://github.com/raymondbernard",
"followers_url": "https://api.github.com/users/raymondbernard/followers",
"following_url": "https://api.github.com/users/raymondbernard/following{/other_user}",
"gists_url": "https://api.github.com/users/raymondbernard/gists{/gist_id}",
"starred_url": "https://api.github.com/users/raymondbernard/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/raymondbernard/subscriptions",
"organizations_url": "https://api.github.com/users/raymondbernard/orgs",
"repos_url": "https://api.github.com/users/raymondbernard/repos",
"events_url": "https://api.github.com/users/raymondbernard/events{/privacy}",
"received_events_url": "https://api.github.com/users/raymondbernard/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8283/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8283/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3985
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3985/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3985/comments
|
https://api.github.com/repos/ollama/ollama/issues/3985/events
|
https://github.com/ollama/ollama/pull/3985
| 2,267,307,622
|
PR_kwDOJ0Z1Ps5t7B0f
| 3,985
|
Restart server on failure when running Windows app
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-28T02:49:29
| 2024-04-29T14:07:52
| 2024-04-29T14:07:52
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3985",
"html_url": "https://github.com/ollama/ollama/pull/3985",
"diff_url": "https://github.com/ollama/ollama/pull/3985.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3985.patch",
"merged_at": "2024-04-29T14:07:52"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3985/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3985/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2531
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2531/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2531/comments
|
https://api.github.com/repos/ollama/ollama/issues/2531/events
|
https://github.com/ollama/ollama/pull/2531
| 2,137,712,676
|
PR_kwDOJ0Z1Ps5nCh4G
| 2,531
|
Move LLM library extraction to stable location
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-16T01:20:47
| 2024-02-20T01:24:06
| 2024-02-20T01:24:05
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2531",
"html_url": "https://github.com/ollama/ollama/pull/2531",
"diff_url": "https://github.com/ollama/ollama/pull/2531.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2531.patch",
"merged_at": null
}
|
This refines where we extract the LLM libraries to by adding a new OLLAMA_HOME env var, that defaults to `~/.ollama`
The logic was already idempotenent, so this should speed up startups after the first time a new release is deployed. It also cleans up after itself.
I thought there was an issue tracking this but maybe it was just discussed in discord. (users seeing lots of orphaned ollamaXXX temp dirs)
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2531/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2531/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/384
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/384/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/384/comments
|
https://api.github.com/repos/ollama/ollama/issues/384/events
|
https://github.com/ollama/ollama/issues/384
| 1,857,316,655
|
I_kwDOJ0Z1Ps5utGcv
| 384
|
Can we stop the model response?
|
{
"login": "technoplato",
"id": 6922904,
"node_id": "MDQ6VXNlcjY5MjI5MDQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/6922904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technoplato",
"html_url": "https://github.com/technoplato",
"followers_url": "https://api.github.com/users/technoplato/followers",
"following_url": "https://api.github.com/users/technoplato/following{/other_user}",
"gists_url": "https://api.github.com/users/technoplato/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technoplato/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technoplato/subscriptions",
"organizations_url": "https://api.github.com/users/technoplato/orgs",
"repos_url": "https://api.github.com/users/technoplato/repos",
"events_url": "https://api.github.com/users/technoplato/events{/privacy}",
"received_events_url": "https://api.github.com/users/technoplato/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 5
| 2023-08-18T21:35:42
| 2023-09-02T06:12:54
| 2023-08-22T01:29:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I must be missing this in the docs.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/384/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/384/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5118
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5118/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5118/comments
|
https://api.github.com/repos/ollama/ollama/issues/5118/events
|
https://github.com/ollama/ollama/pull/5118
| 2,360,447,202
|
PR_kwDOJ0Z1Ps5y3A1S
| 5,118
|
Update README.md
|
{
"login": "perpendicularai",
"id": 146530480,
"node_id": "U_kgDOCLvgsA",
"avatar_url": "https://avatars.githubusercontent.com/u/146530480?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/perpendicularai",
"html_url": "https://github.com/perpendicularai",
"followers_url": "https://api.github.com/users/perpendicularai/followers",
"following_url": "https://api.github.com/users/perpendicularai/following{/other_user}",
"gists_url": "https://api.github.com/users/perpendicularai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/perpendicularai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/perpendicularai/subscriptions",
"organizations_url": "https://api.github.com/users/perpendicularai/orgs",
"repos_url": "https://api.github.com/users/perpendicularai/repos",
"events_url": "https://api.github.com/users/perpendicularai/events{/privacy}",
"received_events_url": "https://api.github.com/users/perpendicularai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-18T18:23:55
| 2024-07-02T01:20:31
| 2024-07-02T01:20:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5118",
"html_url": "https://github.com/ollama/ollama/pull/5118",
"diff_url": "https://github.com/ollama/ollama/pull/5118.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5118.patch",
"merged_at": null
}
|
Hi @jmorganca,
As suggested, I have updated the forked repo and have added OllaMail to the Community Integrations for Web and Desktop.
--
Best regards,
Perpendicular AI
|
{
"login": "perpendicularai",
"id": 146530480,
"node_id": "U_kgDOCLvgsA",
"avatar_url": "https://avatars.githubusercontent.com/u/146530480?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/perpendicularai",
"html_url": "https://github.com/perpendicularai",
"followers_url": "https://api.github.com/users/perpendicularai/followers",
"following_url": "https://api.github.com/users/perpendicularai/following{/other_user}",
"gists_url": "https://api.github.com/users/perpendicularai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/perpendicularai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/perpendicularai/subscriptions",
"organizations_url": "https://api.github.com/users/perpendicularai/orgs",
"repos_url": "https://api.github.com/users/perpendicularai/repos",
"events_url": "https://api.github.com/users/perpendicularai/events{/privacy}",
"received_events_url": "https://api.github.com/users/perpendicularai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5118/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6486
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6486/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6486/comments
|
https://api.github.com/repos/ollama/ollama/issues/6486/events
|
https://github.com/ollama/ollama/issues/6486
| 2,484,205,255
|
I_kwDOJ0Z1Ps6UEfbH
| 6,486
|
add LongWrtier Llama3.1 8b and LongWrtier GLM4 9b
|
{
"login": "Willian7004",
"id": 128359604,
"node_id": "U_kgDOB6actA",
"avatar_url": "https://avatars.githubusercontent.com/u/128359604?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Willian7004",
"html_url": "https://github.com/Willian7004",
"followers_url": "https://api.github.com/users/Willian7004/followers",
"following_url": "https://api.github.com/users/Willian7004/following{/other_user}",
"gists_url": "https://api.github.com/users/Willian7004/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Willian7004/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Willian7004/subscriptions",
"organizations_url": "https://api.github.com/users/Willian7004/orgs",
"repos_url": "https://api.github.com/users/Willian7004/repos",
"events_url": "https://api.github.com/users/Willian7004/events{/privacy}",
"received_events_url": "https://api.github.com/users/Willian7004/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 5
| 2024-08-24T04:42:03
| 2024-08-31T09:07:35
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The LongWrtier models are good at writing long content in a single reply.I have successfully impoeted QuantFactory/LongWriter-llama3.1-8b-GGUF so it can be uploaded derectly.I tried to quantize the F32 verson in QuantPanda/LongWriter-glm4-9B-GGUF for a Q4_0 version so that I can load all layers on my GPU,but the quantization failed with "Error: quantization is only supported for F16 and F32 models", so please create a Q4_0 version and upload.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6486/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6486/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1131
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1131/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1131/comments
|
https://api.github.com/repos/ollama/ollama/issues/1131/events
|
https://github.com/ollama/ollama/pull/1131
| 1,993,516,246
|
PR_kwDOJ0Z1Ps5fc7of
| 1,131
|
return failure details when unauthorized to push
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-14T20:28:49
| 2023-11-16T21:44:19
| 2023-11-16T21:44:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1131",
"html_url": "https://github.com/ollama/ollama/pull/1131",
"diff_url": "https://github.com/ollama/ollama/pull/1131.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1131.patch",
"merged_at": "2023-11-16T21:44:18"
}
|
Previous behavior:
Pushing to the default namespace or a namespace you don't have access to results in a vague error.
```
$ ollama push mario
retrieving manifest
Error: max retries exceeded
```
New behavior:
Pushing to the default namespace or a namespace you don't have access to results the reason for the error.
```
$ ollama push mario
retrieving manifest
Error: unable to push library/mario, make sure this namespace exists and you are authorized to push to it
$ ollama push bruxe/mario
retrieving manifest
Error: unable to push bruxe/mario, make sure this namespace exists and you are authorized to push to it
```
Resolves #1140
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1131/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1131/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5823
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5823/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5823/comments
|
https://api.github.com/repos/ollama/ollama/issues/5823/events
|
https://github.com/ollama/ollama/issues/5823
| 2,421,180,309
|
I_kwDOJ0Z1Ps6QUEeV
| 5,823
|
Is there any plan to release an IOS version
|
{
"login": "aibangjuxin",
"id": 8045208,
"node_id": "MDQ6VXNlcjgwNDUyMDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8045208?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aibangjuxin",
"html_url": "https://github.com/aibangjuxin",
"followers_url": "https://api.github.com/users/aibangjuxin/followers",
"following_url": "https://api.github.com/users/aibangjuxin/following{/other_user}",
"gists_url": "https://api.github.com/users/aibangjuxin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aibangjuxin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aibangjuxin/subscriptions",
"organizations_url": "https://api.github.com/users/aibangjuxin/orgs",
"repos_url": "https://api.github.com/users/aibangjuxin/repos",
"events_url": "https://api.github.com/users/aibangjuxin/events{/privacy}",
"received_events_url": "https://api.github.com/users/aibangjuxin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-21T02:33:14
| 2024-07-26T21:12:26
| 2024-07-26T21:12:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is there any plan to release an IOS version? Because the M4 iPad 16G Memory should have a certain local computing power.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5823/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5823/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8536
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8536/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8536/comments
|
https://api.github.com/repos/ollama/ollama/issues/8536/events
|
https://github.com/ollama/ollama/issues/8536
| 2,804,469,384
|
I_kwDOJ0Z1Ps6nKM6I
| 8,536
|
Support for API_KEY based authentication
|
{
"login": "matthiasgeihs",
"id": 62935430,
"node_id": "MDQ6VXNlcjYyOTM1NDMw",
"avatar_url": "https://avatars.githubusercontent.com/u/62935430?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/matthiasgeihs",
"html_url": "https://github.com/matthiasgeihs",
"followers_url": "https://api.github.com/users/matthiasgeihs/followers",
"following_url": "https://api.github.com/users/matthiasgeihs/following{/other_user}",
"gists_url": "https://api.github.com/users/matthiasgeihs/gists{/gist_id}",
"starred_url": "https://api.github.com/users/matthiasgeihs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matthiasgeihs/subscriptions",
"organizations_url": "https://api.github.com/users/matthiasgeihs/orgs",
"repos_url": "https://api.github.com/users/matthiasgeihs/repos",
"events_url": "https://api.github.com/users/matthiasgeihs/events{/privacy}",
"received_events_url": "https://api.github.com/users/matthiasgeihs/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 6
| 2025-01-22T13:58:27
| 2025-01-22T18:49:06
| 2025-01-22T17:23:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Would be great if Ollama server would support some basic level API_KEY-based authentication.
Use case: Chrome browser extensions cannot use ollama out of the box because of CORS restrictions. Ollama will reject requests from these origins (see also https://github.com/ollama/ollama/issues/3571). Would be great if ollama had API_KEY based authentication to solve this issue without requiring the user to manually start ollama with `OLLAMA_ORIGINS`.
|
{
"login": "matthiasgeihs",
"id": 62935430,
"node_id": "MDQ6VXNlcjYyOTM1NDMw",
"avatar_url": "https://avatars.githubusercontent.com/u/62935430?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/matthiasgeihs",
"html_url": "https://github.com/matthiasgeihs",
"followers_url": "https://api.github.com/users/matthiasgeihs/followers",
"following_url": "https://api.github.com/users/matthiasgeihs/following{/other_user}",
"gists_url": "https://api.github.com/users/matthiasgeihs/gists{/gist_id}",
"starred_url": "https://api.github.com/users/matthiasgeihs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matthiasgeihs/subscriptions",
"organizations_url": "https://api.github.com/users/matthiasgeihs/orgs",
"repos_url": "https://api.github.com/users/matthiasgeihs/repos",
"events_url": "https://api.github.com/users/matthiasgeihs/events{/privacy}",
"received_events_url": "https://api.github.com/users/matthiasgeihs/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8536/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8536/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8210
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8210/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8210/comments
|
https://api.github.com/repos/ollama/ollama/issues/8210/events
|
https://github.com/ollama/ollama/pull/8210
| 2,754,540,520
|
PR_kwDOJ0Z1Ps6GAh-3
| 8,210
|
feature: add logger json enabling by env var
|
{
"login": "didlawowo",
"id": 12622760,
"node_id": "MDQ6VXNlcjEyNjIyNzYw",
"avatar_url": "https://avatars.githubusercontent.com/u/12622760?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/didlawowo",
"html_url": "https://github.com/didlawowo",
"followers_url": "https://api.github.com/users/didlawowo/followers",
"following_url": "https://api.github.com/users/didlawowo/following{/other_user}",
"gists_url": "https://api.github.com/users/didlawowo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/didlawowo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/didlawowo/subscriptions",
"organizations_url": "https://api.github.com/users/didlawowo/orgs",
"repos_url": "https://api.github.com/users/didlawowo/repos",
"events_url": "https://api.github.com/users/didlawowo/events{/privacy}",
"received_events_url": "https://api.github.com/users/didlawowo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-12-22T08:03:47
| 2024-12-26T17:49:23
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8210",
"html_url": "https://github.com/ollama/ollama/pull/8210",
"diff_url": "https://github.com/ollama/ollama/pull/8210.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8210.patch",
"merged_at": null
}
|
by adding env var to server, we can enable log format in json.
better for observability tool
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8210/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8210/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8618
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8618/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8618/comments
|
https://api.github.com/repos/ollama/ollama/issues/8618/events
|
https://github.com/ollama/ollama/issues/8618
| 2,814,039,333
|
I_kwDOJ0Z1Ps6nutUl
| 8,618
|
Support Janus-Pro-7b for vision models
|
{
"login": "franz101",
"id": 18228395,
"node_id": "MDQ6VXNlcjE4MjI4Mzk1",
"avatar_url": "https://avatars.githubusercontent.com/u/18228395?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/franz101",
"html_url": "https://github.com/franz101",
"followers_url": "https://api.github.com/users/franz101/followers",
"following_url": "https://api.github.com/users/franz101/following{/other_user}",
"gists_url": "https://api.github.com/users/franz101/gists{/gist_id}",
"starred_url": "https://api.github.com/users/franz101/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/franz101/subscriptions",
"organizations_url": "https://api.github.com/users/franz101/orgs",
"repos_url": "https://api.github.com/users/franz101/repos",
"events_url": "https://api.github.com/users/franz101/events{/privacy}",
"received_events_url": "https://api.github.com/users/franz101/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 26
| 2025-01-27T20:54:56
| 2025-01-30T07:44:12
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Just announced and performing great with OCR
https://huggingface.co/deepseek-ai/Janus-Pro-7B
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8618/reactions",
"total_count": 147,
"+1": 147,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8618/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5925
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5925/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5925/comments
|
https://api.github.com/repos/ollama/ollama/issues/5925/events
|
https://github.com/ollama/ollama/issues/5925
| 2,428,375,566
|
I_kwDOJ0Z1Ps6QvhIO
| 5,925
|
All models disappeared - Error with logs
|
{
"login": "nicholhai",
"id": 96297412,
"node_id": "U_kgDOBb1hxA",
"avatar_url": "https://avatars.githubusercontent.com/u/96297412?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nicholhai",
"html_url": "https://github.com/nicholhai",
"followers_url": "https://api.github.com/users/nicholhai/followers",
"following_url": "https://api.github.com/users/nicholhai/following{/other_user}",
"gists_url": "https://api.github.com/users/nicholhai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nicholhai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nicholhai/subscriptions",
"organizations_url": "https://api.github.com/users/nicholhai/orgs",
"repos_url": "https://api.github.com/users/nicholhai/repos",
"events_url": "https://api.github.com/users/nicholhai/events{/privacy}",
"received_events_url": "https://api.github.com/users/nicholhai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 13
| 2024-07-24T20:02:56
| 2024-07-29T22:24:31
| 2024-07-29T22:24:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I just rebooted my server and was able to login on the web portal but all my models disappeared. Cannot download new ones either. Tried to view logs (I am learning) and got the following. Any ideas?
user@zephyr:~$ sudo docker logs -f 8c941502f633
Loading WEBUI_SECRET_KEY from file, not provided as an environment variable.
Generating WEBUI_SECRET_KEY
Loading WEBUI_SECRET_KEY from .webui_secret_key
USER_AGENT environment variable not set, consider setting it to identify your requests.
INFO: Started server process [1]
INFO: Waiting for application startup.
/app
___ __ __ _ _ _ ___
/ _ \ _ __ ___ _ __ \ \ / /__| |__ | | | |_ _|
| | | | '_ \ / _ \ '_ \ \ \ /\ / / _ \ '_ \| | | || |
| |_| | |_) | __/ | | | \ V V / __/ |_) | |_| || |
\___/| .__/ \___|_| |_| \_/\_/ \___|_.__/ \___/|___|
|_|
v0.3.10 - building the best open-source AI user interface.
https://github.com/open-webui/open-webui
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
INFO [alembic.runtime.migration] Running upgrade -> 7e5b5dc7342b, init
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO [apps.openai.main] get_all_models()
INFO [apps.ollama.main] get_all_models()
INFO: 192.168.3.17:54463 - "GET /admin/settings/ HTTP/1.1" 304 Not Modified
INFO: 192.168.3.17:54463 - "GET /static/splash.png HTTP/1.1" 200 OK
INFO: 192.168.3.17:54463 - "GET /api/config HTTP/1.1" 200 OK
INFO: 192.168.3.17:54464 - "GET /static/favicon.png HTTP/1.1" 200 OK
INFO: 192.168.3.17:54463 - "GET /ws/socket.io/?EIO=4&transport=polling&t=P3an7c7 HTTP/1.1" 200 OK
INFO: 192.168.3.17:54465 - "GET /api/v1/auths/ HTTP/1.1" 401 Unauthorized
INFO: 192.168.3.17:54464 - "POST /ws/socket.io/?EIO=4&transport=polling&t=P3an7cN&sid=-Vh24Ep6pUJoV1MdAAAA HTTP/1.1" 200 OK
INFO: ('192.168.3.17', 54466) - "WebSocket /ws/socket.io/?EIO=4&transport=websocket&sid=-Vh24Ep6pUJoV1MdAAAA" [accepted]
INFO: 192.168.3.17:54465 - "GET /ws/socket.io/?EIO=4&transport=polling&t=P3an7cN.0&sid=-Vh24Ep6pUJoV1MdAAAA HTTP/1.1" 200 OK
INFO: connection open
INFO: 192.168.3.17:54465 - "GET /ws/socket.io/?EIO=4&transport=polling&t=P3an7ck&sid=-Vh24Ep6pUJoV1MdAAAA HTTP/1.1" 200 OK
INFO: 192.168.3.17:54465 - "GET /static/favicon.png HTTP/1.1" 304 Not Modified
INFO [apps.webui.models.auths] authenticate_user: vbagwalla@gmail.com
INFO: 192.168.3.17:54465 - "POST /api/v1/auths/signin HTTP/1.1" 400 Bad Request
INFO [apps.webui.models.auths] insert_new_auth
INFO: 192.168.3.17:54468 - "POST /api/v1/auths/signup HTTP/1.1" 200 OK
user-join tRHiRqANjHsVDMLdAAAB {'auth': {'token': 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpZCI6IjI4NDY1MmU1LTBkN2ItNGIyMC05ZGM3LTAxMTA4ODNhNDRlNCJ9.vSxWpMlFW8jJ_fBmQ3gqwO6vmqij2mVnkMXQQVo4Ryc'}}
user Vicky Bagwalla(284652e5-0d7b-4b20-9dc7-0110883a44e4) connected with session ID tRHiRqANjHsVDMLdAAAB
INFO: 192.168.3.17:54468 - "GET /api/changelog HTTP/1.1" 200 OK
INFO: 192.168.3.17:54468 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK
INFO [apps.openai.main] get_all_models()
INFO [apps.ollama.main] get_all_models()
INFO: 192.168.3.17:54469 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
INFO: 192.168.3.17:54470 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
INFO: 192.168.3.17:54468 - "GET /api/models HTTP/1.1" 200 OK
INFO: 192.168.3.17:54471 - "GET /api/v1/tools/ HTTP/1.1" 200 OK
INFO: 192.168.3.17:54473 - "GET /api/v1/configs/banners HTTP/1.1" 200 OK
INFO: 192.168.3.17:54472 - "GET /api/v1/functions/ HTTP/1.1" 200 OK
INFO: 192.168.3.17:54469 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
INFO: 192.168.3.17:54472 - "GET /ollama/api/version HTTP/1.1" 200 OK
INFO: 192.168.3.17:54469 - "POST /api/v1/chats/tags HTTP/1.1" 200 OK
INFO: 192.168.3.17:54472 - "GET /static/favicon.png HTTP/1.1" 304 Not Modified
INFO: 192.168.3.17:54469 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK
INFO: 192.168.3.17:54473 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO: 192.168.3.17:54480 - "GET /api/v1/users/ HTTP/1.1" 200 OK
INFO: 192.168.3.17:54480 - "GET /api/v1/auths/admin/config HTTP/1.1" 200 OK
INFO: 192.168.3.17:54482 - "GET /api/webhook HTTP/1.1" 200 OK
INFO: 127.0.0.1:54270 - "GET /health HTTP/1.1" 200 OK
INFO: 192.168.3.17:54482 - "GET /ollama/config HTTP/1.1" 200 OK
INFO: 192.168.3.17:54480 - "GET /ollama/api/version HTTP/1.1" 200 OK
INFO: 192.168.3.17:54482 - "GET /ollama/urls HTTP/1.1" 200 OK
INFO [apps.ollama.main] url: http://host.docker.internal:11434
INFO: 192.168.3.17:54487 - "POST /ollama/api/pull/0 HTTP/1.1" 200 OK
INFO [apps.openai.main] get_all_models()
INFO [apps.ollama.main] get_all_models()
INFO: 192.168.3.17:54487 - "GET /api/models HTTP/1.1" 200 OK
INFO: 192.168.3.17:54488 - "GET /ollama/api/version HTTP/1.1" 200 OK
INFO: 192.168.3.17:54488 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK
INFO: 192.168.3.17:54488 - "GET /api/v1/users/user/settings HTTP/1.1" 200 OK
INFO: 192.168.3.17:54492 - "POST /api/v1/chats/new HTTP/1.1" 200 OK
INFO: 192.168.3.17:54492 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO [apps.ollama.main] url: http://host.docker.internal:11434
INFO: 127.0.0.1:52894 - "GET /health HTTP/1.1" 200 OK
INFO: 192.168.3.17:54492 - "POST /ollama/api/chat HTTP/1.1" 200 OK
INFO: 192.168.3.17:54492 - "POST /api/chat/completed HTTP/1.1" 200 OK
INFO: 192.168.3.17:54492 - "POST /api/v1/chats/441f42b9-18b5-4239-8fe3-b0cd46513858 HTTP/1.1" 200 OK
INFO: 192.168.3.17:54492 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO: 192.168.3.17:54492 - "POST /api/v1/chats/441f42b9-18b5-4239-8fe3-b0cd46513858 HTTP/1.1" 200 OK
error from daemon in stream: Error grabbing logs: invalid character 'l' after object key
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5925/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5925/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/5518
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5518/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5518/comments
|
https://api.github.com/repos/ollama/ollama/issues/5518/events
|
https://github.com/ollama/ollama/issues/5518
| 2,393,732,259
|
I_kwDOJ0Z1Ps6OrXSj
| 5,518
|
Ollama-kis new model
|
{
"login": "elearningshow",
"id": 766298,
"node_id": "MDQ6VXNlcjc2NjI5OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/766298?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/elearningshow",
"html_url": "https://github.com/elearningshow",
"followers_url": "https://api.github.com/users/elearningshow/followers",
"following_url": "https://api.github.com/users/elearningshow/following{/other_user}",
"gists_url": "https://api.github.com/users/elearningshow/gists{/gist_id}",
"starred_url": "https://api.github.com/users/elearningshow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/elearningshow/subscriptions",
"organizations_url": "https://api.github.com/users/elearningshow/orgs",
"repos_url": "https://api.github.com/users/elearningshow/repos",
"events_url": "https://api.github.com/users/elearningshow/events{/privacy}",
"received_events_url": "https://api.github.com/users/elearningshow/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-07-06T21:08:14
| 2024-07-09T15:14:28
| 2024-07-09T04:41:53
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
How do you get your GUI listed in Community Integrations, Web & Desktop? https://github.com/elearningshow/ollama-kis
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5518/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5518/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2383
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2383/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2383/comments
|
https://api.github.com/repos/ollama/ollama/issues/2383/events
|
https://github.com/ollama/ollama/issues/2383
| 2,122,403,956
|
I_kwDOJ0Z1Ps5-gVB0
| 2,383
|
Add support to MiniCPM-2B model
|
{
"login": "ShengdingHu",
"id": 32740627,
"node_id": "MDQ6VXNlcjMyNzQwNjI3",
"avatar_url": "https://avatars.githubusercontent.com/u/32740627?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ShengdingHu",
"html_url": "https://github.com/ShengdingHu",
"followers_url": "https://api.github.com/users/ShengdingHu/followers",
"following_url": "https://api.github.com/users/ShengdingHu/following{/other_user}",
"gists_url": "https://api.github.com/users/ShengdingHu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ShengdingHu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ShengdingHu/subscriptions",
"organizations_url": "https://api.github.com/users/ShengdingHu/orgs",
"repos_url": "https://api.github.com/users/ShengdingHu/repos",
"events_url": "https://api.github.com/users/ShengdingHu/events{/privacy}",
"received_events_url": "https://api.github.com/users/ShengdingHu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 15
| 2024-02-07T07:53:14
| 2024-06-05T13:17:24
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Thank you for your exceptional framework. We have developed a end-side Large Language Model MiniCPM and would like to integrate it with the supported models of ollama.
Here's our repository: [MiniCPM on GitHub](https://github.com/OpenBMB/MiniCPM)
Here‘s our blog: [How to Build MiniCPM](https://shengdinghu.notion.site/MiniCPM-Unveiling-the-Potential-of-End-side-Large-Language-Models-d4d3a8c426424654a4e80e42a711cb20)
Following the discussions in the Llama.cpp issue tracker (see https://github.com/ggerganov/llama.cpp/issues/5276), we have successfully converted our model into the GGML format. I have also personally managed to run it successfully on my Mac.
My question is: How can we get official support in Ollama, so that users can easily use the command `ollama run minicpm` to try out our model?
Thank you in advance for your assistance!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2383/reactions",
"total_count": 19,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 13,
"rocket": 2,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/2383/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8086
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8086/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8086/comments
|
https://api.github.com/repos/ollama/ollama/issues/8086/events
|
https://github.com/ollama/ollama/issues/8086
| 2,738,235,263
|
I_kwDOJ0Z1Ps6jNid_
| 8,086
|
OLLAMA_TMPDIR ignored when ollama create
|
{
"login": "luisgg98",
"id": 45603226,
"node_id": "MDQ6VXNlcjQ1NjAzMjI2",
"avatar_url": "https://avatars.githubusercontent.com/u/45603226?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/luisgg98",
"html_url": "https://github.com/luisgg98",
"followers_url": "https://api.github.com/users/luisgg98/followers",
"following_url": "https://api.github.com/users/luisgg98/following{/other_user}",
"gists_url": "https://api.github.com/users/luisgg98/gists{/gist_id}",
"starred_url": "https://api.github.com/users/luisgg98/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luisgg98/subscriptions",
"organizations_url": "https://api.github.com/users/luisgg98/orgs",
"repos_url": "https://api.github.com/users/luisgg98/repos",
"events_url": "https://api.github.com/users/luisgg98/events{/privacy}",
"received_events_url": "https://api.github.com/users/luisgg98/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-13T12:01:40
| 2024-12-16T14:26:19
| 2024-12-16T14:26:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am running ollama as a systemctl service on a server with Ubuntu 22.04.4 LTS (GNU/Linux 5.15.0-117-generic x86_64) as its Operative System.
```bash
user@vm:~/testplaza$ export OLLAMA_TMPDIR=/datassd/proyectos/ollama_tmp
user@vm:~/testplaza$ OLLAMA_TMPDIR=/datassd/proyectos/ollama_tmp ollama create functionary-medimum-v3.1:latest -f functionary.modelfile
```
When I try to create an ollama model out of a huggingface model, I always end up with my _/tmp_ folder runned out of left space.
According to the documentation, I have updated the OLLAMA_TMPDIR environment variable.
OLLAMA_TMPDIR actually exists:
```bash
user@vm:/tmp# ls -al /datassd/proyectos/ollama_tmp
total 12
drwxrwxr-x 3 ollama ollama 4096 dic 13 11:29 .
drwxrwxr-x 10 root root 4096 nov 20 15:37 ..
drwx------ 3 ollama ollama 4096 dic 13 11:29 ollama1349108744
```
Service ollama configuration:
```
user@vm:~/testplaza$ cat /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/usr/local/cuda/bin:/opt/anaconda3/bin:/opt/anaconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
Environment="OLLAMA_MODELS=/datassd/proyectos/modelos"
Environment="OLLAMA_HOST=0.0.0.0:11434"
Environment="OLLAMA_MAX_LOADED_MODELS=8"
Environment="OLLAMA_NUM_PARALLEL=8"
Environment="OLLAMA_DEBUG=1"
Environment="OLLAMA_TMPDIR=/datassd/proyectos/ollama_tmp"
[Install]
WantedBy=default.target
```
However, I doesn't look it's working because it's writing files on _/tmp_ folder.
```bash
user@vm:/tmp# ls -al | grep ollama
-rw------- 1 user groups 15659278324 dic 13 12:39 ollama-tf1844365840
```
Huggingface model: https://huggingface.co/meetkai/functionary-medium-v3.1
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.5.1
|
{
"login": "luisgg98",
"id": 45603226,
"node_id": "MDQ6VXNlcjQ1NjAzMjI2",
"avatar_url": "https://avatars.githubusercontent.com/u/45603226?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/luisgg98",
"html_url": "https://github.com/luisgg98",
"followers_url": "https://api.github.com/users/luisgg98/followers",
"following_url": "https://api.github.com/users/luisgg98/following{/other_user}",
"gists_url": "https://api.github.com/users/luisgg98/gists{/gist_id}",
"starred_url": "https://api.github.com/users/luisgg98/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luisgg98/subscriptions",
"organizations_url": "https://api.github.com/users/luisgg98/orgs",
"repos_url": "https://api.github.com/users/luisgg98/repos",
"events_url": "https://api.github.com/users/luisgg98/events{/privacy}",
"received_events_url": "https://api.github.com/users/luisgg98/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8086/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8086/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1249
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1249/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1249/comments
|
https://api.github.com/repos/ollama/ollama/issues/1249/events
|
https://github.com/ollama/ollama/issues/1249
| 2,007,204,900
|
I_kwDOJ0Z1Ps53o4Qk
| 1,249
|
An easy way to get model information?
|
{
"login": "jorge-menjivar",
"id": 16660534,
"node_id": "MDQ6VXNlcjE2NjYwNTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/16660534?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jorge-menjivar",
"html_url": "https://github.com/jorge-menjivar",
"followers_url": "https://api.github.com/users/jorge-menjivar/followers",
"following_url": "https://api.github.com/users/jorge-menjivar/following{/other_user}",
"gists_url": "https://api.github.com/users/jorge-menjivar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jorge-menjivar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jorge-menjivar/subscriptions",
"organizations_url": "https://api.github.com/users/jorge-menjivar/orgs",
"repos_url": "https://api.github.com/users/jorge-menjivar/repos",
"events_url": "https://api.github.com/users/jorge-menjivar/events{/privacy}",
"received_events_url": "https://api.github.com/users/jorge-menjivar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 6
| 2023-11-22T22:37:45
| 2025-01-13T09:42:12
| 2024-09-04T03:28:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is there an easy way to get model information? I would like to know the context size window for any model, preferably from the endpoints API.
If there is no way to get this from the endpoints API, I would like to contribute this feature, however, I am not sure where this information is found for each model. Advice on this would be appreciated. Thank you!
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1249/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1249/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8499
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8499/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8499/comments
|
https://api.github.com/repos/ollama/ollama/issues/8499/events
|
https://github.com/ollama/ollama/issues/8499
| 2,798,767,628
|
I_kwDOJ0Z1Ps6m0c4M
| 8,499
|
Ollama is working fine with CLI / powershell but goes in loop on API request.
|
{
"login": "guytechw",
"id": 39879485,
"node_id": "MDQ6VXNlcjM5ODc5NDg1",
"avatar_url": "https://avatars.githubusercontent.com/u/39879485?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/guytechw",
"html_url": "https://github.com/guytechw",
"followers_url": "https://api.github.com/users/guytechw/followers",
"following_url": "https://api.github.com/users/guytechw/following{/other_user}",
"gists_url": "https://api.github.com/users/guytechw/gists{/gist_id}",
"starred_url": "https://api.github.com/users/guytechw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guytechw/subscriptions",
"organizations_url": "https://api.github.com/users/guytechw/orgs",
"repos_url": "https://api.github.com/users/guytechw/repos",
"events_url": "https://api.github.com/users/guytechw/events{/privacy}",
"received_events_url": "https://api.github.com/users/guytechw/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 5
| 2025-01-20T09:54:54
| 2025-01-21T18:21:01
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama is able to run model when a message is sent through CLI /Powershell, when Cline or Roocline is used or a prompt is sent through API. Ollama is stuck in loop of previous response.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.7
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8499/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8499/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/635
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/635/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/635/comments
|
https://api.github.com/repos/ollama/ollama/issues/635/events
|
https://github.com/ollama/ollama/pull/635
| 1,918,088,659
|
PR_kwDOJ0Z1Ps5beZW_
| 635
|
do not reload model when only prompt template changes
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-09-28T18:51:14
| 2023-10-20T16:43:57
| 2023-10-18T18:11:15
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/635",
"html_url": "https://github.com/ollama/ollama/pull/635",
"diff_url": "https://github.com/ollama/ollama/pull/635.diff",
"patch_url": "https://github.com/ollama/ollama/pull/635.patch",
"merged_at": null
}
|
Say I have 2 models, both are based on llama2, but they have different prompts.
```
FROM llama2
TEMPLATE """
you are a dog
"""
```
and
```
FROM llama2
TEMPLATE """
you are a cat
"""
```
If I am building something that swaps requests between these models a lot our current logic will re-load the models every time, even though the only thing changing is the prompt template.
This change adds a `runner digest` which uses only fields relevant to running the model to determine if a running model should be swapped out.
As a side-effect, this also fixes the `/show modelfile` to actually show the library model name, rather than the file name when using a base model.
```
ollama run mistral
>>> /show modelfile
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this one, replace the FROM line with:
# FROM mistral:latest
FROM registry.ollama.ai/library/mistral:latest
TEMPLATE """[INST] {{ .Prompt }} [/INST]
"""
SYSTEM """"""
```
- Also remove calculation on system prompt from template that makes sure the first system command is kept via `num_keep`. This isn't needed with our new prompt templates.
Resolves #337
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/635/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/635/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3924
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3924/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3924/comments
|
https://api.github.com/repos/ollama/ollama/issues/3924/events
|
https://github.com/ollama/ollama/pull/3924
| 2,264,647,312
|
PR_kwDOJ0Z1Ps5tyFqx
| 3,924
|
types/model: overhaul Name and Digest types
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-25T23:20:10
| 2024-04-26T20:08:33
| 2024-04-26T20:08:32
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3924",
"html_url": "https://github.com/ollama/ollama/pull/3924",
"diff_url": "https://github.com/ollama/ollama/pull/3924.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3924.patch",
"merged_at": "2024-04-26T20:08:32"
}
| null |
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3924/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3924/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6758
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6758/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6758/comments
|
https://api.github.com/repos/ollama/ollama/issues/6758/events
|
https://github.com/ollama/ollama/issues/6758
| 2,520,452,718
|
I_kwDOJ0Z1Ps6WOw5u
| 6,758
|
Model Request: Reader-LM
|
{
"login": "Xe",
"id": 529003,
"node_id": "MDQ6VXNlcjUyOTAwMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/529003?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Xe",
"html_url": "https://github.com/Xe",
"followers_url": "https://api.github.com/users/Xe/followers",
"following_url": "https://api.github.com/users/Xe/following{/other_user}",
"gists_url": "https://api.github.com/users/Xe/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Xe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Xe/subscriptions",
"organizations_url": "https://api.github.com/users/Xe/orgs",
"repos_url": "https://api.github.com/users/Xe/repos",
"events_url": "https://api.github.com/users/Xe/events{/privacy}",
"received_events_url": "https://api.github.com/users/Xe/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-11T18:29:17
| 2024-09-11T18:47:48
| 2024-09-11T18:47:48
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
0.5B: https://huggingface.co/jinaai/reader-lm-0.5b
1.5B: https://huggingface.co/jinaai/reader-lm-1.5b
Upstream blogpost: https://jina.ai/news/reader-lm-small-language-models-for-cleaning-and-converting-html-to-markdown/
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6758/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6758/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3749
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3749/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3749/comments
|
https://api.github.com/repos/ollama/ollama/issues/3749/events
|
https://github.com/ollama/ollama/issues/3749
| 2,252,266,631
|
I_kwDOJ0Z1Ps6GPtyH
| 3,749
|
Rerankers and Embeddings
|
{
"login": "BradKML",
"id": 58927531,
"node_id": "MDQ6VXNlcjU4OTI3NTMx",
"avatar_url": "https://avatars.githubusercontent.com/u/58927531?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BradKML",
"html_url": "https://github.com/BradKML",
"followers_url": "https://api.github.com/users/BradKML/followers",
"following_url": "https://api.github.com/users/BradKML/following{/other_user}",
"gists_url": "https://api.github.com/users/BradKML/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BradKML/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BradKML/subscriptions",
"organizations_url": "https://api.github.com/users/BradKML/orgs",
"repos_url": "https://api.github.com/users/BradKML/repos",
"events_url": "https://api.github.com/users/BradKML/events{/privacy}",
"received_events_url": "https://api.github.com/users/BradKML/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 17
| 2024-04-19T07:23:34
| 2025-01-11T15:44:54
| 2024-09-02T20:58:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello from the agentic AI crowd, other than that Ollama does not have concurrent model abilities, is it possible to get Ollama to run Rerankers (like Cohere and Jina) or text embeddings (SentenceTransformers e.g. SBERT) for applications that does not directly use LLMs?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3749/reactions",
"total_count": 96,
"+1": 89,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 7
}
|
https://api.github.com/repos/ollama/ollama/issues/3749/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3139
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3139/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3139/comments
|
https://api.github.com/repos/ollama/ollama/issues/3139/events
|
https://github.com/ollama/ollama/issues/3139
| 2,186,478,361
|
I_kwDOJ0Z1Ps6CUwMZ
| 3,139
|
low ram usage
|
{
"login": "parzzd",
"id": 103915075,
"node_id": "U_kgDOBjGeQw",
"avatar_url": "https://avatars.githubusercontent.com/u/103915075?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/parzzd",
"html_url": "https://github.com/parzzd",
"followers_url": "https://api.github.com/users/parzzd/followers",
"following_url": "https://api.github.com/users/parzzd/following{/other_user}",
"gists_url": "https://api.github.com/users/parzzd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/parzzd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/parzzd/subscriptions",
"organizations_url": "https://api.github.com/users/parzzd/orgs",
"repos_url": "https://api.github.com/users/parzzd/repos",
"events_url": "https://api.github.com/users/parzzd/events{/privacy}",
"received_events_url": "https://api.github.com/users/parzzd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-03-14T14:07:25
| 2024-03-14T19:33:13
| 2024-03-14T19:33:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
i have a problem with the ram usage for qwen_1_5_4b
i have 16gb ram, but the software only use 1.6gb
is there a way to dedicate more ram to ollama process?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3139/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3139/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2936
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2936/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2936/comments
|
https://api.github.com/repos/ollama/ollama/issues/2936/events
|
https://github.com/ollama/ollama/issues/2936
| 2,169,243,157
|
I_kwDOJ0Z1Ps6BTAYV
| 2,936
|
Does not using all threads on NUMA configuration (server motherboards 2, 4, 6 multisocket CPU) on Windows
|
{
"login": "GermanAizek",
"id": 21138600,
"node_id": "MDQ6VXNlcjIxMTM4NjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/21138600?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/GermanAizek",
"html_url": "https://github.com/GermanAizek",
"followers_url": "https://api.github.com/users/GermanAizek/followers",
"following_url": "https://api.github.com/users/GermanAizek/following{/other_user}",
"gists_url": "https://api.github.com/users/GermanAizek/gists{/gist_id}",
"starred_url": "https://api.github.com/users/GermanAizek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GermanAizek/subscriptions",
"organizations_url": "https://api.github.com/users/GermanAizek/orgs",
"repos_url": "https://api.github.com/users/GermanAizek/repos",
"events_url": "https://api.github.com/users/GermanAizek/events{/privacy}",
"received_events_url": "https://api.github.com/users/GermanAizek/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-03-05T13:29:05
| 2024-10-15T22:54:14
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Very old problem since 00s early Microsoft concerns `setThreadAffinity` which by default does not cover all logical processors in the system, this was fixed 20 years later only in Windows 11 and more Windows Server, but count threads is most likely incorrectly calculated. I do not know Golang, but if it were in С/C++ I would be able to help. I have already fixed this issue here before:
https://github.com/ggerganov/llama.cpp/issues/5524
https://github.com/x64dbg/x64dbg/pull/3272
https://github.com/giampaolo/psutil/issues/771
https://github.com/GermanAizek/llvm-project/commit/d1fa25f37631b8b33a71fbe9eb4ea89e3a47b723

| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2936/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2936/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/659
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/659/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/659/comments
|
https://api.github.com/repos/ollama/ollama/issues/659/events
|
https://github.com/ollama/ollama/pull/659
| 1,920,380,429
|
PR_kwDOJ0Z1Ps5bmBil
| 659
|
fix docker build
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-30T19:50:42
| 2023-09-30T20:34:02
| 2023-09-30T20:34:01
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/659",
"html_url": "https://github.com/ollama/ollama/pull/659",
"diff_url": "https://github.com/ollama/ollama/pull/659.diff",
"patch_url": "https://github.com/ollama/ollama/pull/659.patch",
"merged_at": "2023-09-30T20:34:01"
}
|
Resolves #652
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/659/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/659/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8657
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8657/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8657/comments
|
https://api.github.com/repos/ollama/ollama/issues/8657/events
|
https://github.com/ollama/ollama/issues/8657
| 2,818,103,966
|
I_kwDOJ0Z1Ps6n-Nqe
| 8,657
|
running ollama deepseek-r1:1.5b on windows stuck for whole day
|
{
"login": "aadltya",
"id": 142524039,
"node_id": "U_kgDOCH6-hw",
"avatar_url": "https://avatars.githubusercontent.com/u/142524039?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aadltya",
"html_url": "https://github.com/aadltya",
"followers_url": "https://api.github.com/users/aadltya/followers",
"following_url": "https://api.github.com/users/aadltya/following{/other_user}",
"gists_url": "https://api.github.com/users/aadltya/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aadltya/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aadltya/subscriptions",
"organizations_url": "https://api.github.com/users/aadltya/orgs",
"repos_url": "https://api.github.com/users/aadltya/repos",
"events_url": "https://api.github.com/users/aadltya/events{/privacy}",
"received_events_url": "https://api.github.com/users/aadltya/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
closed
| false
| null |
[] | null | 3
| 2025-01-29T12:40:29
| 2025-01-29T13:44:51
| 2025-01-29T13:44:50
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
optimize for low end device, I'm using should windows with 8gb ram and 4gb nvidia gtx 1650 graphics card and im unable to run PS deepseek-r1:1.5b
In command line it stuck at 0% for whole day
```bash
C:\Users\ADITYA> ollama run deepseek-r1:1.5b
pulling manifest
pulling aabd4debf0c8... 0% ▕ ▏ 0 B/1.1 GB
```

|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8657/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8657/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2121
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2121/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2121/comments
|
https://api.github.com/repos/ollama/ollama/issues/2121/events
|
https://github.com/ollama/ollama/issues/2121
| 2,092,523,207
|
I_kwDOJ0Z1Ps58uV7H
| 2,121
|
Feature request: control session duration of loaded models
|
{
"login": "nperez",
"id": 75055,
"node_id": "MDQ6VXNlcjc1MDU1",
"avatar_url": "https://avatars.githubusercontent.com/u/75055?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nperez",
"html_url": "https://github.com/nperez",
"followers_url": "https://api.github.com/users/nperez/followers",
"following_url": "https://api.github.com/users/nperez/following{/other_user}",
"gists_url": "https://api.github.com/users/nperez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nperez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nperez/subscriptions",
"organizations_url": "https://api.github.com/users/nperez/orgs",
"repos_url": "https://api.github.com/users/nperez/repos",
"events_url": "https://api.github.com/users/nperez/events{/privacy}",
"received_events_url": "https://api.github.com/users/nperez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-01-21T10:28:38
| 2024-01-27T00:19:45
| 2024-01-27T00:19:33
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have a use case where multiple processes (stable diffusion, whsiper, ollama, etc) are competing for limited GPU resources and I need to share the GPU. Unfortunately, there doesn't appear to be a way to manage the session lifetime of loaded models in ollama. It would be cool to have the ability via model options to control the session lifetime (ie. unload after each request) or have a new endpoint to unconditionally unload whatever model is loaded. Without this feature, I need to manage (kill, then restart) the ollama process or wait the five minutes that is the current `defaultSessionDuration` in routes.go. Before v0.1.18, I probably would have just killed the separate runner process which would leave the api server intact, but now that it is integrated, that isn't really an option any more.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2121/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2121/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1316
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1316/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1316/comments
|
https://api.github.com/repos/ollama/ollama/issues/1316/events
|
https://github.com/ollama/ollama/issues/1316
| 2,016,892,473
|
I_kwDOJ0Z1Ps54N1Y5
| 1,316
|
Refactoring API Spec to Match OpenAI Standards
|
{
"login": "bernardo-sb",
"id": 115413406,
"node_id": "U_kgDOBuERng",
"avatar_url": "https://avatars.githubusercontent.com/u/115413406?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bernardo-sb",
"html_url": "https://github.com/bernardo-sb",
"followers_url": "https://api.github.com/users/bernardo-sb/followers",
"following_url": "https://api.github.com/users/bernardo-sb/following{/other_user}",
"gists_url": "https://api.github.com/users/bernardo-sb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bernardo-sb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bernardo-sb/subscriptions",
"organizations_url": "https://api.github.com/users/bernardo-sb/orgs",
"repos_url": "https://api.github.com/users/bernardo-sb/repos",
"events_url": "https://api.github.com/users/bernardo-sb/events{/privacy}",
"received_events_url": "https://api.github.com/users/bernardo-sb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-29T15:48:41
| 2024-01-25T21:50:35
| 2024-01-25T21:50:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### Description
**Current Behavior:**
The current API spec in Ollama does not align with OpenAI's standards.
**Desired Behavior:**
Refactor the API spec to match OpenAI's standards for consistency and compatibility.
### Rationale
OpenAI has established itself as a standard in the field of large language models, and aligning Ollama's API spec with OpenAI's standards can unlock new use cases and facilitate seamless integrations. Developers familiar with OpenAI's API will find it easier to work with Ollama, leading to a more user-friendly and accessible experience.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1316/reactions",
"total_count": 10,
"+1": 10,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1316/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2732
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2732/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2732/comments
|
https://api.github.com/repos/ollama/ollama/issues/2732/events
|
https://github.com/ollama/ollama/issues/2732
| 2,152,320,200
|
I_kwDOJ0Z1Ps6ASczI
| 2,732
|
Provide model context length in API
|
{
"login": "gluonfield",
"id": 5672094,
"node_id": "MDQ6VXNlcjU2NzIwOTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/5672094?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gluonfield",
"html_url": "https://github.com/gluonfield",
"followers_url": "https://api.github.com/users/gluonfield/followers",
"following_url": "https://api.github.com/users/gluonfield/following{/other_user}",
"gists_url": "https://api.github.com/users/gluonfield/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gluonfield/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gluonfield/subscriptions",
"organizations_url": "https://api.github.com/users/gluonfield/orgs",
"repos_url": "https://api.github.com/users/gluonfield/repos",
"events_url": "https://api.github.com/users/gluonfield/events{/privacy}",
"received_events_url": "https://api.github.com/users/gluonfield/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-02-24T13:54:53
| 2024-06-29T22:49:39
| 2024-06-29T22:49:39
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi guys, love Ollama and contributing to the ecosystem by building [Enchanted](https://github.com/AugustDev/enchanted).
One important thing that is currently missing from `/api/show` API is the context length that model supports. For all RAG applications this is essential to know and it seems that in the future models will support greatly varied context lengths.
It would be great to include this metadata into API and as well as [Ollama library](https://ollama.com/library).
Example of `/api/show` response for Gemma contains no clues of context length.
```
{
"license": "...",
"modelfile": "# Modelfile generated by \"ollama show\"\n# To build a new Modelfile based on this one, replace the FROM line with:\n# FROM gemma:7b\n\nFROM /Users/wpc/.ollama/models/blobs/sha256:456402914e838a953e0cf80caa6adbe75383d9e63584a964f504a7bbb8f7aad9\nTEMPLATE \"\"\"<start_of_turn>user\n{{ if .System }}{{ .System }} {{ end }}{{ .Prompt }}<end_of_turn>\n<start_of_turn>model\n{{ .Response }}<end_of_turn>\n\"\"\"\nPARAMETER repeat_penalty 1\nPARAMETER stop \"<start_of_turn>\"\nPARAMETER stop \"<end_of_turn>\"",
"parameters": "repeat_penalty 1\nstop \"<start_of_turn>\"\nstop \"<end_of_turn>\"",
"template": "<start_of_turn>user\n{{ if .System }}{{ .System }} {{ end }}{{ .Prompt }}<end_of_turn>\n<start_of_turn>model\n{{ .Response }}<end_of_turn>\n",
"details": {
"parent_model": "",
"format": "gguf",
"family": "gemma",
"families": [
"gemma"
],
"parameter_size": "9B",
"quantization_level": "Q4_0"
}
}
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2732/reactions",
"total_count": 30,
"+1": 24,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 6,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2732/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1504
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1504/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1504/comments
|
https://api.github.com/repos/ollama/ollama/issues/1504/events
|
https://github.com/ollama/ollama/issues/1504
| 2,039,992,494
|
I_kwDOJ0Z1Ps55l9Cu
| 1,504
|
Upgrade to XCode 14.2 breaks build which is looking for 14.0
|
{
"login": "jkleckner",
"id": 1223485,
"node_id": "MDQ6VXNlcjEyMjM0ODU=",
"avatar_url": "https://avatars.githubusercontent.com/u/1223485?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jkleckner",
"html_url": "https://github.com/jkleckner",
"followers_url": "https://api.github.com/users/jkleckner/followers",
"following_url": "https://api.github.com/users/jkleckner/following{/other_user}",
"gists_url": "https://api.github.com/users/jkleckner/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jkleckner/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jkleckner/subscriptions",
"organizations_url": "https://api.github.com/users/jkleckner/orgs",
"repos_url": "https://api.github.com/users/jkleckner/repos",
"events_url": "https://api.github.com/users/jkleckner/events{/privacy}",
"received_events_url": "https://api.github.com/users/jkleckner/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2023-12-13T15:48:17
| 2024-04-04T13:29:34
| 2024-02-20T01:22:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
An upgrade to current XCode breaks the build which is looking explicitly for the sdk 14.0 which is now 14.2:
```
$ go generate ./...
Submodule path 'ggml': checked out '9e232f0234073358e7031c1b8d7aa45020469a3b'
CMake Warning at /opt/homebrew/Cellar/cmake/3.28.0/share/cmake/Modules/Platform/Darwin-Initialize.cmake:308 (message):
Ignoring CMAKE_OSX_SYSROOT value:
/Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX14.0.sdk
because the directory does not exist.
Call Stack (most recent call first):
/opt/homebrew/Cellar/cmake/3.28.0/share/cmake/Modules/CMakeSystemSpecificInitialize.cmake:34 (include)
CMakeLists.txt:2 (project)
```
SDK links:
```
$ ls -l /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs
total 0
drwxr-xr-x 7 root wheel 224B Nov 13 10:25 MacOSX.sdk
lrwxr-xr-x 1 root wheel 10B Dec 13 06:00 MacOSX14.2.sdk -> MacOSX.sdk
lrwxr-xr-x 1 root wheel 10B Dec 13 06:00 MacOSX14.sdk -> MacOSX.sdk
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1504/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1504/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1947
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1947/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1947/comments
|
https://api.github.com/repos/ollama/ollama/issues/1947/events
|
https://github.com/ollama/ollama/issues/1947
| 2,078,222,506
|
I_kwDOJ0Z1Ps573yiq
| 1,947
|
`CUDA out of memory` error with multi-GPU of different sizes
|
{
"login": "m0wer",
"id": 25278081,
"node_id": "MDQ6VXNlcjI1Mjc4MDgx",
"avatar_url": "https://avatars.githubusercontent.com/u/25278081?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/m0wer",
"html_url": "https://github.com/m0wer",
"followers_url": "https://api.github.com/users/m0wer/followers",
"following_url": "https://api.github.com/users/m0wer/following{/other_user}",
"gists_url": "https://api.github.com/users/m0wer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/m0wer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/m0wer/subscriptions",
"organizations_url": "https://api.github.com/users/m0wer/orgs",
"repos_url": "https://api.github.com/users/m0wer/repos",
"events_url": "https://api.github.com/users/m0wer/events{/privacy}",
"received_events_url": "https://api.github.com/users/m0wer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-01-12T07:19:56
| 2024-05-20T20:12:24
| 2024-05-20T20:12:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
With two GPUs (RTX 2060 6GB + RTX 3090 24GB) and ollama 1.2.0 I get a OOM + ollama crash. In previous versions, it would have only tried to fit 28/33 layers in VRAM and that worked. This could be related to https://github.com/jmorganca/ollama/issues/1385
```
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 32002
llm_load_print_meta: n_merges = 0
llm_load_print_meta: n_ctx_train = 32768
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: n_expert = 8
llm_load_print_meta: n_expert_used = 2
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 1000000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx = 32768
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: model type = 7B
llm_load_print_meta: model ftype = Q4_K - Medium
llm_load_print_meta: model params = 46.70 B
llm_load_print_meta: model size = 24.62 GiB (4.53 BPW)
llm_load_print_meta: general.name = cognitivecomputations
llm_load_print_meta: BOS token = 1 '<s>'
llm_load_print_meta: EOS token = 32000 '<|im_end|>'
llm_load_print_meta: UNK token = 0 '<unk>'
llm_load_print_meta: LF token = 13 '<0x0A>'
llm_load_tensors: ggml ctx size = 0.38 MiB
llm_load_tensors: using CUDA for GPU acceleration
llm_load_tensors: mem required = 955.85 MiB
llm_load_tensors: offloading 31 repeating layers to GPU
llm_load_tensors: offloaded 31/33 layers to GPU
llm_load_tensors: VRAM used: 24260.41 MiB
.............................................................................................
CUDA error 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:9007: out of memory
current device: 1
GGML_ASSERT: /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:9007: !"CUDA error"
SIGABRT: abort
PC=0x7f59828cb9fc m=7 sigcode=18446744073709551610
signal arrived during cgo execution
goroutine 11 [syscall]:
runtime.cgocall(0x9c0710, 0xc0004de608)
/usr/local/go/src/runtime/cgocall.go:157 +0x4b fp=0xc0004de5e0 sp=0xc0004de5a8 pc=0x4266ab
github.com/jmorganca/ollama/llm._Cfunc_dynamic_shim_llama_server_init({0x7f591c001280, 0x7f58c7d4b7b0, 0x7f58c7d3ed90, 0x7f58c7d41150, 0x7f58c7d58680, 0x7f58c7d48ca0, 0x7f58c7d40ff0, 0x7f58c7d3ee30, 0x7f58c7d587b0, 0x7f58c7d58b50, ...}, ...)
_cgo_gotypes.go:291 +0x45 fp=0xc0004de608 sp=0xc0004de5e0 pc=0x7cce45
github.com/jmorganca/ollama/llm.(*shimExtServer).llama_server_init.func1(0x456c1b?, 0x80?, 0x80?)
/go/src/github.com/jmorganca/ollama/llm/shim_ext_server.go:40 +0xec fp=0xc0004de6f8 sp=0xc0004de608 pc=0x7d220c
github.com/jmorganca/ollama/llm.(*shimExtServer).llama_server_init(0xc0000942d0?, 0x0?, 0x4377c8?)
/go/src/github.com/jmorganca/ollama/llm/shim_ext_server.go:40 +0x13 fp=0xc0004de720 sp=0xc0004de6f8 pc=0x7d20f3
github.com/jmorganca/ollama/llm.newExtServer({0x2b39d1d8, 0xc0004d4120}, {0xc0004ce150, _}, {_, _, _}, {0x0, 0x0, 0x0}, ...)
/go/src/github.com/jmorganca/ollama/llm/ext_server_common.go:139 +0x70e fp=0xc0004de8e0 sp=0xc0004de720 pc=0x7ce38e
```
|
{
"login": "m0wer",
"id": 25278081,
"node_id": "MDQ6VXNlcjI1Mjc4MDgx",
"avatar_url": "https://avatars.githubusercontent.com/u/25278081?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/m0wer",
"html_url": "https://github.com/m0wer",
"followers_url": "https://api.github.com/users/m0wer/followers",
"following_url": "https://api.github.com/users/m0wer/following{/other_user}",
"gists_url": "https://api.github.com/users/m0wer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/m0wer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/m0wer/subscriptions",
"organizations_url": "https://api.github.com/users/m0wer/orgs",
"repos_url": "https://api.github.com/users/m0wer/repos",
"events_url": "https://api.github.com/users/m0wer/events{/privacy}",
"received_events_url": "https://api.github.com/users/m0wer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1947/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1947/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/983
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/983/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/983/comments
|
https://api.github.com/repos/ollama/ollama/issues/983/events
|
https://github.com/ollama/ollama/pull/983
| 1,975,305,210
|
PR_kwDOJ0Z1Ps5efVYu
| 983
|
Add missing "be" to GPU support warning message
|
{
"login": "noahgitsham",
"id": 73707948,
"node_id": "MDQ6VXNlcjczNzA3OTQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/73707948?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/noahgitsham",
"html_url": "https://github.com/noahgitsham",
"followers_url": "https://api.github.com/users/noahgitsham/followers",
"following_url": "https://api.github.com/users/noahgitsham/following{/other_user}",
"gists_url": "https://api.github.com/users/noahgitsham/gists{/gist_id}",
"starred_url": "https://api.github.com/users/noahgitsham/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/noahgitsham/subscriptions",
"organizations_url": "https://api.github.com/users/noahgitsham/orgs",
"repos_url": "https://api.github.com/users/noahgitsham/repos",
"events_url": "https://api.github.com/users/noahgitsham/events{/privacy}",
"received_events_url": "https://api.github.com/users/noahgitsham/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-03T01:36:29
| 2023-11-03T01:37:13
| 2023-11-03T01:37:13
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/983",
"html_url": "https://github.com/ollama/ollama/pull/983",
"diff_url": "https://github.com/ollama/ollama/pull/983.diff",
"patch_url": "https://github.com/ollama/ollama/pull/983.patch",
"merged_at": "2023-11-03T01:37:12"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/983/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/983/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2954
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2954/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2954/comments
|
https://api.github.com/repos/ollama/ollama/issues/2954/events
|
https://github.com/ollama/ollama/issues/2954
| 2,171,518,159
|
I_kwDOJ0Z1Ps6BbrzP
| 2,954
|
server fails to init GPU when accessing a model (GPU was detected during startup)
|
{
"login": "stephen2001",
"id": 23563824,
"node_id": "MDQ6VXNlcjIzNTYzODI0",
"avatar_url": "https://avatars.githubusercontent.com/u/23563824?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stephen2001",
"html_url": "https://github.com/stephen2001",
"followers_url": "https://api.github.com/users/stephen2001/followers",
"following_url": "https://api.github.com/users/stephen2001/following{/other_user}",
"gists_url": "https://api.github.com/users/stephen2001/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stephen2001/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stephen2001/subscriptions",
"organizations_url": "https://api.github.com/users/stephen2001/orgs",
"repos_url": "https://api.github.com/users/stephen2001/repos",
"events_url": "https://api.github.com/users/stephen2001/events{/privacy}",
"received_events_url": "https://api.github.com/users/stephen2001/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2024-03-06T13:30:03
| 2024-04-15T22:37:23
| 2024-04-15T22:37:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am running ollama "serve" in a docker container, this is my current dockerfile
```
FROM nvidia/cuda:11.8.0-cudnn8-devel-ubuntu22.04
WORKDIR /opt/ollama
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
wget curl \
&& apt-get autoremove -y \
&& apt-get clean \
&& rm -rf /var/lib/{apt,dpkg,cache,log}/
# Download and install Ollama
RUN curl -L https://ollama.com/download/ollama-linux-amd64 -o /usr/bin/ollama && \
chmod +x /usr/bin/ollama
ENV OLLAMA_DEBUG=1
ENV OLLAMA_HOST 0.0.0.0
EXPOSE 11434
# Set the entrypoint
ENTRYPOINT [ "/usr/bin/ollama" ]
# Default command
CMD ["serve"]
```
GPU is detected as expected
```
time=2024-03-06T14:07:32.512+01:00 level=INFO source=images.go:710 msg="total blobs: 6"
2024-03-06T13:07:32.514600912Z time=2024-03-06T14:07:32.514+01:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0"
2024-03-06T13:07:32.516025518Z time=2024-03-06T14:07:32.515+01:00 level=INFO source=routes.go:1021 msg="Listening on [::]:11434 (version 0.1.28)"
2024-03-06T13:07:32.516175124Z time=2024-03-06T14:07:32.516+01:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
2024-03-06T13:07:37.612422723Z time=2024-03-06T14:07:37.611+01:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu_avx2 cpu_avx cpu cuda_v11 rocm_v6 rocm_v5]"
2024-03-06T13:07:37.612486963Z time=2024-03-06T14:07:37.612+01:00 level=DEBUG source=payload_common.go:147 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY"
2024-03-06T13:07:37.612499836Z time=2024-03-06T14:07:37.612+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
2024-03-06T13:07:37.612510087Z time=2024-03-06T14:07:37.612+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so"
2024-03-06T13:07:37.612522996Z time=2024-03-06T14:07:37.612+01:00 level=DEBUG source=gpu.go:283 msg="gpu management search paths: [/usr/local/cuda/lib64/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/libnvidia-ml.so* /usr/lib/wsl/lib/libnvidia-ml.so* /usr/lib/wsl/drivers/*/libnvidia-ml.so* /opt/cuda/lib64/libnvidia-ml.so* /usr/lib*/libnvidia-ml.so* /usr/local/lib*/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/libnvidia-ml.so* /opt/cuda/targets/x86_64-linux/lib/stubs/libnvidia-ml.so* /usr/local/nvidia/lib/libnvidia-ml.so* /usr/local/nvidia/lib64/libnvidia-ml.so*]"
2024-03-06T13:07:37.614431607Z time=2024-03-06T14:07:37.614+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/usr/lib/x86_64-linux-gnu/libnvidia-ml.so.525.116.04]"
2024-03-06T13:07:37.614536353Z wiring nvidia management library functions in /usr/lib/x86_64-linux-gnu/libnvidia-ml.so.525.116.04
2024-03-06T13:07:37.614565012Z dlsym: nvmlInit_v2
2024-03-06T13:07:37.614573240Z dlsym: nvmlShutdown
2024-03-06T13:07:37.614580264Z dlsym: nvmlDeviceGetHandleByIndex
2024-03-06T13:07:37.614587226Z dlsym: nvmlDeviceGetMemoryInfo
2024-03-06T13:07:37.614593900Z dlsym: nvmlDeviceGetCount_v2
2024-03-06T13:07:37.614600540Z dlsym: nvmlDeviceGetCudaComputeCapability
2024-03-06T13:07:37.614607160Z dlsym: nvmlSystemGetDriverVersion
2024-03-06T13:07:37.614613946Z dlsym: nvmlDeviceGetName
2024-03-06T13:07:37.614620480Z dlsym: nvmlDeviceGetSerial
2024-03-06T13:07:37.614627222Z dlsym: nvmlDeviceGetVbiosVersion
2024-03-06T13:07:37.614651044Z dlsym: nvmlDeviceGetBoardPartNumber
2024-03-06T13:07:37.614658210Z dlsym: nvmlDeviceGetBrand
2024-03-06T13:07:37.626543025Z CUDA driver version: 525.116.04
2024-03-06T13:07:37.626591950Z time=2024-03-06T14:07:37.626+01:00 level=INFO source=gpu.go:99 msg="Nvidia GPU detected"
2024-03-06T13:07:37.626604324Z time=2024-03-06T14:07:37.626+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
2024-03-06T13:07:37.632326226Z [0] CUDA device name: NVIDIA GeForce GTX 1080 Ti
2024-03-06T13:07:37.632367566Z [0] CUDA part number:
2024-03-06T13:07:37.632380051Z nvmlDeviceGetSerial failed: 3
2024-03-06T13:07:37.632389911Z [0] CUDA vbios version: 86.02.39.00.22
2024-03-06T13:07:37.632399445Z [0] CUDA brand: 5
2024-03-06T13:07:37.632408716Z [0] CUDA totalMem 11811160064
2024-03-06T13:07:37.632418781Z [0] CUDA usedMem 96272384
2024-03-06T13:07:37.632428941Z time=2024-03-06T14:07:37.632+01:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 6.1"
2024-03-06T13:07:37.632439074Z time=2024-03-06T14:07:37.632+01:00 level=DEBUG source=gpu.go:254 msg="**cuda detected 1 devices with 10054M available memory**"
2024-03-06T13:07:45.128463945Z
```
After pulling a model and accessing the server via API - port 11434 (via AnythingLLM), I get follwing error message, it is not able to initialize GPU and continues with CPU.
```
time=2024-03-06T14:12:15.108+01:00 level=DEBUG source=payload_common.go:93 msg="ordered list of LLM libraries to try [/tmp/ollama3053854002/cuda_v11/libext_server.so /tmp/ollama3053854002/cpu_avx2/libext_server.so]"
2024-03-06T13:12:15.108672805Z loading library /tmp/ollama3053854002/cuda_v11/libext_server.so
2024-03-06T13:12:15.124892811Z time=2024-03-06T14:12:15.124+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama3053854002/cuda_v11/libext_server.so"
2024-03-06T13:12:15.124934506Z time=2024-03-06T14:12:15.124+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"
2024-03-06T13:12:15.126816285Z [1709730735] system info: AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 |
2024-03-06T13:12:15.126842734Z [1709730735] Performing pre-initialization of GPU
2024-03-06T13:12:15.152439571Z time=2024-03-06T14:12:15.152+01:00 level=DEBUG source=dyn_ext_server.go:157 msg="**failure during initialization: Unable to init GPU: no CUDA-capable device is detected**"
2024-03-06T13:12:15.152469986Z time=2024-03-06T14:12:15.152+01:00 level=WARN source=llm.go:162 msg="Failed to load dynamic library /tmp/ollama3053854002/cuda_v11/libext_server.so Unable to init GPU: no CUDA-capable device is detected"
2024-03-06T13:12:15.152476725Z loading library /tmp/ollama3053854002/cpu_avx2/libext_server.so
2024-03-06T13:12:15.154618107Z
```
Any suggestion welcome. Thanks in advance
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2954/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2954/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6769
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6769/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6769/comments
|
https://api.github.com/repos/ollama/ollama/issues/6769/events
|
https://github.com/ollama/ollama/issues/6769
| 2,521,169,936
|
I_kwDOJ0Z1Ps6WRgAQ
| 6,769
|
OLLAMA_FLASH_ATTENTION regression on 0.3.10?
|
{
"login": "coodoo",
"id": 325936,
"node_id": "MDQ6VXNlcjMyNTkzNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/325936?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coodoo",
"html_url": "https://github.com/coodoo",
"followers_url": "https://api.github.com/users/coodoo/followers",
"following_url": "https://api.github.com/users/coodoo/following{/other_user}",
"gists_url": "https://api.github.com/users/coodoo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/coodoo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coodoo/subscriptions",
"organizations_url": "https://api.github.com/users/coodoo/orgs",
"repos_url": "https://api.github.com/users/coodoo/repos",
"events_url": "https://api.github.com/users/coodoo/events{/privacy}",
"received_events_url": "https://api.github.com/users/coodoo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-09-12T02:12:20
| 2024-09-12T05:44:44
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
After upgrading to the latest version `0.3.10`, with `OLLAMA_FLASH_ATTENTION=1` set in env, seemed the tokens per second were halved, in my experiment, same code used to have tps around 23 and now it's only 11.
Wondering is there any known regression with regard to FLASH_ATTENTION?
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.10
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6769/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6769/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1496
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1496/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1496/comments
|
https://api.github.com/repos/ollama/ollama/issues/1496/events
|
https://github.com/ollama/ollama/issues/1496
| 2,039,052,784
|
I_kwDOJ0Z1Ps55iXnw
| 1,496
|
Add Phi-2 model
|
{
"login": "hunnble",
"id": 14886515,
"node_id": "MDQ6VXNlcjE0ODg2NTE1",
"avatar_url": "https://avatars.githubusercontent.com/u/14886515?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hunnble",
"html_url": "https://github.com/hunnble",
"followers_url": "https://api.github.com/users/hunnble/followers",
"following_url": "https://api.github.com/users/hunnble/following{/other_user}",
"gists_url": "https://api.github.com/users/hunnble/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hunnble/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hunnble/subscriptions",
"organizations_url": "https://api.github.com/users/hunnble/orgs",
"repos_url": "https://api.github.com/users/hunnble/repos",
"events_url": "https://api.github.com/users/hunnble/events{/privacy}",
"received_events_url": "https://api.github.com/users/hunnble/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 19
| 2023-12-13T06:36:24
| 2023-12-24T22:04:57
| 2023-12-24T22:04:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The Phi-2 model performs well. Should we consider adding it to Ollama?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1496/reactions",
"total_count": 32,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 32,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1496/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7938
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7938/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7938/comments
|
https://api.github.com/repos/ollama/ollama/issues/7938/events
|
https://github.com/ollama/ollama/issues/7938
| 2,719,072,990
|
I_kwDOJ0Z1Ps6iEcLe
| 7,938
|
KV Cache quants run into issues every couple of messages.
|
{
"login": "SingularityMan",
"id": 91804288,
"node_id": "U_kgDOBXjSgA",
"avatar_url": "https://avatars.githubusercontent.com/u/91804288?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SingularityMan",
"html_url": "https://github.com/SingularityMan",
"followers_url": "https://api.github.com/users/SingularityMan/followers",
"following_url": "https://api.github.com/users/SingularityMan/following{/other_user}",
"gists_url": "https://api.github.com/users/SingularityMan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SingularityMan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SingularityMan/subscriptions",
"organizations_url": "https://api.github.com/users/SingularityMan/orgs",
"repos_url": "https://api.github.com/users/SingularityMan/repos",
"events_url": "https://api.github.com/users/SingularityMan/events{/privacy}",
"received_events_url": "https://api.github.com/users/SingularityMan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 7
| 2024-12-05T00:48:32
| 2024-12-12T17:13:48
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
This is the error message I run into when running my scripts with KV cache q4_0 or q8_0. These are the models:
```
gemma2:27b-instruct-q4_0 (will be switching to q4_K_S)
minicpm-v-2.6-8b-q8_0
```
```
Traceback (most recent call last):
File "C:\Users\user\PycharmProjects\vector_companion\vector_companion\main.py", line 564, in <module>
config.asyncio.run(main())
File "C:\Users\user\.conda\envs\vector_companion\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "C:\Users\user\.conda\envs\vector_companion\lib\asyncio\base_events.py", line 647, in run_until_complete
return future.result()
File "C:\Users\user\PycharmProjects\vector_companion\vector_companion\main.py", line 520, in main
await queue_agent_responses(
File "C:\Users\user\PycharmProjects\vector_companion\vector_companion\main.py", line 178, in queue_agent_responses
await config.asyncio.gather(process_sentences(), play_audio_queue())
File "C:\Users\user\PycharmProjects\vector_companion\vector_companion\main.py", line 157, in process_sentences
async for sentence in sentence_generator:
File "C:\Users\user\PycharmProjects\vector_companion\vector_companion\config\config.py", line 109, in fetch_stream
for chunk in stream:
File "C:\Users\user\.conda\envs\vector_companion\lib\site-packages\ollama\_client.py", line 90, in _stream
raise ResponseError(e)
ollama._types.ResponseError: an error was encountered while running the model: read tcp 127.0.0.1:34105->127.0.0.1:34102: wsarecv: An existing connection was forcibly closed by the remote host.
```
So when I look at the server log I see this:
```
C:\a\ollama\ollama\llama\ggml-cuda\cpy.cu:531: ggml_cuda_cpy: unsupported type combination (q4_0 to f32)
time=2024-12-04T19:38:14.673-05:00 level=DEBUG source=server.go:1092 msg="stopping llama server"
[GIN] 2024/12/04 - 19:38:14 | 200 | 5.073219s | 127.0.0.1 | POST "/api/chat"
time=2024-12-04T19:38:14.674-05:00 level=DEBUG source=sched.go:407 msg="context for request finished"
time=2024-12-04T19:38:14.674-05:00 level=DEBUG source=sched.go:339 msg="runner with non-zero duration has gone idle, adding timer" modelPath=C:\Users\carlo\.ollama\models\blobs\sha256-d7e4b00a7d7a8d03d4eed9b0f3f61a427e9f0fc5dea6aeb414e41dee23dc8ecc duration=2562047h47m16.854775807s
time=2024-12-04T19:38:14.674-05:00 level=DEBUG source=sched.go:357 msg="after processing request finished event" modelPath=C:\Users\carlo\.ollama\models\blobs\sha256-d7e4b00a7d7a8d03d4eed9b0f3f61a427e9f0fc5dea6aeb414e41dee23dc8ecc refCount=0
```
This is per the RC uploaded today.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.8 RC
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7938/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7938/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6815
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6815/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6815/comments
|
https://api.github.com/repos/ollama/ollama/issues/6815/events
|
https://github.com/ollama/ollama/issues/6815
| 2,526,960,064
|
I_kwDOJ0Z1Ps6WnlnA
| 6,815
|
Idea: Model Pre-Pulling on Startup
|
{
"login": "adrianliechti",
"id": 343775,
"node_id": "MDQ6VXNlcjM0Mzc3NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/343775?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adrianliechti",
"html_url": "https://github.com/adrianliechti",
"followers_url": "https://api.github.com/users/adrianliechti/followers",
"following_url": "https://api.github.com/users/adrianliechti/following{/other_user}",
"gists_url": "https://api.github.com/users/adrianliechti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adrianliechti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adrianliechti/subscriptions",
"organizations_url": "https://api.github.com/users/adrianliechti/orgs",
"repos_url": "https://api.github.com/users/adrianliechti/repos",
"events_url": "https://api.github.com/users/adrianliechti/events{/privacy}",
"received_events_url": "https://api.github.com/users/adrianliechti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-15T15:17:01
| 2024-12-11T20:41:35
| 2024-12-11T20:41:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
To ease Docker/Kubernetes based deployments, I would suggest a feature to auto download/update models on startup if not yet existing. See here a poor man implementation of that idea: https://github.com/adrianliechti/ollama/blob/main/main.go.
I could imagine this would also help in samples/documentations.
If you think that this is a good idea but no resources; I could try a proposal.
Probably the biggest challenge is naming the env variable :)
Cheers, Adrian
|
{
"login": "adrianliechti",
"id": 343775,
"node_id": "MDQ6VXNlcjM0Mzc3NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/343775?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adrianliechti",
"html_url": "https://github.com/adrianliechti",
"followers_url": "https://api.github.com/users/adrianliechti/followers",
"following_url": "https://api.github.com/users/adrianliechti/following{/other_user}",
"gists_url": "https://api.github.com/users/adrianliechti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adrianliechti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adrianliechti/subscriptions",
"organizations_url": "https://api.github.com/users/adrianliechti/orgs",
"repos_url": "https://api.github.com/users/adrianliechti/repos",
"events_url": "https://api.github.com/users/adrianliechti/events{/privacy}",
"received_events_url": "https://api.github.com/users/adrianliechti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6815/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6815/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2362
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2362/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2362/comments
|
https://api.github.com/repos/ollama/ollama/issues/2362/events
|
https://github.com/ollama/ollama/issues/2362
| 2,118,751,236
|
I_kwDOJ0Z1Ps5-SZQE
| 2,362
|
Ollama Mixtral uses only 7% of the Nvidia RTX A4000 GPU.
|
{
"login": "nejib1",
"id": 10485460,
"node_id": "MDQ6VXNlcjEwNDg1NDYw",
"avatar_url": "https://avatars.githubusercontent.com/u/10485460?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nejib1",
"html_url": "https://github.com/nejib1",
"followers_url": "https://api.github.com/users/nejib1/followers",
"following_url": "https://api.github.com/users/nejib1/following{/other_user}",
"gists_url": "https://api.github.com/users/nejib1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nejib1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nejib1/subscriptions",
"organizations_url": "https://api.github.com/users/nejib1/orgs",
"repos_url": "https://api.github.com/users/nejib1/repos",
"events_url": "https://api.github.com/users/nejib1/events{/privacy}",
"received_events_url": "https://api.github.com/users/nejib1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-02-05T14:53:09
| 2024-03-12T21:34:18
| 2024-03-12T21:34:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
When I execute Ollama Mixtral with the Nvidia A4000 (16GB), I observe that only 7% of the GPU is utilized. Do you know why this might be happening? Additionally, the process seems somewhat slow. It appears that Ollama Mixtral is using 40% of the CPU but only 7% of the GPU.

Do you have any suggestions on how to increase GPU utilization instead of %?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2362/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2362/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4844
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4844/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4844/comments
|
https://api.github.com/repos/ollama/ollama/issues/4844/events
|
https://github.com/ollama/ollama/issues/4844
| 2,336,832,215
|
I_kwDOJ0Z1Ps6LSTrX
| 4,844
|
List models through openai-conform API?
|
{
"login": "arthurGrigo",
"id": 35745065,
"node_id": "MDQ6VXNlcjM1NzQ1MDY1",
"avatar_url": "https://avatars.githubusercontent.com/u/35745065?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arthurGrigo",
"html_url": "https://github.com/arthurGrigo",
"followers_url": "https://api.github.com/users/arthurGrigo/followers",
"following_url": "https://api.github.com/users/arthurGrigo/following{/other_user}",
"gists_url": "https://api.github.com/users/arthurGrigo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/arthurGrigo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/arthurGrigo/subscriptions",
"organizations_url": "https://api.github.com/users/arthurGrigo/orgs",
"repos_url": "https://api.github.com/users/arthurGrigo/repos",
"events_url": "https://api.github.com/users/arthurGrigo/events{/privacy}",
"received_events_url": "https://api.github.com/users/arthurGrigo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-05T21:04:14
| 2024-06-05T21:10:39
| 2024-06-05T21:10:38
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Does the openai-conform API offer to get a list of deployed models?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4844/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4844/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7296
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7296/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7296/comments
|
https://api.github.com/repos/ollama/ollama/issues/7296/events
|
https://github.com/ollama/ollama/issues/7296
| 2,602,667,400
|
I_kwDOJ0Z1Ps6bIY2I
| 7,296
|
ollama can not detect RTX GPU card
|
{
"login": "wangqiang-kubota",
"id": 128361488,
"node_id": "U_kgDOB6akEA",
"avatar_url": "https://avatars.githubusercontent.com/u/128361488?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wangqiang-kubota",
"html_url": "https://github.com/wangqiang-kubota",
"followers_url": "https://api.github.com/users/wangqiang-kubota/followers",
"following_url": "https://api.github.com/users/wangqiang-kubota/following{/other_user}",
"gists_url": "https://api.github.com/users/wangqiang-kubota/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wangqiang-kubota/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wangqiang-kubota/subscriptions",
"organizations_url": "https://api.github.com/users/wangqiang-kubota/orgs",
"repos_url": "https://api.github.com/users/wangqiang-kubota/repos",
"events_url": "https://api.github.com/users/wangqiang-kubota/events{/privacy}",
"received_events_url": "https://api.github.com/users/wangqiang-kubota/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677677816,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgVG-A",
"url": "https://api.github.com/repos/ollama/ollama/labels/docker",
"name": "docker",
"color": "0052CC",
"default": false,
"description": "Issues relating to using ollama in containers"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-10-21T14:03:35
| 2024-10-22T04:36:50
| 2024-10-22T04:36:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have one RTX 4080 installed and used ollama 0.3.14 docker version. Now ollama is working on CPU mode and GPU cannot be detected. I enter the ollama container and run the command 'nvidia-smi'
**GPU information:**
<img width="763" alt="image" src="https://github.com/user-attachments/assets/44c1b77c-d27d-4bbf-a50e-7d961e6a5633">
**Ollama start command :**
`docker run -d -e CUDA_VISIBLE_DEVICES=0 -e OLLAMA_DEBUG=1 --gpus=all -v /data/models:/root/.ollama -p 11434:11434 --restart unless-stopped --name ollama ollama/ollama`
**Ollama start logs:**
`2024/10/21 13:35:53 routes.go:1158: INFO server config env="map[CUDA_VISIBLE_DEVICES:0 GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434/ OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost/ https://localhost/ http://localhost/:* https://localhost/:* http://127.0.0.1/ https://127.0.0.1/ http://127.0.0.1/:* https://127.0.0.1/:* http://0.0.0.0/ https://0.0.0.0/ http://0.0.0.0/:* https://0.0.0.0/:* app://* file://* tauri://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2024-10-21T13:35:53.445Z level=INFO source=images.go:754 msg="total blobs: 10"
time=2024-10-21T13:35:53.445Z level=INFO source=images.go:761 msg="total unused blobs removed: 0"
time=2024-10-21T13:35:53.445Z level=INFO source=routes.go:1205 msg="Listening on [::]:11434 (version 0.3.14)"
time=2024-10-21T13:35:53.445Z level=DEBUG source=common.go:294 msg="availableServers : found" file=/usr/lib/ollama/runners/cpu/ollama_llama_server
time=2024-10-21T13:35:53.445Z level=DEBUG source=common.go:294 msg="availableServers : found" file=/usr/lib/ollama/runners/cpu_avx/ollama_llama_server
time=2024-10-21T13:35:53.445Z level=DEBUG source=common.go:294 msg="availableServers : found" file=/usr/lib/ollama/runners/cpu_avx2/ollama_llama_server
time=2024-10-21T13:35:53.445Z level=DEBUG source=common.go:294 msg="availableServers : found" file=/usr/lib/ollama/runners/cuda_v11/ollama_llama_server
time=2024-10-21T13:35:53.445Z level=DEBUG source=common.go:294 msg="availableServers : found" file=/usr/lib/ollama/runners/cuda_v12/ollama_llama_server
time=2024-10-21T13:35:53.445Z level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[cuda_v11 cuda_v12 cpu cpu_avx cpu_avx2]"
time=2024-10-21T13:35:53.445Z level=DEBUG source=common.go:50 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY"
time=2024-10-21T13:35:53.445Z level=DEBUG source=sched.go:105 msg="starting llm scheduler"
time=2024-10-21T13:35:53.445Z level=INFO source=gpu.go:221 msg="looking for compatible GPUs"
time=2024-10-21T13:35:53.446Z level=DEBUG source=gpu.go:94 msg="searching for GPU discovery libraries for NVIDIA"
time=2024-10-21T13:35:53.446Z level=DEBUG source=gpu.go:505 msg="Searching for GPU library" name=libcuda.so*
time=2024-10-21T13:35:53.446Z level=DEBUG source=gpu.go:528 msg="gpu library search" globs="[/usr/lib/ollama/libcuda.so* /usr/local/nvidia/lib/libcuda.so* /usr/local/nvidia/lib64/libcuda.so* /usr/local/cuda*/targets/*/lib/libcuda.so* /usr/lib/*-linux-gnu/nvidia/current/libcuda.so* /usr/lib/*-linux-gnu/libcuda.so* /usr/lib/wsl/lib/libcuda.so* /usr/lib/wsl/drivers/*/libcuda.so* /opt/cuda/lib*/libcuda.so* /usr/local/cuda/lib*/libcuda.so* /usr/lib*/libcuda.so* /usr/local/lib*/libcuda.so*]"
time=2024-10-21T13:35:53.446Z level=DEBUG source=gpu.go:562 msg="discovered GPU libraries" paths=[/usr/lib/x86_64-linux-gnu/libcuda.so.535.171.04]
cuInit err: 3
time=2024-10-21T13:35:53.452Z level=INFO source=gpu.go:616 msg="Unable to load cudart library /usr/lib/x86_64-linux-gnu/libcuda.so.535.171.04: cuda driver library init failure: 3"
time=2024-10-21T13:35:53.452Z level=DEBUG source=gpu.go:505 msg="Searching for GPU library" name=libcudart.so*
time=2024-10-21T13:35:53.452Z level=DEBUG source=gpu.go:528 msg="gpu library search" globs="[/usr/lib/ollama/libcudart.so* /usr/local/nvidia/lib/libcudart.so* /usr/local/nvidia/lib64/libcudart.so* /usr/lib/ollama/libcudart.so* /usr/local/cuda/lib64/libcudart.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/x86_64-linux-gnu/libcudart.so* /usr/lib/wsl/lib/libcudart.so* /usr/lib/wsl/drivers/*/libcudart.so* /opt/cuda/lib64/libcudart.so* /usr/local/cuda*/targets/aarch64-linux/lib/libcudart.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/aarch64-linux-gnu/libcudart.so* /usr/local/cuda/lib*/libcudart.so* /usr/lib*/libcudart.so* /usr/local/lib*/libcudart.so*]"
time=2024-10-21T13:35:53.452Z level=DEBUG source=gpu.go:562 msg="discovered GPU libraries" paths="[/usr/lib/ollama/libcudart.so.12.4.99 /usr/lib/ollama/libcudart.so.11.3.109]"
cudaSetDevice err: 3
time=2024-10-21T13:35:53.457Z level=DEBUG source=gpu.go:578 msg="Unable to load cudart library /usr/lib/ollama/libcudart.so.12.4.99: cudart init failure: 3"
cudaSetDevice err: 3
time=2024-10-21T13:35:53.462Z level=DEBUG source=gpu.go:578 msg="Unable to load cudart library /usr/lib/ollama/libcudart.so.11.3.109: cudart init failure: 3"
time=2024-10-21T13:35:53.462Z level=DEBUG source=amd_linux.go:416 msg="amdgpu driver not detected /sys/module/amdgpu"
time=2024-10-21T13:35:53.462Z level=INFO source=gpu.go:384 msg="no compatible GPUs were discovered"
time=2024-10-21T13:35:53.462Z level=INFO source=types.go:123 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="15.6 GiB" available="9.9 GiB"`
When I run some LLM action, it always shows below information(100% CPU in processor field and the model was loaded by CPU):
<img width="949" alt="image" src="https://github.com/user-attachments/assets/fd73106d-48ab-4048-b94f-76c8ca80d858">
**Operation System:**
<img width="523" alt="image" src="https://github.com/user-attachments/assets/60de1337-cdbb-47bb-965b-1f415ab98e63">
if any more information needed please let me know. Thank you in advanced.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14
|
{
"login": "wangqiang-kubota",
"id": 128361488,
"node_id": "U_kgDOB6akEA",
"avatar_url": "https://avatars.githubusercontent.com/u/128361488?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wangqiang-kubota",
"html_url": "https://github.com/wangqiang-kubota",
"followers_url": "https://api.github.com/users/wangqiang-kubota/followers",
"following_url": "https://api.github.com/users/wangqiang-kubota/following{/other_user}",
"gists_url": "https://api.github.com/users/wangqiang-kubota/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wangqiang-kubota/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wangqiang-kubota/subscriptions",
"organizations_url": "https://api.github.com/users/wangqiang-kubota/orgs",
"repos_url": "https://api.github.com/users/wangqiang-kubota/repos",
"events_url": "https://api.github.com/users/wangqiang-kubota/events{/privacy}",
"received_events_url": "https://api.github.com/users/wangqiang-kubota/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7296/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7296/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6558
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6558/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6558/comments
|
https://api.github.com/repos/ollama/ollama/issues/6558/events
|
https://github.com/ollama/ollama/issues/6558
| 2,494,975,471
|
I_kwDOJ0Z1Ps6Utk3v
| 6,558
|
Multiple GPU´s Nvidia 56GB VRAM gemma2:27b
|
{
"login": "paulopais",
"id": 28353191,
"node_id": "MDQ6VXNlcjI4MzUzMTkx",
"avatar_url": "https://avatars.githubusercontent.com/u/28353191?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/paulopais",
"html_url": "https://github.com/paulopais",
"followers_url": "https://api.github.com/users/paulopais/followers",
"following_url": "https://api.github.com/users/paulopais/following{/other_user}",
"gists_url": "https://api.github.com/users/paulopais/gists{/gist_id}",
"starred_url": "https://api.github.com/users/paulopais/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/paulopais/subscriptions",
"organizations_url": "https://api.github.com/users/paulopais/orgs",
"repos_url": "https://api.github.com/users/paulopais/repos",
"events_url": "https://api.github.com/users/paulopais/events{/privacy}",
"received_events_url": "https://api.github.com/users/paulopais/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 15
| 2024-08-29T16:08:03
| 2024-09-03T21:55:21
| 2024-09-03T21:55:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
Error: cudaMalloc failed: out of memory
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.8
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6558/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6558/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8527
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8527/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8527/comments
|
https://api.github.com/repos/ollama/ollama/issues/8527/events
|
https://github.com/ollama/ollama/issues/8527
| 2,803,267,929
|
I_kwDOJ0Z1Ps6nFnlZ
| 8,527
|
Log tracking
|
{
"login": "poo0054",
"id": 85894494,
"node_id": "MDQ6VXNlcjg1ODk0NDk0",
"avatar_url": "https://avatars.githubusercontent.com/u/85894494?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/poo0054",
"html_url": "https://github.com/poo0054",
"followers_url": "https://api.github.com/users/poo0054/followers",
"following_url": "https://api.github.com/users/poo0054/following{/other_user}",
"gists_url": "https://api.github.com/users/poo0054/gists{/gist_id}",
"starred_url": "https://api.github.com/users/poo0054/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/poo0054/subscriptions",
"organizations_url": "https://api.github.com/users/poo0054/orgs",
"repos_url": "https://api.github.com/users/poo0054/repos",
"events_url": "https://api.github.com/users/poo0054/events{/privacy}",
"received_events_url": "https://api.github.com/users/poo0054/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-22T03:18:02
| 2025-01-22T03:18:02
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have an idea, can I add a traceId when printing the log? This will facilitate the tracking details of subsequent requests.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8527/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8527/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1847
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1847/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1847/comments
|
https://api.github.com/repos/ollama/ollama/issues/1847/events
|
https://github.com/ollama/ollama/issues/1847
| 2,069,341,492
|
I_kwDOJ0Z1Ps57V6U0
| 1,847
|
API equivalent of Ctrl+C? (stopping response stream before completion)
|
{
"login": "houstonhaynes",
"id": 8174976,
"node_id": "MDQ6VXNlcjgxNzQ5NzY=",
"avatar_url": "https://avatars.githubusercontent.com/u/8174976?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/houstonhaynes",
"html_url": "https://github.com/houstonhaynes",
"followers_url": "https://api.github.com/users/houstonhaynes/followers",
"following_url": "https://api.github.com/users/houstonhaynes/following{/other_user}",
"gists_url": "https://api.github.com/users/houstonhaynes/gists{/gist_id}",
"starred_url": "https://api.github.com/users/houstonhaynes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/houstonhaynes/subscriptions",
"organizations_url": "https://api.github.com/users/houstonhaynes/orgs",
"repos_url": "https://api.github.com/users/houstonhaynes/repos",
"events_url": "https://api.github.com/users/houstonhaynes/events{/privacy}",
"received_events_url": "https://api.github.com/users/houstonhaynes/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-07T23:27:52
| 2024-01-08T13:40:01
| 2024-01-08T13:40:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is there an equivalent to the console 'Ctrl+C' in the API to stop a stream response? What's the recommended practice?
Thanks!
|
{
"login": "houstonhaynes",
"id": 8174976,
"node_id": "MDQ6VXNlcjgxNzQ5NzY=",
"avatar_url": "https://avatars.githubusercontent.com/u/8174976?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/houstonhaynes",
"html_url": "https://github.com/houstonhaynes",
"followers_url": "https://api.github.com/users/houstonhaynes/followers",
"following_url": "https://api.github.com/users/houstonhaynes/following{/other_user}",
"gists_url": "https://api.github.com/users/houstonhaynes/gists{/gist_id}",
"starred_url": "https://api.github.com/users/houstonhaynes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/houstonhaynes/subscriptions",
"organizations_url": "https://api.github.com/users/houstonhaynes/orgs",
"repos_url": "https://api.github.com/users/houstonhaynes/repos",
"events_url": "https://api.github.com/users/houstonhaynes/events{/privacy}",
"received_events_url": "https://api.github.com/users/houstonhaynes/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1847/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1847/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/240
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/240/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/240/comments
|
https://api.github.com/repos/ollama/ollama/issues/240/events
|
https://github.com/ollama/ollama/issues/240
| 1,827,620,862
|
I_kwDOJ0Z1Ps5s70f-
| 240
|
Why does the app add itself to the "Open at Login" items on Mac?
|
{
"login": "1234igor",
"id": 25590175,
"node_id": "MDQ6VXNlcjI1NTkwMTc1",
"avatar_url": "https://avatars.githubusercontent.com/u/25590175?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/1234igor",
"html_url": "https://github.com/1234igor",
"followers_url": "https://api.github.com/users/1234igor/followers",
"following_url": "https://api.github.com/users/1234igor/following{/other_user}",
"gists_url": "https://api.github.com/users/1234igor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/1234igor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/1234igor/subscriptions",
"organizations_url": "https://api.github.com/users/1234igor/orgs",
"repos_url": "https://api.github.com/users/1234igor/repos",
"events_url": "https://api.github.com/users/1234igor/events{/privacy}",
"received_events_url": "https://api.github.com/users/1234igor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-07-29T19:40:26
| 2023-07-30T02:27:53
| 2023-07-30T02:27:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/240/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/240/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6476
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6476/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6476/comments
|
https://api.github.com/repos/ollama/ollama/issues/6476/events
|
https://github.com/ollama/ollama/pull/6476
| 2,483,325,301
|
PR_kwDOJ0Z1Ps55QbxX
| 6,476
|
fix: remove duplicated func call
|
{
"login": "alwqx",
"id": 9915368,
"node_id": "MDQ6VXNlcjk5MTUzNjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/9915368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alwqx",
"html_url": "https://github.com/alwqx",
"followers_url": "https://api.github.com/users/alwqx/followers",
"following_url": "https://api.github.com/users/alwqx/following{/other_user}",
"gists_url": "https://api.github.com/users/alwqx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alwqx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alwqx/subscriptions",
"organizations_url": "https://api.github.com/users/alwqx/orgs",
"repos_url": "https://api.github.com/users/alwqx/repos",
"events_url": "https://api.github.com/users/alwqx/events{/privacy}",
"received_events_url": "https://api.github.com/users/alwqx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-23T14:54:45
| 2024-08-26T02:58:02
| 2024-08-25T16:06:59
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6476",
"html_url": "https://github.com/ollama/ollama/pull/6476",
"diff_url": "https://github.com/ollama/ollama/pull/6476.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6476.patch",
"merged_at": null
}
|
Hi, `fn(api.ProgressResponse{Status: "converting model"})` is called twice in `line 101` and `line 111`. Just remove one.
|
{
"login": "alwqx",
"id": 9915368,
"node_id": "MDQ6VXNlcjk5MTUzNjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/9915368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alwqx",
"html_url": "https://github.com/alwqx",
"followers_url": "https://api.github.com/users/alwqx/followers",
"following_url": "https://api.github.com/users/alwqx/following{/other_user}",
"gists_url": "https://api.github.com/users/alwqx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alwqx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alwqx/subscriptions",
"organizations_url": "https://api.github.com/users/alwqx/orgs",
"repos_url": "https://api.github.com/users/alwqx/repos",
"events_url": "https://api.github.com/users/alwqx/events{/privacy}",
"received_events_url": "https://api.github.com/users/alwqx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6476/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6476/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7737
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7737/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7737/comments
|
https://api.github.com/repos/ollama/ollama/issues/7737/events
|
https://github.com/ollama/ollama/issues/7737
| 2,671,288,649
|
I_kwDOJ0Z1Ps6fOKFJ
| 7,737
|
Installation script breaks in devcontainer with terminal color issue
|
{
"login": "loujaybee",
"id": 5528307,
"node_id": "MDQ6VXNlcjU1MjgzMDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5528307?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/loujaybee",
"html_url": "https://github.com/loujaybee",
"followers_url": "https://api.github.com/users/loujaybee/followers",
"following_url": "https://api.github.com/users/loujaybee/following{/other_user}",
"gists_url": "https://api.github.com/users/loujaybee/gists{/gist_id}",
"starred_url": "https://api.github.com/users/loujaybee/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/loujaybee/subscriptions",
"organizations_url": "https://api.github.com/users/loujaybee/orgs",
"repos_url": "https://api.github.com/users/loujaybee/repos",
"events_url": "https://api.github.com/users/loujaybee/events{/privacy}",
"received_events_url": "https://api.github.com/users/loujaybee/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-19T08:43:48
| 2024-11-20T07:56:07
| 2024-11-19T16:33:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
This PR https://github.com/ollama/ollama/pull/6693 appears to break installation of ollama in a devcontainer.
I can reproduce pretty consistently with some minimal configurations:
devcontainer.json
```json
{
"dockerFile": "Dockerfile"
}
```
Dockerfile
```docker
FROM mcr.microsoft.com/devcontainers/javascript-node:0-20
# Install Ollama
RUN curl -fsSL https://ollama.com/install.sh | TERM=xterm sh
```
I believe this is to do with how `tput` behaves when running in a devcontainer.
I've found the following fixes the issue:
`RUN curl -fsSL https://ollama.com/install.sh | TERM=xterm sh`
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.4.2
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7737/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7737/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5914
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5914/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5914/comments
|
https://api.github.com/repos/ollama/ollama/issues/5914/events
|
https://github.com/ollama/ollama/issues/5914
| 2,427,719,753
|
I_kwDOJ0Z1Ps6QtBBJ
| 5,914
|
Alias names for models
|
{
"login": "jpummill",
"id": 9271237,
"node_id": "MDQ6VXNlcjkyNzEyMzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/9271237?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jpummill",
"html_url": "https://github.com/jpummill",
"followers_url": "https://api.github.com/users/jpummill/followers",
"following_url": "https://api.github.com/users/jpummill/following{/other_user}",
"gists_url": "https://api.github.com/users/jpummill/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jpummill/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jpummill/subscriptions",
"organizations_url": "https://api.github.com/users/jpummill/orgs",
"repos_url": "https://api.github.com/users/jpummill/repos",
"events_url": "https://api.github.com/users/jpummill/events{/privacy}",
"received_events_url": "https://api.github.com/users/jpummill/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-07-24T14:16:44
| 2024-09-04T03:38:41
| 2024-09-04T03:38:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Model names are hard to remember. They can be very long and somewhat cryptic.
For example, I may have the following models on my system for testing:
mistral-nemo:12b-instruct-2407-q3_K_S
mistral-nemo:12b-instruct-2407-q4_K_S
mistral-nemo:12b-instruct-2407-q5_K_M
I think it would be helpful to be able to create alias names for models. For example, I might use the following aliases: mn12-3, mn12-4, and mn12-5.
We would still have the true name as a reference but we could also have a column in the list output for the alias name and Ollama would be able to use either for loading.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5914/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5914/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2643
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2643/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2643/comments
|
https://api.github.com/repos/ollama/ollama/issues/2643/events
|
https://github.com/ollama/ollama/issues/2643
| 2,147,196,144
|
I_kwDOJ0Z1Ps5_-5zw
| 2,643
|
SIGSEGV: when running new Gemma 7B instruct model on Mac - Apple M2 Pro
|
{
"login": "fooinha",
"id": 8560528,
"node_id": "MDQ6VXNlcjg1NjA1Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8560528?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fooinha",
"html_url": "https://github.com/fooinha",
"followers_url": "https://api.github.com/users/fooinha/followers",
"following_url": "https://api.github.com/users/fooinha/following{/other_user}",
"gists_url": "https://api.github.com/users/fooinha/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fooinha/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fooinha/subscriptions",
"organizations_url": "https://api.github.com/users/fooinha/orgs",
"repos_url": "https://api.github.com/users/fooinha/repos",
"events_url": "https://api.github.com/users/fooinha/events{/privacy}",
"received_events_url": "https://api.github.com/users/fooinha/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-02-21T16:31:53
| 2024-02-21T23:19:23
| 2024-02-21T23:19:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
OLLAMA_HOST=127.0.0.1:11434 ./Ollama serve
2024-02-21 16:30:07.726 Ollama[57354:11721047] WARNING: Secure coding is not enabled for restorable state! Enable secure coding by implementing NSApplicationDelegate.applicationSupportsSecureRestorableState: and returning YES.
time=2024-02-21T16:30:13.732Z level=INFO source=images.go:706 msg="total blobs: 28"
time=2024-02-21T16:30:13.741Z level=INFO source=images.go:713 msg="total unused blobs removed: 0"
time=2024-02-21T16:30:13.744Z level=INFO source=routes.go:1014 msg="Listening on 127.0.0.1:11434 (version 0.1.25)"
time=2024-02-21T16:30:13.744Z level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
time=2024-02-21T16:30:13.759Z level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [metal]"
[GIN] 2024/02/21 - 16:30:31 | 200 | 1.418541ms | 127.0.0.1 | HEAD "/"
[GIN] 2024/02/21 - 16:30:31 | 200 | 5.206709ms | 127.0.0.1 | POST "/api/show"
[GIN] 2024/02/21 - 16:30:31 | 200 | 1.440625ms | 127.0.0.1 | POST "/api/show"
time=2024-02-21T16:30:32.690Z level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /var/folders/8x/8wzccvjd5q91rs8g53hj8gtr0000gp/T/ollama3945128941/metal/libext_server.dylib"
time=2024-02-21T16:30:32.690Z level=INFO source=dyn_ext_server.go:145 msg="Initializing llama server"
SIGSEGV: segmentation violation
PC=0x181adcdc4 m=12 sigcode=2
signal arrived during cgo execution
goroutine 43 [syscall]:
runtime.cgocall(0x1045988fc, 0x140004aa708)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/cgocall.go:157 +0x44 fp=0x140004aa6d0 sp=0x140004aa690 pc=0x104055fb4
github.com/jmorganca/ollama/llm._Cfunc_dyn_llama_server_init(
{0x84550370, 0x10c186534, 0x10c186fe8, 0x10c1870c8, 0x10c1872bc, 0x10c187df8, 0x10c188e58, 0x10c188e44, 0x10c188f08, 0x10c18991c, ...}, ...)
_cgo_gotypes.go:288 +0x30 fp=0x140004aa700 sp=0x140004aa6d0 pc=0x1043c9fd0
github.com/jmorganca/ollama/llm.newDynExtServer.func7(0x1045b2d9b?, 0xc?)
/Users/jmorgan/workspace/ollama/llm/dyn_ext_server.go:148 +0xe0 fp=0x140004aa7f0 sp=0x140004aa700 pc=0x1043cb220
github.com/jmorganca/ollama/llm.newDynExtServer({0x14000030240, 0x5b}, {0x14000294280, 0x71}, {0x0, 0x0, _}, {_, _, _}, ...)
/Users/jmorgan/workspace/ollama/llm/dyn_ext_server.go:148 +0x8f8 fp=0x140004aaa90 sp=0x140004aa7f0 pc=0x1043caf28
github.com/jmorganca/ollama/llm.newLlmServer({{0x0, 0x0, 0x0}, {_, _}, {_, _}}, {_, _}, {_, ...}, ...)
/Users/jmorgan/workspace/ollama/llm/llm.go:158 +0x308 fp=0x140004aac50 sp=0x140004aaa90 pc=0x1043c7918
github.com/jmorganca/ollama/llm.New({0x14000444640, 0x41}, {0x14000294280, 0x71}, {0x0, 0x0, _}, {_, _, _}, ...)
/Users/jmorgan/workspace/ollama/llm/llm.go:123 +0x3d8 fp=0x140004aae90 sp=0x140004aac50 pc=0x1043c7468
github.com/jmorganca/ollama/server.load(0x14000316000?, 0x14000316000, {{0x0, 0x1770, 0x200, 0x1, 0xffffffffffffffff, 0x0, 0x0, 0x1, ...}, ...}, ...)
/Users/jmorgan/workspace/ollama/server/routes.go:85 +0x308 fp=0x140004ab010 sp=0x140004aae90 pc=0x104577ed8
github.com/jmorganca/ollama/server.ChatHandler(0x140002d4100)
/Users/jmorgan/workspace/ollama/server/routes.go:1163 +0x528 fp=0x140004ab720 sp=0x140004ab010 pc=0x1045814b8
github.com/gin-gonic/gin.(*Context).Next(...)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func1(0x140002d4100)
/Users/jmorgan/workspace/ollama/server/routes.go:938 +0x78 fp=0x140004ab760 sp=0x140004ab720 pc=0x104580308
github.com/gin-gonic/gin.(*Context).Next(...)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.CustomRecoveryWithWriter.func1(0x140002d4100)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/recovery.go:102 +0x80 fp=0x140004ab7b0 sp=0x140004ab760 pc=0x10455c5b0
github.com/gin-gonic/gin.(*Context).Next(...)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.LoggerWithConfig.func1(0x140002d4100)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/logger.go:240 +0xb0 fp=0x140004ab960 sp=0x140004ab7b0 pc=0x10455b950
github.com/gin-gonic/gin.(*Context).Next(...)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.(*Engine).handleHTTPRequest(0x14000480820, 0x140002d4100)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/gin.go:620 +0x524 fp=0x140004abaf0 sp=0x140004ab960 pc=0x10455aa84
github.com/gin-gonic/gin.(*Engine).ServeHTTP(0x14000480820, {0x1048fd390?, 0x140001322a0}, 0x140002d4000)
/Users/jmorgan/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/gin.go:576 +0x1a0 fp=0x140004abb30 sp=0x140004abaf0 pc=0x10455a3d0
net/http.serverHandler.ServeHTTP({0x1048fb660?}, {0x1048fd390?, 0x140001322a0?}, 0x6?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:2938 +0xbc fp=0x140004abb60 sp=0x140004abb30 pc=0x1042e389c
net/http.(*conn).serve(0x14000496cf0, {0x1048fe968, 0x1400032c7e0})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:2009 +0x518 fp=0x140004abfa0 sp=0x140004abb60 pc=0x1042dfc98
net/http.(*Server).Serve.func3()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:3086 +0x30 fp=0x140004abfd0 sp=0x140004abfa0 pc=0x1042e3fb0
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140004abfd0 sp=0x140004abfd0 pc=0x1040bb954
created by net/http.(*Server).Serve in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:3086 +0x4cc
goroutine 1 [IO wait]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x140002d17b0 sp=0x140002d1790 pc=0x10408a498
runtime.netpollblock(0x14000465848?, 0x413fe44?, 0x1?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:564 +0x158 fp=0x140002d17f0 sp=0x140002d17b0 pc=0x104083b78
internal/poll.runtime_pollWait(0x10bb70aa0, 0x72)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:343 +0xa0 fp=0x140002d1820 sp=0x140002d17f0 pc=0x1040b5380
internal/poll.(*pollDesc).wait(0x14000422000?, 0x0?, 0x0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x140002d1850 sp=0x140002d1820 pc=0x10413b4a8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x14000422000)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_unix.go:611 +0x250 fp=0x140002d1900 sp=0x140002d1850 pc=0x10413ff30
net.(*netFD).accept(0x14000422000)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/fd_unix.go:172 +0x28 fp=0x140002d19c0 sp=0x140002d1900 pc=0x1041ab998
net.(*TCPListener).accept(0x140003f5520)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/tcpsock_posix.go:152 +0x28 fp=0x140002d19f0 sp=0x140002d19c0 pc=0x1041bfa18
net.(*TCPListener).Accept(0x140003f5520)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/tcpsock.go:315 +0x2c fp=0x140002d1a30 sp=0x140002d19f0 pc=0x1041bebfc
net/http.(*onceCloseListener).Accept(0x14000496cf0?)
<autogenerated>:1 +0x30 fp=0x140002d1a50 sp=0x140002d1a30 pc=0x104305910
net/http.(*Server).Serve(0x1400007a000, {0x1048fd180, 0x140003f5520})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:3056 +0x2b8 fp=0x140002d1b80 sp=0x140002d1a50 pc=0x1042e3c58
github.com/jmorganca/ollama/server.Serve({0x1048fd180, 0x140003f5520})
/Users/jmorgan/workspace/ollama/server/routes.go:1041 +0x394 fp=0x140002d1c70 sp=0x140002d1b80 pc=0x104580704
github.com/jmorganca/ollama/cmd.RunServer(0x14000420300?, {0x104d2c780?, 0x4?, 0x10459aa68?})
/Users/jmorgan/workspace/ollama/cmd/cmd.go:705 +0x178 fp=0x140002d1d10 sp=0x140002d1c70 pc=0x104590e78
github.com/spf13/cobra.(*Command).execute(0x140003b7800, {0x104d2c780, 0x0, 0x0})
/Users/jmorgan/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:940 +0x658 fp=0x140002d1e50 sp=0x140002d1d10 pc=0x104375258
github.com/spf13/cobra.(*Command).ExecuteC(0x140003b6c00)
/Users/jmorgan/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:1068 +0x320 fp=0x140002d1f10 sp=0x140002d1e50 pc=0x104375980
github.com/spf13/cobra.(*Command).Execute(...)
/Users/jmorgan/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
/Users/jmorgan/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:985
main.main()
/Users/jmorgan/workspace/ollama/main.go:11 +0x54 fp=0x140002d1f30 sp=0x140002d1f10 pc=0x1045981a4
runtime.main()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:267 +0x2bc fp=0x140002d1fd0 sp=0x140002d1f30 pc=0x10408a06c
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140002d1fd0 sp=0x140002d1fd0 pc=0x1040bb954
goroutine 2 [force gc (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x1400005af90 sp=0x1400005af70 pc=0x10408a498
runtime.goparkunlock(...)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:404
runtime.forcegchelper()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:322 +0xb8 fp=0x1400005afd0 sp=0x1400005af90 pc=0x10408a328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x1400005afd0 sp=0x1400005afd0 pc=0x1040bb954
created by runtime.init.6
in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:310 +0x24
goroutine 18 [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000056760 sp=0x14000056740 pc=0x10408a498
runtime.goparkunlock(...)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:404
runtime.bgsweep(0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgcsweep.go:321 +0x108 fp=0x140000567b0 sp=0x14000056760 pc=0x104076c78
runtime.gcenable.func1()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:200 +0x28 fp=0x140000567d0 sp=0x140000567b0 pc=0x10406b6d8
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140000567d0 sp=0x140000567d0 pc=0x1040bb954
created by runtime.gcenable in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:200 +0x6c
goroutine 19 [GC scavenge wait]:
runtime.gopark(0x44b6fe4
?, 0x6553f100?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000056f50 sp=0x14000056f30 pc=0x10408a498
runtime.goparkunlock(...)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:404
runtime.(*scavengerState).park(0x104cf69c0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgcscavenge.go:425 +0x5c fp=0x14000056f80 sp=0x14000056f50 pc=0x10407447c
runtime.bgscavenge(0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgcscavenge.go:658 +0xac fp=0x14000056fb0 sp=0x14000056f80 pc=0x104074a3c
runtime.gcenable.func2()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:201 +0x28 fp=0x14000056fd0 sp=0x14000056fb0 pc=0x10406b678
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x14000056fd0 sp=0x14000056fd0 pc=0x1040bb954
created by runtime.gcenable in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:201 +0xac
goroutine 20 [finalizer wait]:
runtime.gopark(0x14000092820?, 0x1a0?, 0xe8?, 0xa5?, 0x10432a82c?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x1400005a580 sp=0x1400005a560 pc=0x10408a498
runtime.runfinq()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mfinal.go:193 +0x108 fp=0x1400005a7d0 sp=0x1400005a580 pc=0x10406a7c8
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x1400005a7d0 sp=0x1400005a7d0 pc=0x1040bb954
created by runtime.createfing in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mfinal.go:163 +0x80
goroutine 21 [GC worker (idle)]:
runtime.gopark(0x1b941735489e9?, 0x3?, 0xb4?, 0x2d?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000057730 sp=0x14000057710 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x140000577d0 sp=0x14000057730 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s
:1197 +0x4 fp=0x140000577d0 sp=0x140000577d0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 34 [GC worker (idle)]:
runtime.gopark(0x1b9417354a6e1?, 0x3?, 0xd9?, 0x40?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000486730 sp=0x14000486710 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x140004867d0 sp=0x14000486730 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140004867d0 sp=0x140004867d0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 22 [GC worker (idle)]:
runtime.gopark(0x1b94173558683?, 0x1?, 0x36?, 0x5?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000057f30 sp=0x14000057f10 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x14000057fd0 sp=0x14000057f30 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x14000057fd0 sp=0x14000057fd0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 23 [GC worker (idle)]:
runtime.gopark(0x1b941734e3a06?, 0x3?, 0xf2?, 0xa2?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000058730 sp=0x14000058710 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:
1293 +0xd8 fp=0x140000587d0 sp=0x14000058730 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140000587d0 sp=0x140000587d0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 35 [GC worker (idle)]:
runtime.gopark(0x1b94173531bdc?, 0x3?, 0x5d?, 0xa5?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000486f30 sp=0x14000486f10 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x14000486fd0 sp=0x14000486f30 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x14000486fd0 sp=0x14000486fd0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 3 [GC worker (idle)]:
runtime.gopark(0x1b94173531e77?, 0x3?, 0xce?, 0x88?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x1400005b730 sp=0x1400005b710 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x1400005b7d0 sp=0x1400005b730 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s
:1197 +0x4 fp=0x1400005b7d0 sp=0x1400005b7d0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 36 [GC worker (idle)]:
runtime.gopark(0x104d2e6a0?, 0x1?, 0xd4?, 0x71?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000487730 sp=0x14000487710 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x140004877d0 sp=0x14000487730 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140004877d0 sp=0x140004877d0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 4 [GC worker (idle)]:
runtime.gopark(0x1b94173531c83?, 0x3?, 0x2c?, 0xbc?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x1400005bf30 sp=0x1400005bf10 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x1400005bfd0 sp=0x1400005bf30 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x1400005bfd0 sp=0x1400005bfd0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 37 [GC worker (idle)]:
runtime.gopark(0x1b9417359dea8?, 0x3?, 0x66?, 0x42?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000487f30 sp=0x14000487f10 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x14000487fd0 sp=0x14000487f30 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x14000487fd0 sp=0x14000487fd0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 38 [GC worker (idle)]:
runtime.gopark(0x104d2e6a0?, 0x3?, 0x56?, 0xa3?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000488730 sp=0x14000488710 pc=0x10408a498
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1293 +0xd8 fp=0x140004887d0 sp=0x14000488730 pc=0x10406d328
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140004887d0 sp=0x140004887d0 pc=0x1040bb954
created by runtime.gcBgMarkStartWorkers in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/mgc.go:1217 +0x28
goroutine 5 [select, locked to thread]:
runtime.gopark(0x14000489fa0?, 0x2?, 0xa8?, 0x9e?, 0x14000489f9c?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x14000489e40 sp=0x14000489e20 pc=0x10408a498
runtime.selectgo(0x14000489fa0, 0x14000489f98, 0x0?, 0x0, 0x0?, 0x1)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/select.go:327 +0x608 fp=0x14000489f50 sp=0x14000489e40 pc=0x10409acc8
runtime.ensureSigM.func1()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/signal_unix.go:1014 +0x168 fp=0x14000489fd0 sp=0x14000489f50 pc=0x1040b1b08
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x14000489fd0 sp=0x14000489fd0 pc=0x1040bb954
created by runtime.ensureSigM in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/signal_unix.go:997 +0xd8
goroutine 39 [syscall]:
runtime.sigNoteSleep(0x0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/os_darwin.go:124 +0x20 fp=0x14000483790 sp=0x14000483750 pc=0x104084b40
os/signal.signal_recv()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/sigqueue.go:149 +0x2c fp=0x140004837b0 sp=0x14000483790 pc=0x1040b78ec
os/signal.loop()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/os/signal/signal_unix.go:23 +0x1c fp=0x140004837d0 sp=0x140004837b0 pc=0x10430d75c
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140004837d0 sp=0x140004837d0 pc=0x1040bb954
created by os/signal.Notify.func1.1 in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/os/signal/signal.go:151 +0x28
goroutine 24 [chan receive]:
runtime.gopark(0xd4?, 0x104cf78a0?, 0x28?, 0x97?, 0x104066da0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x140000f96f0 sp=0x140000f96d0 pc=0x10408a498
runtime.chanrecv(0x1400044aa80, 0x0, 0x1)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/chan.go:583 +0x414 fp=0x140000f9770 sp=0x140000f96f0 pc=0x1040582e4
runtime.chanrecv1(0x104896a20?, 0x2e047c7fd0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/chan.go:442 +0x14 fp=0x140000f97a0 sp=0x140000f9770 pc=0x104057e94
github.com/jmorganca/ollama/server.Serve.func2()
/Users/jmorgan/workspace/ollama/server/routes.go:1023 +0x2c fp=0x140000f97d0 sp=0x140000f97a0 pc=0x10458079c
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140000f97d0 sp=0x140000f97d0 pc=0x1040bb954
created by github.com/jmorganca/ollama/server.Serve in goroutine 1
/Users/jmorgan/workspace/ollama/server/routes.go:1022 +0x334
goroutine 26 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x1040d0a20?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x140002cb890 sp=0x140002cb870 pc=0x10408a498
runtime.netpollblock(0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:564 +0x158 fp=0x140002cb8d0 sp=0x140002cb890 pc=0x104083b78
internal/poll.runtime_pollWait(0x10bb709a8, 0x72)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:343 +0xa0 fp=0x140002cb900 sp=0x140002cb8d0 pc=0x1040b5380
internal/poll.(*pollDesc).wait(0x1400013a080?, 0x14000149000?, 0x0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x140002cb930 sp=0x140002cb900 pc=0x10413b4a8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x1400013a080, {0x14000149000, 0x1000, 0x1000})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_unix.go:164 +0x200 fp=0x140002cb9d0 sp=0x140002cb930 pc=0x10413c7f0
net.(*netFD).Read(0x1400013a080, {0x14000149000?, 0x140002cba58?, 0x10413b8bc?})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/fd_posix.go:55 +0x28 fp=0x140002cba20 sp=0x140002cb9d0 pc=0x1041a9d88
net.(*conn).Read(0x14000604028, {0x14000149000?, 0x1401219076b?, 0x14000388098?})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/net.go:179 +0x34 fp=0x140002cba70 sp=0x140002cba20 pc=0x1041b7264
net.(*TCPConn).Read(0x14000388090?, {0x14000149000?, 0x0?, 0x0?})
<autogenerated>:1 +0x2c fp=0x140002cbaa0 sp=0x140002cba70 pc=0x1041c88fc
net/http.(*connReader).Read(0x14000388090, {0x14000149000, 0x1000, 0x1000})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:791 +0x224 fp=0x140002cbb00 sp=0x140002cbaa0 pc=0x1042da7b4
bufio.(*Reader).fill(0x1400013e7e0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/bufio/bufio.go:113 +0xf8 fp=0x140002cbb40 sp=0x140002cbb00 pc=0x10426e088
bufio.(*Reader).Peek(0x1400013e7e0, 0x4)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/bufio/bufio.go:151 +0x60 fp=0x140002cbb60 sp=0x140002cbb40 pc=0x10426e1f0
net/http.(*conn).serve(0x140002a0000, {0x1048fe968, 0x1400032c7e0})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:2044 +0x64c fp=0x140002cbfa0 sp=0x140002cbb60 pc=0x1042dfdcc
net/http.(*Server).Serve.func3()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:3086 +0x30 fp=0x140002cbfd0 sp=0x140002cbfa0 pc=0x1042e3fb0
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140002cbfd0 sp=0x140002cbfd0 pc=0x1040bb954
created by net/http.(*Server).Serve in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:3086 +0x4cc
goroutine 41 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x1040d0a20?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x140002cf890 sp=0x140002cf870 pc=0x10408a498
runtime.netpollblock(0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:564 +0x158 fp=0x140002cf8d0 sp=0x140002cf890 pc=0x104083b78
internal/poll.runtime_pollWait(0x10bb708b0, 0x72)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:343 +0xa0 fp=0x140002cf900 sp=0x140002cf8d0 pc=0x1040b5380
internal/poll.(*pollDesc).wait(0x140003dc080?, 0x14000495000?, 0x0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x140002cf930 sp=0x140002cf900 pc=0x10413b4a8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x140003dc080, {0x14000495000, 0x1000, 0x1000})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_unix.go:164 +0x200 fp=0x140002cf9d0 sp=0x140002cf930 pc=0x10413c7f0
net.(*netFD).Read(0x140003dc080, {0x14000495000?, 0x140004afa58?, 0x10413b8bc?})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/fd_posix.go:55 +0x28 fp=0x140002cfa20 sp=0x140002cf9d0 pc=0x1041a9d88
net.(*conn).Read(0x140004a2000, {0x14000495000?, 0x140121b9c91?, 0x14000388608?})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/net.go:179 +0x34 fp=0x140002cfa70 sp=0x140002cfa20 pc=0x1041b7264
net.(*TCPConn).Read(0x14000388600?, {0x14000495000?, 0x0?, 0x0?})
<autogenerated>:1 +0x2c fp=0x140002cfaa0 sp=0x140002cfa70 pc=0x1041c88fc
net/http.(*connReader).Read(0x14000388600, {0x14000495000, 0x1000, 0x1000})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:791 +0x224 fp=0x140002cfb00 sp=0x140002cfaa0 pc=0x1042da7b4
bufio.(*Reader).fill(0x1400013ede0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/bufio/bufio.go:113 +0xf8 fp=0x140002cfb40 sp=0x140002cfb00 pc=0x10426e088
bufio.(*Reader).Peek(0x1400013ede0, 0x4)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/bufio/bufio.go:151 +0x60 fp=0x140002cfb60 sp=0x140002cfb40 pc=0x10426e1f0
net/http.(*conn).serve(0x14000496120, {0x1048fe968, 0x1400032c7e0})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:2044 +0x64c fp=0x140002cffa0 sp=0x140002cfb60 pc=0x1042dfdcc
net/http.(*Server).Serve.func3()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:3086 +0x30 fp=0x140002cffd0 sp=0x140002cffa0 pc=0x1042e3fb0
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140002cffd0 sp=0x140002cffd0 pc=0x1040bb954
created by net/http.(*Server).Serve in goroutine 1
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:3086 +0x4cc
goroutine 27 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x1040d0a20?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/proc.go:398 +0xc8 fp=0x140000fa540 sp=0x140000fa520 pc=0x10408a498
runtime.netpollblock(0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:564 +0x158 fp=0x140000fa580 sp=0x140000fa540 pc=0x104083b78
internal/poll.runtime_pollWait(0x10bb707b8, 0x72)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/netpoll.go:343 +0xa0 fp=0x140000fa5b0 sp=0x140000fa580 pc=0x1040b5380
internal/poll.(*pollDesc).wait(0x140003dc200?, 0x1400040fab1?, 0x0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x140000fa5e0 sp=0x140000fa5b0 pc=0x10413b4a8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x140003dc200, {0x1400040fab1, 0x1, 0x1})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/internal/poll/fd_unix.go:164 +0x200 fp=0x140000fa680 sp=0x140000fa5e0 pc=0x10413c7f0
net.(*netFD).Read(0x140003dc200, {0x1400040fab1?
, 0x0?, 0x0?})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/fd_posix.go:55 +0x28 fp=0x140000fa6d0 sp=0x140000fa680 pc=0x1041a9d88
net.(*conn).Read(0x140004a21c8, {0x1400040fab1?, 0x0?, 0x0?})
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/net.go:179 +0x34 fp=0x140000fa720 sp=0x140000fa6d0 pc=0x1041b7264
net.(*TCPConn).Read(0x0?, {0x1400040fab1?, 0x0?, 0x0?})
<autogenerated>:1 +0x2c fp=0x140000fa750 sp=0x140000fa720 pc=0x1041c88fc
net/http.(*connReader).backgroundRead(0x1400040faa0)
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:683 +0x40 fp=0x140000fa7b0 sp=0x140000fa750 pc=0x1042da230
net/http.(*connReader).startBackgroundRead.func2()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:679 +0x28 fp=0x140000fa7d0 sp=0x140000fa7b0 pc=0x1042da158
runtime.goexit()
/opt/homebrew/Cellar/go/1.21.3/libexec/src/runtime/asm_arm64.s:1197 +0x4 fp=0x140000fa7d0 sp=0x140000fa7d0 pc=0x1040bb954
created by net/http.(*connReader).startBackgroundRead in goroutine 43
/opt/homebrew/Cellar/go/1.21.3/libexec/src/net/http/server.go:679 +0xc8
r0 0x0
r1 0x0
r2 0x2
r3 0x181adcd05
r4 0x67
r5 0x6f
r6 0x61
r7 0xed0
r8 0x0
r9 0x5
r10 0x10527f370
r11 0xc0452f67cb792c67
r12 0x2de4b19f0114bd9f
r13 0xc949d7c7509e6557
r14 0xa8566ef1c8cc9fa
r15 0xbe41013c00000000
r16 0x181adcdc0
r17 0x1e0c4c3c8
r18 0x0
r19 0x0
r20 0x1716463a0
r21 0x10c351080
r22 0x600002cc4030
r23 0x171646580
r24 0x171646438
r25 0x1400082d8a8
r26 0x171646418
r27 0x1716468c0
r28 0x171646410
r29 0x171646310
lr 0x10c21ea8c
sp 0x1716460b0
pc 0x181adcdc4
fault 0x0
time=2024-02-21T16:30:33.746Z level=INFO source=images.go:706 msg="total blobs: 28"
time=2024-02-21T16:30:33.752Z level=INFO source=images.go:713 msg="total unused blobs removed: 0"
time=2024-02-21T16:30:33.756Z level=INFO source=routes.go:1014 msg="Listening on 127.0.0.1:11434 (version 0.1.25)"
time=2024-02-21T16:30:33.756Z level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
time=2024-02-21T16:30:33.771Z level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [metal]"
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2643/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2643/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4146
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4146/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4146/comments
|
https://api.github.com/repos/ollama/ollama/issues/4146/events
|
https://github.com/ollama/ollama/issues/4146
| 2,278,655,538
|
I_kwDOJ0Z1Ps6H0YYy
| 4,146
|
starting the docker container stucks at "CPU has AVX2"
|
{
"login": "valiantrex3rei",
"id": 158743299,
"node_id": "U_kgDOCXY7Aw",
"avatar_url": "https://avatars.githubusercontent.com/u/158743299?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/valiantrex3rei",
"html_url": "https://github.com/valiantrex3rei",
"followers_url": "https://api.github.com/users/valiantrex3rei/followers",
"following_url": "https://api.github.com/users/valiantrex3rei/following{/other_user}",
"gists_url": "https://api.github.com/users/valiantrex3rei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/valiantrex3rei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/valiantrex3rei/subscriptions",
"organizations_url": "https://api.github.com/users/valiantrex3rei/orgs",
"repos_url": "https://api.github.com/users/valiantrex3rei/repos",
"events_url": "https://api.github.com/users/valiantrex3rei/events{/privacy}",
"received_events_url": "https://api.github.com/users/valiantrex3rei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677677816,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgVG-A",
"url": "https://api.github.com/repos/ollama/ollama/labels/docker",
"name": "docker",
"color": "0052CC",
"default": false,
"description": "Issues relating to using ollama in containers"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-05-04T01:48:45
| 2024-05-04T22:42:08
| 2024-05-04T22:42:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hello everyone.
I was following the tutorial at [Ollama Docker image](https://hub.docker.com/r/ollama/ollama).
After installing the NVIDIA Container Toolkit, and then configuring Docker to use Nvidia driver and starting the container,
I tried to attach the container but it took forever.
I tried to remove the `-d` to see the outout,
```
docker run --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
```
and got the following:
```
time=2024-05-04T01:07:55.992Z level=INFO source=images.go:828 msg="total blobs: 0"
time=2024-05-04T01:07:55.993Z level=INFO source=images.go:835 msg="total unused blobs removed: 0"
time=2024-05-04T01:07:55.993Z level=INFO source=routes.go:1071 msg="Listening on [::]:11434 (version 0.1.33)"
time=2024-05-04T01:07:55.993Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama244156069/runners
time=2024-05-04T01:07:57.767Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11 rocm_v60002]"
time=2024-05-04T01:07:57.767Z level=INFO source=gpu.go:96 msg="Detecting GPUs"
time=2024-05-04T01:07:57.801Z level=INFO source=gpu.go:101 msg="detected GPUs" library=/tmp/ollama244156069/runners/cuda_v11/libcudart.so.11.0 count=1
time=2024-05-04T01:07:57.801Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
```
It seems like the Nvidia driver and CUDA are recognized.
Nevertheless, I also tried to remove the `--gpus=all` for CPU only mode, but still stuck at the same spot, only without the `detected GPUs` in the output.
Any suggestions and/or corrections would be appreciated.
system information:
```
_,met$$$$$gg. bwang@deb11bwang
,g$$$$$$$$$$$$$$$P. OS: Debian 11 bullseye
,g$$P"" """Y$$.". Kernel: x86_64 Linux 5.10.0-28-amd64
,$$P' `$$$. Uptime: 6d 24m
',$$P ,ggs. `$$b: Packages: 1726
`d$$' ,$P"' . $$$ Shell: bash 5.1.4
$$P d$' , $$P Disk: 83G / 931G (10%)
$$: $$. - ,d$$' CPU: 13th Gen Intel Core i9-13900K @ 32x 7.5GHz [25.0°C]
$$\; Y$b._ _,d$P' GPU: NVIDIA GeForce RTX 3060
Y$$. `.`"Y$$$$P"' RAM: 2138MiB / 31866MiB
`$$b "-.__
`Y$$
`Y$$.
`$$b.
`Y$$b.
`"Y$b._
`""""
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 470.223.02 Driver Version: 470.223.02 CUDA Version: 11.4 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce ... On | 00000000:01:00.0 Off | N/A |
| 0% 33C P8 13W / 170W | 14MiB / 12053MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 882 G /usr/lib/xorg/Xorg 12MiB |
+-----------------------------------------------------------------------------+
```
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
latest
|
{
"login": "valiantrex3rei",
"id": 158743299,
"node_id": "U_kgDOCXY7Aw",
"avatar_url": "https://avatars.githubusercontent.com/u/158743299?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/valiantrex3rei",
"html_url": "https://github.com/valiantrex3rei",
"followers_url": "https://api.github.com/users/valiantrex3rei/followers",
"following_url": "https://api.github.com/users/valiantrex3rei/following{/other_user}",
"gists_url": "https://api.github.com/users/valiantrex3rei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/valiantrex3rei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/valiantrex3rei/subscriptions",
"organizations_url": "https://api.github.com/users/valiantrex3rei/orgs",
"repos_url": "https://api.github.com/users/valiantrex3rei/repos",
"events_url": "https://api.github.com/users/valiantrex3rei/events{/privacy}",
"received_events_url": "https://api.github.com/users/valiantrex3rei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4146/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4146/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8411
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8411/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8411/comments
|
https://api.github.com/repos/ollama/ollama/issues/8411/events
|
https://github.com/ollama/ollama/pull/8411
| 2,786,083,266
|
PR_kwDOJ0Z1Ps6Hpf8j
| 8,411
|
llama: move grammar tests to llama_test.go
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-14T02:49:33
| 2025-01-14T20:55:47
| 2025-01-14T20:55:46
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8411",
"html_url": "https://github.com/ollama/ollama/pull/8411",
"diff_url": "https://github.com/ollama/ollama/pull/8411.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8411.patch",
"merged_at": "2025-01-14T20:55:46"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8411/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8411/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5025
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5025/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5025/comments
|
https://api.github.com/repos/ollama/ollama/issues/5025/events
|
https://github.com/ollama/ollama/pull/5025
| 2,351,694,257
|
PR_kwDOJ0Z1Ps5yZKnq
| 5,025
|
Revert "proper utf16 support"
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-06-13T17:24:20
| 2024-06-14T02:17:11
| 2024-06-13T17:31:25
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5025",
"html_url": "https://github.com/ollama/ollama/pull/5025",
"diff_url": "https://github.com/ollama/ollama/pull/5025.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5025.patch",
"merged_at": "2024-06-13T17:31:25"
}
|
This reverts commit 66ab48772f4f41f3f27fb93e15ef0cf756bda3d0.
this change broke utf-8 scanning of multi-byte runes
resolves #5055 by reverting the offending change #4715
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5025/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5025/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7800
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7800/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7800/comments
|
https://api.github.com/repos/ollama/ollama/issues/7800/events
|
https://github.com/ollama/ollama/issues/7800
| 2,683,665,190
|
I_kwDOJ0Z1Ps6f9Xsm
| 7,800
|
Loosing useragent after HTTP redirect while pulling models
|
{
"login": "ZeGuigui",
"id": 13727111,
"node_id": "MDQ6VXNlcjEzNzI3MTEx",
"avatar_url": "https://avatars.githubusercontent.com/u/13727111?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZeGuigui",
"html_url": "https://github.com/ZeGuigui",
"followers_url": "https://api.github.com/users/ZeGuigui/followers",
"following_url": "https://api.github.com/users/ZeGuigui/following{/other_user}",
"gists_url": "https://api.github.com/users/ZeGuigui/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZeGuigui/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZeGuigui/subscriptions",
"organizations_url": "https://api.github.com/users/ZeGuigui/orgs",
"repos_url": "https://api.github.com/users/ZeGuigui/repos",
"events_url": "https://api.github.com/users/ZeGuigui/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZeGuigui/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-11-22T15:49:17
| 2024-11-22T15:49:17
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When pulling a model, a first HTTP GET call is issued using a specific ollama user agent (like `ollama/0.4.2`). The following GET requests to cloudflare use a different user agent (`Go-http-client/1.1`).
My company use firewall rules based on domains and user agents. Ollama should use a consistent useragent for all its http requests
Logs from my IT department (pulling nomic-embed-text):
```
[22/Nov/2024:15:47:19 +0100] "" int.ern.ali.pv4 307 "GET https://registry.ollama.ai/v2/library/nomic-embed-text/blobs/sha256:970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6 HTTP/2.0" "Business, Software/Hardware" "Minimal Risk" "text/html" 1304 "ollama/0.4.2 (arm64 darwin) Go/go1.23.3" "" "0" ipv6:here:removed
[22/Nov/2024:15:47:50 +0100] "" int.ern.ali.pv4 206 "GET https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/97/970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=xxx&X-Amz-Date=20241122T144719Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&X-Amz-Signature=xxx HTTP/1.1" "Content Server" "Minimal Risk" "application/octet-stream" 74291068 "Go-http-client/1.1" "" "0" ipv6:here:removed
[22/Nov/2024:15:47:51 +0100] "" int.ern.ali.pv4 206 "GET https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/97/970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=xxx&X-Amz-Date=20241122T144719Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&X-Amz-Signature=xxx HTTP/1.1" "Content Server" "Minimal Risk" "application/octet-stream" 100000413 "Go-http-client/1.1" "" "0" ipv6:here:removed
[22/Nov/2024:15:47:55 +0100] "" int.ern.ali.pv4 206 "GET https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/97/970aa74c0a90ef7482477cf803618e776e173c007bf957f635f1015bfcfef0e6/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=xxx&X-Amz-Date=20241122T144719Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&X-Amz-Signature=xxx HTTP/1.1" "Content Server" "Minimal Risk" "application/octet-stream" 100000404 "Go-http-client/1.1" "" "0" ipv6:here:removed
```
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.4.2
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7800/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7800/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2479
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2479/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2479/comments
|
https://api.github.com/repos/ollama/ollama/issues/2479/events
|
https://github.com/ollama/ollama/pull/2479
| 2,133,146,409
|
PR_kwDOJ0Z1Ps5my8lv
| 2,479
|
Bump llama.cpp submodule to `6c00a06`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-13T21:21:27
| 2024-02-14T01:12:43
| 2024-02-14T01:12:42
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2479",
"html_url": "https://github.com/ollama/ollama/pull/2479",
"diff_url": "https://github.com/ollama/ollama/pull/2479.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2479.patch",
"merged_at": "2024-02-14T01:12:42"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2479/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2479/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7138
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7138/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7138/comments
|
https://api.github.com/repos/ollama/ollama/issues/7138/events
|
https://github.com/ollama/ollama/pull/7138
| 2,573,662,194
|
PR_kwDOJ0Z1Ps59-SVb
| 7,138
|
llama: wire up builtin runner
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-10-08T16:16:52
| 2024-11-20T16:33:15
| 2024-11-20T16:33:15
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7138",
"html_url": "https://github.com/ollama/ollama/pull/7138",
"diff_url": "https://github.com/ollama/ollama/pull/7138.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7138.patch",
"merged_at": null
}
|
This adds a new entrypoint into the ollama CLI to run the cgo built runner. On Mac arm64, this will have GPU support, but on all other platforms it will be the lowest common denominator CPU build. After we fully transition to the new Go runners more tech-debt can be removed and we can stop building the "default" runner via make and rely on the builtin always.
With this change, we achieve the clean build on MacOS ARM of being able to simply say
```
go build .
./ollama serve
```
...and run on the GPU. (No make or generate required.)
Replaces #6991 on main
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7138/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7138/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2750
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2750/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2750/comments
|
https://api.github.com/repos/ollama/ollama/issues/2750/events
|
https://github.com/ollama/ollama/pull/2750
| 2,152,926,020
|
PR_kwDOJ0Z1Ps5n2V7p
| 2,750
|
style: status badge for `test` in `README.md`
|
{
"login": "hamirmahal",
"id": 43425812,
"node_id": "MDQ6VXNlcjQzNDI1ODEy",
"avatar_url": "https://avatars.githubusercontent.com/u/43425812?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hamirmahal",
"html_url": "https://github.com/hamirmahal",
"followers_url": "https://api.github.com/users/hamirmahal/followers",
"following_url": "https://api.github.com/users/hamirmahal/following{/other_user}",
"gists_url": "https://api.github.com/users/hamirmahal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hamirmahal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hamirmahal/subscriptions",
"organizations_url": "https://api.github.com/users/hamirmahal/orgs",
"repos_url": "https://api.github.com/users/hamirmahal/repos",
"events_url": "https://api.github.com/users/hamirmahal/events{/privacy}",
"received_events_url": "https://api.github.com/users/hamirmahal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-02-25T20:23:29
| 2024-05-09T19:28:25
| 2024-05-09T16:08:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2750",
"html_url": "https://github.com/ollama/ollama/pull/2750",
"diff_url": "https://github.com/ollama/ollama/pull/2750.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2750.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2750/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2750/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3852
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3852/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3852/comments
|
https://api.github.com/repos/ollama/ollama/issues/3852/events
|
https://github.com/ollama/ollama/issues/3852
| 2,259,692,103
|
I_kwDOJ0Z1Ps6GsCpH
| 3,852
|
Save command in repl allows invalid model names (such as "-h")
|
{
"login": "Chiuuu0209",
"id": 69319144,
"node_id": "MDQ6VXNlcjY5MzE5MTQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/69319144?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Chiuuu0209",
"html_url": "https://github.com/Chiuuu0209",
"followers_url": "https://api.github.com/users/Chiuuu0209/followers",
"following_url": "https://api.github.com/users/Chiuuu0209/following{/other_user}",
"gists_url": "https://api.github.com/users/Chiuuu0209/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Chiuuu0209/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Chiuuu0209/subscriptions",
"organizations_url": "https://api.github.com/users/Chiuuu0209/orgs",
"repos_url": "https://api.github.com/users/Chiuuu0209/repos",
"events_url": "https://api.github.com/users/Chiuuu0209/events{/privacy}",
"received_events_url": "https://api.github.com/users/Chiuuu0209/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-04-23T20:12:36
| 2024-05-14T01:48:29
| 2024-05-14T01:48:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I misuse the -h flag in a llama3 session
I want to see the usage of `/save` in a session so i type `/save -h` .Thus i have saved a session name `-h`
```
$ollama list
NAME ID SIZE MODIFIED
-h:latest c36f2d34d11b 4.7 GB 18 minutes ago
llama3:latest a6990ed6be41 4.7 GB 53 minutes ago
```
Now i can not remove it by
```
ollama rm -h
```
I try to remove it by ID but it seems not support this feature now.
What else can i do for this problem.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3852/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/61
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/61/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/61/comments
|
https://api.github.com/repos/ollama/ollama/issues/61/events
|
https://github.com/ollama/ollama/issues/61
| 1,795,167,226
|
I_kwDOJ0Z1Ps5rABP6
| 61
|
server crashes if connection closes
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2023-07-08T23:44:07
| 2023-07-12T02:56:15
| 2023-07-12T02:56:15
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
If the incoming tcp connection closes before generation is complete, the server shut down and print an error
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/61/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/61/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/801
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/801/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/801/comments
|
https://api.github.com/repos/ollama/ollama/issues/801/events
|
https://github.com/ollama/ollama/pull/801
| 1,944,805,415
|
PR_kwDOJ0Z1Ps5c4LcA
| 801
|
Add ellama community integration
|
{
"login": "s-kostyaev",
"id": 8576745,
"node_id": "MDQ6VXNlcjg1NzY3NDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/8576745?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/s-kostyaev",
"html_url": "https://github.com/s-kostyaev",
"followers_url": "https://api.github.com/users/s-kostyaev/followers",
"following_url": "https://api.github.com/users/s-kostyaev/following{/other_user}",
"gists_url": "https://api.github.com/users/s-kostyaev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/s-kostyaev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/s-kostyaev/subscriptions",
"organizations_url": "https://api.github.com/users/s-kostyaev/orgs",
"repos_url": "https://api.github.com/users/s-kostyaev/repos",
"events_url": "https://api.github.com/users/s-kostyaev/events{/privacy}",
"received_events_url": "https://api.github.com/users/s-kostyaev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-16T09:44:09
| 2023-10-16T22:51:25
| 2023-10-16T22:51:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/801",
"html_url": "https://github.com/ollama/ollama/pull/801",
"diff_url": "https://github.com/ollama/ollama/pull/801.diff",
"patch_url": "https://github.com/ollama/ollama/pull/801.patch",
"merged_at": "2023-10-16T22:51:25"
}
|
Hi.
Thank you for this cool project.
In this PR new emacs package added to community integrations. [ellama](https://github.com/s-kostyaev/ellama) supports streaming output and can be installed from [MELPA](https://melpa.org/#/getting-started)
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/801/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/801/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7245
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7245/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7245/comments
|
https://api.github.com/repos/ollama/ollama/issues/7245/events
|
https://github.com/ollama/ollama/issues/7245
| 2,595,583,978
|
I_kwDOJ0Z1Ps6atXfq
| 7,245
|
model in 3 part
|
{
"login": "werruww",
"id": 157249411,
"node_id": "U_kgDOCV9vgw",
"avatar_url": "https://avatars.githubusercontent.com/u/157249411?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/werruww",
"html_url": "https://github.com/werruww",
"followers_url": "https://api.github.com/users/werruww/followers",
"following_url": "https://api.github.com/users/werruww/following{/other_user}",
"gists_url": "https://api.github.com/users/werruww/gists{/gist_id}",
"starred_url": "https://api.github.com/users/werruww/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/werruww/subscriptions",
"organizations_url": "https://api.github.com/users/werruww/orgs",
"repos_url": "https://api.github.com/users/werruww/repos",
"events_url": "https://api.github.com/users/werruww/events{/privacy}",
"received_events_url": "https://api.github.com/users/werruww/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-10-17T19:40:31
| 2024-10-23T01:45:30
| 2024-10-23T01:45:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
How do I run a model divided into parts in Olam?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7245/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7245/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4559
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4559/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4559/comments
|
https://api.github.com/repos/ollama/ollama/issues/4559/events
|
https://github.com/ollama/ollama/issues/4559
| 2,308,455,375
|
I_kwDOJ0Z1Ps6JmDvP
| 4,559
|
rm command dont delete files
|
{
"login": "milenamilka755",
"id": 149798060,
"node_id": "U_kgDOCO28rA",
"avatar_url": "https://avatars.githubusercontent.com/u/149798060?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/milenamilka755",
"html_url": "https://github.com/milenamilka755",
"followers_url": "https://api.github.com/users/milenamilka755/followers",
"following_url": "https://api.github.com/users/milenamilka755/following{/other_user}",
"gists_url": "https://api.github.com/users/milenamilka755/gists{/gist_id}",
"starred_url": "https://api.github.com/users/milenamilka755/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/milenamilka755/subscriptions",
"organizations_url": "https://api.github.com/users/milenamilka755/orgs",
"repos_url": "https://api.github.com/users/milenamilka755/repos",
"events_url": "https://api.github.com/users/milenamilka755/events{/privacy}",
"received_events_url": "https://api.github.com/users/milenamilka755/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-05-21T14:32:36
| 2024-10-24T16:01:32
| 2024-10-24T16:01:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama version is 0.1.38
I use rm command and
the model was deleted from the list of available models, but the associated sha256 files from the
ollama\models\blobs directory were not deleted.
I had to delete the files manually.
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.38
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4559/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/4559/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3349
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3349/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3349/comments
|
https://api.github.com/repos/ollama/ollama/issues/3349/events
|
https://github.com/ollama/ollama/issues/3349
| 2,206,683,854
|
I_kwDOJ0Z1Ps6Dh1LO
| 3,349
|
mixtral:8x7b-instruct-v0.1-fp16 served on Ollama performs worse than the same model served on vllm with same configuration
|
{
"login": "yilei-ding",
"id": 131700591,
"node_id": "U_kgDOB9mXbw",
"avatar_url": "https://avatars.githubusercontent.com/u/131700591?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yilei-ding",
"html_url": "https://github.com/yilei-ding",
"followers_url": "https://api.github.com/users/yilei-ding/followers",
"following_url": "https://api.github.com/users/yilei-ding/following{/other_user}",
"gists_url": "https://api.github.com/users/yilei-ding/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yilei-ding/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yilei-ding/subscriptions",
"organizations_url": "https://api.github.com/users/yilei-ding/orgs",
"repos_url": "https://api.github.com/users/yilei-ding/repos",
"events_url": "https://api.github.com/users/yilei-ding/events{/privacy}",
"received_events_url": "https://api.github.com/users/yilei-ding/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 6
| 2024-03-25T20:50:45
| 2024-07-25T00:17:33
| 2024-07-25T00:17:33
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
hi, I've been comparing the inference speeds of serving **unquantzied** `mixtral:8x7b-instruct-v0.1-fp16` between using the ollama and vllm platforms. I had set the temeparture to 0 and also set the same number of generated tokens, the mixtral model served on ollama performs very bad. I also checked that the [INST] and [/INST] was added to the prompt on ollama, same as vllm. But the model still performs very bad. Notably, ollama manages to run the model using just 2 A6000 GPUs (each with 48G memory), whereas both vllm and Hugging Face require 4 GPUs to handle the **unquantized** mixtral 8x7b model. This has led me to wonder if ollama applies any form of on-the-fly quantization?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3349/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/3349/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4941
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4941/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4941/comments
|
https://api.github.com/repos/ollama/ollama/issues/4941/events
|
https://github.com/ollama/ollama/pull/4941
| 2,341,963,893
|
PR_kwDOJ0Z1Ps5x4Gcn
| 4,941
|
llm: always add bos token to prompt
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-09T01:13:51
| 2024-06-09T01:47:11
| 2024-06-09T01:47:10
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4941",
"html_url": "https://github.com/ollama/ollama/pull/4941",
"diff_url": "https://github.com/ollama/ollama/pull/4941.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4941.patch",
"merged_at": "2024-06-09T01:47:10"
}
|
Carries https://github.com/ollama/ollama/pull/4399
Fixes issues with embeddings not working because `bos` tokens were omitted from the prompt
Thank you @deadbeef84 for finding and fixing this.
Fixes https://github.com/ollama/ollama/issues/4207
Fixes https://github.com/ollama/ollama/issues/3777
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4941/reactions",
"total_count": 7,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 7,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4941/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8428
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8428/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8428/comments
|
https://api.github.com/repos/ollama/ollama/issues/8428/events
|
https://github.com/ollama/ollama/pull/8428
| 2,788,366,012
|
PR_kwDOJ0Z1Ps6HxXLh
| 8,428
|
Fix absolute path names + gguf detection
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-14T22:05:17
| 2025-01-15T03:01:26
| 2025-01-15T03:01:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8428",
"html_url": "https://github.com/ollama/ollama/pull/8428",
"diff_url": "https://github.com/ollama/ollama/pull/8428.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8428.patch",
"merged_at": "2025-01-15T03:01:24"
}
|
This change fixes two issues with the model creation:
1. Absolute pathnames for GGUF files were being treated as a relative pathname; and
2. The GGUF image detection wasn't able to correctly determine if a GGUF file w/o the correct extension was actually a GGUF file.
This change fixes both issues and adds additional unit tests.
Fixes #8427 and #8423
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8428/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8428/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/950
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/950/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/950/comments
|
https://api.github.com/repos/ollama/ollama/issues/950/events
|
https://github.com/ollama/ollama/pull/950
| 1,968,986,048
|
PR_kwDOJ0Z1Ps5eJ13P
| 950
|
readline windows terminal support
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-30T18:33:27
| 2023-10-30T20:18:13
| 2023-10-30T20:18:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/950",
"html_url": "https://github.com/ollama/ollama/pull/950",
"diff_url": "https://github.com/ollama/ollama/pull/950.diff",
"patch_url": "https://github.com/ollama/ollama/pull/950.patch",
"merged_at": "2023-10-30T20:18:12"
}
|
- update the readline package to have basic support on windows, this is not full feature parity with the unix cli yet
This change adds a readline implementation that works on Windows. I dont expect this will work for all features (keyboard short-cuts probably need work). This change also fixes `go build .` on windows.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/950/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/950/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3043
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3043/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3043/comments
|
https://api.github.com/repos/ollama/ollama/issues/3043/events
|
https://github.com/ollama/ollama/pull/3043
| 2,177,778,889
|
PR_kwDOJ0Z1Ps5pLAPU
| 3,043
|
Fix paste of text with line feed characters
|
{
"login": "glumia",
"id": 20153481,
"node_id": "MDQ6VXNlcjIwMTUzNDgx",
"avatar_url": "https://avatars.githubusercontent.com/u/20153481?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/glumia",
"html_url": "https://github.com/glumia",
"followers_url": "https://api.github.com/users/glumia/followers",
"following_url": "https://api.github.com/users/glumia/following{/other_user}",
"gists_url": "https://api.github.com/users/glumia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/glumia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/glumia/subscriptions",
"organizations_url": "https://api.github.com/users/glumia/orgs",
"repos_url": "https://api.github.com/users/glumia/repos",
"events_url": "https://api.github.com/users/glumia/events{/privacy}",
"received_events_url": "https://api.github.com/users/glumia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-10T15:48:33
| 2024-05-07T22:26:07
| 2024-05-07T22:26:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3043",
"html_url": "https://github.com/ollama/ollama/pull/3043",
"diff_url": "https://github.com/ollama/ollama/pull/3043.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3043.patch",
"merged_at": "2024-05-07T22:26:07"
}
|
Hey there, thanks for this awesome tool 🙌
This is a small patch to fix copy-paste with some terminals that send line feed characters when pasting text with newlines.
To test these changes, run ollama in interactive mode on [Kitty](https://github.com/kovidgoyal/kitty) and paste:
```txt
text
on multiple
lines
```
Curiously this issue doesn't occur on MacOS's default terminal, because it sends CR characters for the newlines. I'm not sure which terminal behavior is the correct one, anyway I figured it would make sense for ollama to support CR and LF characters in the same way.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3043/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3043/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1719
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1719/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1719/comments
|
https://api.github.com/repos/ollama/ollama/issues/1719/events
|
https://github.com/ollama/ollama/issues/1719
| 2,056,322,561
|
I_kwDOJ0Z1Ps56kP4B
| 1,719
|
can ollama support qwen72b ?
|
{
"login": "goldenquant",
"id": 108568777,
"node_id": "U_kgDOBnigyQ",
"avatar_url": "https://avatars.githubusercontent.com/u/108568777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/goldenquant",
"html_url": "https://github.com/goldenquant",
"followers_url": "https://api.github.com/users/goldenquant/followers",
"following_url": "https://api.github.com/users/goldenquant/following{/other_user}",
"gists_url": "https://api.github.com/users/goldenquant/gists{/gist_id}",
"starred_url": "https://api.github.com/users/goldenquant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/goldenquant/subscriptions",
"organizations_url": "https://api.github.com/users/goldenquant/orgs",
"repos_url": "https://api.github.com/users/goldenquant/repos",
"events_url": "https://api.github.com/users/goldenquant/events{/privacy}",
"received_events_url": "https://api.github.com/users/goldenquant/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-12-26T11:29:18
| 2024-03-12T00:12:45
| 2024-03-12T00:12:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
can ollama support qwen72b ?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1719/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1719/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3532
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3532/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3532/comments
|
https://api.github.com/repos/ollama/ollama/issues/3532/events
|
https://github.com/ollama/ollama/issues/3532
| 2,230,379,580
|
I_kwDOJ0Z1Ps6E8OQ8
| 3,532
|
CohereForAI / c4ai-command-r-plus-4bit
|
{
"login": "petunder",
"id": 7860649,
"node_id": "MDQ6VXNlcjc4NjA2NDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/7860649?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/petunder",
"html_url": "https://github.com/petunder",
"followers_url": "https://api.github.com/users/petunder/followers",
"following_url": "https://api.github.com/users/petunder/following{/other_user}",
"gists_url": "https://api.github.com/users/petunder/gists{/gist_id}",
"starred_url": "https://api.github.com/users/petunder/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/petunder/subscriptions",
"organizations_url": "https://api.github.com/users/petunder/orgs",
"repos_url": "https://api.github.com/users/petunder/repos",
"events_url": "https://api.github.com/users/petunder/events{/privacy}",
"received_events_url": "https://api.github.com/users/petunder/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 9
| 2024-04-08T06:37:20
| 2024-04-12T19:15:10
| 2024-04-12T19:15:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What model would you like?
C4AI Command R+ is an open weights research release of a 104B billion parameter model with highly advanced capabilities, this includes Retrieval Augmented Generation (RAG) and tool use to automate sophisticated tasks.
https://huggingface.co/CohereForAI/c4ai-command-r-plus-4bit
This model is split into individual files, which makes direct uploading difficult. Additionally, the model is in the .safetensors format.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3532/reactions",
"total_count": 8,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/3532/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/124
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/124/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/124/comments
|
https://api.github.com/repos/ollama/ollama/issues/124/events
|
https://github.com/ollama/ollama/pull/124
| 1,812,052,423
|
PR_kwDOJ0Z1Ps5V5eBJ
| 124
|
Update README.md
|
{
"login": "DavidZirinsky",
"id": 15115928,
"node_id": "MDQ6VXNlcjE1MTE1OTI4",
"avatar_url": "https://avatars.githubusercontent.com/u/15115928?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DavidZirinsky",
"html_url": "https://github.com/DavidZirinsky",
"followers_url": "https://api.github.com/users/DavidZirinsky/followers",
"following_url": "https://api.github.com/users/DavidZirinsky/following{/other_user}",
"gists_url": "https://api.github.com/users/DavidZirinsky/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DavidZirinsky/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DavidZirinsky/subscriptions",
"organizations_url": "https://api.github.com/users/DavidZirinsky/orgs",
"repos_url": "https://api.github.com/users/DavidZirinsky/repos",
"events_url": "https://api.github.com/users/DavidZirinsky/events{/privacy}",
"received_events_url": "https://api.github.com/users/DavidZirinsky/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-07-19T14:15:22
| 2023-07-19T14:44:24
| 2023-07-19T14:40:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/124",
"html_url": "https://github.com/ollama/ollama/pull/124",
"diff_url": "https://github.com/ollama/ollama/pull/124.diff",
"patch_url": "https://github.com/ollama/ollama/pull/124.patch",
"merged_at": "2023-07-19T14:40:24"
}
|
I needed to do this to run the project after building from source, so I think the documentation should reflect this
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/124/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/124/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7975
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7975/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7975/comments
|
https://api.github.com/repos/ollama/ollama/issues/7975/events
|
https://github.com/ollama/ollama/pull/7975
| 2,723,684,925
|
PR_kwDOJ0Z1Ps6EXGc3
| 7,975
|
readme: add llama3.3 to readme
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-06T18:32:23
| 2024-12-06T19:05:13
| 2024-12-06T19:05:11
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7975",
"html_url": "https://github.com/ollama/ollama/pull/7975",
"diff_url": "https://github.com/ollama/ollama/pull/7975.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7975.patch",
"merged_at": "2024-12-06T19:05:11"
}
|
readme: add llama3.3 to readme
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7975/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7975/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6829
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6829/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6829/comments
|
https://api.github.com/repos/ollama/ollama/issues/6829/events
|
https://github.com/ollama/ollama/pull/6829
| 2,529,478,085
|
PR_kwDOJ0Z1Ps57rMc0
| 6,829
|
CI: set platform build build_linux script to keep buildx happy
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-16T20:53:16
| 2024-09-16T21:07:43
| 2024-09-16T21:07:29
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6829",
"html_url": "https://github.com/ollama/ollama/pull/6829",
"diff_url": "https://github.com/ollama/ollama/pull/6829.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6829.patch",
"merged_at": "2024-09-16T21:07:29"
}
|
The runners don't have emulation set up so the default multi-platform build wont work.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6829/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6829/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1255
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1255/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1255/comments
|
https://api.github.com/repos/ollama/ollama/issues/1255/events
|
https://github.com/ollama/ollama/issues/1255
| 2,008,240,054
|
I_kwDOJ0Z1Ps53s0-2
| 1,255
|
ollama listreg
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-11-23T13:30:13
| 2023-11-23T20:34:40
| 2023-11-23T19:33:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
We can pull from the registry with ollama pull,
We can push to the registry with ollama push,
we don't have any way (that I can see) of seeing what's in the registry.
I propose ollama listreg
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1255/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1255/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/914
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/914/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/914/comments
|
https://api.github.com/repos/ollama/ollama/issues/914/events
|
https://github.com/ollama/ollama/issues/914
| 1,963,845,529
|
I_kwDOJ0Z1Ps51DeeZ
| 914
|
Locally-hosted library
|
{
"login": "JDRay42",
"id": 10964706,
"node_id": "MDQ6VXNlcjEwOTY0NzA2",
"avatar_url": "https://avatars.githubusercontent.com/u/10964706?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JDRay42",
"html_url": "https://github.com/JDRay42",
"followers_url": "https://api.github.com/users/JDRay42/followers",
"following_url": "https://api.github.com/users/JDRay42/following{/other_user}",
"gists_url": "https://api.github.com/users/JDRay42/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JDRay42/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JDRay42/subscriptions",
"organizations_url": "https://api.github.com/users/JDRay42/orgs",
"repos_url": "https://api.github.com/users/JDRay42/repos",
"events_url": "https://api.github.com/users/JDRay42/events{/privacy}",
"received_events_url": "https://api.github.com/users/JDRay42/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
open
| false
| null |
[] | null | 21
| 2023-10-26T15:38:02
| 2025-01-09T07:41:05
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Organizations that want to use Ollama in their enterprise will want some sort of control over the models that are available for use and where the trained models go when they get pushed. For instance, most implementers outside AI research won't necessarily want "uncensored" models on half the laptops in the company. And, if someone trains a model on corporate data, they will want to control where that model goes when someone pushes it, and won't want it made available to the general public.
So...
1. As an enterprise implementer, I want to be able to host a local library.
2. As an enterprise implementer, I want to be able to control the configuration of the Ollama instances on my network.
3. As an enterprise implementer, I want to ensure that locally-developed libraries don't get pushed to a public repository.
I think that about covers it.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/914/reactions",
"total_count": 21,
"+1": 21,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/914/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4694
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4694/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4694/comments
|
https://api.github.com/repos/ollama/ollama/issues/4694/events
|
https://github.com/ollama/ollama/issues/4694
| 2,322,384,586
|
I_kwDOJ0Z1Ps6KbMbK
| 4,694
|
always fail to push models
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
open
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-05-29T05:25:33
| 2024-07-15T02:16:42
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have tried many times
```
pushing 6d1c7eba2686... 0% ▕ ▏ 1.2 MB/ 74 GB
Error: max retries exceeded: http status 502 Bad Gateway: <html>
<head><title>502 Bad Gateway</title></head>
<body>
<center><h1>502 Bad Gateway</h1></center>
<hr><center>cloudflare</center>
</body>
</html>
```
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
1.38
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4694/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4694/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7088
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7088/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7088/comments
|
https://api.github.com/repos/ollama/ollama/issues/7088/events
|
https://github.com/ollama/ollama/issues/7088
| 2,563,332,149
|
I_kwDOJ0Z1Ps6YyVg1
| 7,088
|
Add CLI Support for Setting Temperature in ollama run
|
{
"login": "Bakht-Ullah",
"id": 150267165,
"node_id": "U_kgDOCPTlHQ",
"avatar_url": "https://avatars.githubusercontent.com/u/150267165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Bakht-Ullah",
"html_url": "https://github.com/Bakht-Ullah",
"followers_url": "https://api.github.com/users/Bakht-Ullah/followers",
"following_url": "https://api.github.com/users/Bakht-Ullah/following{/other_user}",
"gists_url": "https://api.github.com/users/Bakht-Ullah/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Bakht-Ullah/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Bakht-Ullah/subscriptions",
"organizations_url": "https://api.github.com/users/Bakht-Ullah/orgs",
"repos_url": "https://api.github.com/users/Bakht-Ullah/repos",
"events_url": "https://api.github.com/users/Bakht-Ullah/events{/privacy}",
"received_events_url": "https://api.github.com/users/Bakht-Ullah/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-10-03T07:31:17
| 2024-10-03T15:50:53
| 2024-10-03T15:50:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I would like to request the ability to set the temperature parameter directly from the command line when running models using the ollama run command.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7088/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7088/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7421
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7421/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7421/comments
|
https://api.github.com/repos/ollama/ollama/issues/7421/events
|
https://github.com/ollama/ollama/issues/7421
| 2,624,533,467
|
I_kwDOJ0Z1Ps6cbzPb
| 7,421
|
Docs: Add Linux manual instructions that can run without root / sudo
|
{
"login": "gabe-l-hart",
"id": 1254484,
"node_id": "MDQ6VXNlcjEyNTQ0ODQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1254484?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gabe-l-hart",
"html_url": "https://github.com/gabe-l-hart",
"followers_url": "https://api.github.com/users/gabe-l-hart/followers",
"following_url": "https://api.github.com/users/gabe-l-hart/following{/other_user}",
"gists_url": "https://api.github.com/users/gabe-l-hart/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gabe-l-hart/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gabe-l-hart/subscriptions",
"organizations_url": "https://api.github.com/users/gabe-l-hart/orgs",
"repos_url": "https://api.github.com/users/gabe-l-hart/repos",
"events_url": "https://api.github.com/users/gabe-l-hart/events{/privacy}",
"received_events_url": "https://api.github.com/users/gabe-l-hart/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 9
| 2024-10-30T15:47:42
| 2025-01-30T01:36:56
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
## Description
In the [Linux manual install instructions](https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install), the commands are shown requiring `sudo` access. This is usually fine with personal machines, but often isn't for shared or managed machines. The request here is to add instructions on how to install without `root` access.
## Proposal
I'm happy to contribute the README PR. Here are my suggested changes:
---
## Manual install
### Download and extract the package:
```shell
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
sudo tar -C /usr -xzf ollama-linux-amd64.tgz
```
### Non-root install
For those installing without root/sudo access, the install can also be done in user space with the appropriate additions to `PATH` and `LD_LIBRARY_PATH`:
```shell
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz
# Substitute any place you have write access for ~/.local
mkdir -p ~/.local
tar -C ~/.local -xzf ollama-linux-amd64.tgz
# Place this in your ~/.bashrc to persist
export PATH=$HOME/.local/bin:$PATH
export LD_LIBRARY_PATH=$HOME/.local/lib/ollama:$LD_LIBRARY_PATH
```
### Start Ollama:
```shell
ollama serve
```
In another terminal, verify that Ollama is running:
```shell
ollama -v
```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7421/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7421/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6257
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6257/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6257/comments
|
https://api.github.com/repos/ollama/ollama/issues/6257/events
|
https://github.com/ollama/ollama/issues/6257
| 2,455,568,825
|
I_kwDOJ0Z1Ps6SXQG5
| 6,257
|
Wrong output with the new Llama3.1 and llama3-groq-tool-use pull
|
{
"login": "Goekdeniz-Guelmez",
"id": 60228478,
"node_id": "MDQ6VXNlcjYwMjI4NDc4",
"avatar_url": "https://avatars.githubusercontent.com/u/60228478?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Goekdeniz-Guelmez",
"html_url": "https://github.com/Goekdeniz-Guelmez",
"followers_url": "https://api.github.com/users/Goekdeniz-Guelmez/followers",
"following_url": "https://api.github.com/users/Goekdeniz-Guelmez/following{/other_user}",
"gists_url": "https://api.github.com/users/Goekdeniz-Guelmez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Goekdeniz-Guelmez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Goekdeniz-Guelmez/subscriptions",
"organizations_url": "https://api.github.com/users/Goekdeniz-Guelmez/orgs",
"repos_url": "https://api.github.com/users/Goekdeniz-Guelmez/repos",
"events_url": "https://api.github.com/users/Goekdeniz-Guelmez/events{/privacy}",
"received_events_url": "https://api.github.com/users/Goekdeniz-Guelmez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 22
| 2024-08-08T11:34:13
| 2024-08-09T09:01:41
| 2024-08-08T22:15:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
So I just pulled the new Llama3.1 model and I get wrong outputs:
```text
(base) ~/Desktop/J.O.S.I.E.v3.5/ ollama run llama3.1
>>> hello
se// ofam oroomex b hades (--et mim n {adctadic of ad re Snd// lentam in p re d {ur n
ned
=//im halouadsead and h {ing-- tour ( and ofan;
f lion (ndil o in Sam re);
tois);
read andam to to
the mentad
ic);
toexroidas toilim d { S in nomesaram w { b =^C
>>>
Use Ctrl + d or /bye to exit.
```
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.4
# Update!
Repull `ollama pull llama3.1`.
Basically the way the tensors were encoded was causing memory to be over-allocated.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6257/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6257/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3439
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3439/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3439/comments
|
https://api.github.com/repos/ollama/ollama/issues/3439/events
|
https://github.com/ollama/ollama/issues/3439
| 2,218,456,716
|
I_kwDOJ0Z1Ps6EOvaM
| 3,439
|
Linux download of Ollama throws destination error
|
{
"login": "Chidu2000",
"id": 54184017,
"node_id": "MDQ6VXNlcjU0MTg0MDE3",
"avatar_url": "https://avatars.githubusercontent.com/u/54184017?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Chidu2000",
"html_url": "https://github.com/Chidu2000",
"followers_url": "https://api.github.com/users/Chidu2000/followers",
"following_url": "https://api.github.com/users/Chidu2000/following{/other_user}",
"gists_url": "https://api.github.com/users/Chidu2000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Chidu2000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Chidu2000/subscriptions",
"organizations_url": "https://api.github.com/users/Chidu2000/orgs",
"repos_url": "https://api.github.com/users/Chidu2000/repos",
"events_url": "https://api.github.com/users/Chidu2000/events{/privacy}",
"received_events_url": "https://api.github.com/users/Chidu2000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6678628138,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjhPHKg",
"url": "https://api.github.com/repos/ollama/ollama/labels/install",
"name": "install",
"color": "E0B88D",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-04-01T15:04:30
| 2024-06-01T22:56:35
| 2024-06-01T22:56:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The curl command did not work, Ollama did not get installed
curl -fsSL https://ollama.com/install.sh | sh
>>> Downloading ollama...
Warning: Failed to open the file /tmp/tmp.XW0NnkYWmf/ollama: No such file or %##O#-#
Warning: directory
curl: (23) Failure writing output to destination
### What did you expect to see?
Expected to see the download happened correctly.
### Steps to reproduce
1.Navigate to Ollama
2. Hit the linux download option
3. Copy the curl command for linux installation
4. Run in your terminal
### Are there any recent changes that introduced the issue?
No
### OS
Linux
### Architecture
x86
### Platform
_No response_
### Ollama version
'llama' from snap llama (1.4.0)
### GPU
Nvidia
### GPU info
545.29.06
Cuda version: 12.3
### CPU
Intel
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3439/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3439/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/944
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/944/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/944/comments
|
https://api.github.com/repos/ollama/ollama/issues/944/events
|
https://github.com/ollama/ollama/pull/944
| 1,966,774,740
|
PR_kwDOJ0Z1Ps5eCaHz
| 944
|
feat: add webi as install option in readme
|
{
"login": "todpunk",
"id": 1166358,
"node_id": "MDQ6VXNlcjExNjYzNTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1166358?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/todpunk",
"html_url": "https://github.com/todpunk",
"followers_url": "https://api.github.com/users/todpunk/followers",
"following_url": "https://api.github.com/users/todpunk/following{/other_user}",
"gists_url": "https://api.github.com/users/todpunk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/todpunk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/todpunk/subscriptions",
"organizations_url": "https://api.github.com/users/todpunk/orgs",
"repos_url": "https://api.github.com/users/todpunk/repos",
"events_url": "https://api.github.com/users/todpunk/events{/privacy}",
"received_events_url": "https://api.github.com/users/todpunk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-10-28T23:52:57
| 2024-02-22T19:11:55
| 2024-02-22T19:11:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/944",
"html_url": "https://github.com/ollama/ollama/pull/944",
"diff_url": "https://github.com/ollama/ollama/pull/944.diff",
"patch_url": "https://github.com/ollama/ollama/pull/944.patch",
"merged_at": null
}
|
This PR is purely for another install option in the readme. It will be available shortly
I've been getting ollama added to Webi because I use that to install cli tools for as much as I can these days (I like simple portable ways and Webi works well as a flow for our CI/CD jobs which often have to run cross platform as well).
The PR for that at the time of this writing is https://github.com/webinstall/webi-installers/pull/712
Happy to make adjustments to the wording or placement or whatever would be helpful.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/944/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/944/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7145
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7145/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7145/comments
|
https://api.github.com/repos/ollama/ollama/issues/7145/events
|
https://github.com/ollama/ollama/pull/7145
| 2,574,451,356
|
PR_kwDOJ0Z1Ps5-Amb4
| 7,145
|
prompt: Retain image through messages in a conversation
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-10-09T00:08:04
| 2024-10-10T21:38:29
| 2024-10-10T19:04:49
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7145",
"html_url": "https://github.com/ollama/ollama/pull/7145",
"diff_url": "https://github.com/ollama/ollama/pull/7145.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7145.patch",
"merged_at": null
}
| null |
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7145/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7145/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8472
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8472/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8472/comments
|
https://api.github.com/repos/ollama/ollama/issues/8472/events
|
https://github.com/ollama/ollama/issues/8472
| 2,796,045,454
|
I_kwDOJ0Z1Ps6mqESO
| 8,472
|
Hangs with 2x P40 GPUs when switch model
|
{
"login": "fred-vaneijk",
"id": 178751132,
"node_id": "U_kgDOCqeGnA",
"avatar_url": "https://avatars.githubusercontent.com/u/178751132?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fred-vaneijk",
"html_url": "https://github.com/fred-vaneijk",
"followers_url": "https://api.github.com/users/fred-vaneijk/followers",
"following_url": "https://api.github.com/users/fred-vaneijk/following{/other_user}",
"gists_url": "https://api.github.com/users/fred-vaneijk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fred-vaneijk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fred-vaneijk/subscriptions",
"organizations_url": "https://api.github.com/users/fred-vaneijk/orgs",
"repos_url": "https://api.github.com/users/fred-vaneijk/repos",
"events_url": "https://api.github.com/users/fred-vaneijk/events{/privacy}",
"received_events_url": "https://api.github.com/users/fred-vaneijk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2025-01-17T18:14:15
| 2025-01-20T15:52:13
| 2025-01-20T15:52:11
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When running ollama in a docker container with 2 Nvidia P40 GPUs and open webui is used to switch the model (from Phi to lama3.2 or vice versa) the web ui hangs when issuing a new query. In nvtop I notice a Compute process that is using 100% CPU (not GPU). The only way to recover is to restart the docker container. I suspect there is some infinite loop occurring during or prior to the model switch. I also noticed that if OLLAMA_KEEP_ALIVE is set to a very short time like 10s the 100% cpu issue also manifests.
Model switching works fine when configured with a single gpu i.e. NVIDIA_VISIBLE_DEVICES=0 and count: 1.
here is the docker yaml config file
version: '3.8'
services:
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
restart: unless-stopped
ports:
- "3000:8080"
environment:
- OLLAMA_API_BASE_URL=http://host.docker.internal:11434/api
- WEBUI_AUTH=false
depends_on:
- ollama
volumes:
- open-webui:/app/backend/data
networks:
- ollama-network
extra_hosts:
- "host.docker.internal:host-gateway"
ollama:
image: ollama/ollama:latest
container_name: ollama
restart: unless-stopped
ports:
- "11434:11434"
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 2
capabilities: [gpu]
environment:
- NVIDIA_VISIBLE_DEVICES=0,1
- OLLAMA_CUDA_VERSION=12.6
- OLLAMA_MAX_LOADED_MODELS=1
- OLLAMA_KEEP_ALIVE=-1
- OLLAMA_MAX_QUEUE=1
- OLLAMA_HOST=0.0.0.0
- OLLAMA_DEBUG=1 # Add this for verbose logging
- OLLAMA_TIMEOUT=10s # Add timeout for operations
- OLLAMA_NOPRUNE=true
runtime: nvidia
volumes:
- ollama:/root/.ollama
networks:
- ollama-network
networks:
ollama-network:
driver: bridge
volumes:
open-webui:
ollama:

[ollama-p40-log.txt](https://github.com/user-attachments/files/18458285/ollama-p40-log.txt)
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.7-0-ga420a45-dirty
|
{
"login": "fred-vaneijk",
"id": 178751132,
"node_id": "U_kgDOCqeGnA",
"avatar_url": "https://avatars.githubusercontent.com/u/178751132?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fred-vaneijk",
"html_url": "https://github.com/fred-vaneijk",
"followers_url": "https://api.github.com/users/fred-vaneijk/followers",
"following_url": "https://api.github.com/users/fred-vaneijk/following{/other_user}",
"gists_url": "https://api.github.com/users/fred-vaneijk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fred-vaneijk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fred-vaneijk/subscriptions",
"organizations_url": "https://api.github.com/users/fred-vaneijk/orgs",
"repos_url": "https://api.github.com/users/fred-vaneijk/repos",
"events_url": "https://api.github.com/users/fred-vaneijk/events{/privacy}",
"received_events_url": "https://api.github.com/users/fred-vaneijk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8472/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8472/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6846
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6846/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6846/comments
|
https://api.github.com/repos/ollama/ollama/issues/6846/events
|
https://github.com/ollama/ollama/pull/6846
| 2,532,275,599
|
PR_kwDOJ0Z1Ps570ral
| 6,846
|
add solar pro (preview)
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-17T22:19:38
| 2024-09-18T01:11:28
| 2024-09-18T01:11:26
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6846",
"html_url": "https://github.com/ollama/ollama/pull/6846",
"diff_url": "https://github.com/ollama/ollama/pull/6846.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6846.patch",
"merged_at": "2024-09-18T01:11:26"
}
|
solar-pro introduces block skip connections where blocks are connected to other, non-sequential blocks with a scale multiple
this change adds 4 new keys to store the skip connections and one new tensor to store the scalar. the scalar is implemented as a 1-dimensional tensor with 2 elements derived from the model's `bskcn_tv` configuration. in general, the values are `bskcn_tv, 1 - bskcn_tv`
https://huggingface.co/upstage/solar-pro-preview-instruct
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6846/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 1,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/6846/timeline
| null | null | true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.