url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/6789
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6789/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6789/comments
|
https://api.github.com/repos/ollama/ollama/issues/6789/events
|
https://github.com/ollama/ollama/pull/6789
| 2,524,124,073
|
PR_kwDOJ0Z1Ps57ZZmI
| 6,789
|
readme: add Obsidian Quiz Generator plugin to community integrations
|
{
"login": "ECuiDev",
"id": 37892357,
"node_id": "MDQ6VXNlcjM3ODkyMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/37892357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ECuiDev",
"html_url": "https://github.com/ECuiDev",
"followers_url": "https://api.github.com/users/ECuiDev/followers",
"following_url": "https://api.github.com/users/ECuiDev/following{/other_user}",
"gists_url": "https://api.github.com/users/ECuiDev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ECuiDev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ECuiDev/subscriptions",
"organizations_url": "https://api.github.com/users/ECuiDev/orgs",
"repos_url": "https://api.github.com/users/ECuiDev/repos",
"events_url": "https://api.github.com/users/ECuiDev/events{/privacy}",
"received_events_url": "https://api.github.com/users/ECuiDev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-13T07:48:01
| 2024-09-15T03:52:37
| 2024-09-15T03:52:37
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6789",
"html_url": "https://github.com/ollama/ollama/pull/6789",
"diff_url": "https://github.com/ollama/ollama/pull/6789.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6789.patch",
"merged_at": "2024-09-15T03:52:37"
}
|
**Plugin Demo**
https://github.com/user-attachments/assets/24e57fcf-2cbf-4797-a161-4c4a05e518bf
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6789/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6789/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2445
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2445/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2445/comments
|
https://api.github.com/repos/ollama/ollama/issues/2445/events
|
https://github.com/ollama/ollama/issues/2445
| 2,128,848,372
|
I_kwDOJ0Z1Ps5-46X0
| 2,445
|
Ollama stuck on "CUDA Compute Capability detected: 7.5"
|
{
"login": "Rhimzy",
"id": 88019073,
"node_id": "MDQ6VXNlcjg4MDE5MDcz",
"avatar_url": "https://avatars.githubusercontent.com/u/88019073?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rhimzy",
"html_url": "https://github.com/Rhimzy",
"followers_url": "https://api.github.com/users/Rhimzy/followers",
"following_url": "https://api.github.com/users/Rhimzy/following{/other_user}",
"gists_url": "https://api.github.com/users/Rhimzy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rhimzy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rhimzy/subscriptions",
"organizations_url": "https://api.github.com/users/Rhimzy/orgs",
"repos_url": "https://api.github.com/users/Rhimzy/repos",
"events_url": "https://api.github.com/users/Rhimzy/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rhimzy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-11T05:37:55
| 2024-02-20T07:49:09
| 2024-02-20T07:48:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
WIndows 11
Ubuntu WSL
Logs:
```
> OLLAMA_HOST=127.0.0.1:11435 ollama serve
time=2024-02-11T11:04:49.410+05:30 level=INFO source=images.go:863 msg="total blobs: 0"
time=2024-02-11T11:04:49.410+05:30 level=INFO source=images.go:870 msg="total unused blobs removed: 0"
time=2024-02-11T11:04:49.410+05:30 level=INFO source=routes.go:999 msg="Listening on 127.0.0.1:11435 (version 0.1.24)"
time=2024-02-11T11:04:49.411+05:30 level=INFO source=payload_common.go:106 msg="Extracting dynamic libraries..."
time=2024-02-11T11:04:51.905+05:30 level=INFO source=payload_common.go:145 msg="Dynamic LLM libraries [cpu_avx cpu_avx2 rocm_v5 rocm_v6 cpu cuda_v11]"
time=2024-02-11T11:04:51.905+05:30 level=INFO source=gpu.go:94 msg="Detecting GPU type"
time=2024-02-11T11:04:51.905+05:30 level=INFO source=gpu.go:242 msg="Searching for GPU management library libnvidia-ml.so"
time=2024-02-11T11:04:53.334+05:30 level=INFO source=gpu.go:288 msg="Discovered GPU libraries: [/usr/lib/wsl/lib/libnvidia-ml.so.1 /usr/lib/wsl/drivers/nvami.inf_amd64_99c8019dbacde1b2/libnvidia-ml.so.1]"
time=2024-02-11T11:04:54.300+05:30 level=INFO source=gpu.go:99 msg="Nvidia GPU detected"
time=2024-02-11T11:04:54.301+05:30 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-11T11:04:54.307+05:30 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 7.5"
```
And it just gets stuck there
I am not very familiar with how it goes after that..
|
{
"login": "Rhimzy",
"id": 88019073,
"node_id": "MDQ6VXNlcjg4MDE5MDcz",
"avatar_url": "https://avatars.githubusercontent.com/u/88019073?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Rhimzy",
"html_url": "https://github.com/Rhimzy",
"followers_url": "https://api.github.com/users/Rhimzy/followers",
"following_url": "https://api.github.com/users/Rhimzy/following{/other_user}",
"gists_url": "https://api.github.com/users/Rhimzy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Rhimzy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Rhimzy/subscriptions",
"organizations_url": "https://api.github.com/users/Rhimzy/orgs",
"repos_url": "https://api.github.com/users/Rhimzy/repos",
"events_url": "https://api.github.com/users/Rhimzy/events{/privacy}",
"received_events_url": "https://api.github.com/users/Rhimzy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2445/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2445/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2821
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2821/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2821/comments
|
https://api.github.com/repos/ollama/ollama/issues/2821/events
|
https://github.com/ollama/ollama/issues/2821
| 2,160,166,301
|
I_kwDOJ0Z1Ps6AwYWd
| 2,821
|
Can we have the newest 1-bit model
|
{
"login": "chuangtc",
"id": 2288469,
"node_id": "MDQ6VXNlcjIyODg0Njk=",
"avatar_url": "https://avatars.githubusercontent.com/u/2288469?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chuangtc",
"html_url": "https://github.com/chuangtc",
"followers_url": "https://api.github.com/users/chuangtc/followers",
"following_url": "https://api.github.com/users/chuangtc/following{/other_user}",
"gists_url": "https://api.github.com/users/chuangtc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chuangtc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chuangtc/subscriptions",
"organizations_url": "https://api.github.com/users/chuangtc/orgs",
"repos_url": "https://api.github.com/users/chuangtc/repos",
"events_url": "https://api.github.com/users/chuangtc/events{/privacy}",
"received_events_url": "https://api.github.com/users/chuangtc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 16
| 2024-02-29T01:24:50
| 2025-01-14T02:04:26
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits
https://thegenerality.com/agi/
https://arxiv.org/abs/2402.17764
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2821/reactions",
"total_count": 29,
"+1": 27,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/2821/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1138
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1138/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1138/comments
|
https://api.github.com/repos/ollama/ollama/issues/1138/events
|
https://github.com/ollama/ollama/pull/1138
| 1,995,005,325
|
PR_kwDOJ0Z1Ps5fh_2M
| 1,138
|
Add error handling for get_summary function in the newssummary example.
|
{
"login": "Amosel",
"id": 61532,
"node_id": "MDQ6VXNlcjYxNTMy",
"avatar_url": "https://avatars.githubusercontent.com/u/61532?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Amosel",
"html_url": "https://github.com/Amosel",
"followers_url": "https://api.github.com/users/Amosel/followers",
"following_url": "https://api.github.com/users/Amosel/following{/other_user}",
"gists_url": "https://api.github.com/users/Amosel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Amosel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Amosel/subscriptions",
"organizations_url": "https://api.github.com/users/Amosel/orgs",
"repos_url": "https://api.github.com/users/Amosel/repos",
"events_url": "https://api.github.com/users/Amosel/events{/privacy}",
"received_events_url": "https://api.github.com/users/Amosel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2023-11-15T15:28:22
| 2023-11-24T18:08:21
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1138",
"html_url": "https://github.com/ollama/ollama/pull/1138",
"diff_url": "https://github.com/ollama/ollama/pull/1138.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1138.patch",
"merged_at": null
}
|
Calling `get_summary` fails when the hard coded model `mistral-openorca` is not installed. It makes sense to save people trying to figure out why.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1138/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1138/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4953
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4953/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4953/comments
|
https://api.github.com/repos/ollama/ollama/issues/4953/events
|
https://github.com/ollama/ollama/pull/4953
| 2,342,275,701
|
PR_kwDOJ0Z1Ps5x5Dt7
| 4,953
|
refactor: modiffy dockerignore
|
{
"login": "Gabrielfernandes7",
"id": 78227127,
"node_id": "MDQ6VXNlcjc4MjI3MTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/78227127?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Gabrielfernandes7",
"html_url": "https://github.com/Gabrielfernandes7",
"followers_url": "https://api.github.com/users/Gabrielfernandes7/followers",
"following_url": "https://api.github.com/users/Gabrielfernandes7/following{/other_user}",
"gists_url": "https://api.github.com/users/Gabrielfernandes7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Gabrielfernandes7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Gabrielfernandes7/subscriptions",
"organizations_url": "https://api.github.com/users/Gabrielfernandes7/orgs",
"repos_url": "https://api.github.com/users/Gabrielfernandes7/repos",
"events_url": "https://api.github.com/users/Gabrielfernandes7/events{/privacy}",
"received_events_url": "https://api.github.com/users/Gabrielfernandes7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-09T14:04:30
| 2024-06-09T15:45:30
| 2024-06-09T15:45:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4953",
"html_url": "https://github.com/ollama/ollama/pull/4953",
"diff_url": "https://github.com/ollama/ollama/pull/4953.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4953.patch",
"merged_at": null
}
|
This PR modify a `.dockerignore` file to the repository to optimize the Docker image build. References issue #4952
|
{
"login": "Gabrielfernandes7",
"id": 78227127,
"node_id": "MDQ6VXNlcjc4MjI3MTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/78227127?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Gabrielfernandes7",
"html_url": "https://github.com/Gabrielfernandes7",
"followers_url": "https://api.github.com/users/Gabrielfernandes7/followers",
"following_url": "https://api.github.com/users/Gabrielfernandes7/following{/other_user}",
"gists_url": "https://api.github.com/users/Gabrielfernandes7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Gabrielfernandes7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Gabrielfernandes7/subscriptions",
"organizations_url": "https://api.github.com/users/Gabrielfernandes7/orgs",
"repos_url": "https://api.github.com/users/Gabrielfernandes7/repos",
"events_url": "https://api.github.com/users/Gabrielfernandes7/events{/privacy}",
"received_events_url": "https://api.github.com/users/Gabrielfernandes7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4953/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4953/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4780
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4780/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4780/comments
|
https://api.github.com/repos/ollama/ollama/issues/4780/events
|
https://github.com/ollama/ollama/issues/4780
| 2,329,548,993
|
I_kwDOJ0Z1Ps6K2hjB
| 4,780
|
Radeon VII gfx906:sramecc-:xnack- windows support
|
{
"login": "MrSteelRat",
"id": 31157848,
"node_id": "MDQ6VXNlcjMxMTU3ODQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/31157848?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MrSteelRat",
"html_url": "https://github.com/MrSteelRat",
"followers_url": "https://api.github.com/users/MrSteelRat/followers",
"following_url": "https://api.github.com/users/MrSteelRat/following{/other_user}",
"gists_url": "https://api.github.com/users/MrSteelRat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MrSteelRat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MrSteelRat/subscriptions",
"organizations_url": "https://api.github.com/users/MrSteelRat/orgs",
"repos_url": "https://api.github.com/users/MrSteelRat/repos",
"events_url": "https://api.github.com/users/MrSteelRat/events{/privacy}",
"received_events_url": "https://api.github.com/users/MrSteelRat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
},
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2024-06-02T09:08:08
| 2024-08-03T10:52:03
| 2024-07-22T16:50:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello, please add support for this GPU Radeon VII to the regular version, I can help with testing if necessary
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4780/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4780/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7525
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7525/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7525/comments
|
https://api.github.com/repos/ollama/ollama/issues/7525/events
|
https://github.com/ollama/ollama/issues/7525
| 2,637,669,131
|
I_kwDOJ0Z1Ps6dN6ML
| 7,525
|
向量库问答问题
|
{
"login": "NXL333",
"id": 62203971,
"node_id": "MDQ6VXNlcjYyMjAzOTcx",
"avatar_url": "https://avatars.githubusercontent.com/u/62203971?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NXL333",
"html_url": "https://github.com/NXL333",
"followers_url": "https://api.github.com/users/NXL333/followers",
"following_url": "https://api.github.com/users/NXL333/following{/other_user}",
"gists_url": "https://api.github.com/users/NXL333/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NXL333/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NXL333/subscriptions",
"organizations_url": "https://api.github.com/users/NXL333/orgs",
"repos_url": "https://api.github.com/users/NXL333/repos",
"events_url": "https://api.github.com/users/NXL333/events{/privacy}",
"received_events_url": "https://api.github.com/users/NXL333/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-11-06T10:31:39
| 2024-11-17T14:07:43
| 2024-11-17T14:07:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
1:
智慧陆边口岸大模型
一、三大应用场景需求:
(1)口岸整体情况分析:
1、口岸四个流分析,包括人流、物流、信息流、商品流:例如人员往来(跨国人员流
动、非法出入境的分析和统计),往返人员携带的物品(发现可能存在的风险和机
遇),口岸信息的集成;
2、贸易分析:通过年鉴以及商务部文件及数据,分析口岸贸易量变迁、贸易商品结构
(原料、加工、一般贸易等)、产品进出口情况、原产地;
3、地缘政治:中亚五国和东盟
4、政策预警、政府辅助决策(双边关系约定情况,关税政策、政策怎么约定,商务部
网站免税协定与哪些国家签署(方便企业自动查询))
5、检验检疫政策与流程:为企业提供检疫双方的标准和具体流程要求;
6、产品标准(方便企业对照是否符合出口国家产品标准)、双方标准
(2)口岸本身情况分析:
1、通过口岸志数据,分析口岸之间的类型(有些是针对双边贸易,有些仅支持过境—
—通道型口岸)、关联关系、布局以及时空特征(分析口岸布局在哪个城市投入产
出更佳,优化口岸布局、支持城市发展);
2、口岸历史沿革:从历史上来看受到哪些自然灾害或历史事件的影响,使得口岸增加
/减少、开放/关闭;
3、口岸边检风险预警:基于历史数据的叠加,对口岸边检、人检等进行预警,例如灾
害蝗虫跨界预警问题;
4、如何优化通关流程、缩短通关时间,提高便利化:主要在于信息不对等,是否可以
通过制作公众号视频等形式提高公众认知度。
(3)企业与公众社会需求
1、旅游:智慧旅游,旅游攻略,特别要注意对方国和我国的法律问题,针对免签国家
常见的问题汇总和解答;
2、口岸知识:口岸周边人文地理知识,促进知识分享;
3、爱国基地、筑牢教育等:具有历史意义、文化底蕴或独特地理环境的地方;
4、口岸投资潜力分析:通过系统了解对方国家的人文情况,进行投资风险研判;
5、投资政策,投资规则:了解对方国家的投资政策。
这是我的向量库,搜索出来的向量,然后我用这个代码,
vectorstore = FAISS.from_texts(chunk_text, embeddings)
retriever = vectorstore.as_retriever()
# 3. 向模型提问
template = """Answer the question based only on the following context:
{context}
Question: {question}
"""
prompt = ChatPromptTemplate.from_template(template)
chain = (
{"context": retriever, "question": RunnablePassthrough()}
| prompt
| model_local
| StrOutputParser()
)调用,他回答的并不全,这是他的回答,口岸整体情况分析包括:
人流分析:人员往来(跨国人员流动、非法出入境的分析和统计)
物流分析:往返人员携带的物品(发现可能存在的风险和机遇)
信息流分析:口岸信息的集成
商品流分析:贸易量变迁、贸易商品结构(原料、加工、一般贸易等)、产品进出口情况、原产地。
后面的问题,为啥没有回答,我要怎么修改吗???谢谢您,您真好
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7525/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7525/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/170
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/170/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/170/comments
|
https://api.github.com/repos/ollama/ollama/issues/170/events
|
https://github.com/ollama/ollama/issues/170
| 1,816,537,190
|
I_kwDOJ0Z1Ps5sRihm
| 170
|
How to fix `Error: stream: digest mismatch`
|
{
"login": "dtgriscom",
"id": 842958,
"node_id": "MDQ6VXNlcjg0Mjk1OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/842958?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dtgriscom",
"html_url": "https://github.com/dtgriscom",
"followers_url": "https://api.github.com/users/dtgriscom/followers",
"following_url": "https://api.github.com/users/dtgriscom/following{/other_user}",
"gists_url": "https://api.github.com/users/dtgriscom/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dtgriscom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dtgriscom/subscriptions",
"organizations_url": "https://api.github.com/users/dtgriscom/orgs",
"repos_url": "https://api.github.com/users/dtgriscom/repos",
"events_url": "https://api.github.com/users/dtgriscom/events{/privacy}",
"received_events_url": "https://api.github.com/users/dtgriscom/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2023-07-22T02:04:21
| 2023-07-24T20:52:41
| 2023-07-24T20:52:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I was downloading `llama2:13b`, and for some reason the download went wrong. Now, when I try to run it, I get an error:
```
MacBook-Pro-2:~ griscom$ ollama run llama2:13b
pulling manifest
pulling f79142715bc9... 100% |█████████████████████████████████████████████████| (7.3/7.3 GB, 3.5 TB/s)
pulling 2cc93ea1ade8... 100% |████████████████████████████████████████████████████| (90/90 B, 478 kB/s)
pulling a73730bc2562... 100% |██████████████████████████████████████████████████| (509/509 B, 7.6 MB/s)
pulling 13af22070723... 100% |██████████████████████████████████████████████████| (4.4/4.4 kB, 66 MB/s)
pulling 6e004b4cefda... 100% |██████████████████████████████████████████████████| (373/373 B, 1.1 MB/s)
verifying sha256 digest
Error: stream: digest mismatch: want sha256:f79142715bc9539a2edbb4b253548db8b34fac22736593eeaa28555874476e30, got sha256:88c8d47ae981fac63e71d20e9a74324252bcd52a11cc999c09bf38ee8106c723
MacBook-Pro-2:~ griscom$
```
I expect I need to remove the model and download again, but that doesn't work:
```
MacBook-Pro-2:~ griscom$ ollama rm llama2:13b
Error: unmarshal: invalid character '{' after top-level value
MacBook-Pro-2:~ griscom$
```
How can I fix this? (I presume by deleting bits and pieces from `~/.ollama`...
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/170/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/170/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2316
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2316/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2316/comments
|
https://api.github.com/repos/ollama/ollama/issues/2316/events
|
https://github.com/ollama/ollama/pull/2316
| 2,113,887,631
|
PR_kwDOJ0Z1Ps5lxott
| 2,316
|
Clear previous images when submitting a new image to `ollama run`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-02T02:11:43
| 2024-02-02T05:30:26
| 2024-02-02T05:30:26
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2316",
"html_url": "https://github.com/ollama/ollama/pull/2316",
"diff_url": "https://github.com/ollama/ollama/pull/2316.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2316.patch",
"merged_at": "2024-02-02T05:30:26"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2316/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2316/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6866
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6866/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6866/comments
|
https://api.github.com/repos/ollama/ollama/issues/6866/events
|
https://github.com/ollama/ollama/issues/6866
| 2,535,151,715
|
I_kwDOJ0Z1Ps6XG1hj
| 6,866
|
High CPU load with Jetson Orin NX
|
{
"login": "s0301132",
"id": 47412725,
"node_id": "MDQ6VXNlcjQ3NDEyNzI1",
"avatar_url": "https://avatars.githubusercontent.com/u/47412725?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/s0301132",
"html_url": "https://github.com/s0301132",
"followers_url": "https://api.github.com/users/s0301132/followers",
"following_url": "https://api.github.com/users/s0301132/following{/other_user}",
"gists_url": "https://api.github.com/users/s0301132/gists{/gist_id}",
"starred_url": "https://api.github.com/users/s0301132/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/s0301132/subscriptions",
"organizations_url": "https://api.github.com/users/s0301132/orgs",
"repos_url": "https://api.github.com/users/s0301132/repos",
"events_url": "https://api.github.com/users/s0301132/events{/privacy}",
"received_events_url": "https://api.github.com/users/s0301132/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-09-19T02:43:22
| 2024-09-25T20:51:16
| 2024-09-25T20:51:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Using the amr64 build package and run it successfully
However when LLM answering the question the CPU load is 100% but the GPU is nearly 0 % in `jtop`
Is it normal or the amr64 build cannot use GPU by default?

### OS
Linux
### GPU
Nvidia
### CPU
Other
### Ollama version
0.3.11
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6866/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6866/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4187
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4187/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4187/comments
|
https://api.github.com/repos/ollama/ollama/issues/4187/events
|
https://github.com/ollama/ollama/pull/4187
| 2,279,804,557
|
PR_kwDOJ0Z1Ps5ulIyg
| 4,187
|
Fix rare nil pointer dereference when model unloads
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-06T00:04:36
| 2024-05-06T00:18:27
| 2024-05-06T00:18:27
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4187",
"html_url": "https://github.com/ollama/ollama/pull/4187",
"diff_url": "https://github.com/ollama/ollama/pull/4187.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4187.patch",
"merged_at": "2024-05-06T00:18:27"
}
|
While testing concurrency I noticed a segfault happen occasionally when loading, canceling, and loading the same model repeatedly over and over again with a script like this:
```
#!/bin/bash
# Command to run
COMMAND="ollama run llama3 hello"
# Number of times to run the command concurrently
N=100
# Running the command N times concurrently
for i in $(seq 1 $N); do
$COMMAND &
done
```
The error looked like this:
```
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x2 addr=0x0 pc=0x101298ae0]
goroutine 51 [running]:
github.com/ollama/ollama/server.(*runnerRef).needsReload(0x14000ffc5a0, {0x1018b17a0, 0x1400017d8b0}, 0x140003e20f0)
/Users/jmorgan/git/ollama/server/sched.go:472 +0x150
github.com/ollama/ollama/server.(*Scheduler).processPending(0x1400017d900, {0x1018b17a0, 0x1400017d8b0})
/Users/jmorgan/git/ollama/server/sched.go:143 +0x3d0
github.com/ollama/ollama/server.(*Scheduler).Run.func1()
/Users/jmorgan/git/ollama/server/sched.go:120 +0x28
created by github.com/ollama/ollama/server.(*Scheduler).Run in goroutine 1
/Users/jmorgan/git/ollama/server/sched.go:119 +0xc4
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4187/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4187/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1399
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1399/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1399/comments
|
https://api.github.com/repos/ollama/ollama/issues/1399/events
|
https://github.com/ollama/ollama/pull/1399
| 2,028,709,419
|
PR_kwDOJ0Z1Ps5hUCmG
| 1,399
|
List "Send chat messages" in table of contents
|
{
"login": "calderonsamuel",
"id": 19418298,
"node_id": "MDQ6VXNlcjE5NDE4Mjk4",
"avatar_url": "https://avatars.githubusercontent.com/u/19418298?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/calderonsamuel",
"html_url": "https://github.com/calderonsamuel",
"followers_url": "https://api.github.com/users/calderonsamuel/followers",
"following_url": "https://api.github.com/users/calderonsamuel/following{/other_user}",
"gists_url": "https://api.github.com/users/calderonsamuel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/calderonsamuel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/calderonsamuel/subscriptions",
"organizations_url": "https://api.github.com/users/calderonsamuel/orgs",
"repos_url": "https://api.github.com/users/calderonsamuel/repos",
"events_url": "https://api.github.com/users/calderonsamuel/events{/privacy}",
"received_events_url": "https://api.github.com/users/calderonsamuel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-06T14:40:12
| 2023-12-06T20:34:27
| 2023-12-06T20:34:27
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1399",
"html_url": "https://github.com/ollama/ollama/pull/1399",
"diff_url": "https://github.com/ollama/ollama/pull/1399.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1399.patch",
"merged_at": "2023-12-06T20:34:27"
}
|
This PR just adds a line in "docs/api.md" to list the new endpoint in TOC
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1399/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1399/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4019
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4019/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4019/comments
|
https://api.github.com/repos/ollama/ollama/issues/4019/events
|
https://github.com/ollama/ollama/pull/4019
| 2,268,060,797
|
PR_kwDOJ0Z1Ps5t9gXg
| 4,019
|
Fix copying model to itself
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-29T03:45:26
| 2024-04-29T03:47:50
| 2024-04-29T03:47:49
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4019",
"html_url": "https://github.com/ollama/ollama/pull/4019",
"diff_url": "https://github.com/ollama/ollama/pull/4019.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4019.patch",
"merged_at": "2024-04-29T03:47:49"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4019/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4019/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2255
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2255/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2255/comments
|
https://api.github.com/repos/ollama/ollama/issues/2255/events
|
https://github.com/ollama/ollama/issues/2255
| 2,105,597,445
|
I_kwDOJ0Z1Ps59gN4F
| 2,255
|
Output truncated in the extension
|
{
"login": "pums974",
"id": 1005109,
"node_id": "MDQ6VXNlcjEwMDUxMDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/1005109?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pums974",
"html_url": "https://github.com/pums974",
"followers_url": "https://api.github.com/users/pums974/followers",
"following_url": "https://api.github.com/users/pums974/following{/other_user}",
"gists_url": "https://api.github.com/users/pums974/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pums974/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pums974/subscriptions",
"organizations_url": "https://api.github.com/users/pums974/orgs",
"repos_url": "https://api.github.com/users/pums974/repos",
"events_url": "https://api.github.com/users/pums974/events{/privacy}",
"received_events_url": "https://api.github.com/users/pums974/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-29T14:14:34
| 2024-01-31T07:59:12
| 2024-01-31T07:59:11
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
While the model (codellama:7b) answered (badly) to my prompt, and the transcript shows the entirety of it (see bellow) the interface show almost no output.
This might be caused by a block of code of a language not supported by the markdown interpreter ?

# Transcript
This is a session transcript from [Continue](https://continue.dev) on 1/29/2024, 3:06:56 PM.
## User
```truc.py (28-35)
def mysterious_function(x):
for i in range(len(x)):
for j in range(len(x) - 1):
if x[j] > x[j + 1]:
x[j], x[j + 1] = x[j + 1], x[j]
return x
```
rewrite this python function in fortran
## Continue
Here is one way to rewrite the Python function `mysterious_function` in Fortran:
```fortran
subroutine mysterious_function(x)
integer :: i, j
real :: x(n)
do i = 1, n - 1
do j = i + 1, n
if (x(j) > x(i)) then
call swap(x(i), x(j))
end if
end do
end do
return
end subroutine
```
Here's a breakdown of the Fortran code:
* The `subroutine` keyword is used to define a new subroutine.
* The `mysterious_function` subroutine takes a single argument `x`, which is an array of real numbers with length `n`.
* The `do` loop iterates over the elements of the `x` array, starting from index 1 and ending at index `n - 1`.
* The inner `do` loop iterates over the elements of the `x` array starting from index `i + 1` and ending at index `n`.
* The `if` statement checks if the element at index `j` is greater than the element at index `i`. If it is, then we swap the two elements using the `swap` function.
* The `return` keyword is used to exit the subroutine and return control to the calling program.
Note that in Fortran, arrays are 1-based, so the first element of an array has index 1, not 0. Also, Fortran uses explicit typing for variables, so we need to specify the type of `x` as a real number array with length `n`.
## User
/share
|
{
"login": "pums974",
"id": 1005109,
"node_id": "MDQ6VXNlcjEwMDUxMDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/1005109?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pums974",
"html_url": "https://github.com/pums974",
"followers_url": "https://api.github.com/users/pums974/followers",
"following_url": "https://api.github.com/users/pums974/following{/other_user}",
"gists_url": "https://api.github.com/users/pums974/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pums974/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pums974/subscriptions",
"organizations_url": "https://api.github.com/users/pums974/orgs",
"repos_url": "https://api.github.com/users/pums974/repos",
"events_url": "https://api.github.com/users/pums974/events{/privacy}",
"received_events_url": "https://api.github.com/users/pums974/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2255/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2255/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3090
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3090/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3090/comments
|
https://api.github.com/repos/ollama/ollama/issues/3090/events
|
https://github.com/ollama/ollama/issues/3090
| 2,182,976,943
|
I_kwDOJ0Z1Ps6CHZWv
| 3,090
|
How can I modify the model's existence duration on the GPU?
|
{
"login": "papandadj",
"id": 25424898,
"node_id": "MDQ6VXNlcjI1NDI0ODk4",
"avatar_url": "https://avatars.githubusercontent.com/u/25424898?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/papandadj",
"html_url": "https://github.com/papandadj",
"followers_url": "https://api.github.com/users/papandadj/followers",
"following_url": "https://api.github.com/users/papandadj/following{/other_user}",
"gists_url": "https://api.github.com/users/papandadj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/papandadj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/papandadj/subscriptions",
"organizations_url": "https://api.github.com/users/papandadj/orgs",
"repos_url": "https://api.github.com/users/papandadj/repos",
"events_url": "https://api.github.com/users/papandadj/events{/privacy}",
"received_events_url": "https://api.github.com/users/papandadj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-13T02:07:15
| 2024-03-13T03:30:46
| 2024-03-13T03:30:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Recently, I used Ollama to build my application. When I run a model, it automatically loads onto my GPU. However, after a few minutes, the model seems to be unloaded. How can I force the model to always remain loaded on the GPU?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3090/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3090/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1845
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1845/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1845/comments
|
https://api.github.com/repos/ollama/ollama/issues/1845/events
|
https://github.com/ollama/ollama/issues/1845
| 2,069,237,273
|
I_kwDOJ0Z1Ps57Vg4Z
| 1,845
|
Ollama from remote
|
{
"login": "HAL9KKK",
"id": 63504776,
"node_id": "MDQ6VXNlcjYzNTA0Nzc2",
"avatar_url": "https://avatars.githubusercontent.com/u/63504776?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HAL9KKK",
"html_url": "https://github.com/HAL9KKK",
"followers_url": "https://api.github.com/users/HAL9KKK/followers",
"following_url": "https://api.github.com/users/HAL9KKK/following{/other_user}",
"gists_url": "https://api.github.com/users/HAL9KKK/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HAL9KKK/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HAL9KKK/subscriptions",
"organizations_url": "https://api.github.com/users/HAL9KKK/orgs",
"repos_url": "https://api.github.com/users/HAL9KKK/repos",
"events_url": "https://api.github.com/users/HAL9KKK/events{/privacy}",
"received_events_url": "https://api.github.com/users/HAL9KKK/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2024-01-07T18:22:32
| 2024-05-16T13:17:20
| 2024-01-08T19:14:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Ollama is using always localhost.
I have 2 colab istances:
**Colab1 (server)**
```
# Set LD_LIBRARY_PATH so the system NVIDIA library
import os
import asyncio
os.environ.update({'LD_LIBRARY_PATH': '/usr/lib64-nvidia'})
async def run_process(cmd):
print('>>> starting', *cmd)
p = await asyncio.subprocess.create_subprocess_exec(
*cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE,
)
async def pipe(lines):
async for line in lines:
print(line.strip().decode('utf-8'))
await asyncio.gather(
pipe(p.stdout),
pipe(p.stderr),
)
await asyncio.gather(
run_process(['ollama', 'serve']),
run_process(['ngrok', 'http', '--log', 'stderr', '11434']),
)
```
```
>>> starting ollama serve
>>> starting ngrok http --log stderr 11434
2024/01/07 18:10:03 routes.go:929: Listening on 127.0.0.1:11434 (version 0.1.18)
t=2024-01-07T18:10:03+0000 lvl=info msg="started tunnel" obj=tunnels name=command_line addr=http://localhost:11434/ url=https://7b8c-34-83-27-150.ngrok-free.app/
```
**Colab2 (client)**
```
import os
os.environ["OLLAMA_HOST"]="https://7b8c-34-83-27-150.ngrok-free.app"
import subprocess
pr= subprocess.Popen(['ollama', 'run', 'openhermes'], stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)
```
After running the subprocess "ollama run openhermes" the server start running the model, so the connection client server is working thanks to the OLLAMA_HOST variable
The problem is when I run ollama from langchain
```
from langchain.llms import Ollama
ollama_llm = Ollama(model="openhermes")
ollama_llm.generate(["hello"])
```
ConnectionError: HTTPConnectionPool(host='localhost', port=11434)
Why OLLAMA_HOST is not working with langchain?
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1845/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1845/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6788
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6788/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6788/comments
|
https://api.github.com/repos/ollama/ollama/issues/6788/events
|
https://github.com/ollama/ollama/pull/6788
| 2,523,991,633
|
PR_kwDOJ0Z1Ps57Y8ue
| 6,788
|
add Agents-Flex Libraries in README.md
|
{
"login": "yangfuhai",
"id": 1539806,
"node_id": "MDQ6VXNlcjE1Mzk4MDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/1539806?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yangfuhai",
"html_url": "https://github.com/yangfuhai",
"followers_url": "https://api.github.com/users/yangfuhai/followers",
"following_url": "https://api.github.com/users/yangfuhai/following{/other_user}",
"gists_url": "https://api.github.com/users/yangfuhai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yangfuhai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yangfuhai/subscriptions",
"organizations_url": "https://api.github.com/users/yangfuhai/orgs",
"repos_url": "https://api.github.com/users/yangfuhai/repos",
"events_url": "https://api.github.com/users/yangfuhai/events{/privacy}",
"received_events_url": "https://api.github.com/users/yangfuhai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-13T06:37:08
| 2024-09-16T20:42:53
| 2024-09-16T20:42:52
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6788",
"html_url": "https://github.com/ollama/ollama/pull/6788",
"diff_url": "https://github.com/ollama/ollama/pull/6788.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6788.patch",
"merged_at": "2024-09-16T20:42:52"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6788/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6788/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5456
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5456/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5456/comments
|
https://api.github.com/repos/ollama/ollama/issues/5456/events
|
https://github.com/ollama/ollama/issues/5456
| 2,388,213,427
|
I_kwDOJ0Z1Ps6OWT6z
| 5,456
|
ollama push suddenly not working / giving not authorized error
|
{
"login": "ashokgit",
"id": 3615537,
"node_id": "MDQ6VXNlcjM2MTU1Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3615537?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ashokgit",
"html_url": "https://github.com/ashokgit",
"followers_url": "https://api.github.com/users/ashokgit/followers",
"following_url": "https://api.github.com/users/ashokgit/following{/other_user}",
"gists_url": "https://api.github.com/users/ashokgit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ashokgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ashokgit/subscriptions",
"organizations_url": "https://api.github.com/users/ashokgit/orgs",
"repos_url": "https://api.github.com/users/ashokgit/repos",
"events_url": "https://api.github.com/users/ashokgit/events{/privacy}",
"received_events_url": "https://api.github.com/users/ashokgit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-03T10:02:07
| 2024-07-03T13:49:23
| 2024-07-03T13:49:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi, I get the following error on push
ollama push myuseer/model-name
retrieving manifest
pushing 663944096011... 100% ▕█████████████████████████████▏ 667 MB
pushing c3be5dc5651b... 100% ▕█████████████████████████████▏ 54 B
pushing 1cb5df2a171b... 100% ▕█████████████████████████████▏ 32 B
pushing 395c6bda52c3... 100% ▕█████████████████████████████▏ 413 B
pushing manifest
Error: you are not authorized to push to this namespace, create the model under a namespace you own
I've followed the push model guidelines.
also copied ~/.ollama/id_ed25519.pub
and /usr/share/ollama/.ollama/id_ed25519.pub over to settings
was working just fine 2 days ago
I believe i only restarted ollama, should that cause this or am i missing something?
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.40
|
{
"login": "ashokgit",
"id": 3615537,
"node_id": "MDQ6VXNlcjM2MTU1Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3615537?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ashokgit",
"html_url": "https://github.com/ashokgit",
"followers_url": "https://api.github.com/users/ashokgit/followers",
"following_url": "https://api.github.com/users/ashokgit/following{/other_user}",
"gists_url": "https://api.github.com/users/ashokgit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ashokgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ashokgit/subscriptions",
"organizations_url": "https://api.github.com/users/ashokgit/orgs",
"repos_url": "https://api.github.com/users/ashokgit/repos",
"events_url": "https://api.github.com/users/ashokgit/events{/privacy}",
"received_events_url": "https://api.github.com/users/ashokgit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5456/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5456/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1956
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1956/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1956/comments
|
https://api.github.com/repos/ollama/ollama/issues/1956/events
|
https://github.com/ollama/ollama/issues/1956
| 2,079,269,605
|
I_kwDOJ0Z1Ps577yLl
| 1,956
|
Handle Multiple parallel request
|
{
"login": "lauvindra",
"id": 82690315,
"node_id": "MDQ6VXNlcjgyNjkwMzE1",
"avatar_url": "https://avatars.githubusercontent.com/u/82690315?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lauvindra",
"html_url": "https://github.com/lauvindra",
"followers_url": "https://api.github.com/users/lauvindra/followers",
"following_url": "https://api.github.com/users/lauvindra/following{/other_user}",
"gists_url": "https://api.github.com/users/lauvindra/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lauvindra/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lauvindra/subscriptions",
"organizations_url": "https://api.github.com/users/lauvindra/orgs",
"repos_url": "https://api.github.com/users/lauvindra/repos",
"events_url": "https://api.github.com/users/lauvindra/events{/privacy}",
"received_events_url": "https://api.github.com/users/lauvindra/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-01-12T16:48:13
| 2024-01-26T23:51:33
| 2024-01-26T23:51:33
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Does Ollama uses some kind of scheduling algorithm to manage high concurrent request? can you explain this
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1956/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1956/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1495
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1495/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1495/comments
|
https://api.github.com/repos/ollama/ollama/issues/1495/events
|
https://github.com/ollama/ollama/issues/1495
| 2,038,914,078
|
I_kwDOJ0Z1Ps55h1we
| 1,495
|
ollama on Proxmox??
|
{
"login": "Paulie420",
"id": 59846077,
"node_id": "MDQ6VXNlcjU5ODQ2MDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/59846077?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Paulie420",
"html_url": "https://github.com/Paulie420",
"followers_url": "https://api.github.com/users/Paulie420/followers",
"following_url": "https://api.github.com/users/Paulie420/following{/other_user}",
"gists_url": "https://api.github.com/users/Paulie420/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Paulie420/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Paulie420/subscriptions",
"organizations_url": "https://api.github.com/users/Paulie420/orgs",
"repos_url": "https://api.github.com/users/Paulie420/repos",
"events_url": "https://api.github.com/users/Paulie420/events{/privacy}",
"received_events_url": "https://api.github.com/users/Paulie420/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 18
| 2023-12-13T04:08:53
| 2024-10-29T22:46:07
| 2024-01-27T01:39:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
So I know this is user error, but... I can install and use ollama on my Framework laptop (without GPU) easily. Install w/ curl command and get going right away - but on a ProxMox VM w/ MORE RAM than my Framework, I get an Error ollama failed at the run command.
Am I missing something simple that I can 'fix'? I feel like my server has more CPU than my laptop - and wondering if others are running ollama on Proxmox w/o GPU?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1495/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1495/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6792
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6792/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6792/comments
|
https://api.github.com/repos/ollama/ollama/issues/6792/events
|
https://github.com/ollama/ollama/issues/6792
| 2,524,410,670
|
I_kwDOJ0Z1Ps6Wd3Mu
| 6,792
|
The system parameter OLLAMA_NUM_PALLEL is invalid for embeding model
|
{
"login": "black-fox-user",
"id": 181464167,
"node_id": "U_kgDOCtDsZw",
"avatar_url": "https://avatars.githubusercontent.com/u/181464167?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/black-fox-user",
"html_url": "https://github.com/black-fox-user",
"followers_url": "https://api.github.com/users/black-fox-user/followers",
"following_url": "https://api.github.com/users/black-fox-user/following{/other_user}",
"gists_url": "https://api.github.com/users/black-fox-user/gists{/gist_id}",
"starred_url": "https://api.github.com/users/black-fox-user/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/black-fox-user/subscriptions",
"organizations_url": "https://api.github.com/users/black-fox-user/orgs",
"repos_url": "https://api.github.com/users/black-fox-user/repos",
"events_url": "https://api.github.com/users/black-fox-user/events{/privacy}",
"received_events_url": "https://api.github.com/users/black-fox-user/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-09-13T09:50:22
| 2024-09-18T01:30:43
| 2024-09-18T01:30:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have set the system parameters, but when loading the embedding model, only one is still in effect. I copied this model, and surprisingly, their model IDs are the same. After importing the model, the model ID changed, but the same model was still used in the end。

### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.10
|
{
"login": "black-fox-user",
"id": 181464167,
"node_id": "U_kgDOCtDsZw",
"avatar_url": "https://avatars.githubusercontent.com/u/181464167?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/black-fox-user",
"html_url": "https://github.com/black-fox-user",
"followers_url": "https://api.github.com/users/black-fox-user/followers",
"following_url": "https://api.github.com/users/black-fox-user/following{/other_user}",
"gists_url": "https://api.github.com/users/black-fox-user/gists{/gist_id}",
"starred_url": "https://api.github.com/users/black-fox-user/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/black-fox-user/subscriptions",
"organizations_url": "https://api.github.com/users/black-fox-user/orgs",
"repos_url": "https://api.github.com/users/black-fox-user/repos",
"events_url": "https://api.github.com/users/black-fox-user/events{/privacy}",
"received_events_url": "https://api.github.com/users/black-fox-user/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6792/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6792/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4976
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4976/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4976/comments
|
https://api.github.com/repos/ollama/ollama/issues/4976/events
|
https://github.com/ollama/ollama/issues/4976
| 2,346,104,632
|
I_kwDOJ0Z1Ps6L1rc4
| 4,976
|
Error: pull model manifest: Get
|
{
"login": "funnyPhani",
"id": 58216617,
"node_id": "MDQ6VXNlcjU4MjE2NjE3",
"avatar_url": "https://avatars.githubusercontent.com/u/58216617?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/funnyPhani",
"html_url": "https://github.com/funnyPhani",
"followers_url": "https://api.github.com/users/funnyPhani/followers",
"following_url": "https://api.github.com/users/funnyPhani/following{/other_user}",
"gists_url": "https://api.github.com/users/funnyPhani/gists{/gist_id}",
"starred_url": "https://api.github.com/users/funnyPhani/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/funnyPhani/subscriptions",
"organizations_url": "https://api.github.com/users/funnyPhani/orgs",
"repos_url": "https://api.github.com/users/funnyPhani/repos",
"events_url": "https://api.github.com/users/funnyPhani/events{/privacy}",
"received_events_url": "https://api.github.com/users/funnyPhani/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 2
| 2024-06-11T10:53:33
| 2024-11-21T22:50:02
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama is not able to pull the models.
ollama run moon dream
pulling manifest
Error: pull model manifest: Get "https://ollama.com/token?nonce=F_Rh4t6Jrv-EM0eRltrU-Q&scope=repository%!A(MISSING)library%!F(MISSING)moondream%!A(MISSING)pull&service=ollama.com&ts=1718102949": read tcp 10.1.38.47:51154->34.120.132.20:443: read: connection reset by peer
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4976/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4976/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2626
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2626/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2626/comments
|
https://api.github.com/repos/ollama/ollama/issues/2626/events
|
https://github.com/ollama/ollama/pull/2626
| 2,145,926,276
|
PR_kwDOJ0Z1Ps5nen5g
| 2,626
|
Update big-AGI config file link
|
{
"login": "mogudian",
"id": 122781024,
"node_id": "U_kgDOB1F9YA",
"avatar_url": "https://avatars.githubusercontent.com/u/122781024?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mogudian",
"html_url": "https://github.com/mogudian",
"followers_url": "https://api.github.com/users/mogudian/followers",
"following_url": "https://api.github.com/users/mogudian/following{/other_user}",
"gists_url": "https://api.github.com/users/mogudian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mogudian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mogudian/subscriptions",
"organizations_url": "https://api.github.com/users/mogudian/orgs",
"repos_url": "https://api.github.com/users/mogudian/repos",
"events_url": "https://api.github.com/users/mogudian/events{/privacy}",
"received_events_url": "https://api.github.com/users/mogudian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-21T06:21:02
| 2024-02-21T06:24:49
| 2024-02-21T06:24:49
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2626",
"html_url": "https://github.com/ollama/ollama/pull/2626",
"diff_url": "https://github.com/ollama/ollama/pull/2626.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2626.patch",
"merged_at": "2024-02-21T06:24:49"
}
|
The old URL of big-AGI config file is not available, replace it to the latest
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2626/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2626/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4440
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4440/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4440/comments
|
https://api.github.com/repos/ollama/ollama/issues/4440/events
|
https://github.com/ollama/ollama/issues/4440
| 2,296,577,723
|
I_kwDOJ0Z1Ps6I4v67
| 4,440
|
Add support for third-party hosted APIs
|
{
"login": "19h",
"id": 280212,
"node_id": "MDQ6VXNlcjI4MDIxMg==",
"avatar_url": "https://avatars.githubusercontent.com/u/280212?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/19h",
"html_url": "https://github.com/19h",
"followers_url": "https://api.github.com/users/19h/followers",
"following_url": "https://api.github.com/users/19h/following{/other_user}",
"gists_url": "https://api.github.com/users/19h/gists{/gist_id}",
"starred_url": "https://api.github.com/users/19h/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/19h/subscriptions",
"organizations_url": "https://api.github.com/users/19h/orgs",
"repos_url": "https://api.github.com/users/19h/repos",
"events_url": "https://api.github.com/users/19h/events{/privacy}",
"received_events_url": "https://api.github.com/users/19h/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 9
| 2024-05-14T23:27:37
| 2024-11-06T17:36:25
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
We've been coding against the Ollama API internally and eventually it hit me .. Ollama should be able to support third-party API providers, making it a de-facto gateway to LLMs.
For example, it would easily blur the lines between an OpenAI's assistant / user and a Gemini model / user conversation; it could transparently speak Cohere Command R+ completion-like while eloquently talking to Claude, too.
Might sound utterly off-topic, but think about it.
I implemented a hard-coded model into Ollama for local use so I can use unsupported, hosted LLMs in Cody for coding, and I feel like this could very well be a Modelfile-level thing with providers happily providing integrations, putting even more spotlight on Ollama while forcing LLM providers to be less fuzzy about their API integrations, given that the Modelfile spec is rigid enough.
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4440/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4440/timeline
| null |
reopened
| false
|
https://api.github.com/repos/ollama/ollama/issues/2790
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2790/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2790/comments
|
https://api.github.com/repos/ollama/ollama/issues/2790/events
|
https://github.com/ollama/ollama/issues/2790
| 2,157,718,012
|
I_kwDOJ0Z1Ps6AnCn8
| 2,790
|
Function calling with OpenAI API
|
{
"login": "codearranger",
"id": 80373433,
"node_id": "MDQ6VXNlcjgwMzczNDMz",
"avatar_url": "https://avatars.githubusercontent.com/u/80373433?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codearranger",
"html_url": "https://github.com/codearranger",
"followers_url": "https://api.github.com/users/codearranger/followers",
"following_url": "https://api.github.com/users/codearranger/following{/other_user}",
"gists_url": "https://api.github.com/users/codearranger/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codearranger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codearranger/subscriptions",
"organizations_url": "https://api.github.com/users/codearranger/orgs",
"repos_url": "https://api.github.com/users/codearranger/repos",
"events_url": "https://api.github.com/users/codearranger/events{/privacy}",
"received_events_url": "https://api.github.com/users/codearranger/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-02-27T22:26:04
| 2024-07-26T00:52:48
| 2024-07-26T00:52:48
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://ollama.com/joefamous/firefunction-v1
https://platform.openai.com/docs/guides/function-calling
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2790/reactions",
"total_count": 16,
"+1": 12,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 4,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2790/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5317
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5317/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5317/comments
|
https://api.github.com/repos/ollama/ollama/issues/5317/events
|
https://github.com/ollama/ollama/issues/5317
| 2,377,175,757
|
I_kwDOJ0Z1Ps6NsNLN
| 5,317
|
请上架Florence-2
|
{
"login": "enryteam",
"id": 20081090,
"node_id": "MDQ6VXNlcjIwMDgxMDkw",
"avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enryteam",
"html_url": "https://github.com/enryteam",
"followers_url": "https://api.github.com/users/enryteam/followers",
"following_url": "https://api.github.com/users/enryteam/following{/other_user}",
"gists_url": "https://api.github.com/users/enryteam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enryteam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enryteam/subscriptions",
"organizations_url": "https://api.github.com/users/enryteam/orgs",
"repos_url": "https://api.github.com/users/enryteam/repos",
"events_url": "https://api.github.com/users/enryteam/events{/privacy}",
"received_events_url": "https://api.github.com/users/enryteam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 1
| 2024-06-27T05:31:19
| 2024-06-27T19:05:47
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/microsoft/Florence-2-large/tree/main 使用 pytorch
https://huggingface.co/spaces/SixOpen/Florence-2-large-ft
thanks
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5317/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5317/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2262
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2262/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2262/comments
|
https://api.github.com/repos/ollama/ollama/issues/2262/events
|
https://github.com/ollama/ollama/issues/2262
| 2,106,694,887
|
I_kwDOJ0Z1Ps59kZzn
| 2,262
|
the tags page is confusing
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-01-30T00:10:48
| 2024-03-11T17:42:03
| 2024-03-11T17:42:03
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |

And why do I need to know that a layer is 55 bytes?
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2262/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2262/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4455
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4455/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4455/comments
|
https://api.github.com/repos/ollama/ollama/issues/4455/events
|
https://github.com/ollama/ollama/issues/4455
| 2,298,300,142
|
I_kwDOJ0Z1Ps6I_Ubu
| 4,455
|
[REPORTING] For arch or arch based linux users storage path for models is /var/lib/ollama/.ollama/models/blobs
|
{
"login": "Greatz08",
"id": 55040435,
"node_id": "MDQ6VXNlcjU1MDQwNDM1",
"avatar_url": "https://avatars.githubusercontent.com/u/55040435?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Greatz08",
"html_url": "https://github.com/Greatz08",
"followers_url": "https://api.github.com/users/Greatz08/followers",
"following_url": "https://api.github.com/users/Greatz08/following{/other_user}",
"gists_url": "https://api.github.com/users/Greatz08/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Greatz08/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Greatz08/subscriptions",
"organizations_url": "https://api.github.com/users/Greatz08/orgs",
"repos_url": "https://api.github.com/users/Greatz08/repos",
"events_url": "https://api.github.com/users/Greatz08/events{/privacy}",
"received_events_url": "https://api.github.com/users/Greatz08/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-05-15T15:58:36
| 2024-10-23T20:54:29
| 2024-10-23T20:54:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I yesterday only installed ollama in my arch linux from aur and did pull 2 models phi3 and llama3 but couldnt find where they are actually stored and unfortunately in all reddit and FAQ the path mentioned was i guess for ubuntu users only so had to struggle but couldnt find so used fzf and service file to locate exact path where ollama models were actually stored so sharing it here so that it can help other arch based users.Please add this path in documentation so that more can know about it.
Thankyou :-))

|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4455/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4455/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2500
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2500/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2500/comments
|
https://api.github.com/repos/ollama/ollama/issues/2500/events
|
https://github.com/ollama/ollama/issues/2500
| 2,135,100,345
|
I_kwDOJ0Z1Ps5_Qwu5
| 2,500
|
Auto Tagging Documents in Ollama
|
{
"login": "asanchez-appliedres",
"id": 160036440,
"node_id": "U_kgDOCYn2WA",
"avatar_url": "https://avatars.githubusercontent.com/u/160036440?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/asanchez-appliedres",
"html_url": "https://github.com/asanchez-appliedres",
"followers_url": "https://api.github.com/users/asanchez-appliedres/followers",
"following_url": "https://api.github.com/users/asanchez-appliedres/following{/other_user}",
"gists_url": "https://api.github.com/users/asanchez-appliedres/gists{/gist_id}",
"starred_url": "https://api.github.com/users/asanchez-appliedres/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/asanchez-appliedres/subscriptions",
"organizations_url": "https://api.github.com/users/asanchez-appliedres/orgs",
"repos_url": "https://api.github.com/users/asanchez-appliedres/repos",
"events_url": "https://api.github.com/users/asanchez-appliedres/events{/privacy}",
"received_events_url": "https://api.github.com/users/asanchez-appliedres/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-14T19:58:14
| 2024-02-14T21:54:37
| 2024-02-14T21:54:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
When uploading documents to Ollama, users are currently required to manually tag documents. I would like to request a feature that allows for automatic document tagging based on the contents of the document.
|
{
"login": "asanchez-appliedres",
"id": 160036440,
"node_id": "U_kgDOCYn2WA",
"avatar_url": "https://avatars.githubusercontent.com/u/160036440?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/asanchez-appliedres",
"html_url": "https://github.com/asanchez-appliedres",
"followers_url": "https://api.github.com/users/asanchez-appliedres/followers",
"following_url": "https://api.github.com/users/asanchez-appliedres/following{/other_user}",
"gists_url": "https://api.github.com/users/asanchez-appliedres/gists{/gist_id}",
"starred_url": "https://api.github.com/users/asanchez-appliedres/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/asanchez-appliedres/subscriptions",
"organizations_url": "https://api.github.com/users/asanchez-appliedres/orgs",
"repos_url": "https://api.github.com/users/asanchez-appliedres/repos",
"events_url": "https://api.github.com/users/asanchez-appliedres/events{/privacy}",
"received_events_url": "https://api.github.com/users/asanchez-appliedres/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2500/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2500/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4850
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4850/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4850/comments
|
https://api.github.com/repos/ollama/ollama/issues/4850/events
|
https://github.com/ollama/ollama/issues/4850
| 2,337,658,770
|
I_kwDOJ0Z1Ps6LVdeS
| 4,850
|
ollama built with docker - docker run ollama How do I set the --n-gpu-layers parameter because this results in an error that prevents running the model
|
{
"login": "mingLvft",
"id": 50644675,
"node_id": "MDQ6VXNlcjUwNjQ0Njc1",
"avatar_url": "https://avatars.githubusercontent.com/u/50644675?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mingLvft",
"html_url": "https://github.com/mingLvft",
"followers_url": "https://api.github.com/users/mingLvft/followers",
"following_url": "https://api.github.com/users/mingLvft/following{/other_user}",
"gists_url": "https://api.github.com/users/mingLvft/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mingLvft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mingLvft/subscriptions",
"organizations_url": "https://api.github.com/users/mingLvft/orgs",
"repos_url": "https://api.github.com/users/mingLvft/repos",
"events_url": "https://api.github.com/users/mingLvft/events{/privacy}",
"received_events_url": "https://api.github.com/users/mingLvft/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6849881759,
"node_id": "LA_kwDOJ0Z1Ps8AAAABmEjmnw",
"url": "https://api.github.com/repos/ollama/ollama/labels/memory",
"name": "memory",
"color": "5017EA",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-06-06T08:31:18
| 2024-07-03T23:20:20
| 2024-07-03T23:20:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = Llama-3-8B-Instruct-Gradient-1048k
llama_model_loader: - kv 2: llama.block_count u32 = 32
llama_model_loader: - kv 3: llama.context_length u32 = 1048576
llama_model_loader: - kv 4: llama.embedding_length u32 = 4096
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 6: llama.attention.head_count u32 = 32
llama_model_loader: - kv 7: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 8: llama.rope.freq_base f32 = 2804339712.000000
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 10: general.file_type u32 = 1
llama_model_loader: - kv 11: llama.vocab_size u32 = 128256
llama_model_loader: - kv 12: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 13: tokenizer.ggml.model str = gpt2
llama_model_loader: - kv 14: tokenizer.ggml.tokens arr[str,128256] = ["!", "\"", "#", "$", "%", "&", "'", ...
llama_model_loader: - kv 15: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
llama_model_loader: - kv 16: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "...
llama_model_loader: - kv 17: tokenizer.ggml.bos_token_id u32 = 128000
llama_model_loader: - kv 18: tokenizer.ggml.eos_token_id u32 = 128001
llama_model_loader: - kv 19: tokenizer.chat_template str = {% set loop_messages = messages %}{% ...
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type f16: 226 tensors
llm_load_vocab: special tokens definition check successful ( 256/128256 ).
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = BPE
llm_load_print_meta: n_vocab = 128256
llm_load_print_meta: n_merges = 280147
llm_load_print_meta: n_ctx_train = 1048576
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: n_embd_k_gqa = 1024
llm_load_print_meta: n_embd_v_gqa = 1024
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: causal attn = 1
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 0
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 2804339712.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx = 1048576
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: model type = 7B
llm_load_print_meta: model ftype = F16
llm_load_print_meta: model params = 8.03 B
llm_load_print_meta: model size = 14.96 GiB (16.00 BPW)
llm_load_print_meta: general.name = Llama-3-8B-Instruct-Gradient-1048k
llm_load_print_meta: BOS token = 128000 '<|begin_of_text|>'
llm_load_print_meta: EOS token = 128001 '<|end_of_text|>'
llm_load_print_meta: LF token = 128 'Ä'
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: yes
ggml_cuda_init: CUDA_USE_TENSOR_CORES: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce GTX 1080 Ti, compute capability 6.1, VMM: yes
llm_load_tensors: ggml ctx size = 0.22 MiB
llm_load_tensors: offloading 23 repeating layers to GPU
llm_load_tensors: offloaded 23/33 layers to GPU
llm_load_tensors: CPU buffer size = 15317.02 MiB
llm_load_tensors: CUDA0 buffer size = 9568.72 MiB
.........................................................................................
llama_new_context_with_model: n_ctx = 2048
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: freq_base = 2804339712.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CUDA_Host KV buffer size = 72.00 MiB
llama_kv_cache_init: CUDA0 KV buffer size = 184.00 MiB
llama_new_context_with_model: KV self size = 256.00 MiB, K (f16): 128.00 MiB, V (f16): 128.00 MiB
llama_new_context_with_model: CUDA_Host output buffer size = 0.50 MiB
ggml_backend_cuda_buffer_type_alloc_buffer: allocating 1260.50 MiB on device 0: cudaMalloc failed: out of memory
ggml_gallocr_reserve_n: failed to allocate CUDA0 buffer of size 1321730048
llama_new_context_with_model: failed to allocate compute buffers
llama_init_from_gpt_params: error: failed to create context with model '/root/.ollama/models/blobs/sha256-7e4033fc9e578584ab6675c11afbd363056b251b94d86f32ef0be780164a2c97'
{"function":"load_model","level":"ERR","line":410,"model":"/root/.ollama/models/blobs/sha256-7e4033fc9e578584ab6675c11afbd363056b251b94d86f32ef0be780164a2c97","msg":"unable to load model","tid":"129076965371904","timestamp":1717662581}
[GIN] 2024/06/06 - 08:29:41 | 500 | 4.507426839s | 172.18.0.1 | POST "/api/chat"
time=2024-06-06T08:29:41.989Z level=ERROR source=routes.go:120 msg="error loading llama server" error="llama runner process no longer running: 1 error:failed to create context with model '/root/.ollama/models/blobs/sha256-7e4033fc9e578584ab6675c11afbd363056b251b94d86f32ef0be780164a2c97'"
```
### OS
Windows, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4850/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4850/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7735
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7735/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7735/comments
|
https://api.github.com/repos/ollama/ollama/issues/7735/events
|
https://github.com/ollama/ollama/issues/7735
| 2,670,944,941
|
I_kwDOJ0Z1Ps6fM2Kt
| 7,735
|
docker build error
|
{
"login": "zimmortal",
"id": 23369761,
"node_id": "MDQ6VXNlcjIzMzY5NzYx",
"avatar_url": "https://avatars.githubusercontent.com/u/23369761?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zimmortal",
"html_url": "https://github.com/zimmortal",
"followers_url": "https://api.github.com/users/zimmortal/followers",
"following_url": "https://api.github.com/users/zimmortal/following{/other_user}",
"gists_url": "https://api.github.com/users/zimmortal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zimmortal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zimmortal/subscriptions",
"organizations_url": "https://api.github.com/users/zimmortal/orgs",
"repos_url": "https://api.github.com/users/zimmortal/repos",
"events_url": "https://api.github.com/users/zimmortal/events{/privacy}",
"received_events_url": "https://api.github.com/users/zimmortal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2024-11-19T06:43:42
| 2024-12-20T09:27:51
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ERROR: failed to solve: process "/bin/sh -c CMAKE_VERSION=${CMAKE_VERSION} GOLANG_VERSION=${GOLANG_VERSION} sh /rh_linux_deps.sh" did not complete successfully: exit code: 2
```
1289.9
1289.9 Complete!
1290.2 + '[' x86_64 = x86_64 ']'
1290.2 + curl -s -L https://github.com/ccache/ccache/releases/download/v4.10.2/ccache-4.10.2-linux-x86_64.tar.xz
1290.2 + tar -Jx -C /tmp --strip-components 1
2477.3 xz: (stdin): Unexpected end of input
2477.3 tar: Unexpected EOF in archive
2477.3 tar: Unexpected EOF in archive
2477.3 tar: Error is not recoverable: exiting now
2477.3 + '[' -n 3.22.1 ']'
2477.3 + tar -zx -C /usr --strip-components 1
2477.3 ++ uname -m
2477.3 + curl -s -L https://github.com/Kitware/CMake/releases/download/v3.22.1/cmake-3.22.1-linux-x86_64.tar.gz
3509.3
3509.3 gzip: stdin: unexpected end of file
3509.3 tar: Unexpected EOF in archive
3509.3 tar: Unexpected EOF in archive
3509.3 tar: Error is not recoverable: exiting now
------
```
Then I tried to modify the code
```
if [ "${MACHINE}" = "x86_64" ] ; then
- curl -s -L https://github.com/ccache/ccache/releases/download/v4.10.2/ccache-4.10.2-linux-x86_64.tar.xz | tar -Jx -C /tmp --strip-components 1 && \
+ curl -s -L -o /tmp/ccache-4.10.2-linux-x86_64.tar.xz https://github.com/ccache/ccache/releases/download/v4.10.2/ccache-4.10.2-linux-x86_64.tar.xz && \
+ tar -xf /tmp/ccache-4.10.2-linux-x86_64.tar.xz -C /tmp --strip-components=1 && \
mv /tmp/ccache /usr/local/bin/
else
yum -y install epel-release
```
Delete the parameter -J to make it effective
### OS
macOS
### GPU
_No response_
### CPU
Apple
### Ollama version
,0.3.13
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7735/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7735/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/964
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/964/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/964/comments
|
https://api.github.com/repos/ollama/ollama/issues/964/events
|
https://github.com/ollama/ollama/issues/964
| 1,972,603,375
|
I_kwDOJ0Z1Ps51k4nv
| 964
|
unbalanced vram usage on 2x3070 GPUs with coodbooga & nexusraven
|
{
"login": "chymian",
"id": 1899961,
"node_id": "MDQ6VXNlcjE4OTk5NjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/1899961?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chymian",
"html_url": "https://github.com/chymian",
"followers_url": "https://api.github.com/users/chymian/followers",
"following_url": "https://api.github.com/users/chymian/following{/other_user}",
"gists_url": "https://api.github.com/users/chymian/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chymian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chymian/subscriptions",
"organizations_url": "https://api.github.com/users/chymian/orgs",
"repos_url": "https://api.github.com/users/chymian/repos",
"events_url": "https://api.github.com/users/chymian/events{/privacy}",
"received_events_url": "https://api.github.com/users/chymian/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2023-11-01T15:31:28
| 2024-05-04T21:52:19
| 2024-05-04T21:52:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
running coodboga & nexusraven segfaults and makeing the host unresponsiv.
they load w/o pbls and crash "on the first token".
(zephyr works good)
I tried that with stock ollama 0.1.7, (linux install), docker & selfcompiled ([516](https://github.com/jmorganca/ollama/issues/516)).
- checked the sha256: ok
- running them with `llama-bench` (self compiled), all models pass.
__host:__
ubu: 22.04
4 x 3070 8GB
i5-7400
__log:__
```log
Nov 01 13:15:02 utopia kernel: PREDICT[87637]: segfault at 90 ip 000055baddd22987 sp 00007f3d69ff56f0 error 4 in netdata[55baddb76000+474000]
Nov 01 13:15:02 utopia kernel: Code: 19 66 90 4c 89 f7 e8 b8 36 ea ff 48 83 7c 24 68 00 49 89 c4 0f 84 b9 00 00 00 49 8b 9c 24 a0 00 00 00 48 85 db 74 dc 48 8b 03 <8b> 80 90 00 00 00 a8 08 75 cf 48 8b 03 48 8b b8 98 00 00 00 e8 30
Nov 01 13:15:02 utopia kernel: traps: apport[94210] general protection fault ip:55ed725f58e0 sp:7ffe9cf1c7d0 error:0 in python3.10[55ed72523000+2b1000]
Nov 01 13:15:02 utopia kernel: Process 94210(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia systemd[1]: netdata.service: Main process exited, code=killed, status=11/SEGV
Nov 01 13:15:02 utopia dbus-daemon[781]: double free or corruption (!prev)
Nov 01 13:15:02 utopia kernel: x2gocleansessio[971]: segfault at 10 ip 0000563c3b0e2c8c sp 00007ffd36f14ec0 error 4 in perl[563c3b017000+1a0000]
Nov 01 13:15:02 utopia kernel: Code: 78 60 48 8b 70 48 44 8d 6f 01 44 89 68 60 41 83 fd 01 0f 8f 3e 04 00 00 48 8b 56 08 49 63 c5 48 8b 04 c2 48 89 85 20 01 00 00 <48> 8b 40 10 48 89 45 10 84 c9 74 71 4c 8b 00 48 8b 85 b8 00 00 00
Nov 01 13:15:02 utopia systemd[1]: x2goserver.service: Main process exited, code=dumped, status=11/SEGV
Nov 01 13:15:02 utopia systemd[1]: x2goserver.service: Failed with result 'core-dump'.
Nov 01 13:15:02 utopia systemd[1]: x2goserver.service: Consumed 2min 56.062s CPU time.
Nov 01 13:15:02 utopia kernel: apport[94212]: segfault at 158 ip 0000558fdfd3af94 sp 00007ffeadcaf248 error 4 in python3.10[558fdfc6e000+2b1000]
Nov 01 13:15:02 utopia kernel: Code: 89 e7 48 89 2c 24 e8 8b bd fe ff 48 8b 1c 24 e9 37 fe ff ff 48 01 f6 e9 5b f7 ff ff e9 a1 ec f3 ff 0f 1f 44 00 00 f3 0f 1e fa <48> 8b 87 58 01 00 00 48 85 c0 74 5f 48 8b 50 10 48 85 d2 7e 47 41
Nov 01 13:15:02 utopia kernel: Process 94212(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia kernel: apport[94211]: segfault at 0 ip 0000000000000000 sp 00007ffdf6625168 error 14 in python3.10[558edd361000+6d000]
Nov 01 13:15:02 utopia kernel: Code: Unable to access opcode bytes at RIP 0xffffffffffffffd6.
Nov 01 13:15:02 utopia kernel: Process 94211(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia kernel: systemd[1]: segfault at 0 ip 00007ff94d1b0b5e sp 00007ffdbd7d6b68 error 4 in libc.so.6[7ff94d040000+195000]
Nov 01 13:15:02 utopia kernel: Code: 2e 0f 1f 84 00 00 00 00 00 0f 1f 00 f3 0f 1e fa 89 f8 31 d2 c5 c1 ef ff 09 f0 25 ff 0f 00 00 3d 80 0f 00 00 0f 8f 52 03 00 00 <c5> fe 6f 0f c5 f5 74 06 c5 fd da c1 c5 fd 74 c7 c5 fd d7 c8 85 c9
Nov 01 13:15:02 utopia kernel: gdbus[1975]: segfault at 4 ip 00007f0a8837b127 sp 00007f0a86ad0bc0 error 4 in libglib-2.0.so.0.7200.4[7f0a88323000+8f000]
Nov 01 13:15:02 utopia kernel: Code: 48 0f 42 f0 48 8b 05 80 3e 0c 00 31 d2 4c 8b 57 08 48 f7 f6 ba 04 00 00 00 48 89 d1 48 39 d0 48 0f 43 c8 48 8b 05 a1 3e 0c 00 <42> 8b 04 80 89 ca 85 c0 75 5f 49 39 d2 73 72 48 8b 05 1b 2e 0c 00
Nov 01 13:15:02 utopia kernel: unattended-upgr[889]: segfault at 18 ip 00007fe821e6010a sp 00007ffc229dfc80 error 4 in libglib-2.0.so.0.7200.4[7fe821e08000+8f000]
Nov 01 13:15:02 utopia kernel: Code: c5 48 8d 34 9b 49 89 db 4d 89 c1 49 c1 e5 04 48 c1 e6 04 4c 01 ef 49 c1 e3 04 48 39 c6 48 0f 42 f0 48 8b 05 80 3e 0c 00 31 d2 <4c> 8b 57 08 48 f7 f6 ba 04 00 00 00 48 89 d1 48 39 d0 48 0f 43 c8
Nov 01 13:15:02 utopia rtkit-daemon[1140]: Exiting cleanly.
Nov 01 13:15:02 utopia rtkit-daemon[1140]: Demoting known real-time threads.
Nov 01 13:15:02 utopia rtkit-daemon[1140]: Demoted 0 threads.
Nov 01 13:15:02 utopia rtkit-daemon[1140]: Exiting watchdog thread.
Nov 01 13:15:02 utopia rtkit-daemon[1140]: Exiting canary thread.
Nov 01 13:15:02 utopia avahi-daemon[778]: Disconnected from D-Bus, exiting.
Nov 01 13:15:02 utopia avahi-daemon[778]: Got SIGTERM, quitting.
Nov 01 13:15:02 utopia avahi-daemon[778]: Leaving mDNS multicast group on interface docker0.IPv4 with address 172.17.0.1.
Nov 01 13:15:02 utopia avahi-daemon[778]: Leaving mDNS multicast group on interface zt4mrrjgxa.IPv6 with address fe80::6c84:49ff:fe9f:6f.
Nov 01 13:15:02 utopia avahi-daemon[778]: Leaving mDNS multicast group on interface zt4mrrjgxa.IPv4 with address 10.11.1.17.
Nov 01 13:15:02 utopia avahi-daemon[778]: Leaving mDNS multicast group on interface br0.IPv6 with address 2003:a:271a:300:804d:2fff:fe3d:c3b1.
Nov 01 13:15:02 utopia avahi-daemon[778]: Leaving mDNS multicast group on interface br0.IPv4 with address 192.168.178.17.
Nov 01 13:15:02 utopia avahi-daemon[778]: Leaving mDNS multicast group on interface lo.IPv6 with address ::1.
Nov 01 13:15:02 utopia avahi-daemon[778]: Leaving mDNS multicast group on interface lo.IPv4 with address 127.0.0.1.
Nov 01 13:15:02 utopia ModemManager[835]: <warn> could not acquire the 'org.freedesktop.ModemManager1' service name
Nov 01 13:15:02 utopia ModemManager[835]: <info> ModemManager is shut down
Nov 01 13:15:02 utopia avahi-daemon[778]: avahi-daemon 0.8 exiting.
Nov 01 13:15:02 utopia tracker-miner-fs-3[1381]: OK
Nov 01 13:15:02 utopia tracker-miner-fs-3[1390]: OK
Nov 01 13:15:02 utopia gvfs-mtp-volume-monitor[1630]: **
Nov 01 13:15:02 utopia gvfs-mtp-volume-monitor[1630]: GLib-GObject:ERROR:../../../gobject/gtype.c:2189:type_class_init_Wm: assertion failed: (node->is_classed && node->data && node->data->class.class_size && !node->data->class.class && g_atomic_int_get (&node->data->class.init_state) == UNINITIALIZED)
Nov 01 13:15:02 utopia gvfs-mtp-volume-monitor[1630]: Bail out! GLib-GObject:ERROR:../../../gobject/gtype.c:2189:type_class_init_Wm: assertion failed: (node->is_classed && node->data && node->data->class.class_size && !node->data->class.class && g_atomic_int_get (&node->data->class.init_state) == UNINITIALIZED)
Nov 01 13:15:02 utopia kernel: apport[94218]: segfault at 10 ip 0000561e3ecf2cf3 sp 00007ffe27ddc450 error 4 in python3.10[561e3ebf6000+2b1000]
Nov 01 13:15:02 utopia kernel: Code: 75 5b 48 89 fa 8b 7f 60 48 8b 4a 20 8b 41 28 01 ff 78 4a 48 83 ec 38 48 8b 71 78 4c 8b 15 85 bc 1e 00 c7 44 24 08 ff ff ff ff <48> 8b 4e 10 4c 8d 46 20 48 89 e6 4c 89 14 24 66 49 0f 6e c0 89 44
Nov 01 13:15:02 utopia kernel: Process 94218(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia kernel: apport[94216]: segfault at 30 ip 000055f1eccd8864 sp 00007ffd7a91f610 error 4 in python3.10[55f1ecc1a000+2b1000]
Nov 01 13:15:02 utopia kernel: Code: 00 41 89 94 24 b8 00 00 00 74 08 85 d2 0f 8e 27 03 00 00 48 83 c4 08 5b 5d 41 5c 41 5d c3 0f 1f 84 00 00 00 00 00 48 8b 57 08 <ff> 52 30 48 8b 75 18 48 83 eb 01 0f 83 9b fe ff ff e9 af fe ff ff
Nov 01 13:15:02 utopia kernel: Process 94216(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia systemd[1]: Caught <SEGV>, dumped core as pid 94213.
Nov 01 13:15:02 utopia systemd[1]: Freezing execution.
Nov 01 13:15:02 utopia kernel: Process 94215(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia systemd[912]: gvfs-mtp-volume-monitor.service: Main process exited, code=killed, status=6/ABRT
Nov 01 13:15:02 utopia systemd[912]: gvfs-mtp-volume-monitor.service: Failed with result 'signal'.
Nov 01 13:15:02 utopia kernel: Process 94220(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia kernel: Process 94221(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia systemd[910]: tracker-miner-fs-3.service: Main process exited, code=killed, status=11/SEGV
Nov 01 13:15:02 utopia systemd[910]: tracker-miner-fs-3.service: Failed with result 'signal'.
Nov 01 13:15:02 utopia systemd[910]: gvfs-mtp-volume-monitor.service: Main process exited, code=killed, status=11/SEGV
Nov 01 13:15:02 utopia systemd[910]: gvfs-mtp-volume-monitor.service: Failed with result 'signal'.
Nov 01 13:15:02 utopia kernel: Process 94237(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:02 utopia systemd[910]: tracker-miner-fs-3.service: Scheduled restart job, restart counter is at 1.
Nov 01 13:15:02 utopia systemd[910]: Stopped Tracker file system data miner.
Nov 01 13:15:02 utopia systemd[910]: Starting Tracker file system data miner...
Nov 01 13:15:02 utopia tracker-miner-f[94248]: Corrupt database: sqlite integrity check returned '*** in database main ***
Page 190: btreeInitPage() returns error code 11'
Nov 01 13:15:02 utopia tracker-miner-f[94248]: Could not create store/endpoint: Corrupt db file
Nov 01 13:15:02 utopia systemd[910]: tracker-miner-fs-3.service: Main process exited, code=exited, status=1/FAILURE
Nov 01 13:15:02 utopia systemd[910]: tracker-miner-fs-3.service: Failed with result 'exit-code'.
Nov 01 13:15:02 utopia systemd[910]: Failed to start Tracker file system data miner.
Nov 01 13:15:02 utopia systemd[910]: realloc(): invalid pointer
Nov 01 13:15:02 utopia kernel: Process 94253(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:02 utopia kernel: Aborting core
Nov 01 13:15:03 utopia systemd[920]: pam_unix(systemd-user:session): session closed for user pinokio
Nov 01 13:15:04 utopia kernel: Process 94267(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:04 utopia kernel: Aborting core
Nov 01 13:15:04 utopia kernel: Process 94296(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:04 utopia kernel: Aborting core
Nov 01 13:15:06 utopia kernel: Process 94336(apport) has RLIMIT_CORE set to 1
Nov 01 13:15:06 utopia kernel: Aborting core
Nov 01 13:15:07 utopia kernel: show_signal: 16 callbacks suppressed
Nov 01 13:15:07 utopia kernel: traps: python3[94361] general protection fault ip:53c600 sp:7ffc9b7591e0 error:0 in python3.11[41f000+233000]
Nov 01 13:15:10 utopia kernel: traps: python[94412] general protection fault ip:53529d sp:7ffea2782780 error:0 in python3.11[41f000+233000]
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/964/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/964/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/111
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/111/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/111/comments
|
https://api.github.com/repos/ollama/ollama/issues/111/events
|
https://github.com/ollama/ollama/issues/111
| 1,810,998,500
|
I_kwDOJ0Z1Ps5r8aTk
| 111
|
Error trying to create custom model, fresh install
|
{
"login": "saqbach",
"id": 6180399,
"node_id": "MDQ6VXNlcjYxODAzOTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6180399?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saqbach",
"html_url": "https://github.com/saqbach",
"followers_url": "https://api.github.com/users/saqbach/followers",
"following_url": "https://api.github.com/users/saqbach/following{/other_user}",
"gists_url": "https://api.github.com/users/saqbach/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saqbach/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saqbach/subscriptions",
"organizations_url": "https://api.github.com/users/saqbach/orgs",
"repos_url": "https://api.github.com/users/saqbach/repos",
"events_url": "https://api.github.com/users/saqbach/events{/privacy}",
"received_events_url": "https://api.github.com/users/saqbach/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2023-07-19T02:00:32
| 2023-07-19T03:42:44
| 2023-07-19T03:42:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
First off, this is awesome. Thank you for creating this. Running into a `Error: 400 Bad Request` when trying to follow the README and create a custom model.
Steps:
1. Download Apple Silicon app from `https://ollama.ai/download` & install to CLI
2. Run `ollama run llama2` successfully
3. Create a `Modelfile` and copy/paste the README example verbatim
4. Run `ollama create mario -f ./Modelfile`
5. Receive the `Error: 400 Bad Request`
Attempted a few of the other examples as well, etc. but couldn't get it to run.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/111/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/111/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2568
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2568/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2568/comments
|
https://api.github.com/repos/ollama/ollama/issues/2568/events
|
https://github.com/ollama/ollama/issues/2568
| 2,140,637,001
|
I_kwDOJ0Z1Ps5_l4dJ
| 2,568
|
`/set system` in CLI still append to System Prompt after ollama#2542
|
{
"login": "hyjwei",
"id": 76876891,
"node_id": "MDQ6VXNlcjc2ODc2ODkx",
"avatar_url": "https://avatars.githubusercontent.com/u/76876891?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hyjwei",
"html_url": "https://github.com/hyjwei",
"followers_url": "https://api.github.com/users/hyjwei/followers",
"following_url": "https://api.github.com/users/hyjwei/following{/other_user}",
"gists_url": "https://api.github.com/users/hyjwei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hyjwei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hyjwei/subscriptions",
"organizations_url": "https://api.github.com/users/hyjwei/orgs",
"repos_url": "https://api.github.com/users/hyjwei/repos",
"events_url": "https://api.github.com/users/hyjwei/events{/privacy}",
"received_events_url": "https://api.github.com/users/hyjwei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-02-17T22:12:42
| 2024-12-20T23:48:05
| 2024-12-20T23:48:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
This is a follow-up for #2492 . Thanks for fixing that issue. However, `/set system` in CLI still append to System Prompt after PR ollama#2542 .
In the second scenario of #2492, when I load a model, then use `/set system` to specify a custom System Prompt, it does replace the old one.
However, if I load a model, start a chat with the old System Prompt at first, then use `/set system` to specify a custom System Prompt, it will append to the old one instead of replacing that.
### How to replicate
Start a new server by `ollama serve` with `OLLAMA_DEBUG=1`
Run client with any model, for example, `ollama run phi`
Start a chat (for example, user prompt `hello`), and you can find the prompt debug info on server side looks like (and this is expected):
```
time=2024-02-17T16:34:41.346-05:00 level=DEBUG source=routes.go:1165 msg="chat handler" prompt="System: A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful answers to the user's questions.\nUser: hello\nAssistant:"
```
Then, set system prompt by `/set system`
```
/set system I want you to speak French only.
```
And input user prompt `hello` again, and you can find prompt debug info:
```
time=2024-02-17T16:35:05.369-05:00 level=DEBUG source=routes.go:1165 msg="chat handler" prompt="System: A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful answers to the user's questions.\nUser: hello\nAssistant: Hello! How can I assist you today?\nSystem: I want you to speak French only.\nUser: hello\nAssistant:"
```
This time, you can find that the new System Prompt is appended to the chat, which includes the old System Prompt and User/Assistant conversation.
From this point, the new system prompt by `/set system` will also append to the old prompts, even if I don't give any new user prompt. Thus multiple `/set system` (no user prompt in between) will make prompt look like:
```
time=2024-02-17T16:42:31.942-05:00 level=DEBUG source=routes.go:1165 msg="chat handler" prompt="System: A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful answers to the user's questions.\nUser: hello\nAssistant: Hello! How can I assist you today?\nSystem: I want you to speak French only.\nUser: hello\nAssistant: Bonjour! Je parle français. Comment ça va? (Hello! I speak French. How are you?)\nSystem: I want you to speak English only.\nUser: hello\nAssistant: Hello! How can I assist you today?\nSystem: I want you to speak German only.\nUser: \nAssistant:System: I want you to speak Italian only.\nUser: hello\nAssistant:"
```
You can see that the User Prompt and model response is empty between the last two System Prompts.
## My thought on this issue
I am not sure if changing system prompt after user prompt is a valid scenario or not, nor what behavior can be right. But I do believe that at least, if I give `/set system` twice or more without user prompt, only the last one should take effect.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2568/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2568/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2186
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2186/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2186/comments
|
https://api.github.com/repos/ollama/ollama/issues/2186/events
|
https://github.com/ollama/ollama/pull/2186
| 2,100,011,110
|
PR_kwDOJ0Z1Ps5lC43c
| 2,186
|
Fix clearing kv cache between requests with the same prompt
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-25T10:03:41
| 2024-01-25T21:46:21
| 2024-01-25T21:46:21
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2186",
"html_url": "https://github.com/ollama/ollama/pull/2186",
"diff_url": "https://github.com/ollama/ollama/pull/2186.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2186.patch",
"merged_at": "2024-01-25T21:46:21"
}
|
This is a (draft) fix for #1573, as it seems that the kv cache isn't cleared properly when the exact same prompt is provided repetitively.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2186/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 2,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2186/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7490
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7490/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7490/comments
|
https://api.github.com/repos/ollama/ollama/issues/7490/events
|
https://github.com/ollama/ollama/issues/7490
| 2,632,778,502
|
I_kwDOJ0Z1Ps6c7QMG
| 7,490
|
Return an empty embed list
|
{
"login": "utopeadia",
"id": 98788152,
"node_id": "U_kgDOBeNjOA",
"avatar_url": "https://avatars.githubusercontent.com/u/98788152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/utopeadia",
"html_url": "https://github.com/utopeadia",
"followers_url": "https://api.github.com/users/utopeadia/followers",
"following_url": "https://api.github.com/users/utopeadia/following{/other_user}",
"gists_url": "https://api.github.com/users/utopeadia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/utopeadia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/utopeadia/subscriptions",
"organizations_url": "https://api.github.com/users/utopeadia/orgs",
"repos_url": "https://api.github.com/users/utopeadia/repos",
"events_url": "https://api.github.com/users/utopeadia/events{/privacy}",
"received_events_url": "https://api.github.com/users/utopeadia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-11-04T13:10:35
| 2025-01-16T06:34:28
| 2024-11-04T13:59:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I use the [bge-m3](https://ollama.com/library/bge-m3) model, the return is an empty list regardless of the input.
My test code:
```python
import requests
url = "http://localhost:12121/api/embeddings"
payload = {
"model": "bge-m3",
"input": "lol"
}
response = requests.post(url, json=payload)
print(response.text)
```
run program output
```
{"embedding":[]}
```
### OS
Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14
|
{
"login": "utopeadia",
"id": 98788152,
"node_id": "U_kgDOBeNjOA",
"avatar_url": "https://avatars.githubusercontent.com/u/98788152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/utopeadia",
"html_url": "https://github.com/utopeadia",
"followers_url": "https://api.github.com/users/utopeadia/followers",
"following_url": "https://api.github.com/users/utopeadia/following{/other_user}",
"gists_url": "https://api.github.com/users/utopeadia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/utopeadia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/utopeadia/subscriptions",
"organizations_url": "https://api.github.com/users/utopeadia/orgs",
"repos_url": "https://api.github.com/users/utopeadia/repos",
"events_url": "https://api.github.com/users/utopeadia/events{/privacy}",
"received_events_url": "https://api.github.com/users/utopeadia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7490/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7490/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/544
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/544/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/544/comments
|
https://api.github.com/repos/ollama/ollama/issues/544/events
|
https://github.com/ollama/ollama/pull/544
| 1,899,587,123
|
PR_kwDOJ0Z1Ps5agAdf
| 544
|
Linking ollama-ui in README
|
{
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-09-16T22:38:21
| 2023-09-18T18:18:21
| 2023-09-18T16:50:02
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/544",
"html_url": "https://github.com/ollama/ollama/pull/544",
"diff_url": "https://github.com/ollama/ollama/pull/544.diff",
"patch_url": "https://github.com/ollama/ollama/pull/544.patch",
"merged_at": null
}
|
Adding info from https://github.com/jmorganca/ollama/issues/538#issuecomment-1722109233 to `REAMDE.md`.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/544/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/544/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/161
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/161/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/161/comments
|
https://api.github.com/repos/ollama/ollama/issues/161/events
|
https://github.com/ollama/ollama/issues/161
| 1,815,801,366
|
I_kwDOJ0Z1Ps5sOu4W
| 161
|
Asking Llama 2 to read a local text file
|
{
"login": "wwavess",
"id": 54215600,
"node_id": "MDQ6VXNlcjU0MjE1NjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/54215600?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wwavess",
"html_url": "https://github.com/wwavess",
"followers_url": "https://api.github.com/users/wwavess/followers",
"following_url": "https://api.github.com/users/wwavess/following{/other_user}",
"gists_url": "https://api.github.com/users/wwavess/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wwavess/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wwavess/subscriptions",
"organizations_url": "https://api.github.com/users/wwavess/orgs",
"repos_url": "https://api.github.com/users/wwavess/repos",
"events_url": "https://api.github.com/users/wwavess/events{/privacy}",
"received_events_url": "https://api.github.com/users/wwavess/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2023-07-21T13:20:49
| 2023-08-30T21:33:55
| 2023-08-30T21:33:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Has anyone been able to get Llama 2 to read a txt file for analysis?
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/161/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/161/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2948
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2948/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2948/comments
|
https://api.github.com/repos/ollama/ollama/issues/2948/events
|
https://github.com/ollama/ollama/issues/2948
| 2,171,037,442
|
I_kwDOJ0Z1Ps6BZ2cC
| 2,948
|
Allow `api.Client` to be constructed using URL & http.Client
|
{
"login": "jackielii",
"id": 360983,
"node_id": "MDQ6VXNlcjM2MDk4Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/360983?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackielii",
"html_url": "https://github.com/jackielii",
"followers_url": "https://api.github.com/users/jackielii/followers",
"following_url": "https://api.github.com/users/jackielii/following{/other_user}",
"gists_url": "https://api.github.com/users/jackielii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jackielii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackielii/subscriptions",
"organizations_url": "https://api.github.com/users/jackielii/orgs",
"repos_url": "https://api.github.com/users/jackielii/repos",
"events_url": "https://api.github.com/users/jackielii/events{/privacy}",
"received_events_url": "https://api.github.com/users/jackielii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5667396210,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2acg",
"url": "https://api.github.com/repos/ollama/ollama/labels/good%20first%20issue",
"name": "good first issue",
"color": "7057ff",
"default": true,
"description": "Good for newcomers"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-03-06T09:28:10
| 2024-05-07T08:00:46
| 2024-05-07T08:00:46
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm using `github.com/jmorganca/ollama/api` to connect to Ollama in my Go project. It works great. But I run two instance of ollama via different URL & port. At the moment the `api` package only supports construct client from env:
```go
func ClientFromEnvironment() (*Client, error) {}
```
I have to use hacks like SetEnv to construct 2 clients.
Proposal: add a new method that takes `url` and `http.client` as parameters
|
{
"login": "jackielii",
"id": 360983,
"node_id": "MDQ6VXNlcjM2MDk4Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/360983?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackielii",
"html_url": "https://github.com/jackielii",
"followers_url": "https://api.github.com/users/jackielii/followers",
"following_url": "https://api.github.com/users/jackielii/following{/other_user}",
"gists_url": "https://api.github.com/users/jackielii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jackielii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackielii/subscriptions",
"organizations_url": "https://api.github.com/users/jackielii/orgs",
"repos_url": "https://api.github.com/users/jackielii/repos",
"events_url": "https://api.github.com/users/jackielii/events{/privacy}",
"received_events_url": "https://api.github.com/users/jackielii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2948/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2948/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7466
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7466/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7466/comments
|
https://api.github.com/repos/ollama/ollama/issues/7466/events
|
https://github.com/ollama/ollama/pull/7466
| 2,629,791,219
|
PR_kwDOJ0Z1Ps6Aqywe
| 7,466
|
Workaround buggy P2P ROCm copy on windows
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-01T20:07:53
| 2024-11-07T22:26:34
| 2024-11-07T22:26:31
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7466",
"html_url": "https://github.com/ollama/ollama/pull/7466",
"diff_url": "https://github.com/ollama/ollama/pull/7466.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7466.patch",
"merged_at": "2024-11-07T22:26:31"
}
|
This enables the workaround code only for windows which should help windows users with muliple AMD GPUs
While testing #7378 I've only been able to reproduce the gibberish behavior on one system and only on Windows. Windows ROCm shouldn't allow smaller system memory compared to VRAM, so we believe enabling this flag is safe. (On linux, if we enable this flag, it breaks users with less RAM than VRAM when they try to load a model)
Fixes #7461
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7466/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7466/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7089
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7089/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7089/comments
|
https://api.github.com/repos/ollama/ollama/issues/7089/events
|
https://github.com/ollama/ollama/issues/7089
| 2,563,571,951
|
I_kwDOJ0Z1Ps6YzQDv
| 7,089
|
[prompt] add ollama configuration file
|
{
"login": "abitrolly",
"id": 8781107,
"node_id": "MDQ6VXNlcjg3ODExMDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8781107?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/abitrolly",
"html_url": "https://github.com/abitrolly",
"followers_url": "https://api.github.com/users/abitrolly/followers",
"following_url": "https://api.github.com/users/abitrolly/following{/other_user}",
"gists_url": "https://api.github.com/users/abitrolly/gists{/gist_id}",
"starred_url": "https://api.github.com/users/abitrolly/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/abitrolly/subscriptions",
"organizations_url": "https://api.github.com/users/abitrolly/orgs",
"repos_url": "https://api.github.com/users/abitrolly/repos",
"events_url": "https://api.github.com/users/abitrolly/events{/privacy}",
"received_events_url": "https://api.github.com/users/abitrolly/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2024-10-03T09:35:37
| 2024-11-12T06:50:28
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I think we can now try eating our own dog food, and let LLM write the code to solve [second most voted](https://github.com/ollama/ollama/issues?q=config+file+is%3Aopen+sort%3Areactions-%2B1-desc) issue "Please don't clutter the user home directory" (https://github.com/ollama/ollama/issues/228).
Here is my try at prompting.
---
Please write the code to add config file to `ollama` project. The config provides lookup values for `envconfig` variables. The file should be in standard OS location, which is `~/.config/ollama/config.toml` on Linux.
The code should be commented, and variable lookup order printed in debug mode.
An example config file should be included with all variables commented, and set to their default values.
---
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7089/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7089/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2789
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2789/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2789/comments
|
https://api.github.com/repos/ollama/ollama/issues/2789/events
|
https://github.com/ollama/ollama/pull/2789
| 2,157,717,842
|
PR_kwDOJ0Z1Ps5oGy2r
| 2,789
|
prepend image tags
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-02-27T22:25:53
| 2024-02-29T19:30:15
| 2024-02-29T19:30:14
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2789",
"html_url": "https://github.com/ollama/ollama/pull/2789",
"diff_url": "https://github.com/ollama/ollama/pull/2789.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2789.patch",
"merged_at": "2024-02-29T19:30:14"
}
|
instead of appending image tags, prepend them which produces better results in general
resolves #2769
resolves #2788
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2789/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2789/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5056
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5056/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5056/comments
|
https://api.github.com/repos/ollama/ollama/issues/5056/events
|
https://github.com/ollama/ollama/issues/5056
| 2,354,580,351
|
I_kwDOJ0Z1Ps6MWAt_
| 5,056
|
qwen2 model error
|
{
"login": "misi0202",
"id": 101965629,
"node_id": "U_kgDOBhPfPQ",
"avatar_url": "https://avatars.githubusercontent.com/u/101965629?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/misi0202",
"html_url": "https://github.com/misi0202",
"followers_url": "https://api.github.com/users/misi0202/followers",
"following_url": "https://api.github.com/users/misi0202/following{/other_user}",
"gists_url": "https://api.github.com/users/misi0202/gists{/gist_id}",
"starred_url": "https://api.github.com/users/misi0202/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/misi0202/subscriptions",
"organizations_url": "https://api.github.com/users/misi0202/orgs",
"repos_url": "https://api.github.com/users/misi0202/repos",
"events_url": "https://api.github.com/users/misi0202/events{/privacy}",
"received_events_url": "https://api.github.com/users/misi0202/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-06-15T06:13:13
| 2024-06-17T02:46:39
| 2024-06-17T02:46:39
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm try to use Qwen2-7b model by ollama(ollama run qwen2) , but meet timeout error by POST /api/compete, timed out occured.When I POST /api/chat,it can return reply error code like GGGGGGGGG, If the ollama don't support qwen2?
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "misi0202",
"id": 101965629,
"node_id": "U_kgDOBhPfPQ",
"avatar_url": "https://avatars.githubusercontent.com/u/101965629?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/misi0202",
"html_url": "https://github.com/misi0202",
"followers_url": "https://api.github.com/users/misi0202/followers",
"following_url": "https://api.github.com/users/misi0202/following{/other_user}",
"gists_url": "https://api.github.com/users/misi0202/gists{/gist_id}",
"starred_url": "https://api.github.com/users/misi0202/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/misi0202/subscriptions",
"organizations_url": "https://api.github.com/users/misi0202/orgs",
"repos_url": "https://api.github.com/users/misi0202/repos",
"events_url": "https://api.github.com/users/misi0202/events{/privacy}",
"received_events_url": "https://api.github.com/users/misi0202/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5056/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5056/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4551
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4551/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4551/comments
|
https://api.github.com/repos/ollama/ollama/issues/4551/events
|
https://github.com/ollama/ollama/pull/4551
| 2,307,149,054
|
PR_kwDOJ0Z1Ps5wBZQr
| 4,551
|
Added docker healthcheck to all runtime stages
|
{
"login": "codearranger",
"id": 80373433,
"node_id": "MDQ6VXNlcjgwMzczNDMz",
"avatar_url": "https://avatars.githubusercontent.com/u/80373433?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codearranger",
"html_url": "https://github.com/codearranger",
"followers_url": "https://api.github.com/users/codearranger/followers",
"following_url": "https://api.github.com/users/codearranger/following{/other_user}",
"gists_url": "https://api.github.com/users/codearranger/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codearranger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codearranger/subscriptions",
"organizations_url": "https://api.github.com/users/codearranger/orgs",
"repos_url": "https://api.github.com/users/codearranger/repos",
"events_url": "https://api.github.com/users/codearranger/events{/privacy}",
"received_events_url": "https://api.github.com/users/codearranger/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-05-21T02:45:06
| 2024-11-23T21:12:29
| 2024-11-23T21:12:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4551",
"html_url": "https://github.com/ollama/ollama/pull/4551",
"diff_url": "https://github.com/ollama/ollama/pull/4551.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4551.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4551/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4551/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2426
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2426/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2426/comments
|
https://api.github.com/repos/ollama/ollama/issues/2426/events
|
https://github.com/ollama/ollama/issues/2426
| 2,126,843,376
|
I_kwDOJ0Z1Ps5-xQ3w
| 2,426
|
In the blog post -> https://ollama.ai/blog/openai-compatibility change the name of Autogen
|
{
"login": "Naqqash",
"id": 4791247,
"node_id": "MDQ6VXNlcjQ3OTEyNDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/4791247?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Naqqash",
"html_url": "https://github.com/Naqqash",
"followers_url": "https://api.github.com/users/Naqqash/followers",
"following_url": "https://api.github.com/users/Naqqash/following{/other_user}",
"gists_url": "https://api.github.com/users/Naqqash/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Naqqash/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Naqqash/subscriptions",
"organizations_url": "https://api.github.com/users/Naqqash/orgs",
"repos_url": "https://api.github.com/users/Naqqash/repos",
"events_url": "https://api.github.com/users/Naqqash/events{/privacy}",
"received_events_url": "https://api.github.com/users/Naqqash/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-09T10:19:18
| 2024-02-09T13:18:23
| 2024-02-09T13:18:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
In the blog the installation instruction is written as `pip install autogenpy` it should be `pip install pyautogen`
Reference -> https://github.com/microsoft/autogen
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2426/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2426/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5018
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5018/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5018/comments
|
https://api.github.com/repos/ollama/ollama/issues/5018/events
|
https://github.com/ollama/ollama/pull/5018
| 2,350,685,626
|
PR_kwDOJ0Z1Ps5yVum6
| 5,018
|
fix utf8 parser error
|
{
"login": "007gzs",
"id": 5856259,
"node_id": "MDQ6VXNlcjU4NTYyNTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5856259?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/007gzs",
"html_url": "https://github.com/007gzs",
"followers_url": "https://api.github.com/users/007gzs/followers",
"following_url": "https://api.github.com/users/007gzs/following{/other_user}",
"gists_url": "https://api.github.com/users/007gzs/gists{/gist_id}",
"starred_url": "https://api.github.com/users/007gzs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/007gzs/subscriptions",
"organizations_url": "https://api.github.com/users/007gzs/orgs",
"repos_url": "https://api.github.com/users/007gzs/repos",
"events_url": "https://api.github.com/users/007gzs/events{/privacy}",
"received_events_url": "https://api.github.com/users/007gzs/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-06-13T09:44:18
| 2024-06-13T17:35:39
| 2024-06-13T17:35:39
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5018",
"html_url": "https://github.com/ollama/ollama/pull/5018",
"diff_url": "https://github.com/ollama/ollama/pull/5018.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5018.patch",
"merged_at": null
}
|
in `v0.1.43` when utf8 char in modelfile ,after parse got `�������������`
test code :
```
var Modelfile string = "FROM llama3:70b\nSYSTEM \"\"\"\n提问和回答都使用中文\n\"\"\""
var sr io.Reader = strings.NewReader(Modelfile)
f, err := parser.ParseFile(sr)
fmt.Printf("err: %v\n", err)
fmt.Printf("f: %v\n", f)
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5018/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5018/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1729
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1729/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1729/comments
|
https://api.github.com/repos/ollama/ollama/issues/1729/events
|
https://github.com/ollama/ollama/issues/1729
| 2,057,390,143
|
I_kwDOJ0Z1Ps56oUg_
| 1,729
|
Function call with Ollama and LlamaIndex
|
{
"login": "sandangel",
"id": 22189661,
"node_id": "MDQ6VXNlcjIyMTg5NjYx",
"avatar_url": "https://avatars.githubusercontent.com/u/22189661?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sandangel",
"html_url": "https://github.com/sandangel",
"followers_url": "https://api.github.com/users/sandangel/followers",
"following_url": "https://api.github.com/users/sandangel/following{/other_user}",
"gists_url": "https://api.github.com/users/sandangel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sandangel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sandangel/subscriptions",
"organizations_url": "https://api.github.com/users/sandangel/orgs",
"repos_url": "https://api.github.com/users/sandangel/repos",
"events_url": "https://api.github.com/users/sandangel/events{/privacy}",
"received_events_url": "https://api.github.com/users/sandangel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 15
| 2023-12-27T13:44:04
| 2024-07-26T00:47:05
| 2024-07-26T00:47:04
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi, I'm looking for a way to add function call to work with Ollama and LlamaIndex.
From my research we have format json in Ollama, so theoretically, there are 2 ways we can support function call:
1. Enforce the LLM to output json following a schema, and we can call the function based on the json output.
* Not sure how reliable it is for this approach, has anyone been able to have a consistent output from the LLM for the exact prompt?
* Client side also need to implement a retry mechanism so we will feed the previous output and errors back to LLM and ask it to regenerate
* What are schemas and data structure that we should use? Currently, most people seem to go with OpenAI function call schema, but it does not support validation and we probably need to have a pydantic model and keep it up-to-date for LLM response's validation.
* Some examples: https://github.com/lgrammel/modelfusion/blob/main/examples/basic/src/model-provider/ollama/ollama-chat-use-tools-or-generator-text-mistral-example.ts
2. We can also add API in Ollama itself to support function call directly, similar to OpenAI.
* I'm not sure how this will work, especially OpenAI is not open source. Do you think it's possible to implement the function call feature directly in Ollama?
* I'm not sure will we need to have a specific model that support function call, and we can feed `{ role: "tool", content: "tool output" }` into the LLM
* Or it's simply the feature we can add at the API level.
Please let me know what do you guys think and what should be the right approach for this issue going forward.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1729/reactions",
"total_count": 12,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 6
}
|
https://api.github.com/repos/ollama/ollama/issues/1729/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/613
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/613/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/613/comments
|
https://api.github.com/repos/ollama/ollama/issues/613/events
|
https://github.com/ollama/ollama/issues/613
| 1,914,519,684
|
I_kwDOJ0Z1Ps5yHUCE
| 613
|
Getting permission denied when attempting to create a model
|
{
"login": "DeanKamali",
"id": 1252959,
"node_id": "MDQ6VXNlcjEyNTI5NTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/1252959?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DeanKamali",
"html_url": "https://github.com/DeanKamali",
"followers_url": "https://api.github.com/users/DeanKamali/followers",
"following_url": "https://api.github.com/users/DeanKamali/following{/other_user}",
"gists_url": "https://api.github.com/users/DeanKamali/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DeanKamali/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DeanKamali/subscriptions",
"organizations_url": "https://api.github.com/users/DeanKamali/orgs",
"repos_url": "https://api.github.com/users/DeanKamali/repos",
"events_url": "https://api.github.com/users/DeanKamali/events{/privacy}",
"received_events_url": "https://api.github.com/users/DeanKamali/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 14
| 2023-09-27T01:20:48
| 2024-12-02T04:40:44
| 2023-11-16T00:41:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
ollama version: v0.1.0
**Steps to Reproduce:**
- Ran` curl https://ollama.ai/install.sh | sh` to install ollama.
- Navigated to ollama/examples/devops-engineer/.
- Executed `ollama create devops-engineer -f ./Modelfile`.
Error Encountered:
`couldn't open modelfile '/root/ollama/examples/devops-engineer/Modelfile'. Error: failed to open file: open /root/ollama/examples/devops-engineer/Modelfile: permission denied.`

After trying to set the Model file with 755 in my WSL environment, I still couldn't get it to work. I'm wondering if there is a bug with the recent Linux release.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/613/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/613/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3078
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3078/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3078/comments
|
https://api.github.com/repos/ollama/ollama/issues/3078/events
|
https://github.com/ollama/ollama/issues/3078
| 2,181,653,843
|
I_kwDOJ0Z1Ps6CCWVT
| 3,078
|
Ollama is not using the 100% of RTX4000 VRAM (18 of 20GB)
|
{
"login": "nfsecurity",
"id": 16274031,
"node_id": "MDQ6VXNlcjE2Mjc0MDMx",
"avatar_url": "https://avatars.githubusercontent.com/u/16274031?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nfsecurity",
"html_url": "https://github.com/nfsecurity",
"followers_url": "https://api.github.com/users/nfsecurity/followers",
"following_url": "https://api.github.com/users/nfsecurity/following{/other_user}",
"gists_url": "https://api.github.com/users/nfsecurity/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nfsecurity/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nfsecurity/subscriptions",
"organizations_url": "https://api.github.com/users/nfsecurity/orgs",
"repos_url": "https://api.github.com/users/nfsecurity/repos",
"events_url": "https://api.github.com/users/nfsecurity/events{/privacy}",
"received_events_url": "https://api.github.com/users/nfsecurity/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
open
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 29
| 2024-03-12T13:40:42
| 2025-01-08T20:34:31
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi, thank you for the wonderful ollama project and the amazing community!
<img width="742" alt="Screenshot 2024-03-12 at 8 32 31 AM" src="https://github.com/ollama/ollama/assets/16274031/a47d6ad9-3602-4ffe-984d-0ec858f95b6f">
I am testing the Mixtral 3Bit Quantized model under a RTX400 with 20GB of VRAM. The model is 20GB of size and as you can see in the screenshot of nvidia-smi, ollama is using only 18GB and the rest of the model was loaded to the system RAM.
Is this normal? or is it an issue?, Can I force ollama to use the 100% of VRAM?, thank you!!!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3078/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3078/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2111
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2111/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2111/comments
|
https://api.github.com/repos/ollama/ollama/issues/2111/events
|
https://github.com/ollama/ollama/issues/2111
| 2,092,182,139
|
I_kwDOJ0Z1Ps58tCp7
| 2,111
|
Enable installation without root priviledge
|
{
"login": "chunhualiao",
"id": 1627206,
"node_id": "MDQ6VXNlcjE2MjcyMDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/1627206?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chunhualiao",
"html_url": "https://github.com/chunhualiao",
"followers_url": "https://api.github.com/users/chunhualiao/followers",
"following_url": "https://api.github.com/users/chunhualiao/following{/other_user}",
"gists_url": "https://api.github.com/users/chunhualiao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chunhualiao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chunhualiao/subscriptions",
"organizations_url": "https://api.github.com/users/chunhualiao/orgs",
"repos_url": "https://api.github.com/users/chunhualiao/repos",
"events_url": "https://api.github.com/users/chunhualiao/events{/privacy}",
"received_events_url": "https://api.github.com/users/chunhualiao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 12
| 2024-01-20T17:59:05
| 2024-10-08T11:44:57
| 2024-01-21T00:01:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It seems like ollama will run sudo during its installation on Linux.
Please support the installation and use of users without sudo priviledge.
thanks.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2111/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2111/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1386
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1386/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1386/comments
|
https://api.github.com/repos/ollama/ollama/issues/1386/events
|
https://github.com/ollama/ollama/issues/1386
| 2,025,089,735
|
I_kwDOJ0Z1Ps54tGrH
| 1,386
|
Linux kernel traps ollama runner with invalid opcode
|
{
"login": "clvgt12",
"id": 15834506,
"node_id": "MDQ6VXNlcjE1ODM0NTA2",
"avatar_url": "https://avatars.githubusercontent.com/u/15834506?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/clvgt12",
"html_url": "https://github.com/clvgt12",
"followers_url": "https://api.github.com/users/clvgt12/followers",
"following_url": "https://api.github.com/users/clvgt12/following{/other_user}",
"gists_url": "https://api.github.com/users/clvgt12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/clvgt12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/clvgt12/subscriptions",
"organizations_url": "https://api.github.com/users/clvgt12/orgs",
"repos_url": "https://api.github.com/users/clvgt12/repos",
"events_url": "https://api.github.com/users/clvgt12/events{/privacy}",
"received_events_url": "https://api.github.com/users/clvgt12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 8
| 2023-12-05T01:13:12
| 2024-01-27T01:55:52
| 2024-01-27T01:55:51
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have a Ubuntu laptop, and installed ollama and the llama2 model using this script:
```
$ curl https://ollama.ai/install.sh | sh
>>> The Ollama API is now available at 0.0.0.0:11434.
>>> Install complete. Run "ollama" from the command line.
$ ollama pull llama2
pulling manifest
pulling 22f7f8ef5f4c... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 3.8 GB
pulling 8c17c2ebb0ea... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 7.0 KB
pulling 7c23fb36d801... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 4.8 KB
pulling 2e0493f67d0c... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 59 B
pulling 2759286baa87... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 105 B
pulling 5407e3188df9... 100% ▕██████████████████████████████████████████████████████████████████████████████████████████████████████████████▏ 529 B
verifying sha256 digest
writing manifest
removing any unused layers
success
$ ollama run llama2
Error: llama runner process has terminated
```
I noticed these journal logs that indicate a coredump (timestamps are in reverse chronological order):
```
$ journalctl -r -n50 | grep ollama
Dec 04 19:53:42 martini ollama[25463]: [GIN] 2023/12/04 - 19:53:42 | 500 | 3.287409943s | 127.0.0.1 | POST "/api/generate"
Dec 04 19:53:42 martini ollama[25463]: 2023/12/04 19:53:42 llama.go:510: llama runner stopped successfully
Dec 04 19:53:42 martini ollama[25463]: 2023/12/04 19:53:42 llama.go:444: error starting llama runner: llama runner process has terminated
Dec 04 19:53:42 martini ollama[25463]: 2023/12/04 19:53:42 llama.go:436: signal: illegal instruction (core dumped)
Dec 04 19:53:42 martini kernel: traps: ollama-runner[25996] trap invalid opcode ip:5012fc sp:7ffee2087ef0 error:0 in ollama-runner[408000+163000]
Dec 04 19:53:41 martini ollama[25463]: 2023/12/04 19:53:41 llama.go:479: waiting for llama runner to start responding
Dec 04 19:53:41 martini ollama[25463]: 2023/12/04 19:53:41 llama.go:421: starting llama runner
Dec 04 19:53:41 martini ollama[25463]: 2023/12/04 19:53:41 llama.go:390: skipping accelerated runner because num_gpu=0
Dec 04 19:53:39 martini ollama[25463]: [GIN] 2023/12/04 - 19:53:39 | 200 | 1.466176ms | 127.0.0.1 | POST "/api/show"
Dec 04 19:53:39 martini ollama[25463]: [GIN] 2023/12/04 - 19:53:39 | 200 | 1.918687ms | 127.0.0.1 | HEAD "/"
```
Here is my Linux kernel and distribution information:
```
$ uname -a
Linux martini 6.2.0-1017-lowlatency #17~22.04.1-Ubuntu SMP PREEMPT_DYNAMIC Thu Nov 2 18:30:14 UTC 2 x86_64 x86_64 x86_64 GNU/Linux
$ cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04.3 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.3 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy
```
Note that I am using a low latency kernel built for the Ubuntu Studio distribution.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1386/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1386/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3623
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3623/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3623/comments
|
https://api.github.com/repos/ollama/ollama/issues/3623/events
|
https://github.com/ollama/ollama/issues/3623
| 2,241,475,830
|
I_kwDOJ0Z1Ps6FmjT2
| 3,623
|
[v0.1.32-pre for Windows] ollama server does not exit quitting from the system tray icon
|
{
"login": "mann1x",
"id": 20623405,
"node_id": "MDQ6VXNlcjIwNjIzNDA1",
"avatar_url": "https://avatars.githubusercontent.com/u/20623405?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mann1x",
"html_url": "https://github.com/mann1x",
"followers_url": "https://api.github.com/users/mann1x/followers",
"following_url": "https://api.github.com/users/mann1x/following{/other_user}",
"gists_url": "https://api.github.com/users/mann1x/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mann1x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mann1x/subscriptions",
"organizations_url": "https://api.github.com/users/mann1x/orgs",
"repos_url": "https://api.github.com/users/mann1x/repos",
"events_url": "https://api.github.com/users/mann1x/events{/privacy}",
"received_events_url": "https://api.github.com/users/mann1x/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-04-13T10:09:38
| 2024-04-14T22:33:26
| 2024-04-14T22:33:26
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama is not stopped when quitting using the system tray icon
### What did you expect to see?
ollama app.exe and ollama.exe not running anymore
### Steps to reproduce
just quit from the system tray icon
### Are there any recent changes that introduced the issue?
This issue is new with the pre-release version
### OS
Windows
### Architecture
amd64
### Platform
_No response_
### Ollama version
0.1.32-pre
### GPU
Nvidia
### GPU info
_No response_
### CPU
AMD
### Other software
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3623/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3623/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5691
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5691/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5691/comments
|
https://api.github.com/repos/ollama/ollama/issues/5691/events
|
https://github.com/ollama/ollama/issues/5691
| 2,407,610,489
|
I_kwDOJ0Z1Ps6PgTh5
| 5,691
|
Run model by index
|
{
"login": "peteruithoven",
"id": 523210,
"node_id": "MDQ6VXNlcjUyMzIxMA==",
"avatar_url": "https://avatars.githubusercontent.com/u/523210?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/peteruithoven",
"html_url": "https://github.com/peteruithoven",
"followers_url": "https://api.github.com/users/peteruithoven/followers",
"following_url": "https://api.github.com/users/peteruithoven/following{/other_user}",
"gists_url": "https://api.github.com/users/peteruithoven/gists{/gist_id}",
"starred_url": "https://api.github.com/users/peteruithoven/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/peteruithoven/subscriptions",
"organizations_url": "https://api.github.com/users/peteruithoven/orgs",
"repos_url": "https://api.github.com/users/peteruithoven/repos",
"events_url": "https://api.github.com/users/peteruithoven/events{/privacy}",
"received_events_url": "https://api.github.com/users/peteruithoven/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2024-07-14T21:55:44
| 2024-07-14T22:07:54
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When running models using the cli the whole names needs to be used e.g. `ollama run deepseek-coder-v2`. Some of these names are hard to remember. I often copy them from `ollama list`. What if we could also run them by their index in the `ollama list`?
You could just run `ollama list` see the indexes and run `ollama run 1` for example.
When you run the same model often you can just remember the index.
Ideally `ollama list` also shows these indexes:
```
# NAME ID SIZE MODIFIED
1 llama3:latest 365c0bd3c000 4.7 GB 16 minutes ago
2 mistral-openorca:latest 12dc6acc14d0 4.1 GB 11 minutes ago
```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5691/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5691/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7125
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7125/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7125/comments
|
https://api.github.com/repos/ollama/ollama/issues/7125/events
|
https://github.com/ollama/ollama/issues/7125
| 2,571,827,594
|
I_kwDOJ0Z1Ps6ZSvmK
| 7,125
|
openai: support max_completion_tokens due to deprecation of max_tokens
|
{
"login": "codefromthecrypt",
"id": 64215,
"node_id": "MDQ6VXNlcjY0MjE1",
"avatar_url": "https://avatars.githubusercontent.com/u/64215?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codefromthecrypt",
"html_url": "https://github.com/codefromthecrypt",
"followers_url": "https://api.github.com/users/codefromthecrypt/followers",
"following_url": "https://api.github.com/users/codefromthecrypt/following{/other_user}",
"gists_url": "https://api.github.com/users/codefromthecrypt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codefromthecrypt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codefromthecrypt/subscriptions",
"organizations_url": "https://api.github.com/users/codefromthecrypt/orgs",
"repos_url": "https://api.github.com/users/codefromthecrypt/repos",
"events_url": "https://api.github.com/users/codefromthecrypt/events{/privacy}",
"received_events_url": "https://api.github.com/users/codefromthecrypt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 0
| 2024-10-08T01:17:37
| 2024-11-06T00:02:14
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
max_tokens is now deprecated for max_completion_tokens. I suspect we should support both. One way is to define another field in our request object and then default if one or the other isn't set https://github.com/ollama/ollama/blob/defbf9425af8228f3420d567e9eeaa29d8ac87e3/openai/openai.go#L77
See https://platform.openai.com/docs/api-reference/chat/create#chat-create-max_tokens
See https://github.com/openai/openai-openapi/blob/10053bef25cd50a7424f5265ba51a7a63ba95b48/openapi.yaml#L9854-L9866
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7125/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7125/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5954
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5954/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5954/comments
|
https://api.github.com/repos/ollama/ollama/issues/5954/events
|
https://github.com/ollama/ollama/issues/5954
| 2,430,404,262
|
I_kwDOJ0Z1Ps6Q3Qam
| 5,954
|
Detecting macOS GPUs when using Podman with GPU passthrough
|
{
"login": "ThomasVitale",
"id": 8523418,
"node_id": "MDQ6VXNlcjg1MjM0MTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8523418?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ThomasVitale",
"html_url": "https://github.com/ThomasVitale",
"followers_url": "https://api.github.com/users/ThomasVitale/followers",
"following_url": "https://api.github.com/users/ThomasVitale/following{/other_user}",
"gists_url": "https://api.github.com/users/ThomasVitale/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ThomasVitale/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ThomasVitale/subscriptions",
"organizations_url": "https://api.github.com/users/ThomasVitale/orgs",
"repos_url": "https://api.github.com/users/ThomasVitale/repos",
"events_url": "https://api.github.com/users/ThomasVitale/events{/privacy}",
"received_events_url": "https://api.github.com/users/ThomasVitale/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2024-07-25T15:58:38
| 2024-12-10T16:40:20
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Podman provides support for making the local GPU on a macOS computer available from within a container. This article describes the setup for it: https://blog.podman.io/2024/07/podman-and-libkrun/.
```shell
% podman machine ssh ls -l /dev/dri
total 0
drwxr-xr-x. 2 root root 80 Jul 25 17:12 by-path
crw-rw----. 1 root video 226, 0 Jul 25 17:12 card0
crw-rw-rw-. 1 root render 226, 128 Jul 25 17:12 renderD128
```
When I run an Ollama container, it doesn't seem it can recognise the GPU. Is there some option I can use to make that work or is some new implementation needed within the Ollama project to support that?
```shell
docker run -it --rm -p 11434:11434 --device /dev/dri -e OLLAMA_DEBUG=true ollama/ollama
```
Logs:
```
2024/07/25 15:56:42 routes.go:1100: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-07-25T15:56:42.248Z level=INFO source=images.go:784 msg="total blobs: 0"
time=2024-07-25T15:56:42.248Z level=INFO source=images.go:791 msg="total unused blobs removed: 0"
time=2024-07-25T15:56:42.248Z level=INFO source=routes.go:1147 msg="Listening on [::]:11434 (version 0.2.8)"
time=2024-07-25T15:56:42.249Z level=INFO source=payload.go:30 msg="extracting embedded files" dir=/tmp/ollama2409035362/runners
time=2024-07-25T15:56:42.249Z level=DEBUG source=payload.go:182 msg=extracting variant=cpu file=build/linux/arm64/cpu/bin/ollama_llama_server.gz
time=2024-07-25T15:56:42.249Z level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/arm64/cuda_v11/bin/libcublas.so.11.gz
time=2024-07-25T15:56:42.249Z level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/arm64/cuda_v11/bin/libcublasLt.so.11.gz
time=2024-07-25T15:56:42.249Z level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/arm64/cuda_v11/bin/libcudart.so.11.0.gz
time=2024-07-25T15:56:42.249Z level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/arm64/cuda_v11/bin/ollama_llama_server.gz
time=2024-07-25T15:56:47.047Z level=DEBUG source=payload.go:71 msg="availableServers : found" file=/tmp/ollama2409035362/runners/cpu/ollama_llama_server
time=2024-07-25T15:56:47.047Z level=DEBUG source=payload.go:71 msg="availableServers : found" file=/tmp/ollama2409035362/runners/cuda_v11/ollama_llama_server
time=2024-07-25T15:56:47.047Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cuda_v11]"
time=2024-07-25T15:56:47.047Z level=DEBUG source=payload.go:45 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY"
time=2024-07-25T15:56:47.047Z level=DEBUG source=sched.go:102 msg="starting llm scheduler"
time=2024-07-25T15:56:47.047Z level=INFO source=gpu.go:205 msg="looking for compatible GPUs"
time=2024-07-25T15:56:47.048Z level=DEBUG source=gpu.go:91 msg="searching for GPU discovery libraries for NVIDIA"
time=2024-07-25T15:56:47.048Z level=DEBUG source=gpu.go:468 msg="Searching for GPU library" name=libcuda.so*
time=2024-07-25T15:56:47.049Z level=DEBUG source=gpu.go:487 msg="gpu library search" globs="[/usr/local/nvidia/lib/libcuda.so** /usr/local/nvidia/lib64/libcuda.so** /usr/local/cuda*/targets/*/lib/libcuda.so* /usr/lib/*-linux-gnu/nvidia/current/libcuda.so* /usr/lib/*-linux-gnu/libcuda.so* /usr/lib/wsl/lib/libcuda.so* /usr/lib/wsl/drivers/*/libcuda.so* /opt/cuda/lib*/libcuda.so* /usr/local/cuda/lib*/libcuda.so* /usr/lib*/libcuda.so* /usr/local/lib*/libcuda.so*]"
time=2024-07-25T15:56:47.049Z level=DEBUG source=gpu.go:521 msg="discovered GPU libraries" paths=[]
time=2024-07-25T15:56:47.049Z level=DEBUG source=gpu.go:468 msg="Searching for GPU library" name=libcudart.so*
time=2024-07-25T15:56:47.049Z level=DEBUG source=gpu.go:487 msg="gpu library search" globs="[/usr/local/nvidia/lib/libcudart.so** /usr/local/nvidia/lib64/libcudart.so** /tmp/ollama2409035362/runners/cuda*/libcudart.so* /usr/local/cuda/lib64/libcudart.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/x86_64-linux-gnu/libcudart.so* /usr/lib/wsl/lib/libcudart.so* /usr/lib/wsl/drivers/*/libcudart.so* /opt/cuda/lib64/libcudart.so* /usr/local/cuda*/targets/aarch64-linux/lib/libcudart.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/aarch64-linux-gnu/libcudart.so* /usr/local/cuda/lib*/libcudart.so* /usr/lib*/libcudart.so* /usr/local/lib*/libcudart.so*]"
time=2024-07-25T15:56:47.049Z level=DEBUG source=gpu.go:521 msg="discovered GPU libraries" paths=[/tmp/ollama2409035362/runners/cuda_v11/libcudart.so.11.0]
cudaSetDevice err: 35
time=2024-07-25T15:56:47.057Z level=DEBUG source=gpu.go:533 msg="Unable to load cudart" library=/tmp/ollama2409035362/runners/cuda_v11/libcudart.so.11.0 error="your nvidia driver is too old or missing. If you have a CUDA GPU please upgrade to run ollama"
time=2024-07-25T15:56:47.057Z level=DEBUG source=amd_linux.go:356 msg="amdgpu driver not detected /sys/module/amdgpu"
time=2024-07-25T15:56:47.057Z level=INFO source=gpu.go:346 msg="no compatible GPUs were discovered"
time=2024-07-25T15:56:47.057Z level=INFO source=types.go:105 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="4.4 GiB" available="4.0 GiB"
```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5954/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5954/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8627
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8627/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8627/comments
|
https://api.github.com/repos/ollama/ollama/issues/8627/events
|
https://github.com/ollama/ollama/issues/8627
| 2,815,141,322
|
I_kwDOJ0Z1Ps6ny6XK
| 8,627
|
Deepseek-r1 can't read document or picture
|
{
"login": "hereshui3",
"id": 163418623,
"node_id": "U_kgDOCb2R_w",
"avatar_url": "https://avatars.githubusercontent.com/u/163418623?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hereshui3",
"html_url": "https://github.com/hereshui3",
"followers_url": "https://api.github.com/users/hereshui3/followers",
"following_url": "https://api.github.com/users/hereshui3/following{/other_user}",
"gists_url": "https://api.github.com/users/hereshui3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hereshui3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hereshui3/subscriptions",
"organizations_url": "https://api.github.com/users/hereshui3/orgs",
"repos_url": "https://api.github.com/users/hereshui3/repos",
"events_url": "https://api.github.com/users/hereshui3/events{/privacy}",
"received_events_url": "https://api.github.com/users/hereshui3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2025-01-28T09:53:59
| 2025-01-28T11:01:15
| 2025-01-28T11:01:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I use chatbox to visualize deepseek,but when I try to send a document or picture,the output is a chaos.
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
deepseek
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8627/reactions",
"total_count": 1,
"+1": 0,
"-1": 1,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8627/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2176
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2176/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2176/comments
|
https://api.github.com/repos/ollama/ollama/issues/2176/events
|
https://github.com/ollama/ollama/issues/2176
| 2,098,920,149
|
I_kwDOJ0Z1Ps59GvrV
| 2,176
|
Ollama instance stuck and hanging after few hours.
|
{
"login": "jayouimet",
"id": 54856778,
"node_id": "MDQ6VXNlcjU0ODU2Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/54856778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jayouimet",
"html_url": "https://github.com/jayouimet",
"followers_url": "https://api.github.com/users/jayouimet/followers",
"following_url": "https://api.github.com/users/jayouimet/following{/other_user}",
"gists_url": "https://api.github.com/users/jayouimet/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jayouimet/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jayouimet/subscriptions",
"organizations_url": "https://api.github.com/users/jayouimet/orgs",
"repos_url": "https://api.github.com/users/jayouimet/repos",
"events_url": "https://api.github.com/users/jayouimet/events{/privacy}",
"received_events_url": "https://api.github.com/users/jayouimet/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-01-24T19:17:39
| 2024-10-02T17:09:36
| 2024-06-01T20:09:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
We have an ollama instance that starts to hang after a few hours of use.
When using ctrl + c to stop the serve, we get a long stack trace resembling this, could be missing lines at the top as it is the maximum I can get from my ssh instance :
```
net/http/server.go:3086 +0x30 fp=0x140008e5fd0 sp=0x140008e5fa0 pc=0x104b90040
runtime.goexit()
runtime/asm_arm64.s:1197 +0x4 fp=0x140008e5fd0 sp=0x140008e5fd0 pc=0x1049679f4
created by net/http.(*Server).Serve in goroutine 1
net/http/server.go:3086 +0x4cc
goroutine 394 [sync.Mutex.Lock, 6 minutes]:
runtime.gopark(0x140008e2fc8?, 0x104953134?, 0xf0?, 0x88?, 0x140008e2fe8?)
runtime/proc.go:398 +0xc8 fp=0x140008e2f90 sp=0x140008e2f70 pc=0x1049364e8
runtime.goparkunlock(...)
runtime/proc.go:404
runtime.semacquire1(0x1055b2124, 0x7d?, 0x3, 0x1, 0x42?)
runtime/sema.go:160 +0x208 fp=0x140008e2fe0 sp=0x140008e2f90 pc=0x104947b08
sync.runtime_SemacquireMutex(0x14000348450?, 0x0?, 0x0?)
runtime/sema.go:77 +0x28 fp=0x140008e3020 sp=0x140008e2fe0 pc=0x104963248
sync.(*Mutex).lockSlow(0x1055b2120)
sync/mutex.go:171 +0x174 fp=0x140008e3070 sp=0x140008e3020 pc=0x104972114
sync.(*Mutex).Lock(...)
sync/mutex.go:90
github.com/jmorganca/ollama/server.GenerateHandler(0x140008fe200)
github.com/jmorganca/ollama/server/routes.go:140 +0x90 fp=0x140008e3720 sp=0x140008e3070 pc=0x104e0ca60
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func1(0x140008fe200)
github.com/jmorganca/ollama/server/routes.go:877 +0x78 fp=0x140008e3760 sp=0x140008e3720 pc=0x104e14dd8
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.CustomRecoveryWithWriter.func1(0x140008fe200)
github.com/gin-gonic/gin@v1.9.1/recovery.go:102 +0x80 fp=0x140008e37b0 sp=0x140008e3760 pc=0x104df3900
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.LoggerWithConfig.func1(0x140008fe200)
github.com/gin-gonic/gin@v1.9.1/logger.go:240 +0xb0 fp=0x140008e3960 sp=0x140008e37b0 pc=0x104df2ca0
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.(*Engine).handleHTTPRequest(0x1400014fa00, 0x140008fe200)
github.com/gin-gonic/gin@v1.9.1/gin.go:620 +0x524 fp=0x140008e3af0 sp=0x140008e3960 pc=0x104df1dd4
github.com/gin-gonic/gin.(*Engine).ServeHTTP(0x1400014fa00, {0x1051bc230?, 0x140008f20e0}, 0x140008fe100)
github.com/gin-gonic/gin@v1.9.1/gin.go:576 +0x1a0 fp=0x140008e3b30 sp=0x140008e3af0 pc=0x104df1720
net/http.serverHandler.ServeHTTP({0x1051ba500?}, {0x1051bc230?, 0x140008f20e0?}, 0x6?)
net/http/server.go:2938 +0xbc fp=0x140008e3b60 sp=0x140008e3b30 pc=0x104b8f92c
net/http.(*conn).serve(0x1400039e360, {0x1051bd7d8, 0x1400047a570})
net/http/server.go:2009 +0x518 fp=0x140008e3fa0 sp=0x140008e3b60 pc=0x104b8bd28
net/http.(*Server).Serve.func3()
net/http/server.go:3086 +0x30 fp=0x140008e3fd0 sp=0x140008e3fa0 pc=0x104b90040
runtime.goexit()
runtime/asm_arm64.s:1197 +0x4 fp=0x140008e3fd0 sp=0x140008e3fd0 pc=0x1049679f4
created by net/http.(*Server).Serve in goroutine 1
net/http/server.go:3086 +0x4cc
r0 0x458
r1 0xffffffffffffffff
r2 0x1
r3 0x1
r4 0x0
r5 0x1388
r6 0x34
r7 0x0
r8 0x3c
r9 0x1e6d2b9d0
r10 0x11
r11 0x0
r12 0x180
r13 0x170d8ef00
r14 0x181
r15 0x42
r16 0x18fa555f4
r17 0x1eff4e038
r18 0x0
r19 0x458
r20 0x0
r21 0x170d8ee80
r22 0x0
r23 0x17
r24 0x1388
r25 0x14000037798
r26 0x1051b4918
r27 0x820
r28 0x140006821a0
r29 0x170d8edb0
lr 0x987100018f989300
sp 0x170d8edb0
pc 0x18fa55600
fault 0x458
```
Memory looks good, usually generation times are in the range of a few seconds.
Tested version 0.1.17 and 0.1.20.
This is running on the Metal API.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2176/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2176/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6174
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6174/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6174/comments
|
https://api.github.com/repos/ollama/ollama/issues/6174/events
|
https://github.com/ollama/ollama/issues/6174
| 2,447,931,434
|
I_kwDOJ0Z1Ps6R6Hgq
| 6,174
|
Unable to run / pull llama3 model
|
{
"login": "Maha-vignesh09",
"id": 177517255,
"node_id": "U_kgDOCpSyxw",
"avatar_url": "https://avatars.githubusercontent.com/u/177517255?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Maha-vignesh09",
"html_url": "https://github.com/Maha-vignesh09",
"followers_url": "https://api.github.com/users/Maha-vignesh09/followers",
"following_url": "https://api.github.com/users/Maha-vignesh09/following{/other_user}",
"gists_url": "https://api.github.com/users/Maha-vignesh09/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Maha-vignesh09/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Maha-vignesh09/subscriptions",
"organizations_url": "https://api.github.com/users/Maha-vignesh09/orgs",
"repos_url": "https://api.github.com/users/Maha-vignesh09/repos",
"events_url": "https://api.github.com/users/Maha-vignesh09/events{/privacy}",
"received_events_url": "https://api.github.com/users/Maha-vignesh09/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-08-05T08:26:29
| 2024-08-13T05:06:18
| 2024-08-13T05:06:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": proxyconnect tcp: dial tcp: lookup https on 100.76.191.242:53: no such host
when trying to pull llama3
[root@fsgbu-mum-918 ~]# export https_proxy=https://www-***.com:80
[root@fsgbu-mum-918 ~]# export https_proxy=https://www-***.com:80*
[root@fsgbu-mum-918 ~]# curl https://registry.ollama.ai/v2/library/llama3/manifests/latest
{"schemaVersion":2,"mediaType":"application/vnd.docker.distribution.manifest.v2+json","config":{"digest":"sha256:3f8eb4da87fa7a3c9da615036b0dc418d31fef2a30b115ff33562588b32c691d","mediaType":"application/vnd.docker.container.image.v1+json","size":485},"layers":[{"digest":"sha256:6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa","mediaType":"application/vnd.ollama.image.model","size":4661211424},{"digest":"sha256:4fa551d4f938f68b8c1e6afa9d28befb70e3f33f75d0753248d530364aeea40f","mediaType":"application/vnd.ollama.image.license","size":12403},{"digest":"sha256:8ab4849b038cf0abc5b1c9b8ee1443dca6b93a045c2272180d985126eb40bf6f","mediaType":"application/vnd.ollama.image.template","size":254},{"digest":"sha256:577073ffcc6ce95b9981eacc77d1039568639e5638e83044994560d9ef82ce1b","mediaType":"application/vnd.ollama.image.params","size":110}]}
[root@fsgbu-mum-918 ~]# ollama run llama3
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": proxyconnect tcp: dial tcp: lookup https on 100.76.191.242:53: no such host
### OS
Linux
### GPU
Other
### CPU
AMD
### Ollama version
0.3.2
|
{
"login": "Maha-vignesh09",
"id": 177517255,
"node_id": "U_kgDOCpSyxw",
"avatar_url": "https://avatars.githubusercontent.com/u/177517255?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Maha-vignesh09",
"html_url": "https://github.com/Maha-vignesh09",
"followers_url": "https://api.github.com/users/Maha-vignesh09/followers",
"following_url": "https://api.github.com/users/Maha-vignesh09/following{/other_user}",
"gists_url": "https://api.github.com/users/Maha-vignesh09/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Maha-vignesh09/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Maha-vignesh09/subscriptions",
"organizations_url": "https://api.github.com/users/Maha-vignesh09/orgs",
"repos_url": "https://api.github.com/users/Maha-vignesh09/repos",
"events_url": "https://api.github.com/users/Maha-vignesh09/events{/privacy}",
"received_events_url": "https://api.github.com/users/Maha-vignesh09/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6174/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6174/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8493
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8493/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8493/comments
|
https://api.github.com/repos/ollama/ollama/issues/8493/events
|
https://github.com/ollama/ollama/issues/8493
| 2,797,975,587
|
I_kwDOJ0Z1Ps6mxbgj
| 8,493
|
Long context for Qwen2.5 is possible but needs something to work
|
{
"login": "devlux76",
"id": 86517969,
"node_id": "MDQ6VXNlcjg2NTE3OTY5",
"avatar_url": "https://avatars.githubusercontent.com/u/86517969?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/devlux76",
"html_url": "https://github.com/devlux76",
"followers_url": "https://api.github.com/users/devlux76/followers",
"following_url": "https://api.github.com/users/devlux76/following{/other_user}",
"gists_url": "https://api.github.com/users/devlux76/gists{/gist_id}",
"starred_url": "https://api.github.com/users/devlux76/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/devlux76/subscriptions",
"organizations_url": "https://api.github.com/users/devlux76/orgs",
"repos_url": "https://api.github.com/users/devlux76/repos",
"events_url": "https://api.github.com/users/devlux76/events{/privacy}",
"received_events_url": "https://api.github.com/users/devlux76/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 1
| 2025-01-20T01:34:24
| 2025-01-20T09:37:07
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The instructions for Qwen2.5 (all of them) state quite clearly that everything from 7B on up have 128k context. However in order to use that context you need to do something...
https://huggingface.co/Qwen/Qwen2.5-Coder-32B-Instruct#processing-long-texts
For supported frameworks, you could add the following to config.json to enable YaRN:
```
{
...,
"rope_scaling": {
"factor": 4.0,
"original_max_position_embeddings": 32768,
"type": "yarn"
}
}
```
What is the ollama method of achieving this because I'd really like to use these models at their full context length.
Thanks!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8493/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8493/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3854
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3854/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3854/comments
|
https://api.github.com/repos/ollama/ollama/issues/3854/events
|
https://github.com/ollama/ollama/issues/3854
| 2,259,789,985
|
I_kwDOJ0Z1Ps6Gsaih
| 3,854
|
request command-r-plus Q6
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-04-23T21:21:07
| 2024-05-06T23:28:00
| 2024-05-06T23:28:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
request command-r-plus Q6
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3854/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3854/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7826
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7826/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7826/comments
|
https://api.github.com/repos/ollama/ollama/issues/7826/events
|
https://github.com/ollama/ollama/pull/7826
| 2,689,335,494
|
PR_kwDOJ0Z1Ps6C-zuq
| 7,826
|
Use default transport to preserve proxy settings
|
{
"login": "Mazyod",
"id": 860511,
"node_id": "MDQ6VXNlcjg2MDUxMQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/860511?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Mazyod",
"html_url": "https://github.com/Mazyod",
"followers_url": "https://api.github.com/users/Mazyod/followers",
"following_url": "https://api.github.com/users/Mazyod/following{/other_user}",
"gists_url": "https://api.github.com/users/Mazyod/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Mazyod/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Mazyod/subscriptions",
"organizations_url": "https://api.github.com/users/Mazyod/orgs",
"repos_url": "https://api.github.com/users/Mazyod/repos",
"events_url": "https://api.github.com/users/Mazyod/events{/privacy}",
"received_events_url": "https://api.github.com/users/Mazyod/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-11-25T06:21:07
| 2024-11-26T00:32:36
| 2024-11-26T00:32:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7826",
"html_url": "https://github.com/ollama/ollama/pull/7826",
"diff_url": "https://github.com/ollama/ollama/pull/7826.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7826.patch",
"merged_at": null
}
|
Attempt to fix regression in 0.4.3 as per #7788
To test this change, I created a small program to verify that the change indeed respects the proxy settings:
```go
package main
import (
"fmt"
"net/http"
"os"
)
func BuggyClient() *http.Client {
return &http.Client{
Transport: &http.Transport{},
}
}
func CorrectClient() *http.Client {
transport := http.DefaultTransport.(*http.Transport).Clone()
return &http.Client{
Transport: transport,
}
}
func main() {
targetURL := "http://httpbin.org/get"
fmt.Println("Testing with buggy client:")
buggyResp, err := BuggyClient().Get(targetURL)
if err != nil {
fmt.Printf("Buggy client error: %v\n", err)
} else {
defer buggyResp.Body.Close()
fmt.Printf("Buggy client status: %s\n", buggyResp.Status)
}
fmt.Println("\nTesting with correct client:")
correctResp, err := CorrectClient().Get(targetURL)
if err != nil {
fmt.Printf("Correct client error: %v\n", err)
} else {
defer correctResp.Body.Close()
fmt.Printf("Correct client status: %s\n", correctResp.Status)
}
fmt.Println("\nProxy settings:")
fmt.Printf("HTTP_PROXY: %s\n", os.Getenv("HTTP_PROXY"))
}
```
```bash
# I run tinyproxy with DENY all requests for testing
% go run main.go
Testing with buggy client:
Buggy client status: 200 OK
Testing with correct client:
Correct client status: 403 Access denied
Proxy settings:
HTTP_PROXY: http://localhost:8888
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7826/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7826/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8373
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8373/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8373/comments
|
https://api.github.com/repos/ollama/ollama/issues/8373/events
|
https://github.com/ollama/ollama/issues/8373
| 2,780,327,075
|
I_kwDOJ0Z1Ps6luGyj
| 8,373
|
ollama rm xxx failed to delete file /usr/share/ollama/.ollama/models/blobs/sha256-xxx
|
{
"login": "SDAIer",
"id": 174102361,
"node_id": "U_kgDOCmCXWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/174102361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SDAIer",
"html_url": "https://github.com/SDAIer",
"followers_url": "https://api.github.com/users/SDAIer/followers",
"following_url": "https://api.github.com/users/SDAIer/following{/other_user}",
"gists_url": "https://api.github.com/users/SDAIer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SDAIer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SDAIer/subscriptions",
"organizations_url": "https://api.github.com/users/SDAIer/orgs",
"repos_url": "https://api.github.com/users/SDAIer/repos",
"events_url": "https://api.github.com/users/SDAIer/events{/privacy}",
"received_events_url": "https://api.github.com/users/SDAIer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2025-01-10T14:32:17
| 2025-01-10T15:23:22
| 2025-01-10T15:23:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
How to achieve the automatic deletion of the corresponding model files to free up hard drive space after running "ollama rm xx"? "ollama rm xxx" failed to delete the file "/usr/share/ollama/.ollama/models/blobs/sha256-xxx".
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "SDAIer",
"id": 174102361,
"node_id": "U_kgDOCmCXWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/174102361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SDAIer",
"html_url": "https://github.com/SDAIer",
"followers_url": "https://api.github.com/users/SDAIer/followers",
"following_url": "https://api.github.com/users/SDAIer/following{/other_user}",
"gists_url": "https://api.github.com/users/SDAIer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SDAIer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SDAIer/subscriptions",
"organizations_url": "https://api.github.com/users/SDAIer/orgs",
"repos_url": "https://api.github.com/users/SDAIer/repos",
"events_url": "https://api.github.com/users/SDAIer/events{/privacy}",
"received_events_url": "https://api.github.com/users/SDAIer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8373/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8373/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2982
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2982/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2982/comments
|
https://api.github.com/repos/ollama/ollama/issues/2982/events
|
https://github.com/ollama/ollama/issues/2982
| 2,174,103,800
|
I_kwDOJ0Z1Ps6BljD4
| 2,982
|
add a support matrix to the docs
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-03-07T15:16:27
| 2024-03-21T11:32:19
| 2024-03-21T11:32:19
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
A table showing which GPUs are supported OS would be nice to allow users to evaluate if their hardware is supported.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2982/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2982/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1173
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1173/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1173/comments
|
https://api.github.com/repos/ollama/ollama/issues/1173/events
|
https://github.com/ollama/ollama/issues/1173
| 1,999,313,317
|
I_kwDOJ0Z1Ps53Kxml
| 1,173
|
Provide model metadata with `ollama show`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-11-17T14:55:48
| 2024-07-24T21:07:13
| 2024-07-24T21:07:13
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
`ollama show` should provide metadata like:
* Context size
* Parameter count
* Quantization
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1173/reactions",
"total_count": 6,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1173/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5427
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5427/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5427/comments
|
https://api.github.com/repos/ollama/ollama/issues/5427/events
|
https://github.com/ollama/ollama/issues/5427
| 2,385,310,905
|
I_kwDOJ0Z1Ps6OLPS5
| 5,427
|
通过Modelfile构建的模型run不起来
|
{
"login": "yinjianjie",
"id": 54103299,
"node_id": "MDQ6VXNlcjU0MTAzMjk5",
"avatar_url": "https://avatars.githubusercontent.com/u/54103299?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yinjianjie",
"html_url": "https://github.com/yinjianjie",
"followers_url": "https://api.github.com/users/yinjianjie/followers",
"following_url": "https://api.github.com/users/yinjianjie/following{/other_user}",
"gists_url": "https://api.github.com/users/yinjianjie/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yinjianjie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yinjianjie/subscriptions",
"organizations_url": "https://api.github.com/users/yinjianjie/orgs",
"repos_url": "https://api.github.com/users/yinjianjie/repos",
"events_url": "https://api.github.com/users/yinjianjie/events{/privacy}",
"received_events_url": "https://api.github.com/users/yinjianjie/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 3
| 2024-07-02T06:04:03
| 2024-07-08T19:52:19
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
NAME ID SIZE MODIFIED
glm-4-9b-chat:latest 5356a47a9286 6.3 GB 3 minutes ago
llama3:latest 71a106a91016 4.7 GB 2 months ago
llava:latest 8dd30f6b0cb1 4.7 GB 2 months ago
mistral:latest 61e88e884507 4.1 GB 7 weeks ago
phi3:latest a2c89ceaed85 2.3 GB 2 months ago
qwen:1.8b b6e8ec2e7126 1.1 GB 2 months ago
yi:6b a7f031bb846f 3.5 GB 7 weeks ago
znbang/bge:large-zh-v1.5-f16 9020b4d2aeca 650 MB 2 months ago
~$ ollama run glm-4-9b-chat:latest
Error: llama runner process no longer running: -1
~$ ollama run phi3:latest
>>>
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5427/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5427/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7683
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7683/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7683/comments
|
https://api.github.com/repos/ollama/ollama/issues/7683/events
|
https://github.com/ollama/ollama/issues/7683
| 2,661,305,171
|
I_kwDOJ0Z1Ps6eoEtT
| 7,683
|
Does ollama support batching generate?
|
{
"login": "Wu-tn",
"id": 54966661,
"node_id": "MDQ6VXNlcjU0OTY2NjYx",
"avatar_url": "https://avatars.githubusercontent.com/u/54966661?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Wu-tn",
"html_url": "https://github.com/Wu-tn",
"followers_url": "https://api.github.com/users/Wu-tn/followers",
"following_url": "https://api.github.com/users/Wu-tn/following{/other_user}",
"gists_url": "https://api.github.com/users/Wu-tn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Wu-tn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Wu-tn/subscriptions",
"organizations_url": "https://api.github.com/users/Wu-tn/orgs",
"repos_url": "https://api.github.com/users/Wu-tn/repos",
"events_url": "https://api.github.com/users/Wu-tn/events{/privacy}",
"received_events_url": "https://api.github.com/users/Wu-tn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-11-15T09:00:12
| 2024-11-17T12:18:35
| 2024-11-17T12:18:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7683/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7683/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7960
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7960/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7960/comments
|
https://api.github.com/repos/ollama/ollama/issues/7960/events
|
https://github.com/ollama/ollama/pull/7960
| 2,721,539,864
|
PR_kwDOJ0Z1Ps6EPp9N
| 7,960
|
Update OpenAI docs to reflect tool use functionality
|
{
"login": "yannickgloster",
"id": 19475841,
"node_id": "MDQ6VXNlcjE5NDc1ODQx",
"avatar_url": "https://avatars.githubusercontent.com/u/19475841?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yannickgloster",
"html_url": "https://github.com/yannickgloster",
"followers_url": "https://api.github.com/users/yannickgloster/followers",
"following_url": "https://api.github.com/users/yannickgloster/following{/other_user}",
"gists_url": "https://api.github.com/users/yannickgloster/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yannickgloster/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yannickgloster/subscriptions",
"organizations_url": "https://api.github.com/users/yannickgloster/orgs",
"repos_url": "https://api.github.com/users/yannickgloster/repos",
"events_url": "https://api.github.com/users/yannickgloster/events{/privacy}",
"received_events_url": "https://api.github.com/users/yannickgloster/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-05T22:12:49
| 2024-12-08T06:16:21
| 2024-12-08T06:16:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7960",
"html_url": "https://github.com/ollama/ollama/pull/7960",
"diff_url": "https://github.com/ollama/ollama/pull/7960.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7960.patch",
"merged_at": "2024-12-08T06:16:21"
}
|
Tool use while streaming was fixed in #7836 see [comment](https://github.com/ollama/ollama/pull/7836#issuecomment-2521505633)
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7960/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7960/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7087
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7087/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7087/comments
|
https://api.github.com/repos/ollama/ollama/issues/7087/events
|
https://github.com/ollama/ollama/issues/7087
| 2,563,140,687
|
I_kwDOJ0Z1Ps6YxmxP
| 7,087
|
I would like to able to download, extract and run Ollama on an Intel GPU
|
{
"login": "xiangyang-95",
"id": 18331729,
"node_id": "MDQ6VXNlcjE4MzMxNzI5",
"avatar_url": "https://avatars.githubusercontent.com/u/18331729?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xiangyang-95",
"html_url": "https://github.com/xiangyang-95",
"followers_url": "https://api.github.com/users/xiangyang-95/followers",
"following_url": "https://api.github.com/users/xiangyang-95/following{/other_user}",
"gists_url": "https://api.github.com/users/xiangyang-95/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xiangyang-95/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xiangyang-95/subscriptions",
"organizations_url": "https://api.github.com/users/xiangyang-95/orgs",
"repos_url": "https://api.github.com/users/xiangyang-95/repos",
"events_url": "https://api.github.com/users/xiangyang-95/events{/privacy}",
"received_events_url": "https://api.github.com/users/xiangyang-95/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6677491450,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgJu-g",
"url": "https://api.github.com/repos/ollama/ollama/labels/intel",
"name": "intel",
"color": "226E5B",
"default": false,
"description": "issues relating to Intel GPUs"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-10-03T05:13:28
| 2024-10-03T16:14:00
| 2024-10-03T16:13:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
In order to use Ollama easily on Intel discrete GPU, I would like to able to download the ollama binary that is built with Intel OneAPI SYCL runtime directly.
Example:
```
curl -L https://ollama.com/download/ollama-linux-amd64-sycl.tgz -o ollama-linux-amd64-sycl.tgz
sudo tar -C /usr -xzf ollama-linux-amd64-sycl.tgz
```
I am willing to contribute this feature if needed.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7087/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7087/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7750
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7750/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7750/comments
|
https://api.github.com/repos/ollama/ollama/issues/7750/events
|
https://github.com/ollama/ollama/pull/7750
| 2,673,960,240
|
PR_kwDOJ0Z1Ps6CdLeb
| 7,750
|
Disallow Tool Streaming
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-11-20T00:15:52
| 2024-11-22T00:42:17
| 2024-11-22T00:42:17
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7750",
"html_url": "https://github.com/ollama/ollama/pull/7750",
"diff_url": "https://github.com/ollama/ollama/pull/7750.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7750.patch",
"merged_at": null
}
|
While Tool streaming is scoped to be supported, we currently allow it, which can lead to some weird edge cases.
The tool gets added to capabilities and is passed into runner without checking if the behavior should be occurring.
https://github.com/ollama/ollama/blob/807ace5b1f4fc9de7347297b3c8a695c566d9fd9/server/routes.go#L1427-L1432
The error raised now:
```
File "/Users/parth/Documents/repos/ollama-python/ollama/_client.py", line 668, in inner
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: streaming is not supported with tools
```
I'm a bit weary of getting this PR in since people might have some work arounds for now. We could also just table this for now and it'll get fixed in the streamed tool calls PR.
Will add tests depending on what we decide.
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7750/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7750/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8449
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8449/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8449/comments
|
https://api.github.com/repos/ollama/ollama/issues/8449/events
|
https://github.com/ollama/ollama/pull/8449
| 2,791,807,480
|
PR_kwDOJ0Z1Ps6H9SJp
| 8,449
|
parser: fix parsing Modelfiles with multiple FROM commands
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-16T06:24:20
| 2025-01-16T08:14:08
| 2025-01-16T08:14:07
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8449",
"html_url": "https://github.com/ollama/ollama/pull/8449",
"diff_url": "https://github.com/ollama/ollama/pull/8449.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8449.patch",
"merged_at": "2025-01-16T08:14:06"
}
|
Fixes https://github.com/ollama/ollama/issues/8448
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8449/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8449/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8008
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8008/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8008/comments
|
https://api.github.com/repos/ollama/ollama/issues/8008/events
|
https://github.com/ollama/ollama/issues/8008
| 2,726,348,426
|
I_kwDOJ0Z1Ps6igMaK
| 8,008
|
Return prompt cache utilization on completion responses
|
{
"login": "reckart",
"id": 1410238,
"node_id": "MDQ6VXNlcjE0MTAyMzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1410238?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/reckart",
"html_url": "https://github.com/reckart",
"followers_url": "https://api.github.com/users/reckart/followers",
"following_url": "https://api.github.com/users/reckart/following{/other_user}",
"gists_url": "https://api.github.com/users/reckart/gists{/gist_id}",
"starred_url": "https://api.github.com/users/reckart/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/reckart/subscriptions",
"organizations_url": "https://api.github.com/users/reckart/orgs",
"repos_url": "https://api.github.com/users/reckart/repos",
"events_url": "https://api.github.com/users/reckart/events{/privacy}",
"received_events_url": "https://api.github.com/users/reckart/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-12-09T08:28:49
| 2024-12-09T08:28:49
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Since Ollama has prompt caching now (right?), it would be great if the utilization of the cache could be returned in requests.
E.g. the OpenAI-compatible API could be extended with the new [`usage/prompt_tokens_details/cached_tokens`](https://platform.openai.com/docs/guides/prompt-caching).
A similar field in the Ollama API would also be great.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8008/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8008/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6292
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6292/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6292/comments
|
https://api.github.com/repos/ollama/ollama/issues/6292/events
|
https://github.com/ollama/ollama/issues/6292
| 2,458,680,597
|
I_kwDOJ0Z1Ps6SjH0V
| 6,292
|
Docs: tfs_z description incorrect
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-08-09T21:20:07
| 2024-08-09T21:20:07
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
In the Modelfile docs (https://github.com/ollama/ollama/blob/main/docs/modelfile.md#parameter) tfs_z is defined as:
```
Tail free sampling is used to reduce the impact of less probable tokens from the output. A
higher value (e.g., 2.0) will reduce the impact more, while a value of 1.0 disables this
setting. (default: 1)
```
So that says the range will be 1 to something higher. But this part of the llama.cpp docs suggests that it’s a range of 0 to 1.
https://github.com/ggerganov/llama.cpp/blob/master/examples/main/README.md#tail-free-sampling-tfs
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6292/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6292/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4/comments
|
https://api.github.com/repos/ollama/ollama/issues/4/events
|
https://github.com/ollama/ollama/issues/4
| 1,777,852,025
|
I_kwDOJ0Z1Ps5p9955
| 4
|
blinking cursor is ambiguous
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-06-27T22:43:08
| 2023-07-10T10:14:47
| 2023-07-10T10:14:46
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When I see a question, i just see a blinking cursor. Is the model loading? is it thinking? is there something else going on? Would be nice to see some sort of status to see what it is doing. do I need to kill the app?
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2601
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2601/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2601/comments
|
https://api.github.com/repos/ollama/ollama/issues/2601/events
|
https://github.com/ollama/ollama/pull/2601
| 2,143,236,469
|
PR_kwDOJ0Z1Ps5nVavx
| 2,601
|
add faqs for memory pre-loading and the keep_alive setting
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-19T22:31:17
| 2024-02-19T22:45:26
| 2024-02-19T22:45:25
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2601",
"html_url": "https://github.com/ollama/ollama/pull/2601",
"diff_url": "https://github.com/ollama/ollama/pull/2601.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2601.patch",
"merged_at": "2024-02-19T22:45:25"
}
| null |
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2601/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2601/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/735
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/735/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/735/comments
|
https://api.github.com/repos/ollama/ollama/issues/735/events
|
https://github.com/ollama/ollama/issues/735
| 1,931,667,129
|
I_kwDOJ0Z1Ps5zIua5
| 735
|
What is the supported context length? llama2-chinese:13b-chat-q6_K
|
{
"login": "Friedrich-hue",
"id": 61929816,
"node_id": "MDQ6VXNlcjYxOTI5ODE2",
"avatar_url": "https://avatars.githubusercontent.com/u/61929816?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Friedrich-hue",
"html_url": "https://github.com/Friedrich-hue",
"followers_url": "https://api.github.com/users/Friedrich-hue/followers",
"following_url": "https://api.github.com/users/Friedrich-hue/following{/other_user}",
"gists_url": "https://api.github.com/users/Friedrich-hue/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Friedrich-hue/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Friedrich-hue/subscriptions",
"organizations_url": "https://api.github.com/users/Friedrich-hue/orgs",
"repos_url": "https://api.github.com/users/Friedrich-hue/repos",
"events_url": "https://api.github.com/users/Friedrich-hue/events{/privacy}",
"received_events_url": "https://api.github.com/users/Friedrich-hue/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-10-08T06:25:26
| 2023-10-30T22:25:08
| 2023-10-30T22:25:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/735/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/735/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/341
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/341/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/341/comments
|
https://api.github.com/repos/ollama/ollama/issues/341/events
|
https://github.com/ollama/ollama/pull/341
| 1,849,795,598
|
PR_kwDOJ0Z1Ps5X4kKR
| 341
|
do not regenerate embeddings
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-08-14T13:37:55
| 2023-08-15T19:10:25
| 2023-08-15T19:10:23
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/341",
"html_url": "https://github.com/ollama/ollama/pull/341",
"diff_url": "https://github.com/ollama/ollama/pull/341.diff",
"patch_url": "https://github.com/ollama/ollama/pull/341.patch",
"merged_at": "2023-08-15T19:10:23"
}
|
- re-use previously evaluated embeddings when possible
- change embeddings digest identifier to be based on model name and embedded file path
This change opens previously generated embeddings for the same model/file and re-uses them when possible. This means that running create on the same file will not generate the embeddings again. This also means that only the difference between the current version of the file and the old version of the file will have the embeddings re-generated.
resolves #331
resolves #332
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/341/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/341/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8095
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8095/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8095/comments
|
https://api.github.com/repos/ollama/ollama/issues/8095/events
|
https://github.com/ollama/ollama/issues/8095
| 2,739,795,525
|
I_kwDOJ0Z1Ps6jTfZF
| 8,095
|
Using structured output with tools always produces empty tool_calls array
|
{
"login": "grabbou",
"id": 2464966,
"node_id": "MDQ6VXNlcjI0NjQ5NjY=",
"avatar_url": "https://avatars.githubusercontent.com/u/2464966?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/grabbou",
"html_url": "https://github.com/grabbou",
"followers_url": "https://api.github.com/users/grabbou/followers",
"following_url": "https://api.github.com/users/grabbou/following{/other_user}",
"gists_url": "https://api.github.com/users/grabbou/gists{/gist_id}",
"starred_url": "https://api.github.com/users/grabbou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/grabbou/subscriptions",
"organizations_url": "https://api.github.com/users/grabbou/orgs",
"repos_url": "https://api.github.com/users/grabbou/repos",
"events_url": "https://api.github.com/users/grabbou/events{/privacy}",
"received_events_url": "https://api.github.com/users/grabbou/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
open
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-12-14T11:39:33
| 2025-01-20T06:57:55
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
With OpenAI API, you can pass both tools and response_format. In case model wants to call tools, message will be `null` and tools will be called.
With Ollama, it appears that when response_format is present as JSON schema, the tool calls is an empty array, despite model wanting to call the tools.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.5.1
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8095/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8095/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6551
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6551/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6551/comments
|
https://api.github.com/repos/ollama/ollama/issues/6551/events
|
https://github.com/ollama/ollama/issues/6551
| 2,494,058,273
|
I_kwDOJ0Z1Ps6UqE8h
| 6,551
|
Need cli ollama stop
|
{
"login": "HomunMage",
"id": 144320229,
"node_id": "U_kgDOCJom5Q",
"avatar_url": "https://avatars.githubusercontent.com/u/144320229?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HomunMage",
"html_url": "https://github.com/HomunMage",
"followers_url": "https://api.github.com/users/HomunMage/followers",
"following_url": "https://api.github.com/users/HomunMage/following{/other_user}",
"gists_url": "https://api.github.com/users/HomunMage/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HomunMage/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HomunMage/subscriptions",
"organizations_url": "https://api.github.com/users/HomunMage/orgs",
"repos_url": "https://api.github.com/users/HomunMage/repos",
"events_url": "https://api.github.com/users/HomunMage/events{/privacy}",
"received_events_url": "https://api.github.com/users/HomunMage/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-29T10:30:04
| 2024-09-02T00:02:57
| 2024-09-02T00:02:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
we need ollama stop that can kill ollama server, no using systemctl
because need use this in suck python and c++ subprocess or thread to handle
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6551/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6551/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8605
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8605/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8605/comments
|
https://api.github.com/repos/ollama/ollama/issues/8605/events
|
https://github.com/ollama/ollama/issues/8605
| 2,812,486,439
|
I_kwDOJ0Z1Ps6noyMn
| 8,605
|
Error fetching ANY model locally
|
{
"login": "devroopsaha744",
"id": 130696540,
"node_id": "U_kgDOB8pFXA",
"avatar_url": "https://avatars.githubusercontent.com/u/130696540?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/devroopsaha744",
"html_url": "https://github.com/devroopsaha744",
"followers_url": "https://api.github.com/users/devroopsaha744/followers",
"following_url": "https://api.github.com/users/devroopsaha744/following{/other_user}",
"gists_url": "https://api.github.com/users/devroopsaha744/gists{/gist_id}",
"starred_url": "https://api.github.com/users/devroopsaha744/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/devroopsaha744/subscriptions",
"organizations_url": "https://api.github.com/users/devroopsaha744/orgs",
"repos_url": "https://api.github.com/users/devroopsaha744/repos",
"events_url": "https://api.github.com/users/devroopsaha744/events{/privacy}",
"received_events_url": "https://api.github.com/users/devroopsaha744/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
open
| false
| null |
[] | null | 4
| 2025-01-27T09:25:12
| 2025-01-28T17:14:43
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
This is the Error message that I am getting:
pulling manifest
pulling 6e9f90f02bb3... 0% ▕ ▏ 0 B/9.0 GB
Error: max retries exceeded: Get "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/6e/6e9f90f02bb3b39b59e81916e8cfce9deb45aeaeb9a54a5be4414486b907dc1e/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20250127%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20250127T091701Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&X-Amz-Signature=1afc1dcb356baf92d38344324c5f1504f216d4c77efb000442698708739d0298": dial tcp: lookup dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com: no such host
the command I am using:
`ollama pull <model-name>`
I have also tried using:
`ollama run <model-name>`
Any Model that I want to fetch locally, throws this error. How to fix it? I believe, I have installed OLLAMA properly.
Please tell me how to resolve it.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.4
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8605/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8605/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/762
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/762/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/762/comments
|
https://api.github.com/repos/ollama/ollama/issues/762/events
|
https://github.com/ollama/ollama/issues/762
| 1,938,820,901
|
I_kwDOJ0Z1Ps5zkA8l
| 762
|
Support for HuggingFaceH4/zephyr-7b-alpha
|
{
"login": "shauryr",
"id": 12604876,
"node_id": "MDQ6VXNlcjEyNjA0ODc2",
"avatar_url": "https://avatars.githubusercontent.com/u/12604876?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shauryr",
"html_url": "https://github.com/shauryr",
"followers_url": "https://api.github.com/users/shauryr/followers",
"following_url": "https://api.github.com/users/shauryr/following{/other_user}",
"gists_url": "https://api.github.com/users/shauryr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shauryr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shauryr/subscriptions",
"organizations_url": "https://api.github.com/users/shauryr/orgs",
"repos_url": "https://api.github.com/users/shauryr/repos",
"events_url": "https://api.github.com/users/shauryr/events{/privacy}",
"received_events_url": "https://api.github.com/users/shauryr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-10-11T22:14:43
| 2023-10-12T13:34:24
| 2023-10-11T23:09:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha : zephyr-7b-alpha model outperforms ChatLlama 70B
It would be great to have this run inside Ollama!
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/762/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/762/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7113
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7113/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7113/comments
|
https://api.github.com/repos/ollama/ollama/issues/7113/events
|
https://github.com/ollama/ollama/issues/7113
| 2,569,653,831
|
I_kwDOJ0Z1Ps6ZKc5H
| 7,113
|
llama runner process has terminated: error loading model: error loading model vocabulary: invalid string position
|
{
"login": "ImValll",
"id": 107722816,
"node_id": "U_kgDOBmu4QA",
"avatar_url": "https://avatars.githubusercontent.com/u/107722816?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ImValll",
"html_url": "https://github.com/ImValll",
"followers_url": "https://api.github.com/users/ImValll/followers",
"following_url": "https://api.github.com/users/ImValll/following{/other_user}",
"gists_url": "https://api.github.com/users/ImValll/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ImValll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ImValll/subscriptions",
"organizations_url": "https://api.github.com/users/ImValll/orgs",
"repos_url": "https://api.github.com/users/ImValll/repos",
"events_url": "https://api.github.com/users/ImValll/events{/privacy}",
"received_events_url": "https://api.github.com/users/ImValll/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-10-07T07:54:24
| 2024-10-21T04:13:21
| 2024-10-09T09:00:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I finetuned the gemma 2 model and converted it in GGUF, I try to run it with this code but it isn't working. Do you have any idea?
import ollama
import asyncio
from ollama import AsyncClient
async def chat(human_message):
message = {'role': 'human', 'content': human_message}
async for part in await AsyncClient().chat(model='Gemma_chat_bot', messages=[message], stream=True):
print(part['message']['content'], end='', flush=True)
modelfile='''
FROM ./llama.cpp/quantized_model/FP16.gguf
'''
ollama.create(model='Gemma_chat_bot', modelfile=modelfile)
asyncio.run(chat('Salut, comment puis-je faire pour envoyer un mail ?'))
### OS
Windows
### GPU
AMD
### CPU
AMD
### Ollama version
ollama version is 0.3.12
|
{
"login": "ImValll",
"id": 107722816,
"node_id": "U_kgDOBmu4QA",
"avatar_url": "https://avatars.githubusercontent.com/u/107722816?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ImValll",
"html_url": "https://github.com/ImValll",
"followers_url": "https://api.github.com/users/ImValll/followers",
"following_url": "https://api.github.com/users/ImValll/following{/other_user}",
"gists_url": "https://api.github.com/users/ImValll/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ImValll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ImValll/subscriptions",
"organizations_url": "https://api.github.com/users/ImValll/orgs",
"repos_url": "https://api.github.com/users/ImValll/repos",
"events_url": "https://api.github.com/users/ImValll/events{/privacy}",
"received_events_url": "https://api.github.com/users/ImValll/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7113/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7113/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1011
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1011/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1011/comments
|
https://api.github.com/repos/ollama/ollama/issues/1011/events
|
https://github.com/ollama/ollama/pull/1011
| 1,978,293,843
|
PR_kwDOJ0Z1Ps5epKE2
| 1,011
|
Updated README.md. Added a new feature to the ollama project: GitHub Codespaces integration.
|
{
"login": "TouchstoneTheDev",
"id": 101004444,
"node_id": "U_kgDOBgU0nA",
"avatar_url": "https://avatars.githubusercontent.com/u/101004444?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TouchstoneTheDev",
"html_url": "https://github.com/TouchstoneTheDev",
"followers_url": "https://api.github.com/users/TouchstoneTheDev/followers",
"following_url": "https://api.github.com/users/TouchstoneTheDev/following{/other_user}",
"gists_url": "https://api.github.com/users/TouchstoneTheDev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TouchstoneTheDev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TouchstoneTheDev/subscriptions",
"organizations_url": "https://api.github.com/users/TouchstoneTheDev/orgs",
"repos_url": "https://api.github.com/users/TouchstoneTheDev/repos",
"events_url": "https://api.github.com/users/TouchstoneTheDev/events{/privacy}",
"received_events_url": "https://api.github.com/users/TouchstoneTheDev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-06T04:59:29
| 2023-11-06T16:36:20
| 2023-11-06T16:19:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1011",
"html_url": "https://github.com/ollama/ollama/pull/1011",
"diff_url": "https://github.com/ollama/ollama/pull/1011.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1011.patch",
"merged_at": null
}
|
This pull request adds a new feature to the ollama project: GitHub Codespaces integration. With this feature, you can easily create a cloud-based development environment for ollama with just one click. You can edit, debug, test, and deploy your code from anywhere, using any device.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1011/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1011/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/731
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/731/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/731/comments
|
https://api.github.com/repos/ollama/ollama/issues/731/events
|
https://github.com/ollama/ollama/issues/731
| 1,931,602,155
|
I_kwDOJ0Z1Ps5zIejr
| 731
|
Wrong with Chinese spelling
|
{
"login": "1linguowei",
"id": 31962248,
"node_id": "MDQ6VXNlcjMxOTYyMjQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/31962248?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/1linguowei",
"html_url": "https://github.com/1linguowei",
"followers_url": "https://api.github.com/users/1linguowei/followers",
"following_url": "https://api.github.com/users/1linguowei/following{/other_user}",
"gists_url": "https://api.github.com/users/1linguowei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/1linguowei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/1linguowei/subscriptions",
"organizations_url": "https://api.github.com/users/1linguowei/orgs",
"repos_url": "https://api.github.com/users/1linguowei/repos",
"events_url": "https://api.github.com/users/1linguowei/events{/privacy}",
"received_events_url": "https://api.github.com/users/1linguowei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-10-08T02:22:57
| 2023-12-22T03:37:42
| 2023-12-22T03:37:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Here is a case where I used Chinese input:
User: ollama run llama2-chinese:7b-chat-q4_0 "你会Swift编程吗"
Assitant: 您好,我是一个AI语言模型。我能够回答类似于人类的问题,包括指导如何使用Swift编程。如果您想知道Swift编程的基本概念或者是如何编写一个简单的应用程序,我会提供相关的建议和指导。请告诉我您需要了解的Swift编程方面,以便更好地帮助您。
User: ollama run llama2-chinese:7b-chat-q4_0
User: 你会Swift编程吗
Assistant: 我是AI语言模
我是AI语言模型
我是AI语言模型
我是AI语言模型
...
我是AI语言模型,无法判断您的人格特点。如果您想要学习Swift语言,建议您查看相关书籍、视频教程等,尝试在代码编写过程中解决问题并深入了解语言的特性
It seems like Ollama has a bug when entering Chinese rather than a Mac terminal
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/731/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/731/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2249
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2249/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2249/comments
|
https://api.github.com/repos/ollama/ollama/issues/2249/events
|
https://github.com/ollama/ollama/pull/2249
| 2,104,528,878
|
PR_kwDOJ0Z1Ps5lRjM1
| 2,249
|
Add README.md
|
{
"login": "Yuan-ManX",
"id": 68322456,
"node_id": "MDQ6VXNlcjY4MzIyNDU2",
"avatar_url": "https://avatars.githubusercontent.com/u/68322456?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Yuan-ManX",
"html_url": "https://github.com/Yuan-ManX",
"followers_url": "https://api.github.com/users/Yuan-ManX/followers",
"following_url": "https://api.github.com/users/Yuan-ManX/following{/other_user}",
"gists_url": "https://api.github.com/users/Yuan-ManX/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Yuan-ManX/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Yuan-ManX/subscriptions",
"organizations_url": "https://api.github.com/users/Yuan-ManX/orgs",
"repos_url": "https://api.github.com/users/Yuan-ManX/repos",
"events_url": "https://api.github.com/users/Yuan-ManX/events{/privacy}",
"received_events_url": "https://api.github.com/users/Yuan-ManX/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-29T02:45:30
| 2024-02-22T19:03:44
| 2024-02-22T19:03:44
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2249",
"html_url": "https://github.com/ollama/ollama/pull/2249",
"diff_url": "https://github.com/ollama/ollama/pull/2249.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2249.patch",
"merged_at": "2024-02-22T19:03:44"
}
| null |
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2249/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2249/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8165
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8165/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8165/comments
|
https://api.github.com/repos/ollama/ollama/issues/8165/events
|
https://github.com/ollama/ollama/pull/8165
| 2,748,924,641
|
PR_kwDOJ0Z1Ps6FtV9r
| 8,165
|
server: add options to dry run and debug for chat and generate
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 12
| 2024-12-18T23:20:47
| 2025-01-02T20:07:40
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8165",
"html_url": "https://github.com/ollama/ollama/pull/8165",
"diff_url": "https://github.com/ollama/ollama/pull/8165.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8165.patch",
"merged_at": null
}
|
- Doesn't actually load the model
- No tokenization or context length clipping
- Barebones implementation of the `chatPrompt` function
Precursor to enabling tokenization endpoints: https://github.com/ollama/ollama/pull/8106
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8165/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8165/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8080
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8080/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8080/comments
|
https://api.github.com/repos/ollama/ollama/issues/8080/events
|
https://github.com/ollama/ollama/pull/8080
| 2,737,444,182
|
PR_kwDOJ0Z1Ps6FGath
| 8,080
|
Ollama docker usage for jetson devices aded to documentation
|
{
"login": "openzeka-birol-kuyumcu",
"id": 174419215,
"node_id": "U_kgDOCmVtDw",
"avatar_url": "https://avatars.githubusercontent.com/u/174419215?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/openzeka-birol-kuyumcu",
"html_url": "https://github.com/openzeka-birol-kuyumcu",
"followers_url": "https://api.github.com/users/openzeka-birol-kuyumcu/followers",
"following_url": "https://api.github.com/users/openzeka-birol-kuyumcu/following{/other_user}",
"gists_url": "https://api.github.com/users/openzeka-birol-kuyumcu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/openzeka-birol-kuyumcu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/openzeka-birol-kuyumcu/subscriptions",
"organizations_url": "https://api.github.com/users/openzeka-birol-kuyumcu/orgs",
"repos_url": "https://api.github.com/users/openzeka-birol-kuyumcu/repos",
"events_url": "https://api.github.com/users/openzeka-birol-kuyumcu/events{/privacy}",
"received_events_url": "https://api.github.com/users/openzeka-birol-kuyumcu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-12-13T05:31:23
| 2024-12-13T05:31:23
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8080",
"html_url": "https://github.com/ollama/ollama/pull/8080",
"diff_url": "https://github.com/ollama/ollama/pull/8080.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8080.patch",
"merged_at": null
}
| null | null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8080/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8080/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1429
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1429/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1429/comments
|
https://api.github.com/repos/ollama/ollama/issues/1429/events
|
https://github.com/ollama/ollama/issues/1429
| 2,031,782,067
|
I_kwDOJ0Z1Ps55Goiz
| 1,429
|
Can you explain the difference between query and complete? Why one versus the other? Thanks!
|
{
"login": "OpenSpacesAndPlaces",
"id": 30755002,
"node_id": "MDQ6VXNlcjMwNzU1MDAy",
"avatar_url": "https://avatars.githubusercontent.com/u/30755002?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/OpenSpacesAndPlaces",
"html_url": "https://github.com/OpenSpacesAndPlaces",
"followers_url": "https://api.github.com/users/OpenSpacesAndPlaces/followers",
"following_url": "https://api.github.com/users/OpenSpacesAndPlaces/following{/other_user}",
"gists_url": "https://api.github.com/users/OpenSpacesAndPlaces/gists{/gist_id}",
"starred_url": "https://api.github.com/users/OpenSpacesAndPlaces/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OpenSpacesAndPlaces/subscriptions",
"organizations_url": "https://api.github.com/users/OpenSpacesAndPlaces/orgs",
"repos_url": "https://api.github.com/users/OpenSpacesAndPlaces/repos",
"events_url": "https://api.github.com/users/OpenSpacesAndPlaces/events{/privacy}",
"received_events_url": "https://api.github.com/users/OpenSpacesAndPlaces/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2023-12-08T01:30:34
| 2023-12-09T01:16:29
| 2023-12-09T01:16:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
e.g.
query_engine = index.as_query_engine()
retrieved_nodes = query_engine.query("What is the price of apples?")
vs.
prompt ="What is the price of apples?";
response = llm.complete(prompt)
----
I saw this example dogfooding the query into the complete? Why might you want to-do that vs. just query?
https://www.educative.io/answers/how-to-train-gpt-4-on-custom-datasets-using-llamaindex
|
{
"login": "OpenSpacesAndPlaces",
"id": 30755002,
"node_id": "MDQ6VXNlcjMwNzU1MDAy",
"avatar_url": "https://avatars.githubusercontent.com/u/30755002?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/OpenSpacesAndPlaces",
"html_url": "https://github.com/OpenSpacesAndPlaces",
"followers_url": "https://api.github.com/users/OpenSpacesAndPlaces/followers",
"following_url": "https://api.github.com/users/OpenSpacesAndPlaces/following{/other_user}",
"gists_url": "https://api.github.com/users/OpenSpacesAndPlaces/gists{/gist_id}",
"starred_url": "https://api.github.com/users/OpenSpacesAndPlaces/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OpenSpacesAndPlaces/subscriptions",
"organizations_url": "https://api.github.com/users/OpenSpacesAndPlaces/orgs",
"repos_url": "https://api.github.com/users/OpenSpacesAndPlaces/repos",
"events_url": "https://api.github.com/users/OpenSpacesAndPlaces/events{/privacy}",
"received_events_url": "https://api.github.com/users/OpenSpacesAndPlaces/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1429/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1429/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6641
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6641/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6641/comments
|
https://api.github.com/repos/ollama/ollama/issues/6641/events
|
https://github.com/ollama/ollama/pull/6641
| 2,506,123,890
|
PR_kwDOJ0Z1Ps56b_Dx
| 6,641
|
Add curl to container
|
{
"login": "nopoz",
"id": 460545,
"node_id": "MDQ6VXNlcjQ2MDU0NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/460545?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nopoz",
"html_url": "https://github.com/nopoz",
"followers_url": "https://api.github.com/users/nopoz/followers",
"following_url": "https://api.github.com/users/nopoz/following{/other_user}",
"gists_url": "https://api.github.com/users/nopoz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nopoz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nopoz/subscriptions",
"organizations_url": "https://api.github.com/users/nopoz/orgs",
"repos_url": "https://api.github.com/users/nopoz/repos",
"events_url": "https://api.github.com/users/nopoz/events{/privacy}",
"received_events_url": "https://api.github.com/users/nopoz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-09-04T19:31:13
| 2024-11-21T10:39:23
| 2024-11-21T09:52:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6641",
"html_url": "https://github.com/ollama/ollama/pull/6641",
"diff_url": "https://github.com/ollama/ollama/pull/6641.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6641.patch",
"merged_at": null
}
|
Adds curl package to the container for the purpose of creating a custom healthcheck in user-side docker compose files.
This is a compromise to the denied PR in: https://github.com/ollama/ollama/pull/1909 - instead of adding an intergrated health check, just add the tool so users can create one themselves in their local docker compose files.
Solves feature request: https://github.com/ollama/ollama/issues/5389
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6641/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6641/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3535
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3535/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3535/comments
|
https://api.github.com/repos/ollama/ollama/issues/3535/events
|
https://github.com/ollama/ollama/issues/3535
| 2,230,550,428
|
I_kwDOJ0Z1Ps6E83-c
| 3,535
|
tid in log always be the same
|
{
"login": "mofanke",
"id": 54242816,
"node_id": "MDQ6VXNlcjU0MjQyODE2",
"avatar_url": "https://avatars.githubusercontent.com/u/54242816?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mofanke",
"html_url": "https://github.com/mofanke",
"followers_url": "https://api.github.com/users/mofanke/followers",
"following_url": "https://api.github.com/users/mofanke/following{/other_user}",
"gists_url": "https://api.github.com/users/mofanke/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mofanke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mofanke/subscriptions",
"organizations_url": "https://api.github.com/users/mofanke/orgs",
"repos_url": "https://api.github.com/users/mofanke/repos",
"events_url": "https://api.github.com/users/mofanke/events{/privacy}",
"received_events_url": "https://api.github.com/users/mofanke/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-04-08T08:12:32
| 2024-07-25T15:43:30
| 2024-07-25T15:43:30
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
https://github.com/ggerganov/llama.cpp/issues/6534
### What did you expect to see?
i understand that 'tid' represents thread ID, which should change upon restarting, but I've noticed that 'tid':'0x1fc50fac0' keeps appearing repeatedly. I've also noticed some other values, but I'm not sure why.
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
macOS
### Architecture
arm64
### Platform
_No response_
### Ollama version
_No response_
### GPU
_No response_
### GPU info
_No response_
### CPU
Apple
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3535/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3535/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7155
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7155/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7155/comments
|
https://api.github.com/repos/ollama/ollama/issues/7155/events
|
https://github.com/ollama/ollama/pull/7155
| 2,576,955,384
|
PR_kwDOJ0Z1Ps5-IaIs
| 7,155
|
fix vendoring attribute
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-09T21:04:20
| 2024-10-09T21:21:05
| 2024-10-09T21:21:02
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7155",
"html_url": "https://github.com/ollama/ollama/pull/7155",
"diff_url": "https://github.com/ollama/ollama/pull/7155.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7155.patch",
"merged_at": "2024-10-09T21:21:02"
}
|
Expand out the file extensions for vendored code so git reports the status correctly
e.g.:
```
% git check-attr -a -- ./llama/ggml.c
./llama/ggml.c: text: auto
./llama/ggml.c: linguist-vendored: set
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7155/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7155/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6278
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6278/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6278/comments
|
https://api.github.com/repos/ollama/ollama/issues/6278/events
|
https://github.com/ollama/ollama/pull/6278
| 2,457,259,095
|
PR_kwDOJ0Z1Ps536d4D
| 6,278
|
cmd: print proxy info when OLLAMA_DEBUG is true
|
{
"login": "zhangyunhao116",
"id": 18065074,
"node_id": "MDQ6VXNlcjE4MDY1MDc0",
"avatar_url": "https://avatars.githubusercontent.com/u/18065074?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhangyunhao116",
"html_url": "https://github.com/zhangyunhao116",
"followers_url": "https://api.github.com/users/zhangyunhao116/followers",
"following_url": "https://api.github.com/users/zhangyunhao116/following{/other_user}",
"gists_url": "https://api.github.com/users/zhangyunhao116/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhangyunhao116/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhangyunhao116/subscriptions",
"organizations_url": "https://api.github.com/users/zhangyunhao116/orgs",
"repos_url": "https://api.github.com/users/zhangyunhao116/repos",
"events_url": "https://api.github.com/users/zhangyunhao116/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhangyunhao116/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-08-09T07:08:07
| 2024-12-24T07:55:32
| 2024-12-24T03:56:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6278",
"html_url": "https://github.com/ollama/ollama/pull/6278",
"diff_url": "https://github.com/ollama/ollama/pull/6278.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6278.patch",
"merged_at": null
}
|
This PR prints proxy information when OLLAMA_DEBUG is true.
I've noticed that users often encounter issues with HTTP proxy in their environment(like https://github.com/ollama/ollama/issues/6195 https://github.com/ollama/ollama/issues/4834), but setting OLLAMA_DEBUG to true doesn't provide additional debugging infos for this problem. All we got was `something went wrong, please see the ollama server logs for details`. Since this response comes from the HTTP proxy instead of the OLLAMA server, we still can't see anything in the server's log.
For my case, the proxy server will return 502 without any content for all requests to localhost. When I run `ollama ps`, the `something went wrong, please see the ollama server logs for details` is the only thing I can see.
After this PR, when I run `env OLLAMA_DEBUG=1 ollama ps`, I can see
```
proxy found: http://url.to.my.proxy
Error: something went wrong, please see the ollama server logs for details
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6278/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6278/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4041
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4041/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4041/comments
|
https://api.github.com/repos/ollama/ollama/issues/4041/events
|
https://github.com/ollama/ollama/issues/4041
| 2,270,745,686
|
I_kwDOJ0Z1Ps6HWNRW
| 4,041
|
Chat2DB-SQL-7B
|
{
"login": "akan",
"id": 170169,
"node_id": "MDQ6VXNlcjE3MDE2OQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/170169?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/akan",
"html_url": "https://github.com/akan",
"followers_url": "https://api.github.com/users/akan/followers",
"following_url": "https://api.github.com/users/akan/following{/other_user}",
"gists_url": "https://api.github.com/users/akan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/akan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akan/subscriptions",
"organizations_url": "https://api.github.com/users/akan/orgs",
"repos_url": "https://api.github.com/users/akan/repos",
"events_url": "https://api.github.com/users/akan/events{/privacy}",
"received_events_url": "https://api.github.com/users/akan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-04-30T07:45:43
| 2024-04-30T07:45:43
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/bartowski/Chat2DB-SQL-7B-exl2
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4041/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4041/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4732
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4732/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4732/comments
|
https://api.github.com/repos/ollama/ollama/issues/4732/events
|
https://github.com/ollama/ollama/issues/4732
| 2,326,637,374
|
I_kwDOJ0Z1Ps6Kras-
| 4,732
|
Unable to Change Ollama Models Directory on Linux (Rocky 9)
|
{
"login": "pykeras",
"id": 52103105,
"node_id": "MDQ6VXNlcjUyMTAzMTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/52103105?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pykeras",
"html_url": "https://github.com/pykeras",
"followers_url": "https://api.github.com/users/pykeras/followers",
"following_url": "https://api.github.com/users/pykeras/following{/other_user}",
"gists_url": "https://api.github.com/users/pykeras/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pykeras/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pykeras/subscriptions",
"organizations_url": "https://api.github.com/users/pykeras/orgs",
"repos_url": "https://api.github.com/users/pykeras/repos",
"events_url": "https://api.github.com/users/pykeras/events{/privacy}",
"received_events_url": "https://api.github.com/users/pykeras/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 24
| 2024-05-30T22:41:42
| 2025-01-06T09:22:10
| 2024-09-08T06:47:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am following every instruction on the documentation and any other suggestions from previous issues. However, I am unable to change the Ollama models directory to another directory on RockyLinux 9. I have more than 100GB of models that I don't want to download again.
**Steps to Reproduce**:
1. Execute `sudo systemctl edit ollama.service`
2. Add the following:
```bash
[Service]
Environment="OLLAMA_MODELS=/home/myuser/.ollama/models"
```
3. Save and run `sudo systemctl daemon-reload`
4. Execute `sudo systemctl restart ollama.service`
**Expected Behavior**:
The Ollama service should start properly with the models directory set to `/home/myuser/.ollama/models`.
**Actual Behavior**:
The service doesn't start properly. Here are the logs from running `sudo journalctl -u ollama.service | tail`:
```
May 31 01:44:36 myuserPC ollama[354959]: Error: mkdir /home/myuser/.ollama: permission denied
May 31 01:44:36 myuserPC systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
May 31 01:44:36 myuserPC systemd[1]: ollama.service: Failed with result 'exit-code'.
May 31 01:44:39 myuserPC systemd[1]: ollama.service: Scheduled restart job, restart counter is at 3.
May 31 01:44:39 myuserPC systemd[1]: Stopped Ollama Service.
May 31 01:44:39 myuserPC systemd[1]: Started Ollama Service.
May 31 01:44:39 myuserPC ollama[355046]: 2024/05/31 01:44:39 routes.go:1028: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]"
May 31 01:44:39 myuserPC ollama[355046]: Error: mkdir /home/myuser/.ollama: permission denied
May 31 01:44:39 myuserPC systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
May 31 01:44:39 myuserPC systemd[1]: ollama.service: Failed with result 'exit-code'.
```
As you can see, Ollama tries to create the `.ollama` directory in the user's home directory, regardless of whether the directory exists or not.
**Additional Information**:
- I have tried different permissions and ownerships, including an exact file-by-file copy of permissions and ownership from `/usr/share/ollama`.
- Running `/usr/local/bin/ollama serve` does not show any error, unlike `sudo systemctl restart ollama`.
**Other Methods Attempted**:
1. **Symlink Attempt**: I tried using `ln -s` to map models from the default directory to a new location at `/home/myuser/.ollama/models/` with correct permissions and ownership. This resulted in a "mkdir failed" error, as the models directory already exists.
2. **Editing Service Environment**: Editing the Service Environment using `sudo systemctl edit ollama.service` is problematic. If an error occurs, the added lines cannot be removed easily, requiring the removal of the override file using `sudo rm -rf /etc/systemd/system/ollama.service.d/override.conf`.
I would appreciate any help or guidance on how to resolve this issue.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.39
|
{
"login": "pykeras",
"id": 52103105,
"node_id": "MDQ6VXNlcjUyMTAzMTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/52103105?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pykeras",
"html_url": "https://github.com/pykeras",
"followers_url": "https://api.github.com/users/pykeras/followers",
"following_url": "https://api.github.com/users/pykeras/following{/other_user}",
"gists_url": "https://api.github.com/users/pykeras/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pykeras/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pykeras/subscriptions",
"organizations_url": "https://api.github.com/users/pykeras/orgs",
"repos_url": "https://api.github.com/users/pykeras/repos",
"events_url": "https://api.github.com/users/pykeras/events{/privacy}",
"received_events_url": "https://api.github.com/users/pykeras/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4732/reactions",
"total_count": 11,
"+1": 9,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4732/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2015
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2015/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2015/comments
|
https://api.github.com/repos/ollama/ollama/issues/2015/events
|
https://github.com/ollama/ollama/pull/2015
| 2,084,043,315
|
PR_kwDOJ0Z1Ps5kMsc3
| 2,015
|
fix: differentiate floats/ints properly (resolve: #2011)
|
{
"login": "Robitx",
"id": 8431097,
"node_id": "MDQ6VXNlcjg0MzEwOTc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8431097?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Robitx",
"html_url": "https://github.com/Robitx",
"followers_url": "https://api.github.com/users/Robitx/followers",
"following_url": "https://api.github.com/users/Robitx/following{/other_user}",
"gists_url": "https://api.github.com/users/Robitx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Robitx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Robitx/subscriptions",
"organizations_url": "https://api.github.com/users/Robitx/orgs",
"repos_url": "https://api.github.com/users/Robitx/repos",
"events_url": "https://api.github.com/users/Robitx/events{/privacy}",
"received_events_url": "https://api.github.com/users/Robitx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-16T14:06:12
| 2024-01-16T17:37:51
| 2024-01-16T17:37:50
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2015",
"html_url": "https://github.com/ollama/ollama/pull/2015",
"diff_url": "https://github.com/ollama/ollama/pull/2015.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2015.patch",
"merged_at": null
}
|
The parsing might deserve bigger refactor, but for now - all numbers are falling into the `case float64`: branch so I put a differentiation in there.
#2011
|
{
"login": "Robitx",
"id": 8431097,
"node_id": "MDQ6VXNlcjg0MzEwOTc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8431097?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Robitx",
"html_url": "https://github.com/Robitx",
"followers_url": "https://api.github.com/users/Robitx/followers",
"following_url": "https://api.github.com/users/Robitx/following{/other_user}",
"gists_url": "https://api.github.com/users/Robitx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Robitx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Robitx/subscriptions",
"organizations_url": "https://api.github.com/users/Robitx/orgs",
"repos_url": "https://api.github.com/users/Robitx/repos",
"events_url": "https://api.github.com/users/Robitx/events{/privacy}",
"received_events_url": "https://api.github.com/users/Robitx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2015/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2015/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3498
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3498/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3498/comments
|
https://api.github.com/repos/ollama/ollama/issues/3498/events
|
https://github.com/ollama/ollama/issues/3498
| 2,226,833,658
|
I_kwDOJ0Z1Ps6Eusj6
| 3,498
|
Teflon (a new part of Mesa on Linux) NPU delegate support
|
{
"login": "leaf-node",
"id": 342930,
"node_id": "MDQ6VXNlcjM0MjkzMA==",
"avatar_url": "https://avatars.githubusercontent.com/u/342930?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leaf-node",
"html_url": "https://github.com/leaf-node",
"followers_url": "https://api.github.com/users/leaf-node/followers",
"following_url": "https://api.github.com/users/leaf-node/following{/other_user}",
"gists_url": "https://api.github.com/users/leaf-node/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leaf-node/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leaf-node/subscriptions",
"organizations_url": "https://api.github.com/users/leaf-node/orgs",
"repos_url": "https://api.github.com/users/leaf-node/repos",
"events_url": "https://api.github.com/users/leaf-node/events{/privacy}",
"received_events_url": "https://api.github.com/users/leaf-node/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 1
| 2024-04-05T02:24:28
| 2024-11-21T10:33:39
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
[Teflon](https://docs.mesa3d.org/teflon.html) is a [new](https://www.phoronix.com/news/Gallium3D-Teflon-Merged) front end library for NPU acceleration part of the latest versions of [Mesa](https://www.mesa3d.org/) on Linux. It's in early stages, but more NPU drivers may be added in the future.
I'd like to use ollama with an NPU via an open source stack when a device that I own is supported by Mesa.
### How should we solve this?
It hope to see Teflon support in ollama. According to the Teflon documentation, it's already supported in TensorFlow Lite as a "delegate".
### What is the impact of not solving this?
When an upcoming Arm based processor with strong NPU compute capabilities appears on the market, Teflon support for that device may eventually be added to Mesa. If ollama doesn't support Teflon, ollama may miss out on the opportunity to be an early adopter of this open source general purpose API.
### Anything else?
Thanks! : )
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3498/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3498/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1705
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1705/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1705/comments
|
https://api.github.com/repos/ollama/ollama/issues/1705/events
|
https://github.com/ollama/ollama/issues/1705
| 2,055,556,977
|
I_kwDOJ0Z1Ps56hU9x
| 1,705
|
generating embeddings with OllamaEmbeddings taking forever
|
{
"login": "lorenzoromani1983",
"id": 24575445,
"node_id": "MDQ6VXNlcjI0NTc1NDQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/24575445?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lorenzoromani1983",
"html_url": "https://github.com/lorenzoromani1983",
"followers_url": "https://api.github.com/users/lorenzoromani1983/followers",
"following_url": "https://api.github.com/users/lorenzoromani1983/following{/other_user}",
"gists_url": "https://api.github.com/users/lorenzoromani1983/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lorenzoromani1983/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lorenzoromani1983/subscriptions",
"organizations_url": "https://api.github.com/users/lorenzoromani1983/orgs",
"repos_url": "https://api.github.com/users/lorenzoromani1983/repos",
"events_url": "https://api.github.com/users/lorenzoromani1983/events{/privacy}",
"received_events_url": "https://api.github.com/users/lorenzoromani1983/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-12-25T09:34:39
| 2024-05-10T00:23:35
| 2024-05-10T00:23:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am trying to generate embeddings with the OllamaEmbeddings class but it takes forever on a Mac M2 Pro
I am embedding 22000 posts from a forum's threads; this is my config:
from langchain.embeddings import OllamaEmbeddings
from llama_index.llms.ollama import Ollama
llm = Ollama(model="llama2:13b")
...
service_context = ServiceContext.from_defaults(
llm=llm,
embed_model=OllamaEmbeddings(model="llama2:13b"),
)
With OpenAI it takes minutes.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1705/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1705/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/835
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/835/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/835/comments
|
https://api.github.com/repos/ollama/ollama/issues/835/events
|
https://github.com/ollama/ollama/issues/835
| 1,949,294,149
|
I_kwDOJ0Z1Ps50L95F
| 835
|
Improve GPU scheduling
|
{
"login": "slychief",
"id": 831947,
"node_id": "MDQ6VXNlcjgzMTk0Nw==",
"avatar_url": "https://avatars.githubusercontent.com/u/831947?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/slychief",
"html_url": "https://github.com/slychief",
"followers_url": "https://api.github.com/users/slychief/followers",
"following_url": "https://api.github.com/users/slychief/following{/other_user}",
"gists_url": "https://api.github.com/users/slychief/gists{/gist_id}",
"starred_url": "https://api.github.com/users/slychief/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/slychief/subscriptions",
"organizations_url": "https://api.github.com/users/slychief/orgs",
"repos_url": "https://api.github.com/users/slychief/repos",
"events_url": "https://api.github.com/users/slychief/events{/privacy}",
"received_events_url": "https://api.github.com/users/slychief/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 11
| 2023-10-18T09:32:58
| 2024-07-03T10:26:32
| 2024-03-12T15:31:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
we have several GPUs in our server and use SLURM to manage the ressources. SLURM uses CUDA_VISIBLE_DEVICES to assign GPUs to jobs/processes.
When I run ollama directly from commandline - within a SLURM managed context with 1 GPU assigned - it uses all availables GPUs in the server and ignores CUDA_VISIBLE_DEVICES.
Is there a parameter or any recommendation how I can specify which GPUs ollama can use?
PS: a workaround is to use the docker container, but is there another solution for this, too?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/835/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/835/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5639
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5639/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5639/comments
|
https://api.github.com/repos/ollama/ollama/issues/5639/events
|
https://github.com/ollama/ollama/pull/5639
| 2,404,173,128
|
PR_kwDOJ0Z1Ps51Jhyn
| 5,639
|
do not automatically aggregate system messages
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-11T21:40:26
| 2024-07-12T00:48:52
| 2024-07-12T00:48:50
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5639",
"html_url": "https://github.com/ollama/ollama/pull/5639",
"diff_url": "https://github.com/ollama/ollama/pull/5639.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5639.patch",
"merged_at": "2024-07-12T00:48:50"
}
|
add a helper for aggregating system prompts
revert embedded templates to use prompt/response templates for better compatibility
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5639/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5639/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7246
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7246/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7246/comments
|
https://api.github.com/repos/ollama/ollama/issues/7246/events
|
https://github.com/ollama/ollama/pull/7246
| 2,595,840,485
|
PR_kwDOJ0Z1Ps5_Bec7
| 7,246
|
Reuse type InvalidModelNameErrMsg, unify the const parameters.
|
{
"login": "zhanluxianshen",
"id": 161462588,
"node_id": "U_kgDOCZ-5PA",
"avatar_url": "https://avatars.githubusercontent.com/u/161462588?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhanluxianshen",
"html_url": "https://github.com/zhanluxianshen",
"followers_url": "https://api.github.com/users/zhanluxianshen/followers",
"following_url": "https://api.github.com/users/zhanluxianshen/following{/other_user}",
"gists_url": "https://api.github.com/users/zhanluxianshen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhanluxianshen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhanluxianshen/subscriptions",
"organizations_url": "https://api.github.com/users/zhanluxianshen/orgs",
"repos_url": "https://api.github.com/users/zhanluxianshen/repos",
"events_url": "https://api.github.com/users/zhanluxianshen/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhanluxianshen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-17T21:59:30
| 2024-12-18T21:47:40
| 2024-12-18T21:47:36
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7246",
"html_url": "https://github.com/ollama/ollama/pull/7246",
"diff_url": "https://github.com/ollama/ollama/pull/7246.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7246.patch",
"merged_at": null
}
|
Reuse type InvalidModelNameErrMsg, unify the const parameters.
|
{
"login": "zhanluxianshen",
"id": 161462588,
"node_id": "U_kgDOCZ-5PA",
"avatar_url": "https://avatars.githubusercontent.com/u/161462588?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhanluxianshen",
"html_url": "https://github.com/zhanluxianshen",
"followers_url": "https://api.github.com/users/zhanluxianshen/followers",
"following_url": "https://api.github.com/users/zhanluxianshen/following{/other_user}",
"gists_url": "https://api.github.com/users/zhanluxianshen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhanluxianshen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhanluxianshen/subscriptions",
"organizations_url": "https://api.github.com/users/zhanluxianshen/orgs",
"repos_url": "https://api.github.com/users/zhanluxianshen/repos",
"events_url": "https://api.github.com/users/zhanluxianshen/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhanluxianshen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7246/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7246/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5753
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5753/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5753/comments
|
https://api.github.com/repos/ollama/ollama/issues/5753/events
|
https://github.com/ollama/ollama/pull/5753
| 2,414,282,944
|
PR_kwDOJ0Z1Ps51ranz
| 5,753
|
parse tool call as individual objects
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-17T18:22:04
| 2024-07-17T18:47:55
| 2024-07-17T18:47:54
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5753",
"html_url": "https://github.com/ollama/ollama/pull/5753",
"diff_url": "https://github.com/ollama/ollama/pull/5753.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5753.patch",
"merged_at": "2024-07-17T18:47:54"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5753/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5753/timeline
| null | null | true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.