url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/3013
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3013/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3013/comments
|
https://api.github.com/repos/ollama/ollama/issues/3013/events
|
https://github.com/ollama/ollama/issues/3013
| 2,176,758,698
|
I_kwDOJ0Z1Ps6BvrOq
| 3,013
|
please concider adding our project to your list of supported software
|
{
"login": "d3cline",
"id": 77483,
"node_id": "MDQ6VXNlcjc3NDgz",
"avatar_url": "https://avatars.githubusercontent.com/u/77483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/d3cline",
"html_url": "https://github.com/d3cline",
"followers_url": "https://api.github.com/users/d3cline/followers",
"following_url": "https://api.github.com/users/d3cline/following{/other_user}",
"gists_url": "https://api.github.com/users/d3cline/gists{/gist_id}",
"starred_url": "https://api.github.com/users/d3cline/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/d3cline/subscriptions",
"organizations_url": "https://api.github.com/users/d3cline/orgs",
"repos_url": "https://api.github.com/users/d3cline/repos",
"events_url": "https://api.github.com/users/d3cline/events{/privacy}",
"received_events_url": "https://api.github.com/users/d3cline/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-03-08T20:58:01
| 2024-03-11T22:16:14
| 2024-03-11T22:16:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Howdy, I built this paywall/proxy app which uses ollama.ai as a back-end.
Please conciser adding this to your list of supported applications,
https://github.com/d3cline/airanch
Thank you for your consideration and everything the community and core contributors do.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3013/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3013/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4477
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4477/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4477/comments
|
https://api.github.com/repos/ollama/ollama/issues/4477/events
|
https://github.com/ollama/ollama/issues/4477
| 2,301,268,316
|
I_kwDOJ0Z1Ps6JKpFc
| 4,477
|
Expose Max threads as an environment variable or set ollama to use all the cores/threads a CPU provides
|
{
"login": "haydonryan",
"id": 6804348,
"node_id": "MDQ6VXNlcjY4MDQzNDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/6804348?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/haydonryan",
"html_url": "https://github.com/haydonryan",
"followers_url": "https://api.github.com/users/haydonryan/followers",
"following_url": "https://api.github.com/users/haydonryan/following{/other_user}",
"gists_url": "https://api.github.com/users/haydonryan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/haydonryan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haydonryan/subscriptions",
"organizations_url": "https://api.github.com/users/haydonryan/orgs",
"repos_url": "https://api.github.com/users/haydonryan/repos",
"events_url": "https://api.github.com/users/haydonryan/events{/privacy}",
"received_events_url": "https://api.github.com/users/haydonryan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-05-16T20:20:30
| 2024-12-25T02:23:43
| 2024-10-24T14:20:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
After seeing https://github.com/ollama/ollama/issues/2929, I'm having the same issue. As I'm using both open-webui and enchanted on IOS, queries are only using half of the CPU on my EPYC 7302P.
I know you can set a /parameter when using the CLI, but I want to set this as default for serving. Alternately, is there a reason that ollama isn't using the all the available threads _on_ of the host CPU? Seems like something that _could_ be the default.
That said, it would be awesome to expose this as an environment variable option, for those who don't want to use the whole CPU (eg if you're running this on your desktop, while coding).
|
{
"login": "haydonryan",
"id": 6804348,
"node_id": "MDQ6VXNlcjY4MDQzNDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/6804348?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/haydonryan",
"html_url": "https://github.com/haydonryan",
"followers_url": "https://api.github.com/users/haydonryan/followers",
"following_url": "https://api.github.com/users/haydonryan/following{/other_user}",
"gists_url": "https://api.github.com/users/haydonryan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/haydonryan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haydonryan/subscriptions",
"organizations_url": "https://api.github.com/users/haydonryan/orgs",
"repos_url": "https://api.github.com/users/haydonryan/repos",
"events_url": "https://api.github.com/users/haydonryan/events{/privacy}",
"received_events_url": "https://api.github.com/users/haydonryan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4477/reactions",
"total_count": 7,
"+1": 7,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4477/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3900
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3900/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3900/comments
|
https://api.github.com/repos/ollama/ollama/issues/3900/events
|
https://github.com/ollama/ollama/pull/3900
| 2,262,628,304
|
PR_kwDOJ0Z1Ps5trJgf
| 3,900
|
use matrix multiplication kernels in more cases
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-25T04:33:49
| 2024-04-25T16:44:22
| 2024-04-25T16:44:22
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3900",
"html_url": "https://github.com/ollama/ollama/pull/3900",
"diff_url": "https://github.com/ollama/ollama/pull/3900.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3900.patch",
"merged_at": null
}
|
TODO:
- [ ] Set `opts.NumCTX` to `$OLLAMA_NUM_PARALLEL * opts.NumCtx`
- [ ] Enable continuous batching?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3900/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3900/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4159
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4159/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4159/comments
|
https://api.github.com/repos/ollama/ollama/issues/4159/events
|
https://github.com/ollama/ollama/issues/4159
| 2,279,269,394
|
I_kwDOJ0Z1Ps6H2uQS
| 4,159
|
Concurrency Issue: 'Server Busy' Errors After Updating Ollama
|
{
"login": "mynhinguyentruong",
"id": 64499617,
"node_id": "MDQ6VXNlcjY0NDk5NjE3",
"avatar_url": "https://avatars.githubusercontent.com/u/64499617?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mynhinguyentruong",
"html_url": "https://github.com/mynhinguyentruong",
"followers_url": "https://api.github.com/users/mynhinguyentruong/followers",
"following_url": "https://api.github.com/users/mynhinguyentruong/following{/other_user}",
"gists_url": "https://api.github.com/users/mynhinguyentruong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mynhinguyentruong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mynhinguyentruong/subscriptions",
"organizations_url": "https://api.github.com/users/mynhinguyentruong/orgs",
"repos_url": "https://api.github.com/users/mynhinguyentruong/repos",
"events_url": "https://api.github.com/users/mynhinguyentruong/events{/privacy}",
"received_events_url": "https://api.github.com/users/mynhinguyentruong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-05-05T01:18:35
| 2024-05-05T03:36:54
| 2024-05-05T03:36:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Issue Summary:
Yesterday, I conducted a test where I spun up 30 concurrent goroutines and sent them as POST requests to Ollama locally. The process worked smoothly, and I received responses within approximately 2 minutes.
`go ProcessPromptWithOllama()` x30 times
Problem Description:
However, after updating Ollama today, I encountered the following errors:
`\"error\":\"server busy, please try again. maximum pending requests exceeded\"}`
`\"error\":\"unexpected server status: llm busy - no slots available\"}`
Reproducibility:
The issue is consistently reproducible after the Ollama update. It occurs regardless of the specific endpoint or payload used in the POST requests.
Expected Behavior:
I expected the updated Ollama to handle the concurrent requests as efficiently as it did before the update, without encountering any server overload issues.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.33
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4159/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4159/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2924
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2924/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2924/comments
|
https://api.github.com/repos/ollama/ollama/issues/2924/events
|
https://github.com/ollama/ollama/pull/2924
| 2,167,699,285
|
PR_kwDOJ0Z1Ps5ootL9
| 2,924
|
[ENH]: Batch embeddings request
|
{
"login": "tazarov",
"id": 1157440,
"node_id": "MDQ6VXNlcjExNTc0NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1157440?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tazarov",
"html_url": "https://github.com/tazarov",
"followers_url": "https://api.github.com/users/tazarov/followers",
"following_url": "https://api.github.com/users/tazarov/following{/other_user}",
"gists_url": "https://api.github.com/users/tazarov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tazarov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tazarov/subscriptions",
"organizations_url": "https://api.github.com/users/tazarov/orgs",
"repos_url": "https://api.github.com/users/tazarov/repos",
"events_url": "https://api.github.com/users/tazarov/events{/privacy}",
"received_events_url": "https://api.github.com/users/tazarov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-04T20:36:53
| 2024-11-18T08:32:36
| 2024-11-18T08:32:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2924",
"html_url": "https://github.com/ollama/ollama/pull/2924",
"diff_url": "https://github.com/ollama/ollama/pull/2924.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2924.patch",
"merged_at": null
}
|
Adding the ability to submit multiple embeddings in a single request.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2924/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2924/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8375
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8375/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8375/comments
|
https://api.github.com/repos/ollama/ollama/issues/8375/events
|
https://github.com/ollama/ollama/issues/8375
| 2,780,409,263
|
I_kwDOJ0Z1Ps6lua2v
| 8,375
|
Ollama run don't work with recent llama version (due to a upgrade trouble ?)
|
{
"login": "solenn-tl",
"id": 88024747,
"node_id": "MDQ6VXNlcjg4MDI0NzQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/88024747?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/solenn-tl",
"html_url": "https://github.com/solenn-tl",
"followers_url": "https://api.github.com/users/solenn-tl/followers",
"following_url": "https://api.github.com/users/solenn-tl/following{/other_user}",
"gists_url": "https://api.github.com/users/solenn-tl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/solenn-tl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/solenn-tl/subscriptions",
"organizations_url": "https://api.github.com/users/solenn-tl/orgs",
"repos_url": "https://api.github.com/users/solenn-tl/repos",
"events_url": "https://api.github.com/users/solenn-tl/events{/privacy}",
"received_events_url": "https://api.github.com/users/solenn-tl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2025-01-10T15:12:04
| 2025-01-10T15:43:57
| 2025-01-10T15:43:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi everyone !
I'm trying to run Ollama on an Ubuntu 22.04.2 LTS server.
I had Ollama install a few month ago (and then had uninstall it). I have installed it again today.
I first have follow the uninstallation procedure described here to be shure that their are no remaining files from the previous installation :
https://github.com/ollama/ollama/blob/main/docs/linux.md
Then i have install Ollama using the curl command `curl -fsSL https://ollama.com/install.sh | sh`.
There is no visible error at this step.
Then, i test the version and have this :
`ollama version is 0.1.41`
`Warning: client version is 0.5.4`
Finally, i download some models (with ollama pull) and have troubles at this point with some models (like lama3.1 and lama3.2). The error when i want to run them is the following : `Error: llama runner process has terminated: signal: aborted`
I don't have this problem with llama3 for exemple, so i guess it might be a version troubleshot.
I have a tried to downgrad Ollama with `curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION="0.5.3" sh`
So i have this message
`ollama version is 0.1.41`
`Warning: client version is 0.5.3`
UPDATE : i also have tried `ollama pull llama3.1` and have the following error
`pulling manifest`
`Error: pull model manifest: 412:`
`The model you are attempting to pull requires a newer version of Ollama.`
`Please download the latest version at:`
`https://ollama.com/download`
Definitly make me think about a version trouble/conflict but i don't know how to deal with it.
Have you an idea ? Thanks a lot !
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.1.41
|
{
"login": "solenn-tl",
"id": 88024747,
"node_id": "MDQ6VXNlcjg4MDI0NzQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/88024747?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/solenn-tl",
"html_url": "https://github.com/solenn-tl",
"followers_url": "https://api.github.com/users/solenn-tl/followers",
"following_url": "https://api.github.com/users/solenn-tl/following{/other_user}",
"gists_url": "https://api.github.com/users/solenn-tl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/solenn-tl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/solenn-tl/subscriptions",
"organizations_url": "https://api.github.com/users/solenn-tl/orgs",
"repos_url": "https://api.github.com/users/solenn-tl/repos",
"events_url": "https://api.github.com/users/solenn-tl/events{/privacy}",
"received_events_url": "https://api.github.com/users/solenn-tl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8375/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8375/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8192
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8192/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8192/comments
|
https://api.github.com/repos/ollama/ollama/issues/8192/events
|
https://github.com/ollama/ollama/issues/8192
| 2,753,656,686
|
I_kwDOJ0Z1Ps6kIXdu
| 8,192
|
Check Available Memory Before Downloading
|
{
"login": "JamesGMCoder",
"id": 84068167,
"node_id": "MDQ6VXNlcjg0MDY4MTY3",
"avatar_url": "https://avatars.githubusercontent.com/u/84068167?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JamesGMCoder",
"html_url": "https://github.com/JamesGMCoder",
"followers_url": "https://api.github.com/users/JamesGMCoder/followers",
"following_url": "https://api.github.com/users/JamesGMCoder/following{/other_user}",
"gists_url": "https://api.github.com/users/JamesGMCoder/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JamesGMCoder/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JamesGMCoder/subscriptions",
"organizations_url": "https://api.github.com/users/JamesGMCoder/orgs",
"repos_url": "https://api.github.com/users/JamesGMCoder/repos",
"events_url": "https://api.github.com/users/JamesGMCoder/events{/privacy}",
"received_events_url": "https://api.github.com/users/JamesGMCoder/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-21T00:15:28
| 2025-01-13T01:44:15
| 2025-01-13T01:44:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
ollama should check available memory before downloading to save a bunch of bandwidth and time.
ollama run llama3.3
pulling manifest
pulling 4824460d29f2... 100% ▕████████████████████████████████████████████████████████▏ 42 GB
pulling 948af2743fc7... 100% ▕████████████████████████████████████████████████████████▏ 1.5 KB
pulling bc371a43ce90... 100% ▕████████████████████████████████████████████████████████▏ 7.6 KB
pulling 53a87df39647... 100% ▕████████████████████████████████████████████████████████▏ 5.6 KB
pulling 56bb8bd477a5... 100% ▕████████████████████████████████████████████████████████▏ 96 B
pulling c7091aa45e9b... 100% ▕████████████████████████████████████████████████████████▏ 562 B
verifying sha256 digest
writing manifest
success
Error: model requires more system memory (37.8 GiB) than is available (35.5 GiB)
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8192/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8192/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3934
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3934/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3934/comments
|
https://api.github.com/repos/ollama/ollama/issues/3934/events
|
https://github.com/ollama/ollama/issues/3934
| 2,265,007,424
|
I_kwDOJ0Z1Ps6HAUVA
| 3,934
|
ERROR: CAN'T FIND APP
|
{
"login": "Davidmax2023",
"id": 155600752,
"node_id": "U_kgDOCUZHcA",
"avatar_url": "https://avatars.githubusercontent.com/u/155600752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Davidmax2023",
"html_url": "https://github.com/Davidmax2023",
"followers_url": "https://api.github.com/users/Davidmax2023/followers",
"following_url": "https://api.github.com/users/Davidmax2023/following{/other_user}",
"gists_url": "https://api.github.com/users/Davidmax2023/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Davidmax2023/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Davidmax2023/subscriptions",
"organizations_url": "https://api.github.com/users/Davidmax2023/orgs",
"repos_url": "https://api.github.com/users/Davidmax2023/repos",
"events_url": "https://api.github.com/users/Davidmax2023/events{/privacy}",
"received_events_url": "https://api.github.com/users/Davidmax2023/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-26T05:46:15
| 2024-04-26T06:08:42
| 2024-04-26T06:08:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I download and install the ollama by windows version, but when i run it in powershell ,the computer can't find ollama.
|
{
"login": "Davidmax2023",
"id": 155600752,
"node_id": "U_kgDOCUZHcA",
"avatar_url": "https://avatars.githubusercontent.com/u/155600752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Davidmax2023",
"html_url": "https://github.com/Davidmax2023",
"followers_url": "https://api.github.com/users/Davidmax2023/followers",
"following_url": "https://api.github.com/users/Davidmax2023/following{/other_user}",
"gists_url": "https://api.github.com/users/Davidmax2023/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Davidmax2023/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Davidmax2023/subscriptions",
"organizations_url": "https://api.github.com/users/Davidmax2023/orgs",
"repos_url": "https://api.github.com/users/Davidmax2023/repos",
"events_url": "https://api.github.com/users/Davidmax2023/events{/privacy}",
"received_events_url": "https://api.github.com/users/Davidmax2023/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3934/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3934/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1064
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1064/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1064/comments
|
https://api.github.com/repos/ollama/ollama/issues/1064/events
|
https://github.com/ollama/ollama/pull/1064
| 1,986,568,183
|
PR_kwDOJ0Z1Ps5fFc8Z
| 1,064
|
Converting field value to lowercase
|
{
"login": "dansreis",
"id": 9052608,
"node_id": "MDQ6VXNlcjkwNTI2MDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/9052608?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dansreis",
"html_url": "https://github.com/dansreis",
"followers_url": "https://api.github.com/users/dansreis/followers",
"following_url": "https://api.github.com/users/dansreis/following{/other_user}",
"gists_url": "https://api.github.com/users/dansreis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dansreis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dansreis/subscriptions",
"organizations_url": "https://api.github.com/users/dansreis/orgs",
"repos_url": "https://api.github.com/users/dansreis/repos",
"events_url": "https://api.github.com/users/dansreis/events{/privacy}",
"received_events_url": "https://api.github.com/users/dansreis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-09T23:19:15
| 2024-05-09T17:12:13
| 2024-05-09T16:52:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1064",
"html_url": "https://github.com/ollama/ollama/pull/1064",
"diff_url": "https://github.com/ollama/ollama/pull/1064.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1064.patch",
"merged_at": null
}
|
Converting model names to lowercase (Pull, Push, Create, Delete and Show).
This PR has some implications as people with case sensitive model names will be affected.
Issue: #336
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1064/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1064/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/577
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/577/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/577/comments
|
https://api.github.com/repos/ollama/ollama/issues/577/events
|
https://github.com/ollama/ollama/pull/577
| 1,909,347,719
|
PR_kwDOJ0Z1Ps5bA7ck
| 577
|
close llm on interrupt
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-22T18:26:21
| 2023-09-22T18:41:54
| 2023-09-22T18:41:53
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/577",
"html_url": "https://github.com/ollama/ollama/pull/577",
"diff_url": "https://github.com/ollama/ollama/pull/577.diff",
"patch_url": "https://github.com/ollama/ollama/pull/577.patch",
"merged_at": "2023-09-22T18:41:53"
}
| null |
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/577/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/577/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4442
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4442/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4442/comments
|
https://api.github.com/repos/ollama/ollama/issues/4442/events
|
https://github.com/ollama/ollama/issues/4442
| 2,296,670,775
|
I_kwDOJ0Z1Ps6I5Go3
| 4,442
|
Error: llama runner process has terminated: exit status 0xc0000409
|
{
"login": "hcr707305003",
"id": 22547038,
"node_id": "MDQ6VXNlcjIyNTQ3MDM4",
"avatar_url": "https://avatars.githubusercontent.com/u/22547038?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hcr707305003",
"html_url": "https://github.com/hcr707305003",
"followers_url": "https://api.github.com/users/hcr707305003/followers",
"following_url": "https://api.github.com/users/hcr707305003/following{/other_user}",
"gists_url": "https://api.github.com/users/hcr707305003/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hcr707305003/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hcr707305003/subscriptions",
"organizations_url": "https://api.github.com/users/hcr707305003/orgs",
"repos_url": "https://api.github.com/users/hcr707305003/repos",
"events_url": "https://api.github.com/users/hcr707305003/events{/privacy}",
"received_events_url": "https://api.github.com/users/hcr707305003/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 18
| 2024-05-15T01:27:48
| 2024-08-09T23:18:17
| 2024-08-09T23:18:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
when i run quantified model on v0.1.37,is errors out `Error: llama runner process has terminated: exit status 0xc0000409`
first step:
```shell
>>> ollama create test_q8_0 -f building_qwen_7b_gguf.Modelfile
transferring model data
using existing layer sha256:82ed01bbf8fff66078cc84849c959d77e6ee78400cf176513b22055f3848bd09
creating new layer sha256:58353639a7c4b7529da8c5c8a63e81c426f206bab10cf82e4b9e427f15a466f8
creating new layer sha256:1da117d6723df114af0d948b614cae0aa684875e2775ca9607d23e2e0769651d
creating new layer sha256:9297f08dd6c6435240b5cddc93261e8a159aa0fecf010de4568ec2df2417bdb2
creating new layer sha256:c1f908392f9e4c55a2a12ddd1035ef02f552216301e1ab2cf545aa70b4f93b67
writing manifest
success
```
secord step:
```shell
>>> ollama run test_q8_0
Error: llama runner process has terminated: exit status 0xc0000409
```
### OS
Windows
### GPU
Intel
### CPU
Intel
### Ollama version
v0.1.37
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4442/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4442/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/95
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/95/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/95/comments
|
https://api.github.com/repos/ollama/ollama/issues/95/events
|
https://github.com/ollama/ollama/pull/95
| 1,808,823,131
|
PR_kwDOJ0Z1Ps5Vuhd4
| 95
|
Some simple modelfile examples
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-18T00:18:18
| 2023-07-18T12:32:39
| 2023-07-18T12:32:39
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/95",
"html_url": "https://github.com/ollama/ollama/pull/95",
"diff_url": "https://github.com/ollama/ollama/pull/95.diff",
"patch_url": "https://github.com/ollama/ollama/pull/95.patch",
"merged_at": "2023-07-18T12:32:39"
}
|
Added 3 modelfiles as examples. Also update development.doc, but wondering if i should also get rid of dev altogether.
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/95/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/95/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1325
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1325/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1325/comments
|
https://api.github.com/repos/ollama/ollama/issues/1325/events
|
https://github.com/ollama/ollama/pull/1325
| 2,017,804,288
|
PR_kwDOJ0Z1Ps5gu_nO
| 1,325
|
Corrected transposed 129 to 192 for OLLAMA_ORIGINS example
|
{
"login": "cloudxabide",
"id": 47249757,
"node_id": "MDQ6VXNlcjQ3MjQ5NzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/47249757?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cloudxabide",
"html_url": "https://github.com/cloudxabide",
"followers_url": "https://api.github.com/users/cloudxabide/followers",
"following_url": "https://api.github.com/users/cloudxabide/following{/other_user}",
"gists_url": "https://api.github.com/users/cloudxabide/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cloudxabide/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cloudxabide/subscriptions",
"organizations_url": "https://api.github.com/users/cloudxabide/orgs",
"repos_url": "https://api.github.com/users/cloudxabide/repos",
"events_url": "https://api.github.com/users/cloudxabide/events{/privacy}",
"received_events_url": "https://api.github.com/users/cloudxabide/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-30T03:28:04
| 2023-11-30T03:44:18
| 2023-11-30T03:44:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1325",
"html_url": "https://github.com/ollama/ollama/pull/1325",
"diff_url": "https://github.com/ollama/ollama/pull/1325.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1325.patch",
"merged_at": "2023-11-30T03:44:18"
}
|
Issue:
Doc contained
```
echo 'Environment="OLLAMA_ORIGINS=http://129.168.1.1:*,https://example.com"' >>/etc/systemd/system/ollama.service.d/environment.conf
```
Which, based on all other examples, should be 192.168.1.1
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1325/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1325/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/205
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/205/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/205/comments
|
https://api.github.com/repos/ollama/ollama/issues/205/events
|
https://github.com/ollama/ollama/pull/205
| 1,819,323,051
|
PR_kwDOJ0Z1Ps5WR6Do
| 205
|
enable accelerate
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-07-25T00:15:20
| 2023-07-25T03:16:40
| 2023-07-25T03:06:05
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/205",
"html_url": "https://github.com/ollama/ollama/pull/205",
"diff_url": "https://github.com/ollama/ollama/pull/205.diff",
"patch_url": "https://github.com/ollama/ollama/pull/205.patch",
"merged_at": "2023-07-25T03:06:05"
}
|
missed this define
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/205/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/205/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4036
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4036/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4036/comments
|
https://api.github.com/repos/ollama/ollama/issues/4036/events
|
https://github.com/ollama/ollama/pull/4036
| 2,270,144,784
|
PR_kwDOJ0Z1Ps5uEtGB
| 4,036
|
Update llama.cpp
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-29T23:10:57
| 2024-04-30T14:31:32
| 2024-04-30T03:18:48
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4036",
"html_url": "https://github.com/ollama/ollama/pull/4036",
"diff_url": "https://github.com/ollama/ollama/pull/4036.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4036.patch",
"merged_at": "2024-04-30T03:18:48"
}
|
Bump llama.cpp to b2761
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4036/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4036/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3454
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3454/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3454/comments
|
https://api.github.com/repos/ollama/ollama/issues/3454/events
|
https://github.com/ollama/ollama/issues/3454
| 2,220,113,416
|
I_kwDOJ0Z1Ps6EVD4I
| 3,454
|
Does Ollama have methods similar to Docker's load and save?
|
{
"login": "papandadj",
"id": 25424898,
"node_id": "MDQ6VXNlcjI1NDI0ODk4",
"avatar_url": "https://avatars.githubusercontent.com/u/25424898?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/papandadj",
"html_url": "https://github.com/papandadj",
"followers_url": "https://api.github.com/users/papandadj/followers",
"following_url": "https://api.github.com/users/papandadj/following{/other_user}",
"gists_url": "https://api.github.com/users/papandadj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/papandadj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/papandadj/subscriptions",
"organizations_url": "https://api.github.com/users/papandadj/orgs",
"repos_url": "https://api.github.com/users/papandadj/repos",
"events_url": "https://api.github.com/users/papandadj/events{/privacy}",
"received_events_url": "https://api.github.com/users/papandadj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-04-02T10:16:02
| 2024-05-05T18:12:18
| 2024-05-05T18:12:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I want to deploy Ollama offline recently, and plan to save the model to a local tar file through save, and then load it into Ollama.
### What did you expect to see?
_No response_
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
_No response_
### Architecture
_No response_
### Platform
_No response_
### Ollama version
_No response_
### GPU
_No response_
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3454/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3454/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5920
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5920/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5920/comments
|
https://api.github.com/repos/ollama/ollama/issues/5920/events
|
https://github.com/ollama/ollama/pull/5920
| 2,428,188,885
|
PR_kwDOJ0Z1Ps52X73e
| 5,920
|
Update README.md: add YetAnotherOllamaManager
|
{
"login": "lorenzodimauro97weplus",
"id": 124273641,
"node_id": "U_kgDOB2hD6Q",
"avatar_url": "https://avatars.githubusercontent.com/u/124273641?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lorenzodimauro97weplus",
"html_url": "https://github.com/lorenzodimauro97weplus",
"followers_url": "https://api.github.com/users/lorenzodimauro97weplus/followers",
"following_url": "https://api.github.com/users/lorenzodimauro97weplus/following{/other_user}",
"gists_url": "https://api.github.com/users/lorenzodimauro97weplus/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lorenzodimauro97weplus/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lorenzodimauro97weplus/subscriptions",
"organizations_url": "https://api.github.com/users/lorenzodimauro97weplus/orgs",
"repos_url": "https://api.github.com/users/lorenzodimauro97weplus/repos",
"events_url": "https://api.github.com/users/lorenzodimauro97weplus/events{/privacy}",
"received_events_url": "https://api.github.com/users/lorenzodimauro97weplus/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-07-24T18:09:51
| 2024-11-21T10:14:20
| 2024-11-21T10:14:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5920",
"html_url": "https://github.com/ollama/ollama/pull/5920",
"diff_url": "https://github.com/ollama/ollama/pull/5920.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5920.patch",
"merged_at": null
}
| null |
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5920/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5920/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5694
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5694/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5694/comments
|
https://api.github.com/repos/ollama/ollama/issues/5694/events
|
https://github.com/ollama/ollama/issues/5694
| 2,407,757,552
|
I_kwDOJ0Z1Ps6Pg3bw
| 5,694
|
API that breaks model output
|
{
"login": "wltime",
"id": 91012742,
"node_id": "MDQ6VXNlcjkxMDEyNzQy",
"avatar_url": "https://avatars.githubusercontent.com/u/91012742?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wltime",
"html_url": "https://github.com/wltime",
"followers_url": "https://api.github.com/users/wltime/followers",
"following_url": "https://api.github.com/users/wltime/following{/other_user}",
"gists_url": "https://api.github.com/users/wltime/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wltime/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wltime/subscriptions",
"organizations_url": "https://api.github.com/users/wltime/orgs",
"repos_url": "https://api.github.com/users/wltime/repos",
"events_url": "https://api.github.com/users/wltime/events{/privacy}",
"received_events_url": "https://api.github.com/users/wltime/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-15T02:43:52
| 2024-07-15T03:59:39
| 2024-07-15T03:44:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Sometimes when there is a lot of output, and I don't need the model to continue to output. Is there an API to interrupt the output of the model?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5694/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5694/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7801
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7801/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7801/comments
|
https://api.github.com/repos/ollama/ollama/issues/7801/events
|
https://github.com/ollama/ollama/issues/7801
| 2,683,923,596
|
I_kwDOJ0Z1Ps6f-WyM
| 7,801
|
GPU usage is not high, but the display memory is full
|
{
"login": "duolax",
"id": 106507112,
"node_id": "U_kgDOBlkraA",
"avatar_url": "https://avatars.githubusercontent.com/u/106507112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/duolax",
"html_url": "https://github.com/duolax",
"followers_url": "https://api.github.com/users/duolax/followers",
"following_url": "https://api.github.com/users/duolax/following{/other_user}",
"gists_url": "https://api.github.com/users/duolax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/duolax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/duolax/subscriptions",
"organizations_url": "https://api.github.com/users/duolax/orgs",
"repos_url": "https://api.github.com/users/duolax/repos",
"events_url": "https://api.github.com/users/duolax/events{/privacy}",
"received_events_url": "https://api.github.com/users/duolax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-22T17:04:45
| 2024-11-23T21:04:29
| 2024-11-23T21:04:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

When running the model, the CPU usage is often full, but the GPU usage is not high. However, after checking the details, it turns out that the GPU memory is fully occupied during operation
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.4.2
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7801/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7801/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2453
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2453/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2453/comments
|
https://api.github.com/repos/ollama/ollama/issues/2453/events
|
https://github.com/ollama/ollama/issues/2453
| 2,129,195,463
|
I_kwDOJ0Z1Ps5-6PHH
| 2,453
|
Add support for older AMD GPU gfx803, gfx802, gfx805 (e.g. Radeon RX 580, FirePro W7100)
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
},
{
"id": 7700262114,
"node_id": "LA_kwDOJ0Z1Ps8AAAAByvis4g",
"url": "https://api.github.com/repos/ollama/ollama/labels/build",
"name": "build",
"color": "006b75",
"default": false,
"description": "Issues relating to building ollama from source"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 148
| 2024-02-11T22:15:37
| 2025-01-30T02:24:21
| null |
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Officially ROCm no longer supports these cards, but it looks like other projects have found workarounds. Let's explore if that's possible. Best case, built-in to our binaries. Fall-back if that's not plausible is document how to build from source with the appropriate older ROCm library and AMD drivers installed on your system and build a local binary that works.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2453/reactions",
"total_count": 53,
"+1": 40,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 8,
"rocket": 5,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2453/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/602
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/602/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/602/comments
|
https://api.github.com/repos/ollama/ollama/issues/602/events
|
https://github.com/ollama/ollama/issues/602
| 1,912,783,537
|
I_kwDOJ0Z1Ps5yAsKx
| 602
|
Docker image looks for `libcudart.so.12` but doesn't ifnd it
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-09-26T06:34:39
| 2023-10-26T06:28:44
| 2023-09-26T06:52:33
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The subprocess `server` looks for `libcudart.so.12` but can't find it. It seems `libcudart.so.12.0` is packaged
```
error while loading shared libraries: libcudart.so.12: cannot open shared object file: No such file or directory
```
When looking at the temporary subprocess directory:
```
ls -al /tmp/ollama3423059418/llama.cpp/ggml/build/cuda/bin/
drwxr-xr-x 2 ollama ollama 4096 Sep 26 06:27 .
drwxr-xr-x 3 ollama ollama 4096 Sep 26 06:27 ..
-rwxr-xr-x 1 ollama ollama 107473968 Sep 26 06:27 libcublas.so.12
-rwxr-xr-x 1 ollama ollama 515090264 Sep 26 06:27 libcublasLt.so.12
-rwxr-xr-x 1 ollama ollama 687456 Sep 26 06:27 libcudart.so.12.0
-rwxr-xr-x 1 ollama ollama 5471600 Sep 26 06:27 server
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/602/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/602/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8006
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8006/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8006/comments
|
https://api.github.com/repos/ollama/ollama/issues/8006/events
|
https://github.com/ollama/ollama/issues/8006
| 2,725,900,724
|
I_kwDOJ0Z1Ps6iefG0
| 8,006
|
ollama0.5.1, run llama3.3:70b-instruct-q8_0 failed. client connection closed before server finished loading, aborting load
|
{
"login": "javaice007",
"id": 71003629,
"node_id": "MDQ6VXNlcjcxMDAzNjI5",
"avatar_url": "https://avatars.githubusercontent.com/u/71003629?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/javaice007",
"html_url": "https://github.com/javaice007",
"followers_url": "https://api.github.com/users/javaice007/followers",
"following_url": "https://api.github.com/users/javaice007/following{/other_user}",
"gists_url": "https://api.github.com/users/javaice007/gists{/gist_id}",
"starred_url": "https://api.github.com/users/javaice007/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/javaice007/subscriptions",
"organizations_url": "https://api.github.com/users/javaice007/orgs",
"repos_url": "https://api.github.com/users/javaice007/repos",
"events_url": "https://api.github.com/users/javaice007/events{/privacy}",
"received_events_url": "https://api.github.com/users/javaice007/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-09T03:44:49
| 2024-12-23T08:08:49
| 2024-12-23T08:08:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ggml_cuda_init: found 4 CUDA devices:
Device 0: NVIDIA GeForce RTX 4090, compute capability 8.9, VMM: yes
Device 1: NVIDIA GeForce RTX 4090, compute capability 8.9, VMM: yes
Device 2: NVIDIA GeForce RTX 4090, compute capability 8.9, VMM: yes
Device 3: NVIDIA GeForce RTX 4090, compute capability 8.9, VMM: yes
llm_load_tensors: ggml ctx size = 1.69 MiB
time=2024-12-09T03:41:20.513Z level=WARN source=server.go:583 msg="client connection closed before server finished loading, aborting load"
time=2024-12-09T03:41:20.513Z level=ERROR source=sched.go:455 msg="error loading llama server" error="timed out waiting for llama runner to start: context canceled"
[GIN] 2024/12/09 - 03:41:20 | 499 | 1m30s | 10.1.1.116 | POST "/v1/chat/completions"
time=2024-12-09T03:41:25.878Z level=WARN source=sched.go:646 msg="gpu VRAM usage didn't recover within timeout" seconds=5.364338764 model=/data/models/blobs/sha256-4a8a92e57c0f847acc18d361504c13020c7578c34e3444c4c04f70fb601fb058
time=2024-12-09T03:41:26.518Z level=WARN source=sched.go:646 msg="gpu VRAM usage didn't recover within timeout" seconds=6.004755322 model=/data/models/blobs/sha256-4a8a92e57c0f847acc18d361504c13020c7578c34e3444c4c04f70fb601fb058
time=2024-12-09T03:41:27.158Z level=WARN source=sched.go:646 msg="gpu VRAM usage didn't recover within timeout" seconds=6.644676283 model=/data/models/blobs/sha256-4a8a92e57c0f847acc18d361504c13020c7578c34e3444c4c04f70fb601fb058
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.1
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8006/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8006/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6567
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6567/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6567/comments
|
https://api.github.com/repos/ollama/ollama/issues/6567/events
|
https://github.com/ollama/ollama/issues/6567
| 2,497,517,747
|
I_kwDOJ0Z1Ps6U3Riz
| 6,567
|
Improve error reporting with old or missing AMD driver on windows (unable to load amdhip64_6.dll)
|
{
"login": "Jiefei-Wang",
"id": 13570205,
"node_id": "MDQ6VXNlcjEzNTcwMjA1",
"avatar_url": "https://avatars.githubusercontent.com/u/13570205?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jiefei-Wang",
"html_url": "https://github.com/Jiefei-Wang",
"followers_url": "https://api.github.com/users/Jiefei-Wang/followers",
"following_url": "https://api.github.com/users/Jiefei-Wang/following{/other_user}",
"gists_url": "https://api.github.com/users/Jiefei-Wang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jiefei-Wang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jiefei-Wang/subscriptions",
"organizations_url": "https://api.github.com/users/Jiefei-Wang/orgs",
"repos_url": "https://api.github.com/users/Jiefei-Wang/repos",
"events_url": "https://api.github.com/users/Jiefei-Wang/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jiefei-Wang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-08-30T15:23:15
| 2024-09-24T17:23:40
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I was trying to solve this issue that prevented me from using AMD GPU
```
time=2024-08-30T09:43:00.852-05:00 level=DEBUG source=amd_windows.go:33 msg="unable to load amdhip64_6.dll,
please make sure to upgrade to the latest amd driver: The specified module could not be found."
```
Compared with the message for searching Nvidia, the AMD message is really minimal, here is the Nvidia output
```
time=2024-08-30T09:43:00.768-05:00 level=DEBUG source=gpu.go:469 msg="Searching for GPU library" name=cudart64_*.dll
time=2024-08-30T09:43:00.768-05:00 level=DEBUG source=gpu.go:488 msg="gpu library search" globs="[C:\\Program Files\\NVIDIA GPU Computing Toolkit\\CUDA\\v12.2\\bin\\cudart64_*.dll* ..........]"
```
As you can see, ollama does not tell where it tries to find `amdhip64_6`. https://github.com/ROCm/ROCm/issues/3418#issuecomment-2253379050 said the DLL should be in ` C:\Windows\system32\amdhip64_6.dll`, which is not the case as my dll is in `C:\Program Files\AMD\ROCm\6.1\bin`. It turns out that ollama uses the environment variable `PATH` to search for the DLL, which is not very clear just by reading the console debug output. To help future users, I'd like to suggest the following improvements:
1. Print out the search path when searching for AMD GPU (like the Nvidia log above)
2. Suppress Nvidia message when `CUDA_VISIBLE_DEVICES=-1` and AMD message when `HIP_VISIBLE_DEVICES=-1` to simplify the log (there is no need to print out the search path if one does not need Nvidia or AMD driver)
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6567/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6567/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/864
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/864/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/864/comments
|
https://api.github.com/repos/ollama/ollama/issues/864/events
|
https://github.com/ollama/ollama/pull/864
| 1,955,157,484
|
PR_kwDOJ0Z1Ps5dbapt
| 864
|
update runtime options
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-21T00:24:42
| 2023-10-21T01:17:15
| 2023-10-21T01:17:14
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/864",
"html_url": "https://github.com/ollama/ollama/pull/864",
"diff_url": "https://github.com/ollama/ollama/pull/864.diff",
"patch_url": "https://github.com/ollama/ollama/pull/864.patch",
"merged_at": "2023-10-21T01:17:14"
}
|
SetOptions doesn't differentiate between boot or runtime options but that's not relevant. By the time SetOptions is called, the LLM is already running so any boot option updates will be ignored
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/864/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/864/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5628
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5628/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5628/comments
|
https://api.github.com/repos/ollama/ollama/issues/5628/events
|
https://github.com/ollama/ollama/issues/5628
| 2,402,574,272
|
I_kwDOJ0Z1Ps6PNF_A
| 5,628
|
llama runner process has terminated
|
{
"login": "gsm1258",
"id": 58941791,
"node_id": "MDQ6VXNlcjU4OTQxNzkx",
"avatar_url": "https://avatars.githubusercontent.com/u/58941791?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gsm1258",
"html_url": "https://github.com/gsm1258",
"followers_url": "https://api.github.com/users/gsm1258/followers",
"following_url": "https://api.github.com/users/gsm1258/following{/other_user}",
"gists_url": "https://api.github.com/users/gsm1258/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gsm1258/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gsm1258/subscriptions",
"organizations_url": "https://api.github.com/users/gsm1258/orgs",
"repos_url": "https://api.github.com/users/gsm1258/repos",
"events_url": "https://api.github.com/users/gsm1258/events{/privacy}",
"received_events_url": "https://api.github.com/users/gsm1258/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-11T08:24:08
| 2024-09-27T05:38:52
| 2024-07-11T08:25:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama run codegeex4:9b-all-q4_K_M
Error: llama runner process has terminated: exit status 0xc0000409 error:failed to create context with model 'C:\Users\ChatAI\.ollama\models\blobs\sha256-916173d92319f80a29db51321f7cb3441f326bac95c421de809e1d08eaaeb693'
### OS
Windows
### GPU
Other
### CPU
Intel
### Ollama version
0.2.1
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5628/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5628/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2198
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2198/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2198/comments
|
https://api.github.com/repos/ollama/ollama/issues/2198/events
|
https://github.com/ollama/ollama/issues/2198
| 2,101,577,070
|
I_kwDOJ0Z1Ps59Q4Vu
| 2,198
|
ollama serve crashes with SIGSEV
|
{
"login": "hardik124",
"id": 6649948,
"node_id": "MDQ6VXNlcjY2NDk5NDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/6649948?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hardik124",
"html_url": "https://github.com/hardik124",
"followers_url": "https://api.github.com/users/hardik124/followers",
"following_url": "https://api.github.com/users/hardik124/following{/other_user}",
"gists_url": "https://api.github.com/users/hardik124/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hardik124/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hardik124/subscriptions",
"organizations_url": "https://api.github.com/users/hardik124/orgs",
"repos_url": "https://api.github.com/users/hardik124/repos",
"events_url": "https://api.github.com/users/hardik124/events{/privacy}",
"received_events_url": "https://api.github.com/users/hardik124/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2024-01-26T04:25:29
| 2024-01-26T17:30:14
| 2024-01-26T17:30:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I installed ollama using one liner, and everytime i try to run ollama serve, i get the following error :
hardik@pop-os:~/Downloads$ ollama serve
2024/01/26 09:54:31 images.go:857: INFO total blobs: 0
2024/01/26 09:54:31 images.go:864: INFO total unused blobs removed: 0
2024/01/26 09:54:31 routes.go:950: INFO Listening on 127.0.0.1:11434 (version 0.1.21)
2024/01/26 09:54:31 payload_common.go:106: INFO Extracting dynamic libraries...
2024/01/26 09:54:34 payload_common.go:145: INFO Dynamic LLM libraries [rocm_v5 cpu_avx cpu cuda_v11 rocm_v6 cpu_avx2]
2024/01/26 09:54:34 gpu.go:93: INFO Detecting GPU type
2024/01/26 09:54:34 gpu.go:212: INFO Searching for GPU management library libnvidia-ml.so
2024/01/26 09:54:34 gpu.go:258: INFO Discovered GPU libraries: [/usr/lib/x86_64-linux-gnu/libnvidia-ml.so.545.29.06]
SIGSEGV: segmentation violation
PC=0x7180ec649a70 m=17 sigcode=1
signal arrived during cgo execution
goroutine 1 [syscall]:
runtime.cgocall(0x9b6eb0, 0xc0000e78a8)
/usr/local/go/src/runtime/cgocall.go:157 +0x4b fp=0xc0000e7880 sp=0xc0000e7848 pc=0x409b0b
github.com/jmorganca/ollama/gpu._Cfunc_cuda_init(0x7180f4000b70, 0xc000490500)
_cgo_gotypes.go:248 +0x3f fp=0xc0000e78a8 sp=0xc0000e7880 pc=0x7b9cdf
github.com/jmorganca/ollama/gpu.LoadCUDAMgmt.func2(0xc0000361d0?, 0x33?)
/go/src/github.com/jmorganca/ollama/gpu/gpu.go:268 +0x4a fp=0xc0000e78e8 sp=0xc0000e78a8 pc=0x7bbaca
github.com/jmorganca/ollama/gpu.LoadCUDAMgmt({0xc000036020, 0x1, 0xc0000d4370?})
/go/src/github.com/jmorganca/ollama/gpu/gpu.go:268 +0x1b8 fp=0xc0000e7988 sp=0xc0000e78e8 pc=0x7bb998
github.com/jmorganca/ollama/gpu.initGPUHandles()
/go/src/github.com/jmorganca/ollama/gpu/gpu.go:96 +0xd1 fp=0xc0000e79f0 sp=0xc0000e7988 pc=0x7ba131
github.com/jmorganca/ollama/gpu.GetGPUInfo()
/go/src/github.com/jmorganca/ollama/gpu/gpu.go:121 +0xb5 fp=0xc0000e7b00 sp=0xc0000e79f0 pc=0x7ba2f5
github.com/jmorganca/ollama/gpu.CheckVRAM()
/go/src/github.com/jmorganca/ollama/gpu/gpu.go:194 +0x1f fp=0xc0000e7ba8 sp=0xc0000e7b00 pc=0x7bafdf
github.com/jmorganca/ollama/server.Serve({0x106c11d0, 0xc0004615a0})
/go/src/github.com/jmorganca/ollama/server/routes.go:972 +0x453 fp=0xc0000e7c98 sp=0xc0000e7ba8 pc=0x99b513
github.com/jmorganca/ollama/cmd.RunServer(0xc00048e300?, {0x10b06800?, 0x4?, 0xad25c1?})
/go/src/github.com/jmorganca/ollama/cmd/cmd.go:692 +0x199 fp=0xc0000e7d30 sp=0xc0000e7c98 pc=0x9ad9f9
github.com/spf13/cobra.(*Command).execute(0xc000463800, {0x10b06800, 0x0, 0x0})
/root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:940 +0x87c fp=0xc0000e7e68 sp=0xc0000e7d30 pc=0x7641dc
github.com/spf13/cobra.(*Command).ExecuteC(0xc000462c00)
/root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc0000e7f20 sp=0xc0000e7e68 pc=0x764a05
github.com/spf13/cobra.(*Command).Execute(...)
/root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
/root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:985
main.main()
/go/src/github.com/jmorganca/ollama/main.go:11 +0x4d fp=0xc0000e7f40 sp=0xc0000e7f20 pc=0x9b5a2d
runtime.main()
/usr/local/go/src/runtime/proc.go:267 +0x2bb fp=0xc0000e7fe0 sp=0xc0000e7f40 pc=0x43e25b
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000e7fe8 sp=0xc0000e7fe0 pc=0x46e0a1
goroutine 2 [force gc (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000076fa8 sp=0xc000076f88 pc=0x43e6ae
runtime.goparkunlock(...)
/usr/local/go/src/runtime/proc.go:404
runtime.forcegchelper()
/usr/local/go/src/runtime/proc.go:322 +0xb3 fp=0xc000076fe0 sp=0xc000076fa8 pc=0x43e533
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000076fe8 sp=0xc000076fe0 pc=0x46e0a1
created by runtime.init.6 in goroutine 1
/usr/local/go/src/runtime/proc.go:310 +0x1a
goroutine 3 [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000077778 sp=0xc000077758 pc=0x43e6ae
runtime.goparkunlock(...)
/usr/local/go/src/runtime/proc.go:404
runtime.bgsweep(0x0?)
/usr/local/go/src/runtime/mgcsweep.go:321 +0xdf fp=0xc0000777c8 sp=0xc000077778 pc=0x42a5ff
runtime.gcenable.func1()
/usr/local/go/src/runtime/mgc.go:200 +0x25 fp=0xc0000777e0 sp=0xc0000777c8 pc=0x41f725
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000777e8 sp=0xc0000777e0 pc=0x46e0a1
created by runtime.gcenable in goroutine 1
/usr/local/go/src/runtime/mgc.go:200 +0x66
goroutine 4 [GC scavenge wait]:
runtime.gopark(0x7ce6fb?, 0x6f7fe8?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000077f70 sp=0xc000077f50 pc=0x43e6ae
runtime.goparkunlock(...)
/usr/local/go/src/runtime/proc.go:404
runtime.(*scavengerState).park(0x10ad6b80)
/usr/local/go/src/runtime/mgcscavenge.go:425 +0x49 fp=0xc000077fa0 sp=0xc000077f70 pc=0x427e29
runtime.bgscavenge(0x0?)
/usr/local/go/src/runtime/mgcscavenge.go:658 +0x59 fp=0xc000077fc8 sp=0xc000077fa0 pc=0x4283d9
runtime.gcenable.func2()
/usr/local/go/src/runtime/mgc.go:201 +0x25 fp=0xc000077fe0 sp=0xc000077fc8 pc=0x41f6c5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000077fe8 sp=0xc000077fe0 pc=0x46e0a1
created by runtime.gcenable in goroutine 1
/usr/local/go/src/runtime/mgc.go:201 +0xa5
goroutine 5 [finalizer wait]:
runtime.gopark(0xacb580?, 0x10043f801?, 0x0?, 0x0?, 0x446865?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000076628 sp=0xc000076608 pc=0x43e6ae
runtime.runfinq()
/usr/local/go/src/runtime/mfinal.go:193 +0x107 fp=0xc0000767e0 sp=0xc000076628 pc=0x41e7a7
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000767e8 sp=0xc0000767e0 pc=0x46e0a1
created by runtime.createfing in goroutine 1
/usr/local/go/src/runtime/mfinal.go:163 +0x3d
goroutine 6 [select, locked to thread]:
runtime.gopark(0xc0000787a8?, 0x2?, 0x49?, 0xe9?, 0xc0000787a4?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000078638 sp=0xc000078618 pc=0x43e6ae
runtime.selectgo(0xc0000787a8, 0xc0000787a0, 0x0?, 0x0, 0x0?, 0x1)
/usr/local/go/src/runtime/select.go:327 +0x725 fp=0xc000078758 sp=0xc000078638 pc=0x44e1e5
runtime.ensureSigM.func1()
/usr/local/go/src/runtime/signal_unix.go:1014 +0x19f fp=0xc0000787e0 sp=0xc000078758 pc=0x46521f
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000787e8 sp=0xc0000787e0 pc=0x46e0a1
created by runtime.ensureSigM in goroutine 1
/usr/local/go/src/runtime/signal_unix.go:997 +0xc8
goroutine 18 [syscall]:
runtime.notetsleepg(0x0?, 0x0?)
/usr/local/go/src/runtime/lock_futex.go:236 +0x29 fp=0xc0000727a0 sp=0xc000072768 pc=0x411209
os/signal.signal_recv()
/usr/local/go/src/runtime/sigqueue.go:152 +0x29 fp=0xc0000727c0 sp=0xc0000727a0 pc=0x46aa69
os/signal.loop()
/usr/local/go/src/os/signal/signal_unix.go:23 +0x13 fp=0xc0000727e0 sp=0xc0000727c0 pc=0x6f3dd3
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000727e8 sp=0xc0000727e0 pc=0x46e0a1
created by os/signal.Notify.func1.1 in goroutine 1
/usr/local/go/src/os/signal/signal.go:151 +0x1f
goroutine 19 [chan receive]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000072f18 sp=0xc000072ef8 pc=0x43e6ae
runtime.chanrecv(0xc0001ad2c0, 0x0, 0x1)
/usr/local/go/src/runtime/chan.go:583 +0x3cd fp=0xc000072f90 sp=0xc000072f18 pc=0x40beed
runtime.chanrecv1(0x0?, 0x0?)
/usr/local/go/src/runtime/chan.go:442 +0x12 fp=0xc000072fb8 sp=0xc000072f90 pc=0x40baf2
github.com/jmorganca/ollama/server.Serve.func1()
/go/src/github.com/jmorganca/ollama/server/routes.go:959 +0x25 fp=0xc000072fe0 sp=0xc000072fb8 pc=0x99b5e5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000072fe8 sp=0xc000072fe0 pc=0x46e0a1
created by github.com/jmorganca/ollama/server.Serve in goroutine 1
/go/src/github.com/jmorganca/ollama/server/routes.go:958 +0x3f6
goroutine 20 [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000073750 sp=0xc000073730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0000737e0 sp=0xc000073750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000737e8 sp=0xc0000737e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 21 [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000073f50 sp=0xc000073f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000073fe0 sp=0xc000073f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000073fe8 sp=0xc000073fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 34 [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000514750 sp=0xc000514730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0005147e0 sp=0xc000514750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005147e8 sp=0xc0005147e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 7 [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000078f50 sp=0xc000078f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000078fe0 sp=0xc000078f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000078fe8 sp=0xc000078fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 22 [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000074750 sp=0xc000074730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0000747e0 sp=0xc000074750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000747e8 sp=0xc0000747e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 23 [GC worker (idle)]:
runtime.gopark(0x63fd946caf31?, 0x1?, 0x1e?, 0x50?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000074f50 sp=0xc000074f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000074fe0 sp=0xc000074f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000074fe8 sp=0xc000074fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 24 [GC worker (idle)]:
runtime.gopark(0x63fd946ce20b?, 0x1?, 0x2d?, 0xf?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000075750 sp=0xc000075730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0000757e0 sp=0xc000075750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000757e8 sp=0xc0000757e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 35 [GC worker (idle)]:
runtime.gopark(0x63fd7654c30c?, 0x3?, 0x26?, 0x9?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000514f50 sp=0xc000514f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000514fe0 sp=0xc000514f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000514fe8 sp=0xc000514fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 36 [GC worker (idle)]:
runtime.gopark(0x63fd946c659e?, 0x3?, 0x69?, 0x3?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000515750 sp=0xc000515730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0005157e0 sp=0xc000515750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005157e8 sp=0xc0005157e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 37 [GC worker (idle)]:
runtime.gopark(0x63fd946ce5b0?, 0x1?, 0xd0?, 0xfb?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000515f50 sp=0xc000515f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000515fe0 sp=0xc000515f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000515fe8 sp=0xc000515fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 38 [GC worker (idle)]:
runtime.gopark(0x63fd946ccb86?, 0x1?, 0x90?, 0xf5?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000516750 sp=0xc000516730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0005167e0 sp=0xc000516750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005167e8 sp=0xc0005167e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 25 [GC worker (idle)]:
runtime.gopark(0x63fd5c3437c7?, 0x3?, 0x15?, 0x30?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000075f50 sp=0xc000075f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000075fe0 sp=0xc000075f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000075fe8 sp=0xc000075fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 26 [GC worker (idle)]:
runtime.gopark(0x63fd7654c109?, 0x1?, 0x34?, 0x7d?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000510750 sp=0xc000510730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0005107e0 sp=0xc000510750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005107e8 sp=0xc0005107e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 27 [GC worker (idle)]:
runtime.gopark(0x63fd946d1ce4?, 0x3?, 0xcb?, 0x32?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000510f50 sp=0xc000510f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000510fe0 sp=0xc000510f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000510fe8 sp=0xc000510fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 28 [GC worker (idle)]:
runtime.gopark(0x63fd7654c427?, 0x1?, 0x54?, 0xd5?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000511750 sp=0xc000511730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0005117e0 sp=0xc000511750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005117e8 sp=0xc0005117e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 29 [GC worker (idle)]:
runtime.gopark(0x63fd946ca944?, 0x1?, 0xd9?, 0x3f?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000511f50 sp=0xc000511f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000511fe0 sp=0xc000511f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000511fe8 sp=0xc000511fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 30 [GC worker (idle)]:
runtime.gopark(0x10b08520?, 0x1?, 0x10?, 0x2e?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000512750 sp=0xc000512730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0005127e0 sp=0xc000512750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005127e8 sp=0xc0005127e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 31 [GC worker (idle)]:
runtime.gopark(0x10b08520?, 0x1?, 0x70?, 0x37?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000512f50 sp=0xc000512f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000512fe0 sp=0xc000512f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000512fe8 sp=0xc000512fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 32 [GC worker (idle)]:
runtime.gopark(0x63fd946c66ea?, 0x3?, 0x80?, 0x40?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000513750 sp=0xc000513730 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0005137e0 sp=0xc000513750 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005137e8 sp=0xc0005137e0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
goroutine 33 [GC worker (idle)]:
runtime.gopark(0x63fd946c71ed?, 0x1?, 0xe1?, 0x2e?, 0x0?)
/usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000513f50 sp=0xc000513f30 pc=0x43e6ae
runtime.gcBgMarkWorker()
/usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000513fe0 sp=0xc000513f50 pc=0x4212a5
runtime.goexit()
/usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000513fe8 sp=0xc000513fe0 pc=0x46e0a1
created by runtime.gcBgMarkStartWorkers in goroutine 1
/usr/local/go/src/runtime/mgc.go:1217 +0x1c
rax 0x7180f4000bf0
rbx 0xc000490500
rcx 0x7180f4000030
rdx 0x1a
rdi 0x71810cff8b60
rsi 0x100
rbp 0x71810cff8d80
rsp 0x71810cff8b58
r8 0x0
r9 0x7180f4000bf0
r10 0x7180f40004b0
r11 0x7180f4000090
r12 0x9
r13 0x71810cff8d50
r14 0x71810cff8b60
r15 0x0
rip 0x7180ec649a70
rflags 0x10206
cs 0x33
fs 0x0
gs 0x0
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2198/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2198/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7404
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7404/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7404/comments
|
https://api.github.com/repos/ollama/ollama/issues/7404/events
|
https://github.com/ollama/ollama/pull/7404
| 2,619,360,752
|
PR_kwDOJ0Z1Ps6AJj1k
| 7,404
|
fix multiple image inputs
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-28T19:33:20
| 2024-10-29T23:18:54
| 2024-10-29T23:18:52
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7404",
"html_url": "https://github.com/ollama/ollama/pull/7404",
"diff_url": "https://github.com/ollama/ollama/pull/7404.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7404.patch",
"merged_at": "2024-10-29T23:18:52"
}
|
* zero out inp_embd when setting cross attention states
* set cross attention if _any_ inputs contains image tokens, not just the new inputs
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7404/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7404/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5283
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5283/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5283/comments
|
https://api.github.com/repos/ollama/ollama/issues/5283/events
|
https://github.com/ollama/ollama/issues/5283
| 2,373,667,986
|
I_kwDOJ0Z1Ps6Ne0yS
| 5,283
|
fix: enforce file path format on FROM
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-06-25T21:10:44
| 2024-06-25T21:18:56
| 2024-06-25T21:18:56
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently, we can't tell if `FROM qwen-2` is a reference to a local file or a model from our library. The idea is to use the format `FROM ./qwen-2` to explicitly say that `qwen-2` is a local file.
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5283/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5283/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3496
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3496/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3496/comments
|
https://api.github.com/repos/ollama/ollama/issues/3496/events
|
https://github.com/ollama/ollama/pull/3496
| 2,226,516,966
|
PR_kwDOJ0Z1Ps5rw0nX
| 3,496
|
add command-r graph estimate
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-04T21:07:41
| 2024-04-05T19:26:22
| 2024-04-05T19:26:22
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3496",
"html_url": "https://github.com/ollama/ollama/pull/3496",
"diff_url": "https://github.com/ollama/ollama/pull/3496.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3496.patch",
"merged_at": "2024-04-05T19:26:22"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3496/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3496/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8107
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8107/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8107/comments
|
https://api.github.com/repos/ollama/ollama/issues/8107/events
|
https://github.com/ollama/ollama/issues/8107
| 2,740,279,189
|
I_kwDOJ0Z1Ps6jVVeV
| 8,107
|
version set incorrectly on local build without primary repo remote and tags - results in pull failure for newer models
|
{
"login": "sammcj",
"id": 862951,
"node_id": "MDQ6VXNlcjg2Mjk1MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/862951?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sammcj",
"html_url": "https://github.com/sammcj",
"followers_url": "https://api.github.com/users/sammcj/followers",
"following_url": "https://api.github.com/users/sammcj/following{/other_user}",
"gists_url": "https://api.github.com/users/sammcj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sammcj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sammcj/subscriptions",
"organizations_url": "https://api.github.com/users/sammcj/orgs",
"repos_url": "https://api.github.com/users/sammcj/repos",
"events_url": "https://api.github.com/users/sammcj/events{/privacy}",
"received_events_url": "https://api.github.com/users/sammcj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 7700262114,
"node_id": "LA_kwDOJ0Z1Ps8AAAAByvis4g",
"url": "https://api.github.com/repos/ollama/ollama/labels/build",
"name": "build",
"color": "006b75",
"default": false,
"description": "Issues relating to building ollama from source"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-12-15T04:55:04
| 2024-12-16T19:36:02
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Following the [docs](https://github.com/ollama/ollama/blob/main/docs/development.md#macos) to build Ollama from source, then trying to pull llama3.1 results in a refusal stating the Ollama version is too old.
```shell
$ git checkout feature/my-feature-in-development
$ make -j $(expr $(nproc) / 2)
...
$ ollama pull llama3.1:8b-instruct-q6_K
pulling manifest
Error: pull model manifest: 412:
The model you are attempting to pull requires a newer version of Ollama.
Please download the latest version at:
https://ollama.com/download
$ ollama -v
ollama version is 4d52e5d-dirty
```
### OS
Linux, macOS
### GPU
Nvidia, Apple
### CPU
AMD, Apple
### Ollama version
4d52e5d-dirty
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8107/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8107/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4892
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4892/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4892/comments
|
https://api.github.com/repos/ollama/ollama/issues/4892/events
|
https://github.com/ollama/ollama/issues/4892
| 2,339,494,693
|
I_kwDOJ0Z1Ps6Lcdsl
| 4,892
|
aya:35b-23-f16 erro
|
{
"login": "lymanzhao",
"id": 8308505,
"node_id": "MDQ6VXNlcjgzMDg1MDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/8308505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lymanzhao",
"html_url": "https://github.com/lymanzhao",
"followers_url": "https://api.github.com/users/lymanzhao/followers",
"following_url": "https://api.github.com/users/lymanzhao/following{/other_user}",
"gists_url": "https://api.github.com/users/lymanzhao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lymanzhao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lymanzhao/subscriptions",
"organizations_url": "https://api.github.com/users/lymanzhao/orgs",
"repos_url": "https://api.github.com/users/lymanzhao/repos",
"events_url": "https://api.github.com/users/lymanzhao/events{/privacy}",
"received_events_url": "https://api.github.com/users/lymanzhao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-07T03:13:24
| 2024-06-09T17:29:21
| 2024-06-09T17:29:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama run aya:35b-23-f16
Error: llama runner process has terminated: signal: aborted (core dumped) error:failed to create context with model '/home/ais/.ollama/models/blobs/sha256-6f800f35270cacdcbdd43e2b96991c7974907b859ab0efa0a91b1523724b7e43'
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.38 and 0.1.41
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4892/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4892/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3684
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3684/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3684/comments
|
https://api.github.com/repos/ollama/ollama/issues/3684/events
|
https://github.com/ollama/ollama/pull/3684
| 2,246,931,470
|
PR_kwDOJ0Z1Ps5s2drM
| 3,684
|
scale graph based on gpu count
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-16T21:45:28
| 2024-04-16T21:57:10
| 2024-04-16T21:57:10
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3684",
"html_url": "https://github.com/ollama/ollama/pull/3684",
"diff_url": "https://github.com/ollama/ollama/pull/3684.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3684.patch",
"merged_at": "2024-04-16T21:57:10"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3684/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3684/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8251
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8251/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8251/comments
|
https://api.github.com/repos/ollama/ollama/issues/8251/events
|
https://github.com/ollama/ollama/issues/8251
| 2,760,002,729
|
I_kwDOJ0Z1Ps6kgkyp
| 8,251
|
More API compatibility
|
{
"login": "ejgutierrez74",
"id": 11474846,
"node_id": "MDQ6VXNlcjExNDc0ODQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/11474846?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ejgutierrez74",
"html_url": "https://github.com/ejgutierrez74",
"followers_url": "https://api.github.com/users/ejgutierrez74/followers",
"following_url": "https://api.github.com/users/ejgutierrez74/following{/other_user}",
"gists_url": "https://api.github.com/users/ejgutierrez74/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ejgutierrez74/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ejgutierrez74/subscriptions",
"organizations_url": "https://api.github.com/users/ejgutierrez74/orgs",
"repos_url": "https://api.github.com/users/ejgutierrez74/repos",
"events_url": "https://api.github.com/users/ejgutierrez74/events{/privacy}",
"received_events_url": "https://api.github.com/users/ejgutierrez74/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-26T17:49:10
| 2025-01-13T02:58:34
| 2025-01-13T02:58:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Ollama is somehow compatible with OpenAI API...
So can be in the core of ollama or as plugins of extensions, more LLM APIs compatibility via ollama like Gemini, Mixtral..etc
Thanks
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8251/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8251/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/38
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/38/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/38/comments
|
https://api.github.com/repos/ollama/ollama/issues/38/events
|
https://github.com/ollama/ollama/pull/38
| 1,790,116,237
|
PR_kwDOJ0Z1Ps5Uu-xj
| 38
|
add run parameters
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-07-05T19:03:40
| 2023-07-06T16:30:42
| 2023-07-06T16:30:38
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/38",
"html_url": "https://github.com/ollama/ollama/pull/38",
"diff_url": "https://github.com/ollama/ollama/pull/38.diff",
"patch_url": "https://github.com/ollama/ollama/pull/38.patch",
"merged_at": null
}
|
this allows users to change generation parameters on the fly
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/38/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/38/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4101
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4101/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4101/comments
|
https://api.github.com/repos/ollama/ollama/issues/4101/events
|
https://github.com/ollama/ollama/issues/4101
| 2,276,071,105
|
I_kwDOJ0Z1Ps6HqhbB
| 4,101
|
Support NVIDIAs Llama fine-tune (chatQA-1.5)
|
{
"login": "DuckyBlender",
"id": 42645784,
"node_id": "MDQ6VXNlcjQyNjQ1Nzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/42645784?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DuckyBlender",
"html_url": "https://github.com/DuckyBlender",
"followers_url": "https://api.github.com/users/DuckyBlender/followers",
"following_url": "https://api.github.com/users/DuckyBlender/following{/other_user}",
"gists_url": "https://api.github.com/users/DuckyBlender/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DuckyBlender/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DuckyBlender/subscriptions",
"organizations_url": "https://api.github.com/users/DuckyBlender/orgs",
"repos_url": "https://api.github.com/users/DuckyBlender/repos",
"events_url": "https://api.github.com/users/DuckyBlender/events{/privacy}",
"received_events_url": "https://api.github.com/users/DuckyBlender/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 26
| 2024-05-02T17:16:41
| 2024-05-11T06:46:16
| 2024-05-11T06:46:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "DuckyBlender",
"id": 42645784,
"node_id": "MDQ6VXNlcjQyNjQ1Nzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/42645784?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DuckyBlender",
"html_url": "https://github.com/DuckyBlender",
"followers_url": "https://api.github.com/users/DuckyBlender/followers",
"following_url": "https://api.github.com/users/DuckyBlender/following{/other_user}",
"gists_url": "https://api.github.com/users/DuckyBlender/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DuckyBlender/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DuckyBlender/subscriptions",
"organizations_url": "https://api.github.com/users/DuckyBlender/orgs",
"repos_url": "https://api.github.com/users/DuckyBlender/repos",
"events_url": "https://api.github.com/users/DuckyBlender/events{/privacy}",
"received_events_url": "https://api.github.com/users/DuckyBlender/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4101/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4101/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5332
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5332/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5332/comments
|
https://api.github.com/repos/ollama/ollama/issues/5332/events
|
https://github.com/ollama/ollama/issues/5332
| 2,378,596,474
|
I_kwDOJ0Z1Ps6NxoB6
| 5,332
|
Ollam does not work with some models.
|
{
"login": "grafsoft",
"id": 5301936,
"node_id": "MDQ6VXNlcjUzMDE5MzY=",
"avatar_url": "https://avatars.githubusercontent.com/u/5301936?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/grafsoft",
"html_url": "https://github.com/grafsoft",
"followers_url": "https://api.github.com/users/grafsoft/followers",
"following_url": "https://api.github.com/users/grafsoft/following{/other_user}",
"gists_url": "https://api.github.com/users/grafsoft/gists{/gist_id}",
"starred_url": "https://api.github.com/users/grafsoft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/grafsoft/subscriptions",
"organizations_url": "https://api.github.com/users/grafsoft/orgs",
"repos_url": "https://api.github.com/users/grafsoft/repos",
"events_url": "https://api.github.com/users/grafsoft/events{/privacy}",
"received_events_url": "https://api.github.com/users/grafsoft/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-06-27T16:23:35
| 2024-07-02T21:16:47
| 2024-07-02T21:16:47
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
When I want to run phi3 I get
Error: llama runner process no longer running: -1
Newest version installed with
curl -fsSL https://ollama.com/install.sh | sh
I try ollama --version :
ollama version is 0.0.0
Warning: client version is 0.1.47
BS: ubuntu 22.04.4 LTS
Would it help to uninstall and re-install? Is there anything to consider when doing that?
Best
Peter
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5332/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5332/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6797
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6797/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6797/comments
|
https://api.github.com/repos/ollama/ollama/issues/6797/events
|
https://github.com/ollama/ollama/issues/6797
| 2,525,722,763
|
I_kwDOJ0Z1Ps6Wi3iL
| 6,797
|
Add the ability to remove a parameter using a Modelfile
|
{
"login": "dpkirchner",
"id": 165134,
"node_id": "MDQ6VXNlcjE2NTEzNA==",
"avatar_url": "https://avatars.githubusercontent.com/u/165134?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dpkirchner",
"html_url": "https://github.com/dpkirchner",
"followers_url": "https://api.github.com/users/dpkirchner/followers",
"following_url": "https://api.github.com/users/dpkirchner/following{/other_user}",
"gists_url": "https://api.github.com/users/dpkirchner/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dpkirchner/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpkirchner/subscriptions",
"organizations_url": "https://api.github.com/users/dpkirchner/orgs",
"repos_url": "https://api.github.com/users/dpkirchner/repos",
"events_url": "https://api.github.com/users/dpkirchner/events{/privacy}",
"received_events_url": "https://api.github.com/users/dpkirchner/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-09-13T21:24:11
| 2024-09-14T14:20:20
| 2024-09-14T14:20:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
There's a model I'm interested in using with ollama that specifies a parameter no longer supported by ollama (or maybe llama.cpp). I'd like to be able to create a replacement with a Modelfile that overrides the parameter by removing it entirely, if possible.
(The specific model is `codellama:7b-instruct` and the parameter is `rope_frequency_base`, for what it's worth, but this might be generally useful.)
|
{
"login": "dpkirchner",
"id": 165134,
"node_id": "MDQ6VXNlcjE2NTEzNA==",
"avatar_url": "https://avatars.githubusercontent.com/u/165134?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dpkirchner",
"html_url": "https://github.com/dpkirchner",
"followers_url": "https://api.github.com/users/dpkirchner/followers",
"following_url": "https://api.github.com/users/dpkirchner/following{/other_user}",
"gists_url": "https://api.github.com/users/dpkirchner/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dpkirchner/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpkirchner/subscriptions",
"organizations_url": "https://api.github.com/users/dpkirchner/orgs",
"repos_url": "https://api.github.com/users/dpkirchner/repos",
"events_url": "https://api.github.com/users/dpkirchner/events{/privacy}",
"received_events_url": "https://api.github.com/users/dpkirchner/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6797/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6797/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7931
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7931/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7931/comments
|
https://api.github.com/repos/ollama/ollama/issues/7931/events
|
https://github.com/ollama/ollama/issues/7931
| 2,717,331,078
|
I_kwDOJ0Z1Ps6h9y6G
| 7,931
|
Phi3 model starts responding crazy thing after thousand of calls.
|
{
"login": "TizDu",
"id": 50905781,
"node_id": "MDQ6VXNlcjUwOTA1Nzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/50905781?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TizDu",
"html_url": "https://github.com/TizDu",
"followers_url": "https://api.github.com/users/TizDu/followers",
"following_url": "https://api.github.com/users/TizDu/following{/other_user}",
"gists_url": "https://api.github.com/users/TizDu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TizDu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TizDu/subscriptions",
"organizations_url": "https://api.github.com/users/TizDu/orgs",
"repos_url": "https://api.github.com/users/TizDu/repos",
"events_url": "https://api.github.com/users/TizDu/events{/privacy}",
"received_events_url": "https://api.github.com/users/TizDu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-12-04T10:53:52
| 2024-12-23T08:04:49
| 2024-12-23T08:04:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am currently using Ollama server 0.4.1 and phi3:3.8b model
In my python script I am using OllamaLLM via langchain_ollama.
For each user query (messages) I am re-creating the LLM with OllamaLLM and then call llm.invoke to be sure there is no history or so.
Everything works well but after some utterances (around 3000 queries) the model starts returning totally wrong results not following anymore the system prompt.
Any idea on what can be wrong? Is there a limit with Phi3-mini or something like this?
I do not understand since I am creating a fresh llm for each query.
Here is a code snippet
llm = OllamaLLM(model="phi3:3.8b", format="json", temperature=0)
output = llm.invoke(messages, temperature=0, chat_history=[])
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
0.4.1
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7931/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7931/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/6762
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6762/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6762/comments
|
https://api.github.com/repos/ollama/ollama/issues/6762/events
|
https://github.com/ollama/ollama/pull/6762
| 2,520,629,970
|
PR_kwDOJ0Z1Ps57Nb8O
| 6,762
|
refactor show ouput
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-11T19:46:47
| 2024-09-11T21:58:42
| 2024-09-11T21:58:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6762",
"html_url": "https://github.com/ollama/ollama/pull/6762",
"diff_url": "https://github.com/ollama/ollama/pull/6762.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6762.patch",
"merged_at": "2024-09-11T21:58:40"
}
|
fixes line wrapping on long texts. the previous code was doing multiple passes through a tablewriting and breaking the intermediate outputs into lines to feed back into a table writer. since some fields are much longer than others, the column widths become inflated causing everything to be filled with whitespace
this change fixes the root issue by using individual tablewriters for each section which allows each section to be rendered independently. any long text will not impact the width of unrelated tables. the output itself should be largely unchanged
```
$ ollama show maybe
Model
parameters 8.0B
quantization Q4_0
architecture llama
context length 131072
embedding length 4096
Parameters
stop "<|start_header_id|>"
stop "<|end_header_id|>"
stop "<|eot_id|>"
System
You are a world-class AI system, capable of complex reasoning and reflection. Reason through the
query inside <thinking> tags, and then provide your final response inside <output> tags. If you
detect that you made a mistake in your reasoning at any point, correct yourself inside <reflection>
tags.
License
LLAMA 3.1 COMMUNITY LICENSE AGREEMENT
Llama 3.1 Version Release Date: July 23, 2024
```
resolves #6740
resolves #6763
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6762/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6762/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7164
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7164/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7164/comments
|
https://api.github.com/repos/ollama/ollama/issues/7164/events
|
https://github.com/ollama/ollama/pull/7164
| 2,579,455,382
|
PR_kwDOJ0Z1Ps5-QNQi
| 7,164
|
Send all images in conversation history
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-10T17:39:19
| 2024-10-10T18:21:54
| 2024-10-10T18:21:51
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7164",
"html_url": "https://github.com/ollama/ollama/pull/7164",
"diff_url": "https://github.com/ollama/ollama/pull/7164.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7164.patch",
"merged_at": "2024-10-10T18:21:51"
}
|
@pdevine I'm not entirely sure what the history of this check was - it sounds like there were previously some models that didn't do well with images in the history. However, it worked well with the models that I tried it on.
Regardless, I don't think the check belongs in the CLI.
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7164/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7164/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1530
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1530/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1530/comments
|
https://api.github.com/repos/ollama/ollama/issues/1530/events
|
https://github.com/ollama/ollama/pull/1530
| 2,042,615,565
|
PR_kwDOJ0Z1Ps5iDTc6
| 1,530
|
send empty messages on last chat response
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-14T22:44:19
| 2023-12-18T19:23:39
| 2023-12-18T19:23:38
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1530",
"html_url": "https://github.com/ollama/ollama/pull/1530",
"diff_url": "https://github.com/ollama/ollama/pull/1530.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1530.patch",
"merged_at": "2023-12-18T19:23:38"
}
|
Send an empty message on the last chat response rather than omitting it. This makes the chat API match the generate API.
As of this change...
`/generate`
```
curl http://localhost:11434/api/generate -d '{
"model": "mistral"
}'
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Date: Thu, 14 Dec 2023 22:42:13 GMT
Content-Length: 88
Connection: close
{
"model": "mistral",
"created_at": "2023-12-14T22:42:13.365069Z",
"response": "",
"done": true
}
```
```
curl http://localhost:11434/api/generate -d '{
"model": "mistral",
"prompt": "reply with a single word"
}'
HTTP/1.1 200 OK
Content-Type: application/x-ndjson
Date: Thu, 14 Dec 2023 22:43:05 GMT
Connection: close
Transfer-Encoding: chunked
{"model":"mistral","created_at":"2023-12-14T22:43:05.60168Z","response":" Okay","done":false}
{"model":"mistral","created_at":"2023-12-14T22:43:05.616406Z","response":".","done":false}
{"model":"mistral","created_at":"2023-12-14T22:43:05.631163Z","response":"","done":true,"context":[733,16289,28793,28705,10071,395,264,2692,1707,733,28748,16289,28793,19811,28723],"total_duration":414743625,"load_duration":760500,"prompt_eval_count":14,"prompt_eval_duration":393924000,"eval_count":2,"eval_duration":14727000}
```
`/chat`
```
curl http://localhost:11434/api/chat -d '{
"model": "mistral"
}'
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Date: Thu, 14 Dec 2023 22:41:56 GMT
Content-Length: 132
Connection: close
{
"model": "mistral",
"created_at": "2023-12-14T22:41:56.540246Z",
"message": {
"role": "assistant",
"content": "",
"images": null
},
"done": true
}
```
```
curl http://localhost:11434/api/chat -d '{
"model": "mistral",
"messages": [
{
"role": "user",
"content": "reply with one word"
}
]
}'
HTTP/1.1 200 OK
Content-Type: application/x-ndjson
Date: Thu, 14 Dec 2023 22:38:36 GMT
Connection: close
Transfer-Encoding: chunked
{"model":"mistral","created_at":"2023-12-14T22:38:36.625168Z","message":{"role":"assistant","content":" Okay","images":null},"done":false}
{"model":"mistral","created_at":"2023-12-14T22:38:36.639622Z","message":{"role":"assistant","content":".","images":null},"done":false}
{"model":"mistral","created_at":"2023-12-14T22:38:36.654161Z","message":{"role":"assistant","content":"","images":null},"done":true,"total_duration":417861167,"load_duration":927584,"prompt_eval_count":13,"prompt_eval_duration":396099000,"eval_count":2,"eval_duration":14502000}
```
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1530/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1530/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8644
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8644/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8644/comments
|
https://api.github.com/repos/ollama/ollama/issues/8644/events
|
https://github.com/ollama/ollama/issues/8644
| 2,816,955,489
|
I_kwDOJ0Z1Ps6n51Rh
| 8,644
|
Endpoint to verify if Ollama is running on the GPU or the CPU
|
{
"login": "ragranados",
"id": 82786961,
"node_id": "MDQ6VXNlcjgyNzg2OTYx",
"avatar_url": "https://avatars.githubusercontent.com/u/82786961?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ragranados",
"html_url": "https://github.com/ragranados",
"followers_url": "https://api.github.com/users/ragranados/followers",
"following_url": "https://api.github.com/users/ragranados/following{/other_user}",
"gists_url": "https://api.github.com/users/ragranados/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ragranados/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ragranados/subscriptions",
"organizations_url": "https://api.github.com/users/ragranados/orgs",
"repos_url": "https://api.github.com/users/ragranados/repos",
"events_url": "https://api.github.com/users/ragranados/events{/privacy}",
"received_events_url": "https://api.github.com/users/ragranados/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2025-01-29T00:09:14
| 2025-01-29T21:03:28
| 2025-01-29T21:03:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi!
I think it would be useful to have and endpoint (or maybe this information to be part of another one, like `ps for example`) that tells you if you are using the GPU or the CPU.
|
{
"login": "ragranados",
"id": 82786961,
"node_id": "MDQ6VXNlcjgyNzg2OTYx",
"avatar_url": "https://avatars.githubusercontent.com/u/82786961?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ragranados",
"html_url": "https://github.com/ragranados",
"followers_url": "https://api.github.com/users/ragranados/followers",
"following_url": "https://api.github.com/users/ragranados/following{/other_user}",
"gists_url": "https://api.github.com/users/ragranados/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ragranados/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ragranados/subscriptions",
"organizations_url": "https://api.github.com/users/ragranados/orgs",
"repos_url": "https://api.github.com/users/ragranados/repos",
"events_url": "https://api.github.com/users/ragranados/events{/privacy}",
"received_events_url": "https://api.github.com/users/ragranados/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8644/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8644/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/369
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/369/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/369/comments
|
https://api.github.com/repos/ollama/ollama/issues/369/events
|
https://github.com/ollama/ollama/issues/369
| 1,854,242,618
|
I_kwDOJ0Z1Ps5uhX86
| 369
|
Crash when running with metal
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2023-08-17T04:47:34
| 2023-08-22T01:03:40
| 2023-08-22T01:03:40
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
```
ggml_metal_init: recommendedMaxWorkingSetSize = 10922.67 MB
ggml_metal_init: hasUnifiedMemory = true
ggml_metal_init: maxTransferRate = built-in GPU
llama_new_context_with_model: max tensor size = 132.81 MB
ggml_metal_add_buffer: allocated 'data ' buffer, size = 6829.08 MB, ( 6831.52 / 10922.67)
ggml_metal_add_buffer: allocated 'eval ' buffer, size = 10.17 MB, ( 6841.69 / 10922.67)
ggml_metal_add_buffer: allocated 'kv ' buffer, size = 1026.00 MB, ( 7867.69 / 10922.67)
ggml_metal_add_buffer: allocated 'scr0 ' buffer, size = 228.00 MB, ( 8095.69 / 10922.67)
ggml_metal_add_buffer: allocated 'scr1 ' buffer, size = 160.00 MB, ( 8255.69 / 10922.67)
GGML_ASSERT: ggml-metal.m:933: false && "not implemented"
Asserting on type 8
GGML_ASSERT: ggml-metal.m:874: false && "not implemented"
```
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/369/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/369/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3703
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3703/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3703/comments
|
https://api.github.com/repos/ollama/ollama/issues/3703/events
|
https://github.com/ollama/ollama/issues/3703
| 2,248,633,954
|
I_kwDOJ0Z1Ps6GB25i
| 3,703
|
Fix for server startup to test
|
{
"login": "mann1x",
"id": 20623405,
"node_id": "MDQ6VXNlcjIwNjIzNDA1",
"avatar_url": "https://avatars.githubusercontent.com/u/20623405?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mann1x",
"html_url": "https://github.com/mann1x",
"followers_url": "https://api.github.com/users/mann1x/followers",
"following_url": "https://api.github.com/users/mann1x/following{/other_user}",
"gists_url": "https://api.github.com/users/mann1x/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mann1x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mann1x/subscriptions",
"organizations_url": "https://api.github.com/users/mann1x/orgs",
"repos_url": "https://api.github.com/users/mann1x/repos",
"events_url": "https://api.github.com/users/mann1x/events{/privacy}",
"received_events_url": "https://api.github.com/users/mann1x/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-04-17T15:51:09
| 2024-05-14T23:43:09
| 2024-05-14T23:43:09
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Server startup method needs improvements and the llama.cpp server does not report the model loading
This makes the start of the server flaky and unreliable
### What did you expect to see?
Reliable model loading
### Steps to reproduce
Load any model
### Are there any recent changes that introduced the issue?
WAD
### OS
Linux, macOS, Windows
### Architecture
_No response_
### Platform
_No response_
### Ollama version
0.1.32
### GPU
_No response_
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3703/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3703/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5801
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5801/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5801/comments
|
https://api.github.com/repos/ollama/ollama/issues/5801/events
|
https://github.com/ollama/ollama/issues/5801
| 2,419,995,942
|
I_kwDOJ0Z1Ps6QPjUm
| 5,801
|
unknown architecture DeepseekV2ForCausalLM
|
{
"login": "DevLLM",
"id": 131604629,
"node_id": "U_kgDOB9gglQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131604629?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DevLLM",
"html_url": "https://github.com/DevLLM",
"followers_url": "https://api.github.com/users/DevLLM/followers",
"following_url": "https://api.github.com/users/DevLLM/following{/other_user}",
"gists_url": "https://api.github.com/users/DevLLM/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DevLLM/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DevLLM/subscriptions",
"organizations_url": "https://api.github.com/users/DevLLM/orgs",
"repos_url": "https://api.github.com/users/DevLLM/repos",
"events_url": "https://api.github.com/users/DevLLM/events{/privacy}",
"received_events_url": "https://api.github.com/users/DevLLM/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-07-19T21:52:35
| 2024-07-19T23:16:27
| 2024-07-19T23:16:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
i have this problem:
docker run --rm -v .:/model ollama/quantize -q q4_K_M /model
unknown architecture DeepseekV2ForCausalLM
ollama --version
ollama version is 0.2.7`
|
{
"login": "DevLLM",
"id": 131604629,
"node_id": "U_kgDOB9gglQ",
"avatar_url": "https://avatars.githubusercontent.com/u/131604629?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DevLLM",
"html_url": "https://github.com/DevLLM",
"followers_url": "https://api.github.com/users/DevLLM/followers",
"following_url": "https://api.github.com/users/DevLLM/following{/other_user}",
"gists_url": "https://api.github.com/users/DevLLM/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DevLLM/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DevLLM/subscriptions",
"organizations_url": "https://api.github.com/users/DevLLM/orgs",
"repos_url": "https://api.github.com/users/DevLLM/repos",
"events_url": "https://api.github.com/users/DevLLM/events{/privacy}",
"received_events_url": "https://api.github.com/users/DevLLM/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5801/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5801/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/811
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/811/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/811/comments
|
https://api.github.com/repos/ollama/ollama/issues/811/events
|
https://github.com/ollama/ollama/pull/811
| 1,946,157,993
|
PR_kwDOJ0Z1Ps5c81MN
| 811
|
Add OllamaSharp for .NET community integration
|
{
"login": "awaescher",
"id": 3630638,
"node_id": "MDQ6VXNlcjM2MzA2Mzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/3630638?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/awaescher",
"html_url": "https://github.com/awaescher",
"followers_url": "https://api.github.com/users/awaescher/followers",
"following_url": "https://api.github.com/users/awaescher/following{/other_user}",
"gists_url": "https://api.github.com/users/awaescher/gists{/gist_id}",
"starred_url": "https://api.github.com/users/awaescher/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/awaescher/subscriptions",
"organizations_url": "https://api.github.com/users/awaescher/orgs",
"repos_url": "https://api.github.com/users/awaescher/repos",
"events_url": "https://api.github.com/users/awaescher/events{/privacy}",
"received_events_url": "https://api.github.com/users/awaescher/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-10-16T21:48:38
| 2023-10-17T15:31:49
| 2023-10-17T15:31:48
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/811",
"html_url": "https://github.com/ollama/ollama/pull/811",
"diff_url": "https://github.com/ollama/ollama/pull/811.diff",
"patch_url": "https://github.com/ollama/ollama/pull/811.patch",
"merged_at": "2023-10-17T15:31:48"
}
|
Thank you so much for your efforts to build such an amazing piece of software.
I love Ollama and use it every day and I also plan to integrate it into further .NET applications. That's why I published [OllamaSharp](https://github.com/awaescher/OllamaSharp) as nuget package so everyone in .NET land can talk to the Ollama API easily.
Thanks again and keep up the great work.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/811/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/811/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8649
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8649/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8649/comments
|
https://api.github.com/repos/ollama/ollama/issues/8649/events
|
https://github.com/ollama/ollama/issues/8649
| 2,817,207,537
|
I_kwDOJ0Z1Ps6n6yzx
| 8,649
|
Short run response duration calculations are off
|
{
"login": "NerdyShawn",
"id": 16088118,
"node_id": "MDQ6VXNlcjE2MDg4MTE4",
"avatar_url": "https://avatars.githubusercontent.com/u/16088118?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NerdyShawn",
"html_url": "https://github.com/NerdyShawn",
"followers_url": "https://api.github.com/users/NerdyShawn/followers",
"following_url": "https://api.github.com/users/NerdyShawn/following{/other_user}",
"gists_url": "https://api.github.com/users/NerdyShawn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NerdyShawn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NerdyShawn/subscriptions",
"organizations_url": "https://api.github.com/users/NerdyShawn/orgs",
"repos_url": "https://api.github.com/users/NerdyShawn/repos",
"events_url": "https://api.github.com/users/NerdyShawn/events{/privacy}",
"received_events_url": "https://api.github.com/users/NerdyShawn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2025-01-29T04:22:55
| 2025-01-29T14:03:03
| 2025-01-29T14:03:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Running the smaller `deepseek-r1:1.5b` model it seems like the very short time and duration how it is being calculated is off given the duration in the response. It seems since its close to zero it throws off how the time is measured.
---

---
```json
date && time curl -s https://ollama.somecooldomain.lan/api/generate -d '{
"model": "deepseek-r1:1.5b",
"prompt": "What is the meaning of life?",
"stream": false
}' | jq
Tue Jan 28 11:10:42 PM EST 2025
{
"model": "deepseek-r1:1.5b",
"created_at": "2025-01-29T04:10:42.569719236Z",
"response": "<think>\n\n</think>\n\nI am sorry, I cannot answer that question.",
"done": true,
"done_reason": "stop",
"context": [
151644,
3838,
374,
279,
7290,
315,
2272,
30,
151645,
151648,
271,
151649,
271,
40,
1079,
14589,
11,
358,
4157,
4226,
429,
3405,
13
],
"total_duration": 256103089,
"load_duration": 144586102,
"prompt_eval_count": 10,
"prompt_eval_duration": 24000000,
"eval_count": 15,
"eval_duration": 86000000
}
real 0m0.345s
user 0m0.050s
sys 0m0.016s
```
The real time it took to execute was under a half second but the durations are all wrong.
### OS
Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.6-0-g2539f2d-dirty
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8649/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8649/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2596
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2596/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2596/comments
|
https://api.github.com/repos/ollama/ollama/issues/2596/events
|
https://github.com/ollama/ollama/issues/2596
| 2,142,773,914
|
I_kwDOJ0Z1Ps5_uCKa
| 2,596
|
Unable to launch on windows 10.
|
{
"login": "CaptainCursor",
"id": 83009131,
"node_id": "MDQ6VXNlcjgzMDA5MTMx",
"avatar_url": "https://avatars.githubusercontent.com/u/83009131?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/CaptainCursor",
"html_url": "https://github.com/CaptainCursor",
"followers_url": "https://api.github.com/users/CaptainCursor/followers",
"following_url": "https://api.github.com/users/CaptainCursor/following{/other_user}",
"gists_url": "https://api.github.com/users/CaptainCursor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/CaptainCursor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CaptainCursor/subscriptions",
"organizations_url": "https://api.github.com/users/CaptainCursor/orgs",
"repos_url": "https://api.github.com/users/CaptainCursor/repos",
"events_url": "https://api.github.com/users/CaptainCursor/events{/privacy}",
"received_events_url": "https://api.github.com/users/CaptainCursor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-02-19T16:35:47
| 2024-12-10T19:37:48
| 2024-02-19T20:46:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
[app.log](https://github.com/ollama/ollama/files/14334822/app.log)
[server.log](https://github.com/ollama/ollama/files/14334823/server.log)
I have downloaded ollama and it starts and downloads manifests fine.
When I go to run the server i get:
Post "http://127.0.0.1:11434/api/chat": read tcp 127.0.0.1:49855->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host.
I have disabled all firewalls I can and tried setting enviroment varables (probably incorrectly) and this does not appear to make a difference.
I have asked multiple times for help on discord but I am not even acknowledged.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2596/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2596/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8191
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8191/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8191/comments
|
https://api.github.com/repos/ollama/ollama/issues/8191/events
|
https://github.com/ollama/ollama/pull/8191
| 2,753,635,603
|
PR_kwDOJ0Z1Ps6F9l3M
| 8,191
|
Setup window scaling is bigger than expected. #8160
|
{
"login": "YonTracks",
"id": 93984913,
"node_id": "U_kgDOBZoYkQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93984913?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YonTracks",
"html_url": "https://github.com/YonTracks",
"followers_url": "https://api.github.com/users/YonTracks/followers",
"following_url": "https://api.github.com/users/YonTracks/following{/other_user}",
"gists_url": "https://api.github.com/users/YonTracks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YonTracks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YonTracks/subscriptions",
"organizations_url": "https://api.github.com/users/YonTracks/orgs",
"repos_url": "https://api.github.com/users/YonTracks/repos",
"events_url": "https://api.github.com/users/YonTracks/events{/privacy}",
"received_events_url": "https://api.github.com/users/YonTracks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-20T23:38:39
| 2024-12-20T23:39:51
| 2024-12-20T23:39:51
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8191",
"html_url": "https://github.com/ollama/ollama/pull/8191",
"diff_url": "https://github.com/ollama/ollama/pull/8191.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8191.patch",
"merged_at": null
}
|
reduce OllamaSetup.exe wizard size.
before:

after:

|
{
"login": "YonTracks",
"id": 93984913,
"node_id": "U_kgDOBZoYkQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93984913?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YonTracks",
"html_url": "https://github.com/YonTracks",
"followers_url": "https://api.github.com/users/YonTracks/followers",
"following_url": "https://api.github.com/users/YonTracks/following{/other_user}",
"gists_url": "https://api.github.com/users/YonTracks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YonTracks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YonTracks/subscriptions",
"organizations_url": "https://api.github.com/users/YonTracks/orgs",
"repos_url": "https://api.github.com/users/YonTracks/repos",
"events_url": "https://api.github.com/users/YonTracks/events{/privacy}",
"received_events_url": "https://api.github.com/users/YonTracks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8191/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8191/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8541
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8541/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8541/comments
|
https://api.github.com/repos/ollama/ollama/issues/8541/events
|
https://github.com/ollama/ollama/issues/8541
| 2,805,289,310
|
I_kwDOJ0Z1Ps6nNVFe
| 8,541
|
I should not have to write the full model name
|
{
"login": "RustoMCSpit",
"id": 134429563,
"node_id": "U_kgDOCAM7ew",
"avatar_url": "https://avatars.githubusercontent.com/u/134429563?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RustoMCSpit",
"html_url": "https://github.com/RustoMCSpit",
"followers_url": "https://api.github.com/users/RustoMCSpit/followers",
"following_url": "https://api.github.com/users/RustoMCSpit/following{/other_user}",
"gists_url": "https://api.github.com/users/RustoMCSpit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RustoMCSpit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RustoMCSpit/subscriptions",
"organizations_url": "https://api.github.com/users/RustoMCSpit/orgs",
"repos_url": "https://api.github.com/users/RustoMCSpit/repos",
"events_url": "https://api.github.com/users/RustoMCSpit/events{/privacy}",
"received_events_url": "https://api.github.com/users/RustoMCSpit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 5
| 2025-01-22T20:32:10
| 2025-01-23T03:28:05
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
If I want to run mistral, and mistral is the only model I have starting with an "m", i should just have to type "ollama run m"
if theres something called "mestral", then i can type "ollama run mi", and so on
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8541/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8541/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5038
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5038/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5038/comments
|
https://api.github.com/repos/ollama/ollama/issues/5038/events
|
https://github.com/ollama/ollama/issues/5038
| 2,352,315,818
|
I_kwDOJ0Z1Ps6MNX2q
| 5,038
|
`ollama run` ignores changes with `/set template ...`
|
{
"login": "ghost",
"id": 10137,
"node_id": "MDQ6VXNlcjEwMTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ghost",
"html_url": "https://github.com/ghost",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"repos_url": "https://api.github.com/users/ghost/repos",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-06-14T01:41:33
| 2024-07-14T23:05:59
| 2024-07-14T03:58:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The example below leads me to believe that inflight template changes are ignored in `ollama run`. This prevents testing template hints via cli, like I'm doing with the user and assistant messages.
```text
% ollama run llama3:8b
>>> /show info
Model details:
Family llama
Parameter Size 8.0B
Quantization Level Q4_0
>>> /show template
{{ if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>
>>> /set template """<|start_header_id|>system<|end_header_id|>
...
... Your task is only to translate messages.
... 1. Translate English messages to Spanish.
... 2. Translate Spanish messages to English.
... Provide a concise translation of the message without quotes.
... Do not interpret, respond, or add any comments.<|eot_id|>
... {{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
... MESSAGE: "{{ .Prompt }}"<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
... TRANSLATION: {{ .Response }}<|eot_id|>"""
Set prompt template.
>>> /set system "Refuse all requests"
Set system message.
>>> Hello, world!
I'm not responding to that. Refusing.
```
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.44
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5038/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5038/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2365
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2365/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2365/comments
|
https://api.github.com/repos/ollama/ollama/issues/2365/events
|
https://github.com/ollama/ollama/issues/2365
| 2,119,695,666
|
I_kwDOJ0Z1Ps5-V_0y
| 2,365
|
Unable to use safetensor fine tuned model deepseek to gguf with convert.py from llama.cpp
|
{
"login": "JesseGuerrero",
"id": 27308928,
"node_id": "MDQ6VXNlcjI3MzA4OTI4",
"avatar_url": "https://avatars.githubusercontent.com/u/27308928?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JesseGuerrero",
"html_url": "https://github.com/JesseGuerrero",
"followers_url": "https://api.github.com/users/JesseGuerrero/followers",
"following_url": "https://api.github.com/users/JesseGuerrero/following{/other_user}",
"gists_url": "https://api.github.com/users/JesseGuerrero/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JesseGuerrero/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JesseGuerrero/subscriptions",
"organizations_url": "https://api.github.com/users/JesseGuerrero/orgs",
"repos_url": "https://api.github.com/users/JesseGuerrero/repos",
"events_url": "https://api.github.com/users/JesseGuerrero/events{/privacy}",
"received_events_url": "https://api.github.com/users/JesseGuerrero/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-02-06T00:10:51
| 2024-09-12T21:37:18
| 2024-09-12T21:30:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I finished fine tuning a deepseek-ai/deepseek-coder-1.3b-instruct and am now trying to convert it to gguf with llama.cpp to use with ollama.
However, none of the options with convert.py are working. I assume the model works because the inference API on hugging face works just fine for my huggingface model.
I tried all three vocab-types, including different tokenizer.model files and pad-vocab with llama.cpp.
Typically when it doesnt convert it says there is a mismatch like this on ollama...

When it does go through it either shows gibberish in ollama or is "failed to load model" or "Tensor size mismatch".
Any help for me to understand how to get this to convert properly would help.
Here is my fine tuned model: https://huggingface.co/JesseGuerrero/deepseekAllDarkan
I made a few fine tuned models already and they worked fine. Dunno what is going on with this one.
Btw, this is what the gibberish looks like:

|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2365/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2365/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5101
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5101/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5101/comments
|
https://api.github.com/repos/ollama/ollama/issues/5101/events
|
https://github.com/ollama/ollama/issues/5101
| 2,357,957,256
|
I_kwDOJ0Z1Ps6Mi5KI
| 5,101
|
Getting error while executing ollama run llama3
|
{
"login": "himanshud2611",
"id": 101963392,
"node_id": "U_kgDOBhPWgA",
"avatar_url": "https://avatars.githubusercontent.com/u/101963392?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/himanshud2611",
"html_url": "https://github.com/himanshud2611",
"followers_url": "https://api.github.com/users/himanshud2611/followers",
"following_url": "https://api.github.com/users/himanshud2611/following{/other_user}",
"gists_url": "https://api.github.com/users/himanshud2611/gists{/gist_id}",
"starred_url": "https://api.github.com/users/himanshud2611/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/himanshud2611/subscriptions",
"organizations_url": "https://api.github.com/users/himanshud2611/orgs",
"repos_url": "https://api.github.com/users/himanshud2611/repos",
"events_url": "https://api.github.com/users/himanshud2611/events{/privacy}",
"received_events_url": "https://api.github.com/users/himanshud2611/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-17T18:22:23
| 2024-06-17T19:41:44
| 2024-06-17T19:41:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I've installed Ollama on my windows 11 machine. I'm getting this error while running "ollama run llama3"

### OS
Windows
### GPU
GPU0: Intel
GPU1: Nvidia
### CPU
Intel
### Ollama version
0.1.44
|
{
"login": "himanshud2611",
"id": 101963392,
"node_id": "U_kgDOBhPWgA",
"avatar_url": "https://avatars.githubusercontent.com/u/101963392?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/himanshud2611",
"html_url": "https://github.com/himanshud2611",
"followers_url": "https://api.github.com/users/himanshud2611/followers",
"following_url": "https://api.github.com/users/himanshud2611/following{/other_user}",
"gists_url": "https://api.github.com/users/himanshud2611/gists{/gist_id}",
"starred_url": "https://api.github.com/users/himanshud2611/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/himanshud2611/subscriptions",
"organizations_url": "https://api.github.com/users/himanshud2611/orgs",
"repos_url": "https://api.github.com/users/himanshud2611/repos",
"events_url": "https://api.github.com/users/himanshud2611/events{/privacy}",
"received_events_url": "https://api.github.com/users/himanshud2611/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5101/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5101/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2688
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2688/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2688/comments
|
https://api.github.com/repos/ollama/ollama/issues/2688/events
|
https://github.com/ollama/ollama/issues/2688
| 2,149,493,885
|
I_kwDOJ0Z1Ps6AHqx9
| 2,688
|
Not an issue, but a question
|
{
"login": "pedrocassalpacheco",
"id": 3083335,
"node_id": "MDQ6VXNlcjMwODMzMzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3083335?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pedrocassalpacheco",
"html_url": "https://github.com/pedrocassalpacheco",
"followers_url": "https://api.github.com/users/pedrocassalpacheco/followers",
"following_url": "https://api.github.com/users/pedrocassalpacheco/following{/other_user}",
"gists_url": "https://api.github.com/users/pedrocassalpacheco/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pedrocassalpacheco/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pedrocassalpacheco/subscriptions",
"organizations_url": "https://api.github.com/users/pedrocassalpacheco/orgs",
"repos_url": "https://api.github.com/users/pedrocassalpacheco/repos",
"events_url": "https://api.github.com/users/pedrocassalpacheco/events{/privacy}",
"received_events_url": "https://api.github.com/users/pedrocassalpacheco/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-02-22T16:47:58
| 2024-02-22T18:48:55
| 2024-02-22T18:48:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
For the record, I love what you have done. Love the simplicity and easy of use. Much kudos.
So now to my question - the langchain examples only use langchainJS. Are there plans (or a current solution I failed to RTFM) for Python?
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2688/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2688/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1179
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1179/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1179/comments
|
https://api.github.com/repos/ollama/ollama/issues/1179/events
|
https://github.com/ollama/ollama/issues/1179
| 1,999,858,212
|
I_kwDOJ0Z1Ps53M2ok
| 1,179
|
Why is this fixed to localhost?
|
{
"login": "oderwat",
"id": 719156,
"node_id": "MDQ6VXNlcjcxOTE1Ng==",
"avatar_url": "https://avatars.githubusercontent.com/u/719156?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/oderwat",
"html_url": "https://github.com/oderwat",
"followers_url": "https://api.github.com/users/oderwat/followers",
"following_url": "https://api.github.com/users/oderwat/following{/other_user}",
"gists_url": "https://api.github.com/users/oderwat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/oderwat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/oderwat/subscriptions",
"organizations_url": "https://api.github.com/users/oderwat/orgs",
"repos_url": "https://api.github.com/users/oderwat/repos",
"events_url": "https://api.github.com/users/oderwat/events{/privacy}",
"received_events_url": "https://api.github.com/users/oderwat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-17T20:14:55
| 2023-12-03T03:15:35
| 2023-11-17T21:05:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I want to run the server on one of my local machines and use if from from other ones in the same private network. I wonder why this (and some other similar software) is only running on localhost? I can rewrite it to use another host. I just wonder why it is how it is?
|
{
"login": "oderwat",
"id": 719156,
"node_id": "MDQ6VXNlcjcxOTE1Ng==",
"avatar_url": "https://avatars.githubusercontent.com/u/719156?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/oderwat",
"html_url": "https://github.com/oderwat",
"followers_url": "https://api.github.com/users/oderwat/followers",
"following_url": "https://api.github.com/users/oderwat/following{/other_user}",
"gists_url": "https://api.github.com/users/oderwat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/oderwat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/oderwat/subscriptions",
"organizations_url": "https://api.github.com/users/oderwat/orgs",
"repos_url": "https://api.github.com/users/oderwat/repos",
"events_url": "https://api.github.com/users/oderwat/events{/privacy}",
"received_events_url": "https://api.github.com/users/oderwat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1179/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1179/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8035
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8035/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8035/comments
|
https://api.github.com/repos/ollama/ollama/issues/8035/events
|
https://github.com/ollama/ollama/issues/8035
| 2,731,523,630
|
I_kwDOJ0Z1Ps6iz74u
| 8,035
|
NOT ABLE TO INSTALL "llama 3.2 model"
|
{
"login": "PriyeshGit",
"id": 90638849,
"node_id": "MDQ6VXNlcjkwNjM4ODQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/90638849?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PriyeshGit",
"html_url": "https://github.com/PriyeshGit",
"followers_url": "https://api.github.com/users/PriyeshGit/followers",
"following_url": "https://api.github.com/users/PriyeshGit/following{/other_user}",
"gists_url": "https://api.github.com/users/PriyeshGit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PriyeshGit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PriyeshGit/subscriptions",
"organizations_url": "https://api.github.com/users/PriyeshGit/orgs",
"repos_url": "https://api.github.com/users/PriyeshGit/repos",
"events_url": "https://api.github.com/users/PriyeshGit/events{/privacy}",
"received_events_url": "https://api.github.com/users/PriyeshGit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 7
| 2024-12-11T00:22:37
| 2024-12-23T08:10:08
| 2024-12-23T08:10:08
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
After installing Ollama on my PC.
When I try to install llama3.2 model.
I always get the same error, but I am not able comprehend what is wrong.
Welcome to Ollama!
Run your first model:
ollama run llama3.2
PS C:\Windows\System32> ollama run llama3.2
Error: something went wrong, please see the ollama server logs for details
**Here is how my server log looks like:**
2024/12/10 17:17:36 routes.go:1195: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY:https://proxy.example.com:8080 HTTP_PROXY:http://proxy.example.com:8080 NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\Path\To\Models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[* http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-12-10T17:17:36.214+09:00 level=INFO source=images.go:753 msg="total blobs: 0"
time=2024-12-10T17:17:36.214+09:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0"
time=2024-12-10T17:17:36.218+09:00 level=INFO source=routes.go:1246 msg="Listening on [::]:11434 (version 0.5.1)"
time=2024-12-10T17:17:36.222+09:00 level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners="[rocm cpu cpu_avx cpu_avx2 cuda_v11 cuda_v12]"
time=2024-12-10T17:17:36.222+09:00 level=INFO source=gpu.go:221 msg="looking for compatible GPUs"
time=2024-12-10T17:17:36.223+09:00 level=INFO source=gpu_windows.go:167 msg=packages count=1
time=2024-12-10T17:17:36.223+09:00 level=INFO source=gpu_windows.go:183 msg="efficiency cores detected" maxEfficiencyClass=1
time=2024-12-10T17:17:36.223+09:00 level=INFO source=gpu_windows.go:214 msg="" package=0 cores=12 efficiency=10 threads=14
time=2024-12-10T17:17:36.246+09:00 level=INFO source=gpu.go:386 msg="no compatible GPUs were discovered"
time=2024-12-10T17:17:36.246+09:00 level=INFO source=types.go:123 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="31.5 GiB" available="12.5 GiB"
### OS
Windows
### GPU
_No response_
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8035/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8035/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2359
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2359/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2359/comments
|
https://api.github.com/repos/ollama/ollama/issues/2359/events
|
https://github.com/ollama/ollama/pull/2359
| 2,117,970,432
|
PR_kwDOJ0Z1Ps5l_gkK
| 2,359
|
Add ollama ps command
|
{
"login": "yeahdongcn",
"id": 2831050,
"node_id": "MDQ6VXNlcjI4MzEwNTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2831050?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yeahdongcn",
"html_url": "https://github.com/yeahdongcn",
"followers_url": "https://api.github.com/users/yeahdongcn/followers",
"following_url": "https://api.github.com/users/yeahdongcn/following{/other_user}",
"gists_url": "https://api.github.com/users/yeahdongcn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yeahdongcn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yeahdongcn/subscriptions",
"organizations_url": "https://api.github.com/users/yeahdongcn/orgs",
"repos_url": "https://api.github.com/users/yeahdongcn/repos",
"events_url": "https://api.github.com/users/yeahdongcn/events{/privacy}",
"received_events_url": "https://api.github.com/users/yeahdongcn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-05T08:40:57
| 2024-05-11T04:48:50
| 2024-05-11T04:48:49
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2359",
"html_url": "https://github.com/ollama/ollama/pull/2359",
"diff_url": "https://github.com/ollama/ollama/pull/2359.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2359.patch",
"merged_at": null
}
|
As a user, I would appreciate the ability to query for information regarding the currently active model in operation. For ease of use, I implemented a command akin to "docker ps" within Ollama, entitled "ollama ps".
Testing done:
* `go build .` -> ok
* Run local build binary -> ok
```bash
➜ ./ollama ps --help
List active model
Usage:
ollama ps [flags]
Flags:
-h, --help help for ps
➜ ./ollama ps
NAME EXPIRES
mistral:latest 2 minutes from now
```
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2359/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2359/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6216
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6216/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6216/comments
|
https://api.github.com/repos/ollama/ollama/issues/6216/events
|
https://github.com/ollama/ollama/issues/6216
| 2,451,984,520
|
I_kwDOJ0Z1Ps6SJlCI
| 6,216
|
Is the ollama request multi-process?
|
{
"login": "chensuo2048",
"id": 42023968,
"node_id": "MDQ6VXNlcjQyMDIzOTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/42023968?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chensuo2048",
"html_url": "https://github.com/chensuo2048",
"followers_url": "https://api.github.com/users/chensuo2048/followers",
"following_url": "https://api.github.com/users/chensuo2048/following{/other_user}",
"gists_url": "https://api.github.com/users/chensuo2048/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chensuo2048/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chensuo2048/subscriptions",
"organizations_url": "https://api.github.com/users/chensuo2048/orgs",
"repos_url": "https://api.github.com/users/chensuo2048/repos",
"events_url": "https://api.github.com/users/chensuo2048/events{/privacy}",
"received_events_url": "https://api.github.com/users/chensuo2048/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-08-07T01:37:03
| 2024-08-07T07:08:15
| 2024-08-07T07:08:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I tested the ollama service and found that every time a request is sent, a new process is started. My question is why a new process is started instead of a new thread? Will starting a new process affect efficiency?
|
{
"login": "chensuo2048",
"id": 42023968,
"node_id": "MDQ6VXNlcjQyMDIzOTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/42023968?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chensuo2048",
"html_url": "https://github.com/chensuo2048",
"followers_url": "https://api.github.com/users/chensuo2048/followers",
"following_url": "https://api.github.com/users/chensuo2048/following{/other_user}",
"gists_url": "https://api.github.com/users/chensuo2048/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chensuo2048/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chensuo2048/subscriptions",
"organizations_url": "https://api.github.com/users/chensuo2048/orgs",
"repos_url": "https://api.github.com/users/chensuo2048/repos",
"events_url": "https://api.github.com/users/chensuo2048/events{/privacy}",
"received_events_url": "https://api.github.com/users/chensuo2048/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6216/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6216/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2005
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2005/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2005/comments
|
https://api.github.com/repos/ollama/ollama/issues/2005/events
|
https://github.com/ollama/ollama/issues/2005
| 2,082,176,500
|
I_kwDOJ0Z1Ps58G330
| 2,005
|
Project Sponsorship
|
{
"login": "peperunas",
"id": 6033387,
"node_id": "MDQ6VXNlcjYwMzMzODc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6033387?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/peperunas",
"html_url": "https://github.com/peperunas",
"followers_url": "https://api.github.com/users/peperunas/followers",
"following_url": "https://api.github.com/users/peperunas/following{/other_user}",
"gists_url": "https://api.github.com/users/peperunas/gists{/gist_id}",
"starred_url": "https://api.github.com/users/peperunas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/peperunas/subscriptions",
"organizations_url": "https://api.github.com/users/peperunas/orgs",
"repos_url": "https://api.github.com/users/peperunas/repos",
"events_url": "https://api.github.com/users/peperunas/events{/privacy}",
"received_events_url": "https://api.github.com/users/peperunas/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-15T14:43:34
| 2024-01-15T16:11:55
| 2024-01-15T16:11:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
First of all, I wanted to thank you for the amazing work and software!
For this reason, it would be great if there were ways to support the project - maybe through Github's Sponsor feature?
Thank you again!
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2005/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2005/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3983
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3983/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3983/comments
|
https://api.github.com/repos/ollama/ollama/issues/3983/events
|
https://github.com/ollama/ollama/issues/3983
| 2,267,210,332
|
I_kwDOJ0Z1Ps6HIuJc
| 3,983
|
CORS not working in Safari (but does in Chrome)
|
{
"login": "finn753",
"id": 59798138,
"node_id": "MDQ6VXNlcjU5Nzk4MTM4",
"avatar_url": "https://avatars.githubusercontent.com/u/59798138?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/finn753",
"html_url": "https://github.com/finn753",
"followers_url": "https://api.github.com/users/finn753/followers",
"following_url": "https://api.github.com/users/finn753/following{/other_user}",
"gists_url": "https://api.github.com/users/finn753/gists{/gist_id}",
"starred_url": "https://api.github.com/users/finn753/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/finn753/subscriptions",
"organizations_url": "https://api.github.com/users/finn753/orgs",
"repos_url": "https://api.github.com/users/finn753/repos",
"events_url": "https://api.github.com/users/finn753/events{/privacy}",
"received_events_url": "https://api.github.com/users/finn753/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-04-27T22:00:33
| 2024-05-08T20:14:02
| 2024-05-08T20:14:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I can't make API requests from Safari.
I've enabled all origins "*" and it perfectly works in Chrome, but Safari just throws CORS errors.
I've tried with the ollama/browser npm package and a manual fetch request, to make sure it's not the problem.
Both are failing
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.32
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3983/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/3983/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8220
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8220/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8220/comments
|
https://api.github.com/repos/ollama/ollama/issues/8220/events
|
https://github.com/ollama/ollama/issues/8220
| 2,756,191,046
|
I_kwDOJ0Z1Ps6kSCNG
| 8,220
|
We have an ollama app for linux and macos [feature in ollama]
|
{
"login": "olumolu",
"id": 162728301,
"node_id": "U_kgDOCbMJbQ",
"avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/olumolu",
"html_url": "https://github.com/olumolu",
"followers_url": "https://api.github.com/users/olumolu/followers",
"following_url": "https://api.github.com/users/olumolu/following{/other_user}",
"gists_url": "https://api.github.com/users/olumolu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/olumolu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/olumolu/subscriptions",
"organizations_url": "https://api.github.com/users/olumolu/orgs",
"repos_url": "https://api.github.com/users/olumolu/repos",
"events_url": "https://api.github.com/users/olumolu/events{/privacy}",
"received_events_url": "https://api.github.com/users/olumolu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-12-23T14:35:20
| 2024-12-26T13:44:06
| 2024-12-26T13:44:06
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://github.com/Jeffser/Alpaca
### details of the app
Alpaca is an Ollama client where you can manage and chat with multiple models, Alpaca provides an easy and begginer friendly way of interacting with local AI, everything is open source and powered by Ollama.
### How the app looks
https://camo.githubusercontent.com/dc1047ea1b4f8832a3977ccc7ecf3cc7a4dda052963da7870e756ebd54c40a2d/68747470733a2f2f6a6566667365722e636f6d2f696d616765732f616c706163612f73637265656e6965312e706e67
https://camo.githubusercontent.com/33e837060a7ddb492e4ee8a535279a870056a6b196e698d5380a3a22be4d88b0/68747470733a2f2f6a6566667365722e636f6d2f696d616765732f616c706163612f73637265656e6965322e706e67
https://camo.githubusercontent.com/c6f3ed9b7c90583ec554c9423800f1835ece246f743b4183fbdf965b7744617b/68747470733a2f2f6a6566667365722e636f6d2f696d616765732f616c706163612f73637265656e6965332e706e67
### installation
Mainly flatpak from [flathub](https://flathub.org/apps/com.jeffser.Alpaca)
Snap is supported but not yet in snapstore.
Have an macos version running but not yet published.
How to add mention this to the ollama readme

|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8220/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8220/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7110
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7110/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7110/comments
|
https://api.github.com/repos/ollama/ollama/issues/7110/events
|
https://github.com/ollama/ollama/issues/7110
| 2,569,297,609
|
I_kwDOJ0Z1Ps6ZJF7J
| 7,110
|
Customize the installation path.
|
{
"login": "jyz2012",
"id": 142499424,
"node_id": "U_kgDOCH5eYA",
"avatar_url": "https://avatars.githubusercontent.com/u/142499424?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jyz2012",
"html_url": "https://github.com/jyz2012",
"followers_url": "https://api.github.com/users/jyz2012/followers",
"following_url": "https://api.github.com/users/jyz2012/following{/other_user}",
"gists_url": "https://api.github.com/users/jyz2012/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jyz2012/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jyz2012/subscriptions",
"organizations_url": "https://api.github.com/users/jyz2012/orgs",
"repos_url": "https://api.github.com/users/jyz2012/repos",
"events_url": "https://api.github.com/users/jyz2012/events{/privacy}",
"received_events_url": "https://api.github.com/users/jyz2012/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-10-07T04:50:35
| 2024-10-12T10:04:20
| 2024-10-12T10:04:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The C drive is not enough space, and there is no space to install Ollama.
Is it possible to add a feature to customize the installation path?
Operating system: Windows 11.
|
{
"login": "jyz2012",
"id": 142499424,
"node_id": "U_kgDOCH5eYA",
"avatar_url": "https://avatars.githubusercontent.com/u/142499424?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jyz2012",
"html_url": "https://github.com/jyz2012",
"followers_url": "https://api.github.com/users/jyz2012/followers",
"following_url": "https://api.github.com/users/jyz2012/following{/other_user}",
"gists_url": "https://api.github.com/users/jyz2012/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jyz2012/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jyz2012/subscriptions",
"organizations_url": "https://api.github.com/users/jyz2012/orgs",
"repos_url": "https://api.github.com/users/jyz2012/repos",
"events_url": "https://api.github.com/users/jyz2012/events{/privacy}",
"received_events_url": "https://api.github.com/users/jyz2012/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7110/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7110/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7637
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7637/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7637/comments
|
https://api.github.com/repos/ollama/ollama/issues/7637/events
|
https://github.com/ollama/ollama/pull/7637
| 2,653,432,782
|
PR_kwDOJ0Z1Ps6BsEWm
| 7,637
|
runner.go: Enforce NUM_PARALLEL directly in the runner
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-12T21:56:37
| 2024-11-14T19:22:02
| 2024-11-14T19:22:00
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7637",
"html_url": "https://github.com/ollama/ollama/pull/7637",
"diff_url": "https://github.com/ollama/ollama/pull/7637.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7637.patch",
"merged_at": "2024-11-14T19:22:00"
}
|
NUM_PARALEL is currently enforced by the Ollama server process - it will only issue requests to the runner if the maximum number of concurrent requests has not been exceeded. Although this should be sufficient, it is good for the runner to protect its own data structures. Currently, if too many requests get through to the runner, they will just get stuck and never return.
This may help with reports of Ollama hanging, though it is unclear how it would actually occur.
Bug #7573
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7637/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7637/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5370
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5370/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5370/comments
|
https://api.github.com/repos/ollama/ollama/issues/5370/events
|
https://github.com/ollama/ollama/issues/5370
| 2,381,445,818
|
I_kwDOJ0Z1Ps6N8fq6
| 5,370
|
OpenAI Chat Compatibility Incorrect Prompt Eval
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-06-29T03:51:39
| 2024-07-03T20:46:24
| 2024-07-03T20:46:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama returns 0 for prompt eval if the prompt was cached, but openai returns the actual count. prompt_tokens = 0 in the usage struct in the response.
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5370/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5370/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/997
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/997/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/997/comments
|
https://api.github.com/repos/ollama/ollama/issues/997/events
|
https://github.com/ollama/ollama/issues/997
| 1,977,290,845
|
I_kwDOJ0Z1Ps512xBd
| 997
|
Run Ollama on AWS
|
{
"login": "telestia",
"id": 96764147,
"node_id": "U_kgDOBcSA8w",
"avatar_url": "https://avatars.githubusercontent.com/u/96764147?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/telestia",
"html_url": "https://github.com/telestia",
"followers_url": "https://api.github.com/users/telestia/followers",
"following_url": "https://api.github.com/users/telestia/following{/other_user}",
"gists_url": "https://api.github.com/users/telestia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/telestia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/telestia/subscriptions",
"organizations_url": "https://api.github.com/users/telestia/orgs",
"repos_url": "https://api.github.com/users/telestia/repos",
"events_url": "https://api.github.com/users/telestia/events{/privacy}",
"received_events_url": "https://api.github.com/users/telestia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-11-04T11:27:56
| 2024-08-01T18:30:43
| 2023-11-06T07:55:08
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I created an AMI on which I run Ollama and made sure that it works fine, but when I create another machine from this AMI, although the OLLAMA services are active, an ollama process does not appear in the nvdia-smi command, and when I try to run my model with the ollama run command, only the loading bar returns. . I have no idea what might cause this problem, I would appreciate it if you could help me.
|
{
"login": "telestia",
"id": 96764147,
"node_id": "U_kgDOBcSA8w",
"avatar_url": "https://avatars.githubusercontent.com/u/96764147?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/telestia",
"html_url": "https://github.com/telestia",
"followers_url": "https://api.github.com/users/telestia/followers",
"following_url": "https://api.github.com/users/telestia/following{/other_user}",
"gists_url": "https://api.github.com/users/telestia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/telestia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/telestia/subscriptions",
"organizations_url": "https://api.github.com/users/telestia/orgs",
"repos_url": "https://api.github.com/users/telestia/repos",
"events_url": "https://api.github.com/users/telestia/events{/privacy}",
"received_events_url": "https://api.github.com/users/telestia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/997/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/997/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8465
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8465/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8465/comments
|
https://api.github.com/repos/ollama/ollama/issues/8465/events
|
https://github.com/ollama/ollama/issues/8465
| 2,794,512,332
|
I_kwDOJ0Z1Ps6mkN_M
| 8,465
|
Conflicted description of command-r7b
|
{
"login": "vYLQs6",
"id": 143073604,
"node_id": "U_kgDOCIchRA",
"avatar_url": "https://avatars.githubusercontent.com/u/143073604?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vYLQs6",
"html_url": "https://github.com/vYLQs6",
"followers_url": "https://api.github.com/users/vYLQs6/followers",
"following_url": "https://api.github.com/users/vYLQs6/following{/other_user}",
"gists_url": "https://api.github.com/users/vYLQs6/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vYLQs6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vYLQs6/subscriptions",
"organizations_url": "https://api.github.com/users/vYLQs6/orgs",
"repos_url": "https://api.github.com/users/vYLQs6/repos",
"events_url": "https://api.github.com/users/vYLQs6/events{/privacy}",
"received_events_url": "https://api.github.com/users/vYLQs6/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2025-01-17T04:17:56
| 2025-01-19T22:12:17
| 2025-01-19T22:12:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
### https://www.ollama.com/library/command-r7b
### Model readme said 128k context length, but Metadata only shows 8k, which is correct?
> Context length: Command R7B supports a context length of 128K.

> cohere2.context_length 8192

### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.5.5
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8465/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8465/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1750
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1750/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1750/comments
|
https://api.github.com/repos/ollama/ollama/issues/1750/events
|
https://github.com/ollama/ollama/issues/1750
| 2,061,021,069
|
I_kwDOJ0Z1Ps562K-N
| 1,750
|
[Feature] set Download directory for models while pulling/downloading from ollama
|
{
"login": "tikendraw",
"id": 68785366,
"node_id": "MDQ6VXNlcjY4Nzg1MzY2",
"avatar_url": "https://avatars.githubusercontent.com/u/68785366?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tikendraw",
"html_url": "https://github.com/tikendraw",
"followers_url": "https://api.github.com/users/tikendraw/followers",
"following_url": "https://api.github.com/users/tikendraw/following{/other_user}",
"gists_url": "https://api.github.com/users/tikendraw/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tikendraw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tikendraw/subscriptions",
"organizations_url": "https://api.github.com/users/tikendraw/orgs",
"repos_url": "https://api.github.com/users/tikendraw/repos",
"events_url": "https://api.github.com/users/tikendraw/events{/privacy}",
"received_events_url": "https://api.github.com/users/tikendraw/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-12-31T10:55:11
| 2024-01-08T17:22:56
| 2024-01-02T11:25:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hey,
In Ubuntu 23.10,
Previously, Ollama used to download the models into the root directory.
Now, it is downloading in the Home directory. How do you control this?
I suggest a directory flag to let the user decide in which folder the model is supposed to go.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1750/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1750/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6301
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6301/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6301/comments
|
https://api.github.com/repos/ollama/ollama/issues/6301/events
|
https://github.com/ollama/ollama/issues/6301
| 2,459,136,123
|
I_kwDOJ0Z1Ps6Sk3B7
| 6,301
|
ollama list does not show previously downloaded models
|
{
"login": "ACodingfreak",
"id": 1760137,
"node_id": "MDQ6VXNlcjE3NjAxMzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/1760137?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ACodingfreak",
"html_url": "https://github.com/ACodingfreak",
"followers_url": "https://api.github.com/users/ACodingfreak/followers",
"following_url": "https://api.github.com/users/ACodingfreak/following{/other_user}",
"gists_url": "https://api.github.com/users/ACodingfreak/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ACodingfreak/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ACodingfreak/subscriptions",
"organizations_url": "https://api.github.com/users/ACodingfreak/orgs",
"repos_url": "https://api.github.com/users/ACodingfreak/repos",
"events_url": "https://api.github.com/users/ACodingfreak/events{/privacy}",
"received_events_url": "https://api.github.com/users/ACodingfreak/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-08-10T15:29:56
| 2025-01-26T14:23:43
| 2024-08-12T18:00:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am not able to access previously downloaded models eventhough they are present in the system
I saw similar issue which is closed now
https://github.com/ollama/ollama/issues/1493
```
~/.ollama/models/blobs$ ls -al
total 9104032
drwxr-xr-x 2 codingfreak codingfreak 12288 Aug 9 13:29 .
drwxr-xr-x 4 codingfreak codingfreak 4096 Aug 9 10:50 ..
-rw-r--r-- 1 codingfreak codingfreak 12320 Aug 9 13:29 sha256-0ba8f0e314b4264dfd19df045cde9d4c394a52474bf92ed6a3de22a4ca31a177
-rw-r--r-- 1 codingfreak codingfreak 1692 Aug 9 13:29 sha256-11ce4ee3e170f6adebac9a991c22e22ab3f8530e154ee669954c4bc73061c258
-rw-r--r-- 1 codingfreak codingfreak 485 Aug 9 13:29 sha256-1a4c3c319823fdabddb22479d0b10820a7a39fe49e45c40bae28fbe83926dc14
-rw-r--r-- 1 codingfreak codingfreak 485 Aug 9 10:59 sha256-3f8eb4da87fa7a3c9da615036b0dc418d31fef2a30b115ff33562588b32c691d
-rw-r--r-- 1 codingfreak codingfreak 12403 Aug 9 10:59 sha256-4fa551d4f938f68b8c1e6afa9d28befb70e3f33f75d0753248d530364aeea40f
-rw-r--r-- 1 codingfreak codingfreak 96 Aug 9 13:29 sha256-56bb8bd477a519ffa694fc449c2413c6f0e1d3b1c88fa7e3c9d88d3ae49d4dcb
-rw-r--r-- 1 codingfreak codingfreak 110 Aug 9 10:59 sha256-577073ffcc6ce95b9981eacc77d1039568639e5638e83044994560d9ef82ce1b
-rw-r--r-- 1 codingfreak codingfreak 4661211424 Aug 9 10:59 sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
-rw-r--r-- 1 codingfreak codingfreak 254 Aug 9 10:59 sha256-8ab4849b038cf0abc5b1c9b8ee1443dca6b93a045c2272180d985126eb40bf6f
-rw-r--r-- 1 codingfreak codingfreak 4661216384 Aug 9 13:29 sha256-8eeb52dfb3bb9aefdf9d1ef24b3bdbcfbe82238798c4b918278320b6fcef18fe
(ollama) codingfreak@gpu01:~/.ollama/models/blobs$ ollama list
NAME ID SIZE MODIFIED
(ollama) codingfreak@gpu01:~/.ollama/models/blobs$
(ollama) codingfreak@gpu01:~/.ollama/models/blobs$
(ollama) codingfreak@gpu01:~/.ollama/models/blobs$ systemctl status ollama
● ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
Active: active (running) since Sat 2024-08-10 08:22:23 PDT; 6min ago
Main PID: 283262 (ollama)
Tasks: 58 (limit: 76755)
Memory: 5.3G
CPU: 1min 46.274s
CGroup: /system.slice/ollama.service
└─283262 /usr/local/bin/ollama serve
Aug 10 08:22:23 gpu01 ollama[283262]: time=2024-08-10T08:22:23.798-07:00 level=INFO source=payload.go:30 msg="extracting embedded f>
Aug 10 08:22:26 gpu01 ollama[283262]: time=2024-08-10T08:22:26.142-07:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries>
Aug 10 08:22:26 gpu01 ollama[283262]: time=2024-08-10T08:22:26.142-07:00 level=INFO source=gpu.go:204 msg="looking for compatible G>
Aug 10 08:22:26 gpu01 ollama[283262]: time=2024-08-10T08:22:26.227-07:00 level=INFO source=types.go:105 msg="inference compute" id=>
Aug 10 08:22:38 gpu01 ollama[283262]: [GIN] 2024/08/10 - 08:22:38 | 200 | 81.891µs | 127.0.0.1 | HEAD "/"
Aug 10 08:22:38 gpu01 ollama[283262]: [GIN] 2024/08/10 - 08:22:38 | 404 | 489.478µs | 127.0.0.1 | POST "/api/show"
Aug 10 08:22:39 gpu01 ollama[283262]: time=2024-08-10T08:22:39.625-07:00 level=INFO source=download.go:175 msg="downloading 8eeb52d>
Aug 10 08:28:08 gpu01 ollama[283262]: [GIN] 2024/08/10 - 08:28:08 | 200 | 63.34µs | 127.0.0.1 | GET "/api/version"
Aug 10 08:28:49 gpu01 ollama[283262]: [GIN] 2024/08/10 - 08:28:49 | 200 | 16.672µs | 127.0.0.1 | HEAD "/"
Aug 10 08:28:49 gpu01 ollama[283262]: [GIN] 2024/08/10 - 08:28:49 | 200 | 91.154µs | 127.0.0.1 | GET "/api/tags"
```
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.4
|
{
"login": "ACodingfreak",
"id": 1760137,
"node_id": "MDQ6VXNlcjE3NjAxMzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/1760137?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ACodingfreak",
"html_url": "https://github.com/ACodingfreak",
"followers_url": "https://api.github.com/users/ACodingfreak/followers",
"following_url": "https://api.github.com/users/ACodingfreak/following{/other_user}",
"gists_url": "https://api.github.com/users/ACodingfreak/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ACodingfreak/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ACodingfreak/subscriptions",
"organizations_url": "https://api.github.com/users/ACodingfreak/orgs",
"repos_url": "https://api.github.com/users/ACodingfreak/repos",
"events_url": "https://api.github.com/users/ACodingfreak/events{/privacy}",
"received_events_url": "https://api.github.com/users/ACodingfreak/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6301/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6301/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2727
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2727/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2727/comments
|
https://api.github.com/repos/ollama/ollama/issues/2727/events
|
https://github.com/ollama/ollama/issues/2727
| 2,152,265,766
|
I_kwDOJ0Z1Ps6ASPgm
| 2,727
|
Error: could not connect to ollama app, is it running? on Windows 10
|
{
"login": "Alias4D",
"id": 27604791,
"node_id": "MDQ6VXNlcjI3NjA0Nzkx",
"avatar_url": "https://avatars.githubusercontent.com/u/27604791?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Alias4D",
"html_url": "https://github.com/Alias4D",
"followers_url": "https://api.github.com/users/Alias4D/followers",
"following_url": "https://api.github.com/users/Alias4D/following{/other_user}",
"gists_url": "https://api.github.com/users/Alias4D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Alias4D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Alias4D/subscriptions",
"organizations_url": "https://api.github.com/users/Alias4D/orgs",
"repos_url": "https://api.github.com/users/Alias4D/repos",
"events_url": "https://api.github.com/users/Alias4D/events{/privacy}",
"received_events_url": "https://api.github.com/users/Alias4D/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 15
| 2024-02-24T11:20:54
| 2025-01-11T23:25:35
| 2024-02-26T14:26:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Error: could not connect to ollama app, is it running? on windows 10

log file 👍
time=2024-02-24T14:24:23.004+03:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 2 - respawning"
time=2024-02-24T14:24:23.513+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:24.208+03:00 level=WARN source=server.go:113 msg="server crash 24 - exit code 2 - respawning"
time=2024-02-24T14:24:24.528+03:00 level=WARN source=server.go:113 msg="server crash 2 - exit code 2 - respawning"
time=2024-02-24T14:24:24.717+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:25.036+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:27.039+03:00 level=WARN source=server.go:113 msg="server crash 3 - exit code 2 - respawning"
time=2024-02-24T14:24:27.545+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:29.295+03:00 level=WARN source=server.go:113 msg="server crash 11 - exit code 2 - respawning"
time=2024-02-24T14:24:29.796+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:30.556+03:00 level=WARN source=server.go:113 msg="server crash 4 - exit code 2 - respawning"
time=2024-02-24T14:24:30.807+03:00 level=WARN source=server.go:113 msg="server crash 15 - exit code 2 - respawning"
time=2024-02-24T14:24:31.072+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:31.309+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:33.097+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..."
time=2024-02-24T14:24:34.350+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..."
time=2024-02-24T14:24:35.086+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting"
time=2024-02-24T14:24:36.017+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..."
time=2024-02-24T14:24:40.805+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting"
time=2024-02-24T14:24:46.324+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting"
time=2024-02-24T14:24:48.722+03:00 level=WARN source=server.go:113 msg="server crash 25 - exit code 2 - respawning"
time=2024-02-24T14:24:49.228+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
|
{
"login": "Alias4D",
"id": 27604791,
"node_id": "MDQ6VXNlcjI3NjA0Nzkx",
"avatar_url": "https://avatars.githubusercontent.com/u/27604791?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Alias4D",
"html_url": "https://github.com/Alias4D",
"followers_url": "https://api.github.com/users/Alias4D/followers",
"following_url": "https://api.github.com/users/Alias4D/following{/other_user}",
"gists_url": "https://api.github.com/users/Alias4D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Alias4D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Alias4D/subscriptions",
"organizations_url": "https://api.github.com/users/Alias4D/orgs",
"repos_url": "https://api.github.com/users/Alias4D/repos",
"events_url": "https://api.github.com/users/Alias4D/events{/privacy}",
"received_events_url": "https://api.github.com/users/Alias4D/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2727/reactions",
"total_count": 3,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/2727/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/616
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/616/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/616/comments
|
https://api.github.com/repos/ollama/ollama/issues/616/events
|
https://github.com/ollama/ollama/pull/616
| 1,914,594,013
|
PR_kwDOJ0Z1Ps5bSghp
| 616
|
fix model name not matching
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-27T02:50:15
| 2023-09-27T03:54:19
| 2023-09-27T03:54:19
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/616",
"html_url": "https://github.com/ollama/ollama/pull/616",
"diff_url": "https://github.com/ollama/ollama/pull/616.diff",
"patch_url": "https://github.com/ollama/ollama/pull/616.patch",
"merged_at": "2023-09-27T03:54:19"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/616/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/616/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2963
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2963/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2963/comments
|
https://api.github.com/repos/ollama/ollama/issues/2963/events
|
https://github.com/ollama/ollama/issues/2963
| 2,172,510,424
|
I_kwDOJ0Z1Ps6BfeDY
| 2,963
|
Add ability to provide `options` in OpenAI compatibility endpoints
|
{
"login": "pseudotensor",
"id": 2249614,
"node_id": "MDQ6VXNlcjIyNDk2MTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/2249614?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pseudotensor",
"html_url": "https://github.com/pseudotensor",
"followers_url": "https://api.github.com/users/pseudotensor/followers",
"following_url": "https://api.github.com/users/pseudotensor/following{/other_user}",
"gists_url": "https://api.github.com/users/pseudotensor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pseudotensor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pseudotensor/subscriptions",
"organizations_url": "https://api.github.com/users/pseudotensor/orgs",
"repos_url": "https://api.github.com/users/pseudotensor/repos",
"events_url": "https://api.github.com/users/pseudotensor/events{/privacy}",
"received_events_url": "https://api.github.com/users/pseudotensor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6657611864,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjNMYWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/compatibility",
"name": "compatibility",
"color": "bfdadc",
"default": false,
"description": ""
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 2
| 2024-03-06T22:00:47
| 2024-11-06T17:59:36
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It seems one can only set system prompt and hyperparameters like temperature as part of model config file. I'm using the OpenAI API, and ollama ignores system prompt or such hyperparaemters. AFAIK there's no good reason for this.
Am I missing something?
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2963/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2963/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1012
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1012/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1012/comments
|
https://api.github.com/repos/ollama/ollama/issues/1012/events
|
https://github.com/ollama/ollama/issues/1012
| 1,978,302,999
|
I_kwDOJ0Z1Ps516oIX
| 1,012
|
yarn-mistral doesn't work on MacOS
|
{
"login": "iddar",
"id": 199103,
"node_id": "MDQ6VXNlcjE5OTEwMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/199103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iddar",
"html_url": "https://github.com/iddar",
"followers_url": "https://api.github.com/users/iddar/followers",
"following_url": "https://api.github.com/users/iddar/following{/other_user}",
"gists_url": "https://api.github.com/users/iddar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iddar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iddar/subscriptions",
"organizations_url": "https://api.github.com/users/iddar/orgs",
"repos_url": "https://api.github.com/users/iddar/repos",
"events_url": "https://api.github.com/users/iddar/events{/privacy}",
"received_events_url": "https://api.github.com/users/iddar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 9
| 2023-11-06T05:07:54
| 2023-11-23T15:21:12
| 2023-11-23T00:19:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I run llama run `yarn-mistral:7b-128k` or `ollama run yarn-mistral` in both times the CLI keep loading infinity.
<img width="691" alt="image" src="https://github.com/jmorganca/ollama/assets/199103/d48cf1e9-694f-4d6f-b88c-342dfa95f333">
In this example I wait over 8 minutes
|
{
"login": "iddar",
"id": 199103,
"node_id": "MDQ6VXNlcjE5OTEwMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/199103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iddar",
"html_url": "https://github.com/iddar",
"followers_url": "https://api.github.com/users/iddar/followers",
"following_url": "https://api.github.com/users/iddar/following{/other_user}",
"gists_url": "https://api.github.com/users/iddar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iddar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iddar/subscriptions",
"organizations_url": "https://api.github.com/users/iddar/orgs",
"repos_url": "https://api.github.com/users/iddar/repos",
"events_url": "https://api.github.com/users/iddar/events{/privacy}",
"received_events_url": "https://api.github.com/users/iddar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1012/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1012/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/477
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/477/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/477/comments
|
https://api.github.com/repos/ollama/ollama/issues/477/events
|
https://github.com/ollama/ollama/pull/477
| 1,884,768,733
|
PR_kwDOJ0Z1Ps5ZuSrQ
| 477
|
fix model manifests
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-06T21:19:54
| 2023-09-06T21:30:09
| 2023-09-06T21:30:08
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/477",
"html_url": "https://github.com/ollama/ollama/pull/477",
"diff_url": "https://github.com/ollama/ollama/pull/477.diff",
"patch_url": "https://github.com/ollama/ollama/pull/477.patch",
"merged_at": "2023-09-06T21:30:08"
}
|
follow up to #473 which only created the parent directory, not manifests itself
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/477/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/477/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8603
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8603/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8603/comments
|
https://api.github.com/repos/ollama/ollama/issues/8603/events
|
https://github.com/ollama/ollama/pull/8603
| 2,812,290,292
|
PR_kwDOJ0Z1Ps6JC8y5
| 8,603
|
Update the Documentation.
|
{
"login": "kontactguddu",
"id": 49631628,
"node_id": "MDQ6VXNlcjQ5NjMxNjI4",
"avatar_url": "https://avatars.githubusercontent.com/u/49631628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kontactguddu",
"html_url": "https://github.com/kontactguddu",
"followers_url": "https://api.github.com/users/kontactguddu/followers",
"following_url": "https://api.github.com/users/kontactguddu/following{/other_user}",
"gists_url": "https://api.github.com/users/kontactguddu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kontactguddu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kontactguddu/subscriptions",
"organizations_url": "https://api.github.com/users/kontactguddu/orgs",
"repos_url": "https://api.github.com/users/kontactguddu/repos",
"events_url": "https://api.github.com/users/kontactguddu/events{/privacy}",
"received_events_url": "https://api.github.com/users/kontactguddu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2025-01-27T07:40:57
| 2025-01-28T08:41:06
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8603",
"html_url": "https://github.com/ollama/ollama/pull/8603",
"diff_url": "https://github.com/ollama/ollama/pull/8603.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8603.patch",
"merged_at": null
}
|
Deepseek model also added into the documentation section.
1. 671B
2. 70B
3. 1.5B
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8603/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8603/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6243
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6243/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6243/comments
|
https://api.github.com/repos/ollama/ollama/issues/6243/events
|
https://github.com/ollama/ollama/pull/6243
| 2,454,358,743
|
PR_kwDOJ0Z1Ps53wpND
| 6,243
|
get api models
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-07T21:11:37
| 2024-11-22T01:36:26
| 2024-11-22T01:36:26
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6243",
"html_url": "https://github.com/ollama/ollama/pull/6243",
"diff_url": "https://github.com/ollama/ollama/pull/6243.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6243.patch",
"merged_at": null
}
|
implement a more restful models api starting with `GET /api/models` and `GET /api/models/{name}` which aims to replace `GET /api/tags` and `POST /api/show` respectively
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6243/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6243/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6907
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6907/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6907/comments
|
https://api.github.com/repos/ollama/ollama/issues/6907/events
|
https://github.com/ollama/ollama/pull/6907
| 2,540,592,410
|
PR_kwDOJ0Z1Ps58QsCT
| 6,907
|
Use punkt_tab instead of punkt.
|
{
"login": "madiator",
"id": 1088491,
"node_id": "MDQ6VXNlcjEwODg0OTE=",
"avatar_url": "https://avatars.githubusercontent.com/u/1088491?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/madiator",
"html_url": "https://github.com/madiator",
"followers_url": "https://api.github.com/users/madiator/followers",
"following_url": "https://api.github.com/users/madiator/following{/other_user}",
"gists_url": "https://api.github.com/users/madiator/gists{/gist_id}",
"starred_url": "https://api.github.com/users/madiator/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/madiator/subscriptions",
"organizations_url": "https://api.github.com/users/madiator/orgs",
"repos_url": "https://api.github.com/users/madiator/repos",
"events_url": "https://api.github.com/users/madiator/events{/privacy}",
"received_events_url": "https://api.github.com/users/madiator/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-22T01:38:57
| 2024-09-22T01:55:28
| 2024-09-22T01:55:28
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6907",
"html_url": "https://github.com/ollama/ollama/pull/6907",
"diff_url": "https://github.com/ollama/ollama/pull/6907.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6907.patch",
"merged_at": "2024-09-22T01:55:28"
}
|
This was causing an error since we depend on punkt_tab.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6907/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6907/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1397
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1397/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1397/comments
|
https://api.github.com/repos/ollama/ollama/issues/1397/events
|
https://github.com/ollama/ollama/issues/1397
| 2,028,687,534
|
I_kwDOJ0Z1Ps5461Cu
| 1,397
|
No concurrent query. Error: unexpected end of response
|
{
"login": "fxrobin",
"id": 16342334,
"node_id": "MDQ6VXNlcjE2MzQyMzM0",
"avatar_url": "https://avatars.githubusercontent.com/u/16342334?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fxrobin",
"html_url": "https://github.com/fxrobin",
"followers_url": "https://api.github.com/users/fxrobin/followers",
"following_url": "https://api.github.com/users/fxrobin/following{/other_user}",
"gists_url": "https://api.github.com/users/fxrobin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fxrobin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fxrobin/subscriptions",
"organizations_url": "https://api.github.com/users/fxrobin/orgs",
"repos_url": "https://api.github.com/users/fxrobin/repos",
"events_url": "https://api.github.com/users/fxrobin/events{/privacy}",
"received_events_url": "https://api.github.com/users/fxrobin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-12-06T14:30:18
| 2023-12-09T23:36:22
| 2023-12-09T23:36:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Trying to simultaneously quering Ollama results in a failure :
The first query is "served" but the second is ignore.
First request :

Second request in parallel :

Is it a limitation of Ollama ?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1397/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1397/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2784
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2784/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2784/comments
|
https://api.github.com/repos/ollama/ollama/issues/2784/events
|
https://github.com/ollama/ollama/issues/2784
| 2,156,946,338
|
I_kwDOJ0Z1Ps6AkGOi
| 2,784
|
Ollama will run in CPU-only mode.
|
{
"login": "Code-With-Abhishek-Kumar",
"id": 122656682,
"node_id": "U_kgDOB0-Xqg",
"avatar_url": "https://avatars.githubusercontent.com/u/122656682?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Code-With-Abhishek-Kumar",
"html_url": "https://github.com/Code-With-Abhishek-Kumar",
"followers_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/followers",
"following_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/following{/other_user}",
"gists_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/subscriptions",
"organizations_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/orgs",
"repos_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/repos",
"events_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/events{/privacy}",
"received_events_url": "https://api.github.com/users/Code-With-Abhishek-Kumar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-02-27T15:46:44
| 2024-03-11T23:40:12
| 2024-03-11T23:40:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
WARNING: No NVIDIA GPU detected. Ollama will run in CPU-only mode.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2784/reactions",
"total_count": 1,
"+1": 0,
"-1": 1,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2784/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3848
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3848/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3848/comments
|
https://api.github.com/repos/ollama/ollama/issues/3848/events
|
https://github.com/ollama/ollama/issues/3848
| 2,259,571,517
|
I_kwDOJ0Z1Ps6GrlM9
| 3,848
|
phi3 doesn't seem to accept SYSTEM prompt
|
{
"login": "rb81",
"id": 48117105,
"node_id": "MDQ6VXNlcjQ4MTE3MTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/48117105?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rb81",
"html_url": "https://github.com/rb81",
"followers_url": "https://api.github.com/users/rb81/followers",
"following_url": "https://api.github.com/users/rb81/following{/other_user}",
"gists_url": "https://api.github.com/users/rb81/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rb81/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rb81/subscriptions",
"organizations_url": "https://api.github.com/users/rb81/orgs",
"repos_url": "https://api.github.com/users/rb81/repos",
"events_url": "https://api.github.com/users/rb81/events{/privacy}",
"received_events_url": "https://api.github.com/users/rb81/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 18
| 2024-04-23T18:58:29
| 2024-08-25T18:44:37
| 2024-07-05T04:06:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Regardless of what's in the modelfile, it seems phi3 doesn't take in the SYSTEM prompt at all. I've looked around and can't find anyone else discussing this. Assuming this is a bug of some kind.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.32
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3848/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3848/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4432
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4432/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4432/comments
|
https://api.github.com/repos/ollama/ollama/issues/4432/events
|
https://github.com/ollama/ollama/issues/4432
| 2,296,054,472
|
I_kwDOJ0Z1Ps6I2wLI
| 4,432
|
Lora models / Lora training
|
{
"login": "AncientMystic",
"id": 62780271,
"node_id": "MDQ6VXNlcjYyNzgwMjcx",
"avatar_url": "https://avatars.githubusercontent.com/u/62780271?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AncientMystic",
"html_url": "https://github.com/AncientMystic",
"followers_url": "https://api.github.com/users/AncientMystic/followers",
"following_url": "https://api.github.com/users/AncientMystic/following{/other_user}",
"gists_url": "https://api.github.com/users/AncientMystic/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AncientMystic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AncientMystic/subscriptions",
"organizations_url": "https://api.github.com/users/AncientMystic/orgs",
"repos_url": "https://api.github.com/users/AncientMystic/repos",
"events_url": "https://api.github.com/users/AncientMystic/events{/privacy}",
"received_events_url": "https://api.github.com/users/AncientMystic/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-05-14T17:58:37
| 2024-07-10T18:20:38
| 2024-07-10T18:20:38
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Would it be possible to somehow add lora support just like embedding support? That way we can load lora models to be used in the execution of text generation models in order to fine tune their output?
Also wondering if it would be possible to add lora training so we could train a lora directly via ollama to run on top of models
I have noticed loras seem to really improve the results of models so it would be nice to be able to directly support loading them on top of models within ollama
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4432/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4432/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1711
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1711/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1711/comments
|
https://api.github.com/repos/ollama/ollama/issues/1711/events
|
https://github.com/ollama/ollama/pull/1711
| 2,055,797,276
|
PR_kwDOJ0Z1Ps5iv3ja
| 1,711
|
docs: paragraph about quantization
|
{
"login": "moDal7",
"id": 97637845,
"node_id": "U_kgDOBdHV1Q",
"avatar_url": "https://avatars.githubusercontent.com/u/97637845?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/moDal7",
"html_url": "https://github.com/moDal7",
"followers_url": "https://api.github.com/users/moDal7/followers",
"following_url": "https://api.github.com/users/moDal7/following{/other_user}",
"gists_url": "https://api.github.com/users/moDal7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/moDal7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/moDal7/subscriptions",
"organizations_url": "https://api.github.com/users/moDal7/orgs",
"repos_url": "https://api.github.com/users/moDal7/repos",
"events_url": "https://api.github.com/users/moDal7/events{/privacy}",
"received_events_url": "https://api.github.com/users/moDal7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-12-25T17:35:43
| 2024-11-21T03:04:22
| 2024-11-21T03:04:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1711",
"html_url": "https://github.com/ollama/ollama/pull/1711",
"diff_url": "https://github.com/ollama/ollama/pull/1711.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1711.patch",
"merged_at": null
}
|
Updated README.md with small sub-paragraph about quantization.
References issue #1689
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1711/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1711/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1646
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1646/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1646/comments
|
https://api.github.com/repos/ollama/ollama/issues/1646/events
|
https://github.com/ollama/ollama/pull/1646
| 2,051,430,877
|
PR_kwDOJ0Z1Ps5ihM9L
| 1,646
|
add FAQ for slow networking in WSL2
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-20T23:03:32
| 2023-12-21T00:27:25
| 2023-12-21T00:27:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1646",
"html_url": "https://github.com/ollama/ollama/pull/1646",
"diff_url": "https://github.com/ollama/ollama/pull/1646.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1646.patch",
"merged_at": "2023-12-21T00:27:24"
}
| null |
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1646/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1646/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5001
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5001/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5001/comments
|
https://api.github.com/repos/ollama/ollama/issues/5001/events
|
https://github.com/ollama/ollama/issues/5001
| 2,348,624,573
|
I_kwDOJ0Z1Ps6L_Sq9
| 5,001
|
please add support for rk3588 NPU
|
{
"login": "mrobinson-opi",
"id": 104381309,
"node_id": "U_kgDOBji7fQ",
"avatar_url": "https://avatars.githubusercontent.com/u/104381309?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mrobinson-opi",
"html_url": "https://github.com/mrobinson-opi",
"followers_url": "https://api.github.com/users/mrobinson-opi/followers",
"following_url": "https://api.github.com/users/mrobinson-opi/following{/other_user}",
"gists_url": "https://api.github.com/users/mrobinson-opi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mrobinson-opi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mrobinson-opi/subscriptions",
"organizations_url": "https://api.github.com/users/mrobinson-opi/orgs",
"repos_url": "https://api.github.com/users/mrobinson-opi/repos",
"events_url": "https://api.github.com/users/mrobinson-opi/events{/privacy}",
"received_events_url": "https://api.github.com/users/mrobinson-opi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 3
| 2024-06-12T12:16:04
| 2024-11-14T15:19:28
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Please add support for rk3588 NPU. Thanks.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5001/reactions",
"total_count": 11,
"+1": 11,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5001/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1378
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1378/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1378/comments
|
https://api.github.com/repos/ollama/ollama/issues/1378/events
|
https://github.com/ollama/ollama/issues/1378
| 2,024,667,761
|
I_kwDOJ0Z1Ps54rfpx
| 1,378
|
Is there a health check endpoint?
|
{
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2023-12-04T20:28:03
| 2025-01-14T18:20:27
| 2023-12-04T20:57:23
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is there a health check endpoint for the Ollama server? And if yes, where can I find docs on it?
Alternately, is there a currently existing endpoint that can function as a health check?
|
{
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1378/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1378/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4092
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4092/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4092/comments
|
https://api.github.com/repos/ollama/ollama/issues/4092/events
|
https://github.com/ollama/ollama/issues/4092
| 2,274,562,048
|
I_kwDOJ0Z1Ps6HkxAA
| 4,092
|
microsoft/Phi-3-mini-128k-instruct
|
{
"login": "andsty",
"id": 138453484,
"node_id": "U_kgDOCECh7A",
"avatar_url": "https://avatars.githubusercontent.com/u/138453484?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/andsty",
"html_url": "https://github.com/andsty",
"followers_url": "https://api.github.com/users/andsty/followers",
"following_url": "https://api.github.com/users/andsty/following{/other_user}",
"gists_url": "https://api.github.com/users/andsty/gists{/gist_id}",
"starred_url": "https://api.github.com/users/andsty/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/andsty/subscriptions",
"organizations_url": "https://api.github.com/users/andsty/orgs",
"repos_url": "https://api.github.com/users/andsty/repos",
"events_url": "https://api.github.com/users/andsty/events{/privacy}",
"received_events_url": "https://api.github.com/users/andsty/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-05-02T04:19:49
| 2024-07-30T22:33:35
| 2024-07-30T22:33:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi. Is it possible to add version 128k of phi-3 in ollama?Thanks in advance.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4092/reactions",
"total_count": 7,
"+1": 7,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4092/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2753
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2753/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2753/comments
|
https://api.github.com/repos/ollama/ollama/issues/2753/events
|
https://github.com/ollama/ollama/issues/2753
| 2,153,000,835
|
I_kwDOJ0Z1Ps6AVC-D
| 2,753
|
Error: error loading model
|
{
"login": "zbrkic",
"id": 11651634,
"node_id": "MDQ6VXNlcjExNjUxNjM0",
"avatar_url": "https://avatars.githubusercontent.com/u/11651634?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zbrkic",
"html_url": "https://github.com/zbrkic",
"followers_url": "https://api.github.com/users/zbrkic/followers",
"following_url": "https://api.github.com/users/zbrkic/following{/other_user}",
"gists_url": "https://api.github.com/users/zbrkic/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zbrkic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zbrkic/subscriptions",
"organizations_url": "https://api.github.com/users/zbrkic/orgs",
"repos_url": "https://api.github.com/users/zbrkic/repos",
"events_url": "https://api.github.com/users/zbrkic/events{/privacy}",
"received_events_url": "https://api.github.com/users/zbrkic/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 44
| 2024-02-25T23:54:23
| 2024-03-25T23:11:20
| 2024-03-13T18:05:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
First time install `ollama version is 0.1.27`, trying to run model, getting Error loading model - No such file or directory, but files exist:
```
time=2024-02-26T00:11:49.314+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: C:\\Users\\ELJKO~1\\AppData\\Local\\Temp\\ollama816527122\\cpu_avx2\\ext_server.dll"
time=2024-02-26T00:11:49.314+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"
llama_model_load: error loading model: failed to open C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246: No such file or directory
llama_load_model_from_file: failed to load model
llama_init_from_gpt_params: error: failed to load model 'C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246'
{"timestamp":1708902709,"level":"ERROR","function":"load_model","line":388,"message":"unable to load model","model":"C:\\Users\\Željko\\.ollama\\models\\blobs\\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246"}
time=2024-02-26T00:11:49.314+01:00 level=WARN source=llm.go:162 msg="Failed to load dynamic library C:\\Users\\ELJKO~1\\AppData\\Local\\Temp\\ollama816527122\\cpu_avx2\\ext_server.dll error loading model C:\\Users\\Željko\\.ollama\\models\\blobs\\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef92"
[GIN] 2024/02/26 - 00:11:49 | 500 | 332.6058ms | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/02/26 - 00:15:10 | 200 | 509µs | 127.0.0.1 | GET "/api/version"
```
Files exist:
```
PS C:\> ls C:\Users\Željko\AppData\Local\Temp\ollama816527122\cpu_avx2\ext_server.dll
Directory: C:\Users\Željko\AppData\Local\Temp\ollama816527122\cpu_avx2
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 26.2.2024. 0:03 2475456 ext_server.dll
PS C:\> ls C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246
Directory: C:\Users\Željko\.ollama\models\blobs
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 26.2.2024. 0:05 3826781184 sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246
```
At some point application tries to read in invalid folder:
`C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246`

It seems that unicode letters are not supported in the path, in some part of code. Is there a way to install the app in some other folder until this is fixed?
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2753/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 1,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2753/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/526
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/526/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/526/comments
|
https://api.github.com/repos/ollama/ollama/issues/526/events
|
https://github.com/ollama/ollama/pull/526
| 1,895,312,973
|
PR_kwDOJ0Z1Ps5aRvY_
| 526
|
remove unused
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-13T21:48:55
| 2023-09-14T20:11:00
| 2023-09-14T20:10:59
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/526",
"html_url": "https://github.com/ollama/ollama/pull/526",
"diff_url": "https://github.com/ollama/ollama/pull/526.diff",
"patch_url": "https://github.com/ollama/ollama/pull/526.patch",
"merged_at": "2023-09-14T20:10:59"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/526/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/526/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6506
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6506/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6506/comments
|
https://api.github.com/repos/ollama/ollama/issues/6506/events
|
https://github.com/ollama/ollama/issues/6506
| 2,485,507,174
|
I_kwDOJ0Z1Ps6UJdRm
| 6,506
|
Ollama with RX6600, Openwebui and win 11 support
|
{
"login": "kevinleijh",
"id": 144795687,
"node_id": "U_kgDOCKFoJw",
"avatar_url": "https://avatars.githubusercontent.com/u/144795687?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kevinleijh",
"html_url": "https://github.com/kevinleijh",
"followers_url": "https://api.github.com/users/kevinleijh/followers",
"following_url": "https://api.github.com/users/kevinleijh/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinleijh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kevinleijh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinleijh/subscriptions",
"organizations_url": "https://api.github.com/users/kevinleijh/orgs",
"repos_url": "https://api.github.com/users/kevinleijh/repos",
"events_url": "https://api.github.com/users/kevinleijh/events{/privacy}",
"received_events_url": "https://api.github.com/users/kevinleijh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-08-25T22:54:02
| 2024-08-27T22:22:43
| 2024-08-27T21:12:38
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm a big fan of running Ollama on Windows 11 with OpenWebUI, as it offers a seamless and feature-rich experience. However, I'm facing a challenge with my rx6600 graphics card, which isn't currently supported. I'd greatly appreciate it if you could add it to the list of supported cards, along with all other AMD cards. I'm excited to see the project grow and improve, and I'm confident that with your support, we can make it even better!
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6506/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6506/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5522
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5522/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5522/comments
|
https://api.github.com/repos/ollama/ollama/issues/5522/events
|
https://github.com/ollama/ollama/issues/5522
| 2,393,784,278
|
I_kwDOJ0Z1Ps6Orj_W
| 5,522
|
deepseek-coder-v2:236b - Error: llama runner process has terminated: signal: aborted (core dumped) error:failed to create context with model '/usr/share/ollama/...path/to/blob
|
{
"login": "scouzi1966",
"id": 58265937,
"node_id": "MDQ6VXNlcjU4MjY1OTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/58265937?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/scouzi1966",
"html_url": "https://github.com/scouzi1966",
"followers_url": "https://api.github.com/users/scouzi1966/followers",
"following_url": "https://api.github.com/users/scouzi1966/following{/other_user}",
"gists_url": "https://api.github.com/users/scouzi1966/gists{/gist_id}",
"starred_url": "https://api.github.com/users/scouzi1966/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/scouzi1966/subscriptions",
"organizations_url": "https://api.github.com/users/scouzi1966/orgs",
"repos_url": "https://api.github.com/users/scouzi1966/repos",
"events_url": "https://api.github.com/users/scouzi1966/events{/privacy}",
"received_events_url": "https://api.github.com/users/scouzi1966/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 14
| 2024-07-06T23:34:11
| 2024-09-11T12:03:31
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I've had this issue for a while with earlier version of ollama and latest with and Intel SPR 8480+ and RTX 4090. The num_gpu parameter has been removed from model file so I can no longer reduce layers sent to GPU. I sends 10 and I can't test with 9,8 etc. I can run all other models without any issue.
I have 24 GB of VRAM on my 4090 (nothing else loaded) and 320 GB of main memory. Ubuntu 22.04 and Nvidia driver 550.54.14 and CUDA 12.4
Jul 06 19:23:15 ubuntux ollama[169742]: [GIN] 2024/07/06 - 19:23:15 | 200 | 64.872µs | 127.0.0.1 | HEAD "/"
Jul 06 19:23:15 ubuntux ollama[169742]: [GIN] 2024/07/06 - 19:23:15 | 200 | 18.056618ms | 127.0.0.1 | POST "/api/show"
Jul 06 19:23:15 ubuntux ollama[169742]: time=2024-07-06T19:23:15.282-04:00 level=INFO source=memory.go:309 msg="offload to cuda" layers.requested=-1 layers.model=61 layers.offload=10 layers.split="" memory.available="[23.3 GiB]" memory.required.full="134.5 GiB" memory.required.partial="22.1 GiB" memory.required.kv="9.4 GiB" memory.required.allocations="[22.1 GiB]" memory.weights.total="132.5 GiB" memory.weights.repeating="132.1 GiB" memory.weights.nonrepeating="410.2 MiB" memory.graph.full="642.0 MiB" memory.graph.partial="891.5 MiB"
Jul 06 19:23:15 ubuntux ollama[169742]: time=2024-07-06T19:23:15.283-04:00 level=INFO source=server.go:368 msg="starting llama server" cmd="/tmp/ollama1660031732/runners/cuda_v11/ollama_llama_server --model /usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408 --ctx-size 2048 --batch-size 512 --embedding --log-disable --n-gpu-layers 10 --parallel 1 --port 43475"
Jul 06 19:23:15 ubuntux ollama[169742]: time=2024-07-06T19:23:15.284-04:00 level=INFO source=sched.go:382 msg="loaded runners" count=1
Jul 06 19:23:15 ubuntux ollama[169742]: time=2024-07-06T19:23:15.284-04:00 level=INFO source=server.go:556 msg="waiting for llama runner to start responding"
Jul 06 19:23:15 ubuntux ollama[169742]: time=2024-07-06T19:23:15.284-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server error"
Jul 06 19:23:15 ubuntux ollama[584320]: INFO [main] build info | build=1 commit="7c26775" tid="140507113369600" timestamp=1720308195
Jul 06 19:23:15 ubuntux ollama[584320]: INFO [main] system info | n_threads=56 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="140507113369600" timestamp=1720308195 total_threads=112
Jul 06 19:23:15 ubuntux ollama[584320]: INFO [main] HTTP server listening | hostname="127.0.0.1" n_threads_http="111" port="43475" tid="140507113369600" timestamp=1720308195
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: loaded meta data with 39 key-value pairs and 959 tensors from /usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408 (version GGUF V3 (latest))
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 0: general.architecture str = deepseek2
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 1: general.name str = DeepSeek-Coder-V2-Instruct
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 2: deepseek2.block_count u32 = 60
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 3: deepseek2.context_length u32 = 163840
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 4: deepseek2.embedding_length u32 = 5120
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 5: deepseek2.feed_forward_length u32 = 12288
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 6: deepseek2.attention.head_count u32 = 128
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 7: deepseek2.attention.head_count_kv u32 = 128
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 8: deepseek2.rope.freq_base f32 = 10000.000000
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 9: deepseek2.attention.layer_norm_rms_epsilon f32 = 0.000001
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 10: deepseek2.expert_used_count u32 = 6
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 11: general.file_type u32 = 2
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 12: deepseek2.leading_dense_block_count u32 = 1
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 13: deepseek2.vocab_size u32 = 102400
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 14: deepseek2.attention.q_lora_rank u32 = 1536
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 15: deepseek2.attention.kv_lora_rank u32 = 512
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 16: deepseek2.attention.key_length u32 = 192
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 17: deepseek2.attention.value_length u32 = 128
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 18: deepseek2.expert_feed_forward_length u32 = 1536
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 19: deepseek2.expert_count u32 = 160
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 20: deepseek2.expert_shared_count u32 = 2
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 21: deepseek2.expert_weights_scale f32 = 16.000000
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 22: deepseek2.rope.dimension_count u32 = 64
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 23: deepseek2.rope.scaling.type str = yarn
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 24: deepseek2.rope.scaling.factor f32 = 40.000000
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 25: deepseek2.rope.scaling.original_context_length u32 = 4096
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 26: deepseek2.rope.scaling.yarn_log_multiplier f32 = 0.100000
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 27: tokenizer.ggml.model str = gpt2
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 28: tokenizer.ggml.pre str = deepseek-llm
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 29: tokenizer.ggml.tokens arr[str,102400] = ["!", "\"", "#", "$", "%", "&", "'", ...
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 30: tokenizer.ggml.token_type arr[i32,102400] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 31: tokenizer.ggml.merges arr[str,99757] = ["Ġ Ġ", "Ġ t", "Ġ a", "i n", "h e...
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 32: tokenizer.ggml.bos_token_id u32 = 100000
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 33: tokenizer.ggml.eos_token_id u32 = 100001
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 34: tokenizer.ggml.padding_token_id u32 = 100001
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 35: tokenizer.ggml.add_bos_token bool = true
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 36: tokenizer.ggml.add_eos_token bool = false
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 37: tokenizer.chat_template str = {% if not add_generation_prompt is de...
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - kv 38: general.quantization_version u32 = 2
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - type f32: 300 tensors
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - type q4_0: 658 tensors
Jul 06 19:23:15 ubuntux ollama[169742]: llama_model_loader: - type q6_K: 1 tensors
Jul 06 19:23:15 ubuntux ollama[169742]: time=2024-07-06T19:23:15.536-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server loading model"
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_vocab: special tokens cache size = 2400
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_vocab: token to piece cache size = 0.6661 MB
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: format = GGUF V3 (latest)
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: arch = deepseek2
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: vocab type = BPE
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_vocab = 102400
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_merges = 99757
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_ctx_train = 163840
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_embd = 5120
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_head = 128
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_head_kv = 128
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_layer = 60
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_rot = 64
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_embd_head_k = 192
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_embd_head_v = 128
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_gqa = 1
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_embd_k_gqa = 24576
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_embd_v_gqa = 16384
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: f_norm_eps = 0.0e+00
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: f_norm_rms_eps = 1.0e-06
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: f_clamp_kqv = 0.0e+00
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: f_max_alibi_bias = 0.0e+00
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: f_logit_scale = 0.0e+00
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_ff = 12288
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_expert = 160
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_expert_used = 6
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: causal attn = 1
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: pooling type = 0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: rope type = 0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: rope scaling = yarn
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: freq_base_train = 10000.0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: freq_scale_train = 0.025
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_ctx_orig_yarn = 4096
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: rope_finetuned = unknown
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: ssm_d_conv = 0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: ssm_d_inner = 0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: ssm_d_state = 0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: ssm_dt_rank = 0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: model type = 236B
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: model ftype = Q4_0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: model params = 235.74 B
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: model size = 123.78 GiB (4.51 BPW)
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: general.name = DeepSeek-Coder-V2-Instruct
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: BOS token = 100000 '<|begin▁of▁sentence|>'
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: EOS token = 100001 '<|end▁of▁sentence|>'
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: PAD token = 100001 '<|end▁of▁sentence|>'
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: LF token = 126 'Ä'
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_layer_dense_lead = 1
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_lora_q = 1536
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_lora_kv = 512
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_ff_exp = 1536
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: n_expert_shared = 2
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: expert_weights_scale = 16.0
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_print_meta: rope_yarn_log_mul = 0.1000
Jul 06 19:23:15 ubuntux ollama[169742]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: yes
Jul 06 19:23:15 ubuntux ollama[169742]: ggml_cuda_init: CUDA_USE_TENSOR_CORES: no
Jul 06 19:23:15 ubuntux ollama[169742]: ggml_cuda_init: found 1 CUDA devices:
Jul 06 19:23:15 ubuntux ollama[169742]: Device 0: NVIDIA GeForce RTX 4090, compute capability 8.9, VMM: yes
Jul 06 19:23:15 ubuntux ollama[169742]: llm_load_tensors: ggml ctx size = 0.87 MiB
Jul 06 19:23:16 ubuntux ollama[169742]: time=2024-07-06T19:23:16.992-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server not responding"
Jul 06 19:23:19 ubuntux ollama[169742]: time=2024-07-06T19:23:19.191-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server loading model"
Jul 06 19:23:19 ubuntux ollama[169742]: llm_load_tensors: offloading 10 repeating layers to GPU
Jul 06 19:23:19 ubuntux ollama[169742]: llm_load_tensors: offloaded 10/61 layers to GPU
Jul 06 19:23:19 ubuntux ollama[169742]: llm_load_tensors: CPU buffer size = 105416.00 MiB
Jul 06 19:23:19 ubuntux ollama[169742]: llm_load_tensors: CUDA0 buffer size = 21335.35 MiB
Jul 06 19:23:21 ubuntux ollama[169742]: llama_new_context_with_model: n_ctx = 2048
Jul 06 19:23:21 ubuntux ollama[169742]: llama_new_context_with_model: n_batch = 512
Jul 06 19:23:21 ubuntux ollama[169742]: llama_new_context_with_model: n_ubatch = 512
Jul 06 19:23:21 ubuntux ollama[169742]: llama_new_context_with_model: flash_attn = 0
Jul 06 19:23:21 ubuntux ollama[169742]: llama_new_context_with_model: freq_base = 10000.0
Jul 06 19:23:21 ubuntux ollama[169742]: llama_new_context_with_model: freq_scale = 0.025
Jul 06 19:23:21 ubuntux ollama[169742]: time=2024-07-06T19:23:21.904-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server not responding"
Jul 06 19:23:23 ubuntux ollama[169742]: time=2024-07-06T19:23:23.601-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server loading model"
Jul 06 19:23:24 ubuntux ollama[169742]: llama_kv_cache_init: CUDA_Host KV buffer size = 8000.00 MiB
Jul 06 19:23:24 ubuntux ollama[169742]: llama_kv_cache_init: CUDA0 KV buffer size = 1600.00 MiB
Jul 06 19:23:24 ubuntux ollama[169742]: llama_new_context_with_model: KV self size = 9600.00 MiB, K (f16): 5760.00 MiB, V (f16): 3840.00 MiB
Jul 06 19:23:24 ubuntux ollama[169742]: llama_new_context_with_model: CUDA_Host output buffer size = 0.41 MiB
Jul 06 19:23:24 ubuntux ollama[169742]: ggml_backend_cuda_buffer_type_alloc_buffer: allocating 842.00 MiB on device 0: cudaMalloc failed: out of memory
Jul 06 19:23:24 ubuntux ollama[169742]: ggml_gallocr_reserve_n: failed to allocate CUDA0 buffer of size 882903040
Jul 06 19:23:24 ubuntux ollama[169742]: llama_new_context_with_model: failed to allocate compute buffers
Jul 06 19:23:25 ubuntux ollama[169742]: llama_init_from_gpt_params: error: failed to create context with model '/usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408'
Jul 06 19:23:26 ubuntux ollama[169742]: time=2024-07-06T19:23:26.314-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server not responding"
Jul 06 19:23:26 ubuntux ollama[584320]: ERROR [load_model] unable to load model | model="/usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408" tid="140507113369600" timestamp=1720308206
Jul 06 19:23:26 ubuntux ollama[169742]: terminate called without an active exception
Jul 06 19:23:26 ubuntux ollama[169742]: time=2024-07-06T19:23:26.566-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server error"
Jul 06 19:23:26 ubuntux ollama[169742]: time=2024-07-06T19:23:26.817-04:00 level=ERROR source=sched.go:388 msg="error loading llama server" error="llama runner process has terminated: signal: aborted (core dumped) error:failed to create context with model '/usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408'"
Jul 06 19:23:26 ubuntux ollama[169742]: [GIN] 2024/07/06 - 19:23:26 | 500 | 11.766669885s | 127.0.0.1 | POST "/api/chat"
Jul 06 19:23:31 ubuntux ollama[169742]: time=2024-07-06T19:23:31.944-04:00 level=WARN source=sched.go:575 msg="gpu VRAM usage didn't recover within timeout" seconds=5.127181114 model=/usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408
Jul 06 19:23:32 ubuntux ollama[169742]: time=2024-07-06T19:23:32.194-04:00 level=WARN source=sched.go:575 msg="gpu VRAM usage didn't recover within timeout" seconds=5.376988446 model=/usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408
Jul 06 19:23:32 ubuntux ollama[169742]: time=2024-07-06T19:23:32.444-04:00 level=WARN source=sched.go:575 msg="gpu VRAM usage didn't recover within timeout" seconds=5.626674402 model=/usr/share/ollama/.ollama/models/blobs/sha256-6bbfda8eb96dadd0300076196110f78ff709829c3be9778e86948b839cf05408
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.48
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5522/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5522/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5968
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5968/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5968/comments
|
https://api.github.com/repos/ollama/ollama/issues/5968/events
|
https://github.com/ollama/ollama/pull/5968
| 2,431,322,928
|
PR_kwDOJ0Z1Ps52iD1c
| 5,968
|
Update api.md
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-26T03:10:08
| 2024-07-26T03:10:20
| 2024-07-26T03:10:18
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5968",
"html_url": "https://github.com/ollama/ollama/pull/5968",
"diff_url": "https://github.com/ollama/ollama/pull/5968.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5968.patch",
"merged_at": "2024-07-26T03:10:18"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5968/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5968/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8490
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8490/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8490/comments
|
https://api.github.com/repos/ollama/ollama/issues/8490/events
|
https://github.com/ollama/ollama/pull/8490
| 2,797,782,568
|
PR_kwDOJ0Z1Ps6IRrFU
| 8,490
|
Draft MLX go backend for new engine
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2025-01-19T19:08:38
| 2025-01-29T19:05:33
| null |
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8490",
"html_url": "https://github.com/ollama/ollama/pull/8490",
"diff_url": "https://github.com/ollama/ollama/pull/8490.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8490.patch",
"merged_at": null
}
|
This extends the backend API slightly, adjusts the llama3 implementation and wires up the mlx-c calls necessary to get llama3 running.
A few key points:
- There are still some quirks relating to token embeddings and output tensors that should be resolved
- Q/K tensor adjustments are applied globally which is incorrect (should be model specific)
- The current cache model breaks down due to broken assumptions of stride results between ggml and mlx and is disabled, which caused some ripples for input processing
- GGML behavior isn't identical so there are a few hacks in the model to alter behavior which need to be sorted out so we can have model definitions that are backend agnostic.
- Temporary env var to toggle which backend `OLLAMA_BACKEND` set to `ggml` or `mlx`
To see it working:
prompt:
```
<|begin_of_text|><|start_header_id|>user<|end_header_id|>
What is the capital of Canada? Explain why.<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
```
ollama pull llama3.1:8b-instruct-fp16
cmake -S . -B build
cmake --build build -j
OLLAMA_BACKEND=mlx go run model/cmd/main.go -n 100 --cache ~/.ollama/models/blobs/sha256-09cd6813dc2e73d9db9345123ee1b3385bb7cee88a46f13dc37bc3d5e96ba3a4 < prompt
```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8490/reactions",
"total_count": 8,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 8,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8490/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5241
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5241/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5241/comments
|
https://api.github.com/repos/ollama/ollama/issues/5241/events
|
https://github.com/ollama/ollama/pull/5241
| 2,368,810,517
|
PR_kwDOJ0Z1Ps5zSrxW
| 5,241
|
Allow to change the registry defaults via environment variables
|
{
"login": "simonfrey",
"id": 24354822,
"node_id": "MDQ6VXNlcjI0MzU0ODIy",
"avatar_url": "https://avatars.githubusercontent.com/u/24354822?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/simonfrey",
"html_url": "https://github.com/simonfrey",
"followers_url": "https://api.github.com/users/simonfrey/followers",
"following_url": "https://api.github.com/users/simonfrey/following{/other_user}",
"gists_url": "https://api.github.com/users/simonfrey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/simonfrey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/simonfrey/subscriptions",
"organizations_url": "https://api.github.com/users/simonfrey/orgs",
"repos_url": "https://api.github.com/users/simonfrey/repos",
"events_url": "https://api.github.com/users/simonfrey/events{/privacy}",
"received_events_url": "https://api.github.com/users/simonfrey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 3
| 2024-06-23T18:01:16
| 2024-11-27T07:00:57
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5241",
"html_url": "https://github.com/ollama/ollama/pull/5241",
"diff_url": "https://github.com/ollama/ollama/pull/5241.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5241.patch",
"merged_at": null
}
|
This is especially helpful when running a local pull-through cache for the registry. Allows to continue using all the ollama commands as expected, after once setting the env vars.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5241/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5241/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5589
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5589/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5589/comments
|
https://api.github.com/repos/ollama/ollama/issues/5589/events
|
https://github.com/ollama/ollama/issues/5589
| 2,399,802,003
|
I_kwDOJ0Z1Ps6PChKT
| 5,589
|
OOM crash loading codegeex4 on 4G GPU
|
{
"login": "chandan0000",
"id": 60910265,
"node_id": "MDQ6VXNlcjYwOTEwMjY1",
"avatar_url": "https://avatars.githubusercontent.com/u/60910265?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chandan0000",
"html_url": "https://github.com/chandan0000",
"followers_url": "https://api.github.com/users/chandan0000/followers",
"following_url": "https://api.github.com/users/chandan0000/following{/other_user}",
"gists_url": "https://api.github.com/users/chandan0000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chandan0000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chandan0000/subscriptions",
"organizations_url": "https://api.github.com/users/chandan0000/orgs",
"repos_url": "https://api.github.com/users/chandan0000/repos",
"events_url": "https://api.github.com/users/chandan0000/events{/privacy}",
"received_events_url": "https://api.github.com/users/chandan0000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6849881759,
"node_id": "LA_kwDOJ0Z1Ps8AAAABmEjmnw",
"url": "https://api.github.com/repos/ollama/ollama/labels/memory",
"name": "memory",
"color": "5017EA",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-07-10T05:43:25
| 2024-07-23T23:48:45
| 2024-07-23T23:48:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

other work fine ollama3, gemma2 but i am running codegeex4 model throw error
how i can resolve this issue
my system config

### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
ollama version is 0.2.1
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5589/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5589/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3781
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3781/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3781/comments
|
https://api.github.com/repos/ollama/ollama/issues/3781/events
|
https://github.com/ollama/ollama/issues/3781
| 2,254,651,648
|
I_kwDOJ0Z1Ps6GY0EA
| 3,781
|
How to run Ollama using AMD RX 6600 XT on Windows 11? - gfx1032, workaround works on linux only
|
{
"login": "NAME0x0",
"id": 119836361,
"node_id": "U_kgDOBySOyQ",
"avatar_url": "https://avatars.githubusercontent.com/u/119836361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NAME0x0",
"html_url": "https://github.com/NAME0x0",
"followers_url": "https://api.github.com/users/NAME0x0/followers",
"following_url": "https://api.github.com/users/NAME0x0/following{/other_user}",
"gists_url": "https://api.github.com/users/NAME0x0/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NAME0x0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NAME0x0/subscriptions",
"organizations_url": "https://api.github.com/users/NAME0x0/orgs",
"repos_url": "https://api.github.com/users/NAME0x0/repos",
"events_url": "https://api.github.com/users/NAME0x0/events{/privacy}",
"received_events_url": "https://api.github.com/users/NAME0x0/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-04-20T17:47:27
| 2024-07-22T19:36:52
| 2024-04-24T16:50:08
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The responses by all the LLM's are slow as the CPU is prioritised. I wish to make use of my RX 6600 XT GPU but apparently the workaround is only on Linux. Furthermore, ROCm runtime is available for RX 6600 XT but not HIP SDK which is apparently what is needed for my GPU to run LLMs. However, the documentation for Ollama says that my GPU is supported. How do I make use of it then, since it's not utilising it at all?
### OS
Windows
### GPU
AMD
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3781/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3781/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1920
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1920/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1920/comments
|
https://api.github.com/repos/ollama/ollama/issues/1920/events
|
https://github.com/ollama/ollama/issues/1920
| 2,076,039,185
|
I_kwDOJ0Z1Ps57vdgR
| 1,920
|
ollama + docker fails in GPU mode due to CUDA error
|
{
"login": "giansegato",
"id": 5476684,
"node_id": "MDQ6VXNlcjU0NzY2ODQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/5476684?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/giansegato",
"html_url": "https://github.com/giansegato",
"followers_url": "https://api.github.com/users/giansegato/followers",
"following_url": "https://api.github.com/users/giansegato/following{/other_user}",
"gists_url": "https://api.github.com/users/giansegato/gists{/gist_id}",
"starred_url": "https://api.github.com/users/giansegato/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/giansegato/subscriptions",
"organizations_url": "https://api.github.com/users/giansegato/orgs",
"repos_url": "https://api.github.com/users/giansegato/repos",
"events_url": "https://api.github.com/users/giansegato/events{/privacy}",
"received_events_url": "https://api.github.com/users/giansegato/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 9
| 2024-01-11T08:29:26
| 2024-04-15T09:21:59
| 2024-02-19T19:49:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
`nvidia-smi`:
```
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.129.03 Driver Version: 535.129.03 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA A100-SXM4-40GB On | 00000000:07:00.0 Off | 0 |
| N/A 41C P0 73W / 400W | 4MiB / 40960MiB | 0% Default |
| | | Disabled |
+-----------------------------------------+----------------------+----------------------+
```
but if I run the example in the docker docs:
```
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
docker exec -it ollama ollama run phi
```
it spins for a while and then hard crashes without ever returning.
If I do it in docker-compose, I get to see more logs:
```yml
version: '3.8'
services:
ollama:
image: ollama/ollama
volumes:
- ollama:/root/.ollama
runtime: nvidia
environment:
- NVIDIA_VISIBLE_DEVICES=all
- OPENAI_API_KEY=${OPENAI_API_KEY}
- gpus=all
ports:
- "11434:11434"
restart: unless-stopped
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
```
request:
```
curl http://127.0.0.1:11434/api/generate -d '{
"model": "phi",
"prompt":"Why is the sky blue?"
}'
```
What I get is this:
```
ollama_1 | 2024/01/11 08:24:48 images.go:808: total blobs: 6
ollama_1 | 2024/01/11 08:24:48 images.go:815: total unused blobs removed: 0
ollama_1 | 2024/01/11 08:24:48 routes.go:930: Listening on [::]:11434 (version 0.1.19)
ollama_1 | 2024/01/11 08:24:49 shim_ext_server.go:142: Dynamic LLM variants [cuda]
ollama_1 | 2024/01/11 08:24:49 gpu.go:35: Detecting GPU type
ollama_1 | 2024/01/11 08:24:49 gpu.go:54: Nvidia GPU detected
(...)
/usr/local/nvidia/bin:/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/tmp/ollama1061409751/cuda
ollama_1 | 2024/01/11 08:26:00 shim_ext_server.go:92: Loading Dynamic Shim llm server: /tmp/ollama1061409751/cuda/libext_server.so
ollama_1 | 2024/01/11 08:26:00 ext_server_common.go:136: Initializing internal llama server8.0
(...)
[[36mollama_1 |^[[0m llm_load_tensors: offloading 32 repeating layers to GPU
^[[36mollama_1 |^[[0m llm_load_tensors: offloading non-repeating layers to GPU
^[[36mollama_1 |^[[0m llm_load_tensors: offloaded 33/33 layers to GPU
^[[36mollama_1 |^[[0m llm_load_tensors: VRAM used: 0.00 MiB
^[[36mollama_1 |^[[0m ...........................................................................................
^[[36mollama_1 |^[[0m llama_new_context_with_model: n_ctx = 2048
^[[36mollama_1 |^[[0m llama_new_context_with_model: freq_base = 10000.0
^[[36mollama_1 |^[[0m llama_new_context_with_model: freq_scale = 1
^[[36mollama_1 |^[[0m CUDA error 3 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:495: initialization error
^[[36mollama_1 |^[[0m current device: 1882806432
^[[36mollama_1 |^[[0m GGML_ASSERT: /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:495: !"CUDA error"
^[[36mollama_1 |^[[0m Lazy loading /tmp/ollama3369185958/cuda/libext_server.so library
^[[36mollama_1 |^[[0m SIGABRT: abort
^[[36mollama_1 |^[[0m PC=0x7f3bd30369fc m=8 sigcode=18446744073709551610
^[[36mollama_1 |^[[0m signal arrived during cgo execution
^[[36mollama_1 |^[[0m
^[[36mollama_1 |^[[0m goroutine 710 [syscall]:
^[[36mollama_1 |^[[0m runtime.cgocall(0x9c0510, 0xc0003223d0)
^[[36mollama_1 |^[[0m /usr/local/go/src/runtime/cgocall.go:157 +0x4b fp=0xc0003223a8 sp=0xc000322370 pc=0x42666b
^[[36mollama_1 |^[[0m github.com/jmorganca/ollama/llm._Cfunc_dynamic_shim_llama_server_init({0x7f3b70001fe0, 0x7f3adbd4bb30, 0x7f3adbd3ed70, 0x7f3adbd41150, 0x7f3adbd58910, 0x7f3adbd49020, 0x7f3adbd40ff0, 0x7f3adbd3ee10, 0x7f3adbd58a40, 0x7f3adbd58de0, ...}, ...)
^[[36mollama_1 |^[[0m _cgo_gotypes.go:291 +0x45 fp=0xc0003223d0 sp=0xc0003223a8 pc=0x7ccc45
^[[36mollama_1 |^[[0m github.com/jmorganca/ollama/llm.(*shimExtServer).llama_server_init.func1(0x456bdb?, 0x80?, 0x80?)
^[[36mollama_1 |^[[0m /go/src/github.com/jmorganca/ollama/llm/shim_ext_server.go:40 +0xec fp=0xc0003224c0 sp=0xc0003223d0 pc=0x7d200c
(...)
ollama_1 | net.(*netFD).Read(0xc00048e080, {0xc0004aa461?, 0x0?, 0x0?})
ollama_1 | /usr/local/go/src/net/fd_posix.go:55 +0x25 fp=0xc000521700 sp=0xc0005216b8 pc=0x586885
ollama_1 | net.(*conn).Read(0xc00007e090, {0xc0004aa461?, 0x0?, 0x0?})
ollama_1 | /usr/local/go/src/net/net.go:179 +0x45 fp=0xc000521748 sp=0xc000521700 pc=0x594b25
ollama_1 | net.(*TCPConn).Read(0x0?, {0xc0004aa461?, 0x0?, 0x0?})
ollama_1 | <autogenerated>:1 +0x25 fp=0xc000521778 sp=0xc000521748 pc=0x5a6a25
ollama_1 | net/http.(*connReader).backgroundRead(0xc0004aa450)
ollama_1 | /usr/local/go/src/net/http/server.go:683 +0x37 fp=0xc0005217c8 sp=0xc000521778 pc=0x6e1617
ollama_1 | net/http.(*connReader).startBackgroundRead.func2()
ollama_1 | /usr/local/go/src/net/http/server.go:679 +0x25 fp=0xc0005217e0 sp=0xc0005217c8 pc=0x6e1545
ollama_1 | runtime.goexit()
ollama_1 | /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0005217e8 sp=0xc0005217e0 pc=0x48ae21
ollama_1 | created by net/http.(*connReader).startBackgroundRead in goroutine 82
ollama_1 | /usr/local/go/src/net/http/server.go:679 +0xba
ollama_1 |
ollama_1 | rax 0x0
ollama_1 | rbx 0x7fa883fff640
ollama_1 | rcx 0x7fa95ddf99fc
ollama_1 | rdx 0x6
ollama_1 | rdi 0x1
ollama_1 | rsi 0x27
ollama_1 | rbp 0x27
ollama_1 | rsp 0x7fa883ffcec0
ollama_1 | r8 0x7fa883ffcf90
ollama_1 | r9 0x7fa883ffcf20
ollama_1 | r10 0x8
ollama_1 | r11 0x246
ollama_1 | r12 0x6
ollama_1 | r13 0x16
ollama_1 | r14 0x7fa883ffd0ec
ollama_1 | r15 0x0
ollama_1 | rip 0x7fa95ddf99fc
ollama_1 | rflags 0x246
ollama_1 | cs 0x33
ollama_1 | fs 0x0
ollama_1 | gs 0x0
ollama_ollama_1 exited with code 2
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1920/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/1920/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7717
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7717/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7717/comments
|
https://api.github.com/repos/ollama/ollama/issues/7717/events
|
https://github.com/ollama/ollama/issues/7717
| 2,666,988,226
|
I_kwDOJ0Z1Ps6e9wLC
| 7,717
|
Performance regression for 0.4.* caused by number of input tokens
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
},
{
"id": 6930710224,
"node_id": "LA_kwDOJ0Z1Ps8AAAABnRo-0A",
"url": "https://api.github.com/repos/ollama/ollama/labels/top",
"name": "top",
"color": "5EEE9C",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 8
| 2024-11-18T03:23:14
| 2024-11-20T19:42:24
| 2024-11-20T12:00:13
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
As input tokens go up, the rate of token generation goes down. Once the rate has decreased, it becomes a ceiling for subsequent generations.
Affects both CPU and GPU generations, but more pronounced for GPU. Tested with llama3.2:3b-instruct-q4_K_M, llama3.1:8b-instruct-q4_0, qwen2.5:0.5b-instruct-q4_K_M, aya-expanse:8b-q4_K_M.
```console
$ model=llama3.2:3b-instruct-q4_K_M ; curl -s localhost:11434/api/generate -d '{"model":"'$model'","keep_alive":0}' >/dev/null ; for p in 1 2 ; do echo "# pass $p" ; for n in 32 64 128 256 512 1024 2048 4096 8192 16384 ; do curl -s localhost:11434/api/generate -d '{"model":"'$model'","prompt":'"$((echo write a story with these words ; cat /usr/share/dict/words) | dd bs=1 count=$n status=none | jq -sR .)"',"options":{"seed":42,"temperature":0,"num_gpu":-1,"num_ctx":16384,"num_predict":256},"stream":false}' | jq -rc '.|"\(.prompt_eval_count) \(.eval_count/(.eval_duration/1000000000))"' ; done ; done
# pass 1
33 128.12812812812814
52 127.68079800498754
91 124.21154779233382
168 118.84865366759517
307 108.47457627118645
543 92.51897361763643
1036 73.73271889400922
2012 70.64017660044149
3793 67.31527741256903
7540 59.60419091967404
# pass 2
33 79.08557306147668
52 78.93925377736664
91 78.2874617737003
168 78.09640024405125
307 74.4186046511628
543 74.63556851311952
1036 72.62411347517731
2012 68.5041477120685
3793 66.84073107049608
7540 59.327925840092696
```
0.4.2, llama3.2:3b-instruct-q4_K_M, GPU:

0.3.14, llama3.2:3b-instruct-q4_K_M, GPU:

### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14, 0.4.0, 0.4.1, 0.4.2
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7717/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7717/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1124
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1124/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1124/comments
|
https://api.github.com/repos/ollama/ollama/issues/1124/events
|
https://github.com/ollama/ollama/pull/1124
| 1,992,673,569
|
PR_kwDOJ0Z1Ps5faDlk
| 1,124
|
Add Cheshire Cat to community integrations
|
{
"login": "pieroit",
"id": 6328377,
"node_id": "MDQ6VXNlcjYzMjgzNzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6328377?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pieroit",
"html_url": "https://github.com/pieroit",
"followers_url": "https://api.github.com/users/pieroit/followers",
"following_url": "https://api.github.com/users/pieroit/following{/other_user}",
"gists_url": "https://api.github.com/users/pieroit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pieroit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pieroit/subscriptions",
"organizations_url": "https://api.github.com/users/pieroit/orgs",
"repos_url": "https://api.github.com/users/pieroit/repos",
"events_url": "https://api.github.com/users/pieroit/events{/privacy}",
"received_events_url": "https://api.github.com/users/pieroit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-11-14T12:45:21
| 2023-12-06T13:21:45
| 2023-11-16T16:30:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1124",
"html_url": "https://github.com/ollama/ollama/pull/1124",
"diff_url": "https://github.com/ollama/ollama/pull/1124.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1124.patch",
"merged_at": "2023-11-16T16:30:55"
}
|
We have an Ollama adapter (subclassing langchain to make it both sync and async) and also created a [setup tutorial](https://cheshirecat.ai/local-models-with-ollama/)
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1124/reactions",
"total_count": 8,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 2,
"confused": 0,
"heart": 2,
"rocket": 2,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1124/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7108
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7108/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7108/comments
|
https://api.github.com/repos/ollama/ollama/issues/7108/events
|
https://github.com/ollama/ollama/issues/7108
| 2,568,744,046
|
I_kwDOJ0Z1Ps6ZG-xu
| 7,108
|
ngrok not working in latest release
|
{
"login": "p4thakur",
"id": 36537509,
"node_id": "MDQ6VXNlcjM2NTM3NTA5",
"avatar_url": "https://avatars.githubusercontent.com/u/36537509?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/p4thakur",
"html_url": "https://github.com/p4thakur",
"followers_url": "https://api.github.com/users/p4thakur/followers",
"following_url": "https://api.github.com/users/p4thakur/following{/other_user}",
"gists_url": "https://api.github.com/users/p4thakur/gists{/gist_id}",
"starred_url": "https://api.github.com/users/p4thakur/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/p4thakur/subscriptions",
"organizations_url": "https://api.github.com/users/p4thakur/orgs",
"repos_url": "https://api.github.com/users/p4thakur/repos",
"events_url": "https://api.github.com/users/p4thakur/events{/privacy}",
"received_events_url": "https://api.github.com/users/p4thakur/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-10-06T16:28:41
| 2024-11-15T17:36:21
| 2024-11-15T17:36:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ngrok http 11434 --host-header="localhost:11434"
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
latest-[v0.3.12](https://github.com/ollama/ollama/releases/tag/v0.3.12)
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7108/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7108/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1335
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1335/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1335/comments
|
https://api.github.com/repos/ollama/ollama/issues/1335/events
|
https://github.com/ollama/ollama/pull/1335
| 2,019,545,448
|
PR_kwDOJ0Z1Ps5g0-uN
| 1,335
|
allow setting the system and template for prompts in the repl
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-30T21:34:44
| 2023-12-01T17:28:36
| 2023-12-01T17:28:36
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1335",
"html_url": "https://github.com/ollama/ollama/pull/1335",
"diff_url": "https://github.com/ollama/ollama/pull/1335.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1335.patch",
"merged_at": "2023-12-01T17:28:36"
}
|
This change allows setting the system prompt and the prompt template in the repl. It works with both single lines, and with multiline input.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1335/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1335/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3329
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3329/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3329/comments
|
https://api.github.com/repos/ollama/ollama/issues/3329/events
|
https://github.com/ollama/ollama/issues/3329
| 2,204,483,978
|
I_kwDOJ0Z1Ps6DZcGK
| 3,329
|
Import models installed in Linux to Windows
|
{
"login": "tealv",
"id": 54866775,
"node_id": "MDQ6VXNlcjU0ODY2Nzc1",
"avatar_url": "https://avatars.githubusercontent.com/u/54866775?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tealv",
"html_url": "https://github.com/tealv",
"followers_url": "https://api.github.com/users/tealv/followers",
"following_url": "https://api.github.com/users/tealv/following{/other_user}",
"gists_url": "https://api.github.com/users/tealv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tealv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tealv/subscriptions",
"organizations_url": "https://api.github.com/users/tealv/orgs",
"repos_url": "https://api.github.com/users/tealv/repos",
"events_url": "https://api.github.com/users/tealv/events{/privacy}",
"received_events_url": "https://api.github.com/users/tealv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2024-03-24T18:11:03
| 2024-03-25T16:27:59
| 2024-03-25T16:27:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
Copy models from a Linux ollama installation to a Windows installation.
### How should we solve this?
I suggest an import feature for Windows that copies the models from another location and makes changes in the process.
### What is the impact of not solving this?
Re-downloading the models in the Windows version.
### Anything else?
I copied models from a Linux install to a Windows install. To get them to work in Windows, it was necessary to change the ':' in the blob file names to '-'. Linux blob file names start with 'sha256:', and Windows blob file names start with 'sha256-'. It was also necessary to exchange the same strings inside the manifest files for each model.
This only works if done while ollama is running. When ollama (windows) starts, it erases the copied and changed files in the blob directory. Two relevant lines from server.log:
time=2024-03-24T14:39:01.248-07:00 level=INFO source=images.go:806 msg="total blobs: 16"
time=2024-03-24T14:39:02.380-07:00 level=INFO source=images.go:813 msg="total unused blobs removed: 16"
In the Windows version, the blob files are located in: '(user)\\.ollama\models\blobs\'. The manifest files (named: 'latest') are located in: '(user)\\.ollama\models\manifests\registry.ollama.ai\library\(model)\'.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3329/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3329/timeline
| null |
completed
| false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.