url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/1437
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1437/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1437/comments
|
https://api.github.com/repos/ollama/ollama/issues/1437/events
|
https://github.com/ollama/ollama/issues/1437
| 2,032,991,819
|
I_kwDOJ0Z1Ps55LP5L
| 1,437
|
Update Script and Documentation for non-systemd Linux systems
|
{
"login": "NikeshKhatiwada",
"id": 55629421,
"node_id": "MDQ6VXNlcjU1NjI5NDIx",
"avatar_url": "https://avatars.githubusercontent.com/u/55629421?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NikeshKhatiwada",
"html_url": "https://github.com/NikeshKhatiwada",
"followers_url": "https://api.github.com/users/NikeshKhatiwada/followers",
"following_url": "https://api.github.com/users/NikeshKhatiwada/following{/other_user}",
"gists_url": "https://api.github.com/users/NikeshKhatiwada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NikeshKhatiwada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NikeshKhatiwada/subscriptions",
"organizations_url": "https://api.github.com/users/NikeshKhatiwada/orgs",
"repos_url": "https://api.github.com/users/NikeshKhatiwada/repos",
"events_url": "https://api.github.com/users/NikeshKhatiwada/events{/privacy}",
"received_events_url": "https://api.github.com/users/NikeshKhatiwada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2023-12-08T16:41:34
| 2024-03-12T16:36:53
| 2024-03-12T16:36:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I tried default installation script in Alpine Linux (WSL) and though it was apparently installed, I couldn't use ollama command. Also, manual install guide needs alternative steps for non-systemd sytems.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1437/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1437/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/4466
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4466/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4466/comments
|
https://api.github.com/repos/ollama/ollama/issues/4466/events
|
https://github.com/ollama/ollama/issues/4466
| 2,299,176,383
|
I_kwDOJ0Z1Ps6JCqW_
| 4,466
|
Add new model error
|
{
"login": "momo8zero",
"id": 22167486,
"node_id": "MDQ6VXNlcjIyMTY3NDg2",
"avatar_url": "https://avatars.githubusercontent.com/u/22167486?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/momo8zero",
"html_url": "https://github.com/momo8zero",
"followers_url": "https://api.github.com/users/momo8zero/followers",
"following_url": "https://api.github.com/users/momo8zero/following{/other_user}",
"gists_url": "https://api.github.com/users/momo8zero/gists{/gist_id}",
"starred_url": "https://api.github.com/users/momo8zero/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/momo8zero/subscriptions",
"organizations_url": "https://api.github.com/users/momo8zero/orgs",
"repos_url": "https://api.github.com/users/momo8zero/repos",
"events_url": "https://api.github.com/users/momo8zero/events{/privacy}",
"received_events_url": "https://api.github.com/users/momo8zero/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-05-16T02:08:53
| 2024-07-25T22:56:32
| 2024-07-25T22:56:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
hi, im developer
I want add a new Model , but tips
```
Error: Models based on 'LlamaForCausalLM' are not yet supported
```
Do you have any solutions? thanks!
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4466/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4466/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3113
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3113/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3113/comments
|
https://api.github.com/repos/ollama/ollama/issues/3113/events
|
https://github.com/ollama/ollama/issues/3113
| 2,184,295,914
|
I_kwDOJ0Z1Ps6CMbXq
| 3,113
|
Integrated Intel GPU support
|
{
"login": "clvgt12",
"id": 15834506,
"node_id": "MDQ6VXNlcjE1ODM0NTA2",
"avatar_url": "https://avatars.githubusercontent.com/u/15834506?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/clvgt12",
"html_url": "https://github.com/clvgt12",
"followers_url": "https://api.github.com/users/clvgt12/followers",
"following_url": "https://api.github.com/users/clvgt12/following{/other_user}",
"gists_url": "https://api.github.com/users/clvgt12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/clvgt12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/clvgt12/subscriptions",
"organizations_url": "https://api.github.com/users/clvgt12/orgs",
"repos_url": "https://api.github.com/users/clvgt12/repos",
"events_url": "https://api.github.com/users/clvgt12/events{/privacy}",
"received_events_url": "https://api.github.com/users/clvgt12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6677491450,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgJu-g",
"url": "https://api.github.com/repos/ollama/ollama/labels/intel",
"name": "intel",
"color": "226E5B",
"default": false,
"description": "issues relating to Intel GPUs"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 19
| 2024-03-13T15:27:19
| 2024-12-08T09:07:17
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
Please consider adapting Ollama to use Intel Integrated Graphics Processors (such as the Intel Iris Xe Graphics cores) in the future.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3113/reactions",
"total_count": 40,
"+1": 40,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3113/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2164
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2164/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2164/comments
|
https://api.github.com/repos/ollama/ollama/issues/2164/events
|
https://github.com/ollama/ollama/pull/2164
| 2,097,053,692
|
PR_kwDOJ0Z1Ps5k4303
| 2,164
|
Add LangChain4J
|
{
"login": "eddumelendez",
"id": 1810547,
"node_id": "MDQ6VXNlcjE4MTA1NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/1810547?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eddumelendez",
"html_url": "https://github.com/eddumelendez",
"followers_url": "https://api.github.com/users/eddumelendez/followers",
"following_url": "https://api.github.com/users/eddumelendez/following{/other_user}",
"gists_url": "https://api.github.com/users/eddumelendez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eddumelendez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eddumelendez/subscriptions",
"organizations_url": "https://api.github.com/users/eddumelendez/orgs",
"repos_url": "https://api.github.com/users/eddumelendez/repos",
"events_url": "https://api.github.com/users/eddumelendez/events{/privacy}",
"received_events_url": "https://api.github.com/users/eddumelendez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-23T21:55:39
| 2024-02-20T03:20:45
| 2024-02-20T02:17:32
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2164",
"html_url": "https://github.com/ollama/ollama/pull/2164",
"diff_url": "https://github.com/ollama/ollama/pull/2164.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2164.patch",
"merged_at": "2024-02-20T02:17:32"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2164/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2164/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7278
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7278/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7278/comments
|
https://api.github.com/repos/ollama/ollama/issues/7278/events
|
https://github.com/ollama/ollama/issues/7278
| 2,600,617,156
|
I_kwDOJ0Z1Ps6bAkTE
| 7,278
|
llama3.2:latest not running and giving Error: llama runner process no longer running: -1
|
{
"login": "ishu121992",
"id": 11437477,
"node_id": "MDQ6VXNlcjExNDM3NDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/11437477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ishu121992",
"html_url": "https://github.com/ishu121992",
"followers_url": "https://api.github.com/users/ishu121992/followers",
"following_url": "https://api.github.com/users/ishu121992/following{/other_user}",
"gists_url": "https://api.github.com/users/ishu121992/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ishu121992/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ishu121992/subscriptions",
"organizations_url": "https://api.github.com/users/ishu121992/orgs",
"repos_url": "https://api.github.com/users/ishu121992/repos",
"events_url": "https://api.github.com/users/ishu121992/events{/privacy}",
"received_events_url": "https://api.github.com/users/ishu121992/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-10-20T16:14:09
| 2024-10-21T19:34:56
| 2024-10-21T19:34:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have been using Ollama for a while and have never encountered this error while running any other llms (including llama3.1).
Below is the snapshot of server log with error:

Key issue seems to be related to wrong number of tensors. Any help? I have a 3070Ti GPU with 8 GB VRAM.
### OS
WSL2
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "ishu121992",
"id": 11437477,
"node_id": "MDQ6VXNlcjExNDM3NDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/11437477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ishu121992",
"html_url": "https://github.com/ishu121992",
"followers_url": "https://api.github.com/users/ishu121992/followers",
"following_url": "https://api.github.com/users/ishu121992/following{/other_user}",
"gists_url": "https://api.github.com/users/ishu121992/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ishu121992/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ishu121992/subscriptions",
"organizations_url": "https://api.github.com/users/ishu121992/orgs",
"repos_url": "https://api.github.com/users/ishu121992/repos",
"events_url": "https://api.github.com/users/ishu121992/events{/privacy}",
"received_events_url": "https://api.github.com/users/ishu121992/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7278/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7278/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7762
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7762/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7762/comments
|
https://api.github.com/repos/ollama/ollama/issues/7762/events
|
https://github.com/ollama/ollama/issues/7762
| 2,676,164,130
|
I_kwDOJ0Z1Ps6fgwYi
| 7,762
|
What happened with the recent update?
|
{
"login": "JTMarsh556",
"id": 163940208,
"node_id": "U_kgDOCcWHcA",
"avatar_url": "https://avatars.githubusercontent.com/u/163940208?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JTMarsh556",
"html_url": "https://github.com/JTMarsh556",
"followers_url": "https://api.github.com/users/JTMarsh556/followers",
"following_url": "https://api.github.com/users/JTMarsh556/following{/other_user}",
"gists_url": "https://api.github.com/users/JTMarsh556/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JTMarsh556/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JTMarsh556/subscriptions",
"organizations_url": "https://api.github.com/users/JTMarsh556/orgs",
"repos_url": "https://api.github.com/users/JTMarsh556/repos",
"events_url": "https://api.github.com/users/JTMarsh556/events{/privacy}",
"received_events_url": "https://api.github.com/users/JTMarsh556/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 21
| 2024-11-20T14:57:51
| 2024-11-21T00:47:26
| 2024-11-20T20:49:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I just updated this morning and applications that worked flawlessly no longer work. It is like RAG was decimated. The LLMs are just providing generic garbage answers like they always do without RAG.
What happened and how can we fix it?
Until then how can revert back?
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.4.2
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7762/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7762/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/581
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/581/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/581/comments
|
https://api.github.com/repos/ollama/ollama/issues/581/events
|
https://github.com/ollama/ollama/issues/581
| 1,910,030,624
|
I_kwDOJ0Z1Ps5x2MEg
| 581
|
How to use `num_predict`?
|
{
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2023-09-23T23:20:10
| 2023-09-27T01:47:32
| 2023-09-27T01:47:32
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
From https://github.com/jmorganca/ollama/issues/318#issuecomment-1710181439, I see `num_predict` exists, and am trying to figure out how to use it.
Where are the docs on parameters like this?
More specifically, I am trying to figure out how to specify `num_predict` (and similar parameters) to the Ollama server process and/or `/generate` API calls.
|
{
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/581/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/581/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1805
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1805/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1805/comments
|
https://api.github.com/repos/ollama/ollama/issues/1805/events
|
https://github.com/ollama/ollama/issues/1805
| 2,067,289,116
|
I_kwDOJ0Z1Ps57OFQc
| 1,805
|
which model to use for what's the root of 256256?
|
{
"login": "dcasota",
"id": 14890243,
"node_id": "MDQ6VXNlcjE0ODkwMjQz",
"avatar_url": "https://avatars.githubusercontent.com/u/14890243?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dcasota",
"html_url": "https://github.com/dcasota",
"followers_url": "https://api.github.com/users/dcasota/followers",
"following_url": "https://api.github.com/users/dcasota/following{/other_user}",
"gists_url": "https://api.github.com/users/dcasota/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dcasota/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dcasota/subscriptions",
"organizations_url": "https://api.github.com/users/dcasota/orgs",
"repos_url": "https://api.github.com/users/dcasota/repos",
"events_url": "https://api.github.com/users/dcasota/events{/privacy}",
"received_events_url": "https://api.github.com/users/dcasota/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2024-01-05T12:40:07
| 2024-01-12T07:19:35
| 2024-01-12T07:17:28
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "dcasota",
"id": 14890243,
"node_id": "MDQ6VXNlcjE0ODkwMjQz",
"avatar_url": "https://avatars.githubusercontent.com/u/14890243?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dcasota",
"html_url": "https://github.com/dcasota",
"followers_url": "https://api.github.com/users/dcasota/followers",
"following_url": "https://api.github.com/users/dcasota/following{/other_user}",
"gists_url": "https://api.github.com/users/dcasota/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dcasota/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dcasota/subscriptions",
"organizations_url": "https://api.github.com/users/dcasota/orgs",
"repos_url": "https://api.github.com/users/dcasota/repos",
"events_url": "https://api.github.com/users/dcasota/events{/privacy}",
"received_events_url": "https://api.github.com/users/dcasota/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1805/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1805/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4111
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4111/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4111/comments
|
https://api.github.com/repos/ollama/ollama/issues/4111/events
|
https://github.com/ollama/ollama/pull/4111
| 2,276,709,145
|
PR_kwDOJ0Z1Ps5ubC1w
| 4,111
|
Update README.md
|
{
"login": "bernardo-bruning",
"id": 4602873,
"node_id": "MDQ6VXNlcjQ2MDI4NzM=",
"avatar_url": "https://avatars.githubusercontent.com/u/4602873?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bernardo-bruning",
"html_url": "https://github.com/bernardo-bruning",
"followers_url": "https://api.github.com/users/bernardo-bruning/followers",
"following_url": "https://api.github.com/users/bernardo-bruning/following{/other_user}",
"gists_url": "https://api.github.com/users/bernardo-bruning/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bernardo-bruning/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bernardo-bruning/subscriptions",
"organizations_url": "https://api.github.com/users/bernardo-bruning/orgs",
"repos_url": "https://api.github.com/users/bernardo-bruning/repos",
"events_url": "https://api.github.com/users/bernardo-bruning/events{/privacy}",
"received_events_url": "https://api.github.com/users/bernardo-bruning/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-03T00:33:51
| 2024-05-05T21:45:32
| 2024-05-05T21:45:32
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4111",
"html_url": "https://github.com/ollama/ollama/pull/4111",
"diff_url": "https://github.com/ollama/ollama/pull/4111.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4111.patch",
"merged_at": "2024-05-05T21:45:32"
}
|
Includes a proxy plugin for ollama to work like github copilot.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4111/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4111/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6009
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6009/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6009/comments
|
https://api.github.com/repos/ollama/ollama/issues/6009/events
|
https://github.com/ollama/ollama/issues/6009
| 2,433,335,019
|
I_kwDOJ0Z1Ps6RCb7r
| 6,009
|
when trying to download multiple models at same time it cancels automatically
|
{
"login": "hemangjoshi37a",
"id": 12392345,
"node_id": "MDQ6VXNlcjEyMzkyMzQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/12392345?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hemangjoshi37a",
"html_url": "https://github.com/hemangjoshi37a",
"followers_url": "https://api.github.com/users/hemangjoshi37a/followers",
"following_url": "https://api.github.com/users/hemangjoshi37a/following{/other_user}",
"gists_url": "https://api.github.com/users/hemangjoshi37a/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hemangjoshi37a/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hemangjoshi37a/subscriptions",
"organizations_url": "https://api.github.com/users/hemangjoshi37a/orgs",
"repos_url": "https://api.github.com/users/hemangjoshi37a/repos",
"events_url": "https://api.github.com/users/hemangjoshi37a/events{/privacy}",
"received_events_url": "https://api.github.com/users/hemangjoshi37a/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-27T07:21:24
| 2024-07-27T07:58:34
| 2024-07-27T07:58:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
when trying to download multiple models at same time it cancels automatically
### OS
Linux, Docker
### GPU
Nvidia
### CPU
AMD
### Ollama version
latest docker container
|
{
"login": "hemangjoshi37a",
"id": 12392345,
"node_id": "MDQ6VXNlcjEyMzkyMzQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/12392345?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hemangjoshi37a",
"html_url": "https://github.com/hemangjoshi37a",
"followers_url": "https://api.github.com/users/hemangjoshi37a/followers",
"following_url": "https://api.github.com/users/hemangjoshi37a/following{/other_user}",
"gists_url": "https://api.github.com/users/hemangjoshi37a/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hemangjoshi37a/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hemangjoshi37a/subscriptions",
"organizations_url": "https://api.github.com/users/hemangjoshi37a/orgs",
"repos_url": "https://api.github.com/users/hemangjoshi37a/repos",
"events_url": "https://api.github.com/users/hemangjoshi37a/events{/privacy}",
"received_events_url": "https://api.github.com/users/hemangjoshi37a/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6009/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6009/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1113
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1113/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1113/comments
|
https://api.github.com/repos/ollama/ollama/issues/1113/events
|
https://github.com/ollama/ollama/issues/1113
| 1,991,251,754
|
I_kwDOJ0Z1Ps52sBcq
| 1,113
|
I am trying to Create Model File But I am getting permission Denied Error.
|
{
"login": "Sridatta0808",
"id": 10744330,
"node_id": "MDQ6VXNlcjEwNzQ0MzMw",
"avatar_url": "https://avatars.githubusercontent.com/u/10744330?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sridatta0808",
"html_url": "https://github.com/Sridatta0808",
"followers_url": "https://api.github.com/users/Sridatta0808/followers",
"following_url": "https://api.github.com/users/Sridatta0808/following{/other_user}",
"gists_url": "https://api.github.com/users/Sridatta0808/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Sridatta0808/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Sridatta0808/subscriptions",
"organizations_url": "https://api.github.com/users/Sridatta0808/orgs",
"repos_url": "https://api.github.com/users/Sridatta0808/repos",
"events_url": "https://api.github.com/users/Sridatta0808/events{/privacy}",
"received_events_url": "https://api.github.com/users/Sridatta0808/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2023-11-13T18:48:01
| 2023-11-16T00:41:15
| 2023-11-16T00:41:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Project Structure :
bin/
src/
models/
requirements.txt
Readme.md
Steps Followed:
$ nano Modelfile - > Inserted -> FROM ./models/mistral-7b-instruct-v0.1.Q3_K_M.gguf
$ ollama create example -f Modelfile
-> Returns Following Error :
couldn't open modelfile '/home/sridatta/projects/basic_llm/langchain/Modelfile' Error: failed to open file: open /home/sridatta/projects/basic_llm/langchain/Modelfile: permission denied
Tried the below approach Approache :
-> chmod -R o+rx Modelfile path (DIDNT WORK)
For any one please let me know how to load GGUF model to ollama chat.
Thanks in advance.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1113/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1113/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1287
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1287/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1287/comments
|
https://api.github.com/repos/ollama/ollama/issues/1287/events
|
https://github.com/ollama/ollama/pull/1287
| 2,012,839,670
|
PR_kwDOJ0Z1Ps5geAMX
| 1,287
|
ignore jetbrain ides
|
{
"login": "rootedbox",
"id": 3997890,
"node_id": "MDQ6VXNlcjM5OTc4OTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/3997890?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rootedbox",
"html_url": "https://github.com/rootedbox",
"followers_url": "https://api.github.com/users/rootedbox/followers",
"following_url": "https://api.github.com/users/rootedbox/following{/other_user}",
"gists_url": "https://api.github.com/users/rootedbox/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rootedbox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rootedbox/subscriptions",
"organizations_url": "https://api.github.com/users/rootedbox/orgs",
"repos_url": "https://api.github.com/users/rootedbox/repos",
"events_url": "https://api.github.com/users/rootedbox/events{/privacy}",
"received_events_url": "https://api.github.com/users/rootedbox/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-27T18:18:07
| 2023-11-27T20:57:45
| 2023-11-27T20:57:45
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1287",
"html_url": "https://github.com/ollama/ollama/pull/1287",
"diff_url": "https://github.com/ollama/ollama/pull/1287.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1287.patch",
"merged_at": "2023-11-27T20:57:45"
}
|
ignore jetbrain ides
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1287/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1287/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8213
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8213/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8213/comments
|
https://api.github.com/repos/ollama/ollama/issues/8213/events
|
https://github.com/ollama/ollama/issues/8213
| 2,755,058,207
|
I_kwDOJ0Z1Ps6kNtof
| 8,213
|
do embedding request: Post \"http://127.0.0.1:57955/embedding\": read tcp 127.0.0.1:57957->127.0.0.1:57955: wsarecv: An existing connection was forcibly closed by the remote host.
|
{
"login": "conflictpeng",
"id": 75059708,
"node_id": "MDQ6VXNlcjc1MDU5NzA4",
"avatar_url": "https://avatars.githubusercontent.com/u/75059708?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/conflictpeng",
"html_url": "https://github.com/conflictpeng",
"followers_url": "https://api.github.com/users/conflictpeng/followers",
"following_url": "https://api.github.com/users/conflictpeng/following{/other_user}",
"gists_url": "https://api.github.com/users/conflictpeng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/conflictpeng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/conflictpeng/subscriptions",
"organizations_url": "https://api.github.com/users/conflictpeng/orgs",
"repos_url": "https://api.github.com/users/conflictpeng/repos",
"events_url": "https://api.github.com/users/conflictpeng/events{/privacy}",
"received_events_url": "https://api.github.com/users/conflictpeng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-12-23T02:14:09
| 2024-12-23T09:10:29
| 2024-12-23T09:10:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
这个好想在处理pdf交大的文件的时候是不行的。
### OS
Windows
### GPU
Nvidia
### CPU
Intel, AMD
### Ollama version
0.5.1
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8213/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8213/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/809
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/809/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/809/comments
|
https://api.github.com/repos/ollama/ollama/issues/809/events
|
https://github.com/ollama/ollama/pull/809
| 1,946,147,820
|
PR_kwDOJ0Z1Ps5c8y8H
| 809
|
fix: regression unsupported metal types
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-16T21:40:27
| 2023-10-17T15:40:41
| 2023-10-17T15:40:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/809",
"html_url": "https://github.com/ollama/ollama/pull/809",
"diff_url": "https://github.com/ollama/ollama/pull/809.diff",
"patch_url": "https://github.com/ollama/ollama/pull/809.patch",
"merged_at": "2023-10-17T15:40:40"
}
|
omitting `--n-gpu-layers` means use metal on macos which isn't correct since ollama uses `num_gpu=0` to explicitly disable gpu for file types that are not implemented in metal
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/809/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/809/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7/comments
|
https://api.github.com/repos/ollama/ollama/issues/7/events
|
https://github.com/ollama/ollama/pull/7
| 1,777,858,042
|
PR_kwDOJ0Z1Ps5UFYG5
| 7
|
add prompt templates as j2 templates
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-06-27T22:50:30
| 2023-06-28T16:27:28
| 2023-06-28T14:37:03
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7",
"html_url": "https://github.com/ollama/ollama/pull/7",
"diff_url": "https://github.com/ollama/ollama/pull/7.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7.patch",
"merged_at": "2023-06-28T14:37:03"
}
|
easier to read and maintain since diffs are much more obvious. this also provides future opportunity for users to define their own prompt templates
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/86
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/86/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/86/comments
|
https://api.github.com/repos/ollama/ollama/issues/86/events
|
https://github.com/ollama/ollama/pull/86
| 1,808,245,303
|
PR_kwDOJ0Z1Ps5Vsk0U
| 86
|
welcome screen improvements
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-17T17:30:06
| 2023-07-17T17:44:57
| 2023-07-17T17:44:53
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/86",
"html_url": "https://github.com/ollama/ollama/pull/86",
"diff_url": "https://github.com/ollama/ollama/pull/86.diff",
"patch_url": "https://github.com/ollama/ollama/pull/86.patch",
"merged_at": "2023-07-17T17:44:53"
}
|
- make window draggable
- improve copy command experience on the finish page
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/86/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/86/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6736
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6736/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6736/comments
|
https://api.github.com/repos/ollama/ollama/issues/6736/events
|
https://github.com/ollama/ollama/pull/6736
| 2,517,978,310
|
PR_kwDOJ0Z1Ps57ESYG
| 6,736
|
Verify permissions for AMD GPU
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-09-10T22:03:00
| 2024-10-23T16:49:46
| 2024-09-11T18:38:25
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6736",
"html_url": "https://github.com/ollama/ollama/pull/6736",
"diff_url": "https://github.com/ollama/ollama/pull/6736.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6736.patch",
"merged_at": "2024-09-11T18:38:25"
}
|
This adds back a check which was lost many releases back to verify /dev/kfd permissions which when lacking, can lead to confusing failure modes of:
"rocBLAS error: Could not initialize Tensile host: No devices found"
This implementation does not hard fail the serve command but instead will fall back to CPU with an error log. In the future we can include this in the GPU discovery UX to show detected but unsupported devices we discovered.
Fixes #6685
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6736/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6736/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6898
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6898/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6898/comments
|
https://api.github.com/repos/ollama/ollama/issues/6898/events
|
https://github.com/ollama/ollama/pull/6898
| 2,539,825,980
|
PR_kwDOJ0Z1Ps58ORcE
| 6,898
|
CI: win arm adjustments
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-20T23:53:51
| 2024-09-20T23:58:58
| 2024-09-20T23:58:56
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6898",
"html_url": "https://github.com/ollama/ollama/pull/6898",
"diff_url": "https://github.com/ollama/ollama/pull/6898.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6898.patch",
"merged_at": "2024-09-20T23:58:56"
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6898/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6898/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3812
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3812/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3812/comments
|
https://api.github.com/repos/ollama/ollama/issues/3812/events
|
https://github.com/ollama/ollama/issues/3812
| 2,255,670,278
|
I_kwDOJ0Z1Ps6GcswG
| 3,812
|
希望Ollama运行后直接打开界面
|
{
"login": "elarbor",
"id": 43592730,
"node_id": "MDQ6VXNlcjQzNTkyNzMw",
"avatar_url": "https://avatars.githubusercontent.com/u/43592730?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/elarbor",
"html_url": "https://github.com/elarbor",
"followers_url": "https://api.github.com/users/elarbor/followers",
"following_url": "https://api.github.com/users/elarbor/following{/other_user}",
"gists_url": "https://api.github.com/users/elarbor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/elarbor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/elarbor/subscriptions",
"organizations_url": "https://api.github.com/users/elarbor/orgs",
"repos_url": "https://api.github.com/users/elarbor/repos",
"events_url": "https://api.github.com/users/elarbor/events{/privacy}",
"received_events_url": "https://api.github.com/users/elarbor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-04-22T06:14:15
| 2024-05-01T23:59:00
| 2024-05-01T23:59:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
页面只有“Ollama is running”,希望可以提供运行后可以直接打开webUI
<img width="470" alt="image" src="https://github.com/ollama/ollama/assets/43592730/96392aca-b563-4da8-b823-bf6b82641f51">
<img width="1920" alt="image" src="https://github.com/ollama/ollama/assets/43592730/5e117f86-e436-451f-a00f-a531373c25b3">
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3812/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3812/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3791
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3791/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3791/comments
|
https://api.github.com/repos/ollama/ollama/issues/3791/events
|
https://github.com/ollama/ollama/issues/3791
| 2,254,883,036
|
I_kwDOJ0Z1Ps6GZsjc
| 3,791
|
Rename files with prefix "sha256:" to "sha256_"
|
{
"login": "ker2xu",
"id": 31959917,
"node_id": "MDQ6VXNlcjMxOTU5OTE3",
"avatar_url": "https://avatars.githubusercontent.com/u/31959917?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ker2xu",
"html_url": "https://github.com/ker2xu",
"followers_url": "https://api.github.com/users/ker2xu/followers",
"following_url": "https://api.github.com/users/ker2xu/following{/other_user}",
"gists_url": "https://api.github.com/users/ker2xu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ker2xu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ker2xu/subscriptions",
"organizations_url": "https://api.github.com/users/ker2xu/orgs",
"repos_url": "https://api.github.com/users/ker2xu/repos",
"events_url": "https://api.github.com/users/ker2xu/events{/privacy}",
"received_events_url": "https://api.github.com/users/ker2xu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-04-21T03:36:51
| 2024-04-21T13:24:57
| 2024-04-21T03:38:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Some network file systems do not handle ":" well and interpret the string followed by ":" as foreign host, leading to Permission error (due to incorrect and non-existing locations/hosts).
It is also not good to use special symbol like ":" instead of "_", which is accepted by all OS and file systems.
FYI.
https://www.ibm.com/docs/en/zvm/7.2?topic=occ-understanding-network-file-system-nfs-path-name-syntax
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.17
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3791/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3791/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7393
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7393/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7393/comments
|
https://api.github.com/repos/ollama/ollama/issues/7393/events
|
https://github.com/ollama/ollama/issues/7393
| 2,617,327,451
|
I_kwDOJ0Z1Ps6cAT9b
| 7,393
|
EOF error on pull with different model
|
{
"login": "bdytx5",
"id": 32812705,
"node_id": "MDQ6VXNlcjMyODEyNzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/32812705?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bdytx5",
"html_url": "https://github.com/bdytx5",
"followers_url": "https://api.github.com/users/bdytx5/followers",
"following_url": "https://api.github.com/users/bdytx5/following{/other_user}",
"gists_url": "https://api.github.com/users/bdytx5/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bdytx5/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bdytx5/subscriptions",
"organizations_url": "https://api.github.com/users/bdytx5/orgs",
"repos_url": "https://api.github.com/users/bdytx5/repos",
"events_url": "https://api.github.com/users/bdytx5/events{/privacy}",
"received_events_url": "https://api.github.com/users/bdytx5/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 13
| 2024-10-28T05:16:48
| 2024-11-05T22:21:46
| 2024-11-05T22:21:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
brett@brett:~$ ollama pull llama3.2
Error: registry.ollama.ai/library/phi3:latest: EOF
really confused. This is not an out of memory error. Tried reseting the systemctl stuff also ... https://github.com/ollama/ollama/issues/1859
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7393/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7393/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6330
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6330/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6330/comments
|
https://api.github.com/repos/ollama/ollama/issues/6330/events
|
https://github.com/ollama/ollama/issues/6330
| 2,462,388,075
|
I_kwDOJ0Z1Ps6SxQ9r
| 6,330
|
Finetuned LLAMA 3.1 8B Instruct is giving random output
|
{
"login": "krisbianprabowo",
"id": 32126694,
"node_id": "MDQ6VXNlcjMyMTI2Njk0",
"avatar_url": "https://avatars.githubusercontent.com/u/32126694?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/krisbianprabowo",
"html_url": "https://github.com/krisbianprabowo",
"followers_url": "https://api.github.com/users/krisbianprabowo/followers",
"following_url": "https://api.github.com/users/krisbianprabowo/following{/other_user}",
"gists_url": "https://api.github.com/users/krisbianprabowo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/krisbianprabowo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/krisbianprabowo/subscriptions",
"organizations_url": "https://api.github.com/users/krisbianprabowo/orgs",
"repos_url": "https://api.github.com/users/krisbianprabowo/repos",
"events_url": "https://api.github.com/users/krisbianprabowo/events{/privacy}",
"received_events_url": "https://api.github.com/users/krisbianprabowo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-13T04:59:41
| 2024-08-13T07:21:39
| 2024-08-13T05:40:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hello so i tried to running my finetuned model that based on Llama 3.1 8B instruct model.
It's look like it's giving a random output if you check it below:
<img width="1371" alt="Screen Shot 2024-08-13 at 11 41 33" src="https://github.com/user-attachments/assets/b4757e3b-e18d-411f-ad20-de6890a242cc">
I double checked it to make sure if my quantized/formatted to "gguf" model is actually the main problems, so i'm using another text-generation project like LM Studio. It is actually working fine and not giving the random output.
<img width="837" alt="Screen Shot 2024-08-13 at 11 55 39" src="https://github.com/user-attachments/assets/fa153cfe-5fcd-498c-94a3-c350cd2fc597">
I ran the llama 3.1 8B instruct models from ollama itself and it's working fine too.
Already updated the Ollama to the latest version and still giving the same output. Perhaps, i'm missing something before running my models so it can give appropriate output?
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.5
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6330/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 1,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6330/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3177
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3177/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3177/comments
|
https://api.github.com/repos/ollama/ollama/issues/3177/events
|
https://github.com/ollama/ollama/issues/3177
| 2,189,717,372
|
I_kwDOJ0Z1Ps6ChG98
| 3,177
|
GPU utilization & Context Length and Max Tokens & Command-line windows crash & Server connection failed
|
{
"login": "HWiwoiiii",
"id": 103039908,
"node_id": "U_kgDOBiRDpA",
"avatar_url": "https://avatars.githubusercontent.com/u/103039908?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HWiwoiiii",
"html_url": "https://github.com/HWiwoiiii",
"followers_url": "https://api.github.com/users/HWiwoiiii/followers",
"following_url": "https://api.github.com/users/HWiwoiiii/following{/other_user}",
"gists_url": "https://api.github.com/users/HWiwoiiii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HWiwoiiii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HWiwoiiii/subscriptions",
"organizations_url": "https://api.github.com/users/HWiwoiiii/orgs",
"repos_url": "https://api.github.com/users/HWiwoiiii/repos",
"events_url": "https://api.github.com/users/HWiwoiiii/events{/privacy}",
"received_events_url": "https://api.github.com/users/HWiwoiiii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-03-16T02:40:48
| 2024-04-28T19:07:15
| 2024-04-28T19:07:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
1. How to setup Ollama for models to use my GPU?
I'm using Windows with a 32GB DDR4 2667MHz memory (16GB + 16GB) and an NVIDIA GeForce RTX 2080 Super with Max-Q Design (8GB / Dell). Intel(R) UHD Graphics (1GB / Dell). However, the Ollama doesn't seem to utilize my GPU even when I have both my CPU and memory running at 99% utilization. Is this a bug or are there any settings that I am accidentally ignoring?
2. Does adjusting Context Length and Max Tokens have any impact on running the model?
Are the Context Length and Max Tokens predefined or that we can change them in open webui? Can I extend myh input and the outputs by adjusting them? I know this is a silly question so if anyone is tired explaining it leaving a link about that here would be greatly appreciated as well.
3. How to keep Ollama running?
The cmd window that I use to run a model always crash and disappears when I want to type something in it is it normal? By the way do I need to run `ollama run gemma:7b` every time I restart my pc?



### What did you expect to see?
1. How to setup Ollama for models to use my GPU?
GPU being used
2. Does adjusting Context Length and Max Tokens have any impact on running the model?
3. How to keep Ollama running?
Ollama running correctly and can be accessed by open webui

### Steps to reproduce
restart my pc
### Are there any recent changes that introduced the issue?
not working
### OS
Windows
### Architecture
Other
### Platform
_No response_
### Ollama version
latest
### GPU
Intel
### GPU info
NVIDIA GeForce RTX 2080 Super with Max-Q Design (8GB / Dell). Intel(R) UHD Graphics (1GB / Dell).
### CPU
Intel
### Other software
open webui
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3177/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3177/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7291
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7291/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7291/comments
|
https://api.github.com/repos/ollama/ollama/issues/7291/events
|
https://github.com/ollama/ollama/issues/7291
| 2,602,033,810
|
I_kwDOJ0Z1Ps6bF-KS
| 7,291
|
ollama._types.ResponseError
|
{
"login": "1214summer",
"id": 116168346,
"node_id": "U_kgDOBuyWmg",
"avatar_url": "https://avatars.githubusercontent.com/u/116168346?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/1214summer",
"html_url": "https://github.com/1214summer",
"followers_url": "https://api.github.com/users/1214summer/followers",
"following_url": "https://api.github.com/users/1214summer/following{/other_user}",
"gists_url": "https://api.github.com/users/1214summer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/1214summer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/1214summer/subscriptions",
"organizations_url": "https://api.github.com/users/1214summer/orgs",
"repos_url": "https://api.github.com/users/1214summer/repos",
"events_url": "https://api.github.com/users/1214summer/events{/privacy}",
"received_events_url": "https://api.github.com/users/1214summer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 7706485628,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1ejfA",
"url": "https://api.github.com/repos/ollama/ollama/labels/python",
"name": "python",
"color": "59642B",
"default": false,
"description": "relating to the ollama-python client library"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-10-21T10:06:16
| 2024-12-02T07:59:58
| 2024-12-02T07:59:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
~~ python test.py
import ollama
res=ollama.chat(model="qwen2.5:0.5b",stream=False,messages=[{"role": "user","content": "who are you"}],options={"temperature":0})
print(res)
Traceback (most recent call last):
File "xxxxx", line 2, in <module>
res=ollama.chat(model="qwen2.5:0.5b",stream=False,messages=[{"role": "user","content": "who are you"}],options={"temperature":0})
File "xxxxx/lib/python3.10/site-packages/ollama/_client.py", line 236, in chat
return self._request_stream(
File "xxxxx/lib/python3.10/site-packages/ollama/_client.py", line 99, in _request_stream
return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
File "xxxxx/lib/python3.10/site-packages/ollama/_client.py", line 75, in _request
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError
Why can't my Python code access the locally running Ollama?
curl http://localhost:11434/api/chat -d '{
"model": "qwen2.5:0.5b",
"messages": [
{ "role": "user", "content": "why is the sky blue?" }
]
}'
but this way can work
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.13
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7291/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7291/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6095
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6095/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6095/comments
|
https://api.github.com/repos/ollama/ollama/issues/6095/events
|
https://github.com/ollama/ollama/issues/6095
| 2,439,738,630
|
I_kwDOJ0Z1Ps6Ra3UG
| 6,095
|
Keeps switching between cached and wired memory
|
{
"login": "chigkim",
"id": 22120994,
"node_id": "MDQ6VXNlcjIyMTIwOTk0",
"avatar_url": "https://avatars.githubusercontent.com/u/22120994?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chigkim",
"html_url": "https://github.com/chigkim",
"followers_url": "https://api.github.com/users/chigkim/followers",
"following_url": "https://api.github.com/users/chigkim/following{/other_user}",
"gists_url": "https://api.github.com/users/chigkim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chigkim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chigkim/subscriptions",
"organizations_url": "https://api.github.com/users/chigkim/orgs",
"repos_url": "https://api.github.com/users/chigkim/repos",
"events_url": "https://api.github.com/users/chigkim/events{/privacy}",
"received_events_url": "https://api.github.com/users/chigkim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 5
| 2024-07-31T10:47:36
| 2024-08-11T23:57:26
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I offloaded 47 out of 127 layers of Llama 3.1 405b q2 on an M3 Max with 64GB of RAM.
When I run the inference, the memory usage shows only about 8GB, while the cached memory is 56GB. This state persists most of the time, likely indicating that the CPU is in use and data is streaming directly from the disk.
Occasionally, the cached memory decreases, and the wired memory and memory usage increase, suggesting that the GPU is being utilized. Then, the memory usage drops back down to 8GB with the cached memory size at 56GB, repeating the cycle.
Shouldn't the 47 layers be kept in wired memory at all times instead of cached to avoid the constant switching between cached and wired memory? It takes a while to transfer between cached and wired.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
v0.3.0
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6095/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6095/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5032
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5032/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5032/comments
|
https://api.github.com/repos/ollama/ollama/issues/5032/events
|
https://github.com/ollama/ollama/pull/5032
| 2,351,980,783
|
PR_kwDOJ0Z1Ps5yaJW-
| 5,032
|
Actually skip PhysX on windows
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-13T20:17:42
| 2024-06-19T00:59:10
| 2024-06-13T20:26:09
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5032",
"html_url": "https://github.com/ollama/ollama/pull/5032",
"diff_url": "https://github.com/ollama/ollama/pull/5032.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5032.patch",
"merged_at": "2024-06-13T20:26:09"
}
|
Fixes #4984
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5032/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5032/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7388
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7388/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7388/comments
|
https://api.github.com/repos/ollama/ollama/issues/7388/events
|
https://github.com/ollama/ollama/issues/7388
| 2,616,961,185
|
I_kwDOJ0Z1Ps6b-6ih
| 7,388
|
Llama3.2-vision - fails to process png files
|
{
"login": "pitimespi",
"id": 534183,
"node_id": "MDQ6VXNlcjUzNDE4Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/534183?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pitimespi",
"html_url": "https://github.com/pitimespi",
"followers_url": "https://api.github.com/users/pitimespi/followers",
"following_url": "https://api.github.com/users/pitimespi/following{/other_user}",
"gists_url": "https://api.github.com/users/pitimespi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pitimespi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pitimespi/subscriptions",
"organizations_url": "https://api.github.com/users/pitimespi/orgs",
"repos_url": "https://api.github.com/users/pitimespi/repos",
"events_url": "https://api.github.com/users/pitimespi/events{/privacy}",
"received_events_url": "https://api.github.com/users/pitimespi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 21
| 2024-10-27T23:45:21
| 2024-10-29T04:26:24
| 2024-10-29T04:26:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Couldn't process image: "invalid image type: application/octet-stream"
Error: invalid image type: application/octet-stream
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
3.2-vision 0.4.0-rc5
|
{
"login": "pitimespi",
"id": 534183,
"node_id": "MDQ6VXNlcjUzNDE4Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/534183?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pitimespi",
"html_url": "https://github.com/pitimespi",
"followers_url": "https://api.github.com/users/pitimespi/followers",
"following_url": "https://api.github.com/users/pitimespi/following{/other_user}",
"gists_url": "https://api.github.com/users/pitimespi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pitimespi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pitimespi/subscriptions",
"organizations_url": "https://api.github.com/users/pitimespi/orgs",
"repos_url": "https://api.github.com/users/pitimespi/repos",
"events_url": "https://api.github.com/users/pitimespi/events{/privacy}",
"received_events_url": "https://api.github.com/users/pitimespi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7388/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7388/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3339
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3339/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3339/comments
|
https://api.github.com/repos/ollama/ollama/issues/3339/events
|
https://github.com/ollama/ollama/issues/3339
| 2,205,400,860
|
I_kwDOJ0Z1Ps6Dc78c
| 3,339
|
ollama tries to contact registry even when adding a local model
|
{
"login": "noahhaon",
"id": 170715,
"node_id": "MDQ6VXNlcjE3MDcxNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/170715?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/noahhaon",
"html_url": "https://github.com/noahhaon",
"followers_url": "https://api.github.com/users/noahhaon/followers",
"following_url": "https://api.github.com/users/noahhaon/following{/other_user}",
"gists_url": "https://api.github.com/users/noahhaon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/noahhaon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/noahhaon/subscriptions",
"organizations_url": "https://api.github.com/users/noahhaon/orgs",
"repos_url": "https://api.github.com/users/noahhaon/repos",
"events_url": "https://api.github.com/users/noahhaon/events{/privacy}",
"received_events_url": "https://api.github.com/users/noahhaon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-03-25T10:28:11
| 2024-03-25T11:47:11
| 2024-03-25T11:46:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have a Modelfile which has a full path to the model I wish to load, however I get an error when running `ollama create` regarding an invalid certificate from registry.ollama.ai (see #3336)
For stability and privacy reasons, it would be best if ollama does not try to connect to external resources when creating models from the local filesystem.
### What did you expect to see?
`ollama create` succeeds
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
_No response_
### Architecture
_No response_
### Platform
_No response_
### Ollama version
0.1.29
### GPU
_No response_
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "noahhaon",
"id": 170715,
"node_id": "MDQ6VXNlcjE3MDcxNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/170715?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/noahhaon",
"html_url": "https://github.com/noahhaon",
"followers_url": "https://api.github.com/users/noahhaon/followers",
"following_url": "https://api.github.com/users/noahhaon/following{/other_user}",
"gists_url": "https://api.github.com/users/noahhaon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/noahhaon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/noahhaon/subscriptions",
"organizations_url": "https://api.github.com/users/noahhaon/orgs",
"repos_url": "https://api.github.com/users/noahhaon/repos",
"events_url": "https://api.github.com/users/noahhaon/events{/privacy}",
"received_events_url": "https://api.github.com/users/noahhaon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3339/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3339/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3860
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3860/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3860/comments
|
https://api.github.com/repos/ollama/ollama/issues/3860/events
|
https://github.com/ollama/ollama/issues/3860
| 2,260,184,173
|
I_kwDOJ0Z1Ps6Gt6xt
| 3,860
|
Serial generation performance regression from v0.1.32 on main
|
{
"login": "brycereitano",
"id": 1928691,
"node_id": "MDQ6VXNlcjE5Mjg2OTE=",
"avatar_url": "https://avatars.githubusercontent.com/u/1928691?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/brycereitano",
"html_url": "https://github.com/brycereitano",
"followers_url": "https://api.github.com/users/brycereitano/followers",
"following_url": "https://api.github.com/users/brycereitano/following{/other_user}",
"gists_url": "https://api.github.com/users/brycereitano/gists{/gist_id}",
"starred_url": "https://api.github.com/users/brycereitano/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/brycereitano/subscriptions",
"organizations_url": "https://api.github.com/users/brycereitano/orgs",
"repos_url": "https://api.github.com/users/brycereitano/repos",
"events_url": "https://api.github.com/users/brycereitano/events{/privacy}",
"received_events_url": "https://api.github.com/users/brycereitano/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 9
| 2024-04-24T02:49:59
| 2024-04-28T18:27:35
| 2024-04-25T16:24:09
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
In effort to test the latest code which includes the recently merged concurrency branch (#3418), I noticed a performance regression when prompting a model already loaded in VRAM. This appears on latest main (2ac3dd6853a45237ac049d0a4982becf91ca8c45) branch and I haven't been able to identify the commit that caused the regression as of yet as the docker builds take a long time.
I have confirmed that a docker build of v0.1.32 works as intended and subsequent calls to the same model are snappy. Whereas on the 2ac3dd68 commit, subsequent calls can take up to 20 seconds over the normal 1 second for a complete reply.
The impact of this regression seems to be impacted by model size. The large the model, the longer the delays between each prompt.
I have attached the debug logs for the few prompts I ran. [2ac3dd68.log](https://github.com/ollama/ollama/files/15087058/2ac3dd68.log) Although these logs are from an interactive session in open-webui, I can reproduce by calling the following serially multiple times.
```
curl https://ollama.zete.dev/api/generate -d '{
"model": "llama3:8b-instruct-q8_0",
"prompt": "Why is the sky blue?"
}'
```
### OS
Docker
### GPU
Nvidia
### CPU
AMD
### Ollama version
2ac3dd6
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3860/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3860/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2161
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2161/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2161/comments
|
https://api.github.com/repos/ollama/ollama/issues/2161/events
|
https://github.com/ollama/ollama/issues/2161
| 2,096,835,108
|
I_kwDOJ0Z1Ps58-yok
| 2,161
|
Provide Docker images with pre-downloaded models
|
{
"login": "eddumelendez",
"id": 1810547,
"node_id": "MDQ6VXNlcjE4MTA1NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/1810547?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eddumelendez",
"html_url": "https://github.com/eddumelendez",
"followers_url": "https://api.github.com/users/eddumelendez/followers",
"following_url": "https://api.github.com/users/eddumelendez/following{/other_user}",
"gists_url": "https://api.github.com/users/eddumelendez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eddumelendez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eddumelendez/subscriptions",
"organizations_url": "https://api.github.com/users/eddumelendez/orgs",
"repos_url": "https://api.github.com/users/eddumelendez/repos",
"events_url": "https://api.github.com/users/eddumelendez/events{/privacy}",
"received_events_url": "https://api.github.com/users/eddumelendez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-01-23T19:28:00
| 2024-10-31T12:53:26
| 2024-03-11T19:12:10
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Libraries and frameworks have been built around Ollama such as [LangChain4J](https://github.com/langchain4j/langchain4j) and pulling models are part of the process to make use of it.
Currently, in order to test the library integration there is a setup done using [Testcontainers](https://testcontainers.com/) to start Ollama's Docker image and pull the image to provide the infrastructure needed for the Integration Test. See [this code snippet](https://github.com/langchain4j/langchain4j/blob/50f32ba1985826ce7dc81300400c6c5fd2c22576/langchain4j-ollama/src/test/java/dev/langchain4j/model/ollama/AbstractOllamaInfrastructure.java#L65-L75).
I think https://github.com/jmorganca/ollama/issues/2160 could help in this scenario as well but for those who want to run specific models supported by Ollama, it provides a nice getting started experience IMO. There is an effort on the project to maintain those images https://hub.docker.com/u/langchain4j but would be nice to provide images with the model in it.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2161/reactions",
"total_count": 10,
"+1": 8,
"-1": 2,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2161/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/6638
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6638/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6638/comments
|
https://api.github.com/repos/ollama/ollama/issues/6638/events
|
https://github.com/ollama/ollama/issues/6638
| 2,505,948,271
|
I_kwDOJ0Z1Ps6VXbxv
| 6,638
|
Llama 3.1 8b not generating answers since past few days
|
{
"login": "ToshiKBhat",
"id": 97841687,
"node_id": "U_kgDOBdTyFw",
"avatar_url": "https://avatars.githubusercontent.com/u/97841687?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ToshiKBhat",
"html_url": "https://github.com/ToshiKBhat",
"followers_url": "https://api.github.com/users/ToshiKBhat/followers",
"following_url": "https://api.github.com/users/ToshiKBhat/following{/other_user}",
"gists_url": "https://api.github.com/users/ToshiKBhat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ToshiKBhat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ToshiKBhat/subscriptions",
"organizations_url": "https://api.github.com/users/ToshiKBhat/orgs",
"repos_url": "https://api.github.com/users/ToshiKBhat/repos",
"events_url": "https://api.github.com/users/ToshiKBhat/events{/privacy}",
"received_events_url": "https://api.github.com/users/ToshiKBhat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 9
| 2024-09-04T17:41:11
| 2024-11-17T09:33:35
| 2024-11-17T09:33:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The llama 3.1 8b model was generating answers in my RAG app until a few days back. Now it says i cannot help with that even when i use a simple system prompt - you are a helpful assistant , use the context provided to you to answer the user questions.
The 70b model seems to work fine, I also noticed the 8b model was updated recently.
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6638/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6638/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1226
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1226/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1226/comments
|
https://api.github.com/repos/ollama/ollama/issues/1226/events
|
https://github.com/ollama/ollama/issues/1226
| 2,004,922,004
|
I_kwDOJ0Z1Ps53gK6U
| 1,226
|
Support for Intel neural-chat
|
{
"login": "erima2020",
"id": 63055709,
"node_id": "MDQ6VXNlcjYzMDU1NzA5",
"avatar_url": "https://avatars.githubusercontent.com/u/63055709?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/erima2020",
"html_url": "https://github.com/erima2020",
"followers_url": "https://api.github.com/users/erima2020/followers",
"following_url": "https://api.github.com/users/erima2020/following{/other_user}",
"gists_url": "https://api.github.com/users/erima2020/gists{/gist_id}",
"starred_url": "https://api.github.com/users/erima2020/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erima2020/subscriptions",
"organizations_url": "https://api.github.com/users/erima2020/orgs",
"repos_url": "https://api.github.com/users/erima2020/repos",
"events_url": "https://api.github.com/users/erima2020/events{/privacy}",
"received_events_url": "https://api.github.com/users/erima2020/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-21T18:39:10
| 2023-11-21T20:06:12
| 2023-11-21T20:06:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
Thank you for the great software ! I have found zephyr particularly good. I was wondering whether support for Intel neural-chat (e.g., 3.1) was planned.
Best wishes,
Eric
|
{
"login": "erima2020",
"id": 63055709,
"node_id": "MDQ6VXNlcjYzMDU1NzA5",
"avatar_url": "https://avatars.githubusercontent.com/u/63055709?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/erima2020",
"html_url": "https://github.com/erima2020",
"followers_url": "https://api.github.com/users/erima2020/followers",
"following_url": "https://api.github.com/users/erima2020/following{/other_user}",
"gists_url": "https://api.github.com/users/erima2020/gists{/gist_id}",
"starred_url": "https://api.github.com/users/erima2020/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erima2020/subscriptions",
"organizations_url": "https://api.github.com/users/erima2020/orgs",
"repos_url": "https://api.github.com/users/erima2020/repos",
"events_url": "https://api.github.com/users/erima2020/events{/privacy}",
"received_events_url": "https://api.github.com/users/erima2020/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1226/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1226/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/215
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/215/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/215/comments
|
https://api.github.com/repos/ollama/ollama/issues/215/events
|
https://github.com/ollama/ollama/issues/215
| 1,821,331,068
|
I_kwDOJ0Z1Ps5sj058
| 215
|
Function calling
|
{
"login": "nathanleclaire",
"id": 1476820,
"node_id": "MDQ6VXNlcjE0NzY4MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1476820?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nathanleclaire",
"html_url": "https://github.com/nathanleclaire",
"followers_url": "https://api.github.com/users/nathanleclaire/followers",
"following_url": "https://api.github.com/users/nathanleclaire/following{/other_user}",
"gists_url": "https://api.github.com/users/nathanleclaire/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nathanleclaire/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nathanleclaire/subscriptions",
"organizations_url": "https://api.github.com/users/nathanleclaire/orgs",
"repos_url": "https://api.github.com/users/nathanleclaire/repos",
"events_url": "https://api.github.com/users/nathanleclaire/events{/privacy}",
"received_events_url": "https://api.github.com/users/nathanleclaire/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2023-07-25T23:41:25
| 2023-12-09T02:02:59
| 2023-12-04T18:53:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Trying to get structured/consistent responses out of LLMs can be pretty brutal
OpenAI recently rolled out [Function Calling](https://openai.com/blog/function-calling-and-other-api-updates) to get the models to stick to pre-defined schemas
it would be excellent if you could specify something like this (ins/outs) in modelfile
```
FROM llama
INPUT sentence string
ENUM Sentiment ["good", "bad", "neutral"]
OUTPUT classification Sentiment
PROMPT """
You are skilled at detecting tone in user comments.
Classify the following comment:
${sentence}
"""
```
then something like:
```
$ ollama run sentiment "ClosedAI has no moat"
bad
```
(or better yet, with API :) )
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/215/reactions",
"total_count": 4,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/215/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/785
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/785/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/785/comments
|
https://api.github.com/repos/ollama/ollama/issues/785/events
|
https://github.com/ollama/ollama/pull/785
| 1,942,706,455
|
PR_kwDOJ0Z1Ps5cxvmg
| 785
|
check update response
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-13T22:04:48
| 2023-10-13T22:05:47
| 2023-10-13T22:05:46
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/785",
"html_url": "https://github.com/ollama/ollama/pull/785",
"diff_url": "https://github.com/ollama/ollama/pull/785.diff",
"patch_url": "https://github.com/ollama/ollama/pull/785.patch",
"merged_at": "2023-10-13T22:05:46"
}
| null |
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/785/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/785/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6207
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6207/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6207/comments
|
https://api.github.com/repos/ollama/ollama/issues/6207/events
|
https://github.com/ollama/ollama/pull/6207
| 2,451,401,133
|
PR_kwDOJ0Z1Ps53mXV-
| 6,207
|
Ensure sparse files on windows during download
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-06T17:47:50
| 2024-08-06T18:06:09
| 2024-08-06T18:06:06
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6207",
"html_url": "https://github.com/ollama/ollama/pull/6207",
"diff_url": "https://github.com/ollama/ollama/pull/6207.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6207.patch",
"merged_at": "2024-08-06T18:06:06"
}
|
The file.Truncate call on windows will write the whole file unless you set the sparse flag, leading to heavy I/O at the beginning of download. This should improve our I/O behavior on windows and put less stress on the users disk.
Fixes #5852
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6207/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6207/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7305
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7305/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7305/comments
|
https://api.github.com/repos/ollama/ollama/issues/7305/events
|
https://github.com/ollama/ollama/pull/7305
| 2,603,736,770
|
PR_kwDOJ0Z1Ps5_YRFV
| 7,305
|
Fix rocm windows build and clean up dependency gathering
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-21T21:48:58
| 2024-10-22T19:54:18
| 2024-10-22T19:54:16
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7305",
"html_url": "https://github.com/ollama/ollama/pull/7305",
"diff_url": "https://github.com/ollama/ollama/pull/7305.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7305.patch",
"merged_at": "2024-10-22T19:54:15"
}
|
On windows ensure windows version define is properly set for rocm. Remove duplicate rocm arch flags.
Resolve wildcards in the targets so parallel builds don't race. Use readlink to resolve rocm dependencies since wildcards omit libelf. Keep windows rocm deps aligned with unified packaging model
Fixes #7279
Fixes #7320
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7305/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7305/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8471
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8471/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8471/comments
|
https://api.github.com/repos/ollama/ollama/issues/8471/events
|
https://github.com/ollama/ollama/issues/8471
| 2,795,871,839
|
I_kwDOJ0Z1Ps6mpZ5f
| 8,471
|
command-7b:7b-12-2024-fp16 chat completion results in 500 error
|
{
"login": "MarkWard0110",
"id": 90335263,
"node_id": "MDQ6VXNlcjkwMzM1MjYz",
"avatar_url": "https://avatars.githubusercontent.com/u/90335263?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MarkWard0110",
"html_url": "https://github.com/MarkWard0110",
"followers_url": "https://api.github.com/users/MarkWard0110/followers",
"following_url": "https://api.github.com/users/MarkWard0110/following{/other_user}",
"gists_url": "https://api.github.com/users/MarkWard0110/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MarkWard0110/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MarkWard0110/subscriptions",
"organizations_url": "https://api.github.com/users/MarkWard0110/orgs",
"repos_url": "https://api.github.com/users/MarkWard0110/repos",
"events_url": "https://api.github.com/users/MarkWard0110/events{/privacy}",
"received_events_url": "https://api.github.com/users/MarkWard0110/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2025-01-17T16:36:14
| 2025-01-28T21:18:27
| 2025-01-28T21:18:27
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
GPU Nvidia RTX 4070 TI Super 16 GB
System RAM: 96 GB
When I issue a chat completion request for the model `command-7b:7b-12-2024-fp16` I get a 500 response error from Ollama under the following conditions.
500 response error when using
Context: 2048
Max Predict: 2048
The following is the Ollama log
```
Jan 17 16:29:56 quorra ollama[4173706]: [GIN] 2025/01/17 - 16:29:56 | 500 | 1.998027857s | 10.0.0.123 | POST "/api/chat"
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.213Z level=DEBUG source=gpu.go:406 msg="updating system memory data" before.total="94.0 GiB" before.free="90.9 GiB" before.free_swap="7.8 GiB" now.total="94.0 GiB" now.free="90.9 GiB" now.free_swap="7.8 GiB"
Jan 17 16:29:56 quorra ollama[4173706]: initializing /usr/lib/x86_64-linux-gnu/libcuda.so.565.57.01
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuInit - 0x7f7c4d321ec0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDriverGetVersion - 0x7f7c4d321ee0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetCount - 0x7f7c4d321f20
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGet - 0x7f7c4d321f00
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetAttribute - 0x7f7c4d322000
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetUuid - 0x7f7c4d321f60
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetName - 0x7f7c4d321f40
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuCtxCreate_v3 - 0x7f7c4d3221e0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuMemGetInfo_v2 - 0x7f7c4d322960
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuCtxDestroy - 0x7f7c4d36e5a0
Jan 17 16:29:56 quorra ollama[4173706]: calling cuInit
Jan 17 16:29:56 quorra ollama[4173706]: calling cuDriverGetVersion
Jan 17 16:29:56 quorra ollama[4173706]: raw version 0x2f26
Jan 17 16:29:56 quorra ollama[4173706]: CUDA driver version: 12.7
Jan 17 16:29:56 quorra ollama[4173706]: calling cuDeviceGetCount
Jan 17 16:29:56 quorra ollama[4173706]: device count 1
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.404Z level=DEBUG source=gpu.go:456 msg="updating cuda memory data" gpu=GPU-007c9d9a-8177-bd6f-7654-45652102b937 name="NVIDIA GeForce RTX 4070 Ti SUPER" overhead="0 B" before.total="15.6 GiB" before.free="15.4 GiB" now.total="15.6 GiB" now.free="15.4 GiB" now.used="217.2 MiB"
Jan 17 16:29:56 quorra ollama[4173706]: releasing cuda driver library
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.404Z level=DEBUG source=server.go:1079 msg="stopping llama server"
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.404Z level=DEBUG source=sched.go:380 msg="runner released" modelPath=/usr/share/ollama/.ollama/models/blobs/sha256-d565c4f8340747fb2ae26613a785bd1168d1311ad4f76ce4845cad170c7f3f98
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.655Z level=DEBUG source=gpu.go:406 msg="updating system memory data" before.total="94.0 GiB" before.free="90.9 GiB" before.free_swap="7.8 GiB" now.total="94.0 GiB" now.free="90.8 GiB" now.free_swap="7.8 GiB"
Jan 17 16:29:56 quorra ollama[4173706]: initializing /usr/lib/x86_64-linux-gnu/libcuda.so.565.57.01
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuInit - 0x7f7c4d321ec0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDriverGetVersion - 0x7f7c4d321ee0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetCount - 0x7f7c4d321f20
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGet - 0x7f7c4d321f00
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetAttribute - 0x7f7c4d322000
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetUuid - 0x7f7c4d321f60
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetName - 0x7f7c4d321f40
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuCtxCreate_v3 - 0x7f7c4d3221e0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuMemGetInfo_v2 - 0x7f7c4d322960
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuCtxDestroy - 0x7f7c4d36e5a0
Jan 17 16:29:56 quorra ollama[4173706]: calling cuInit
Jan 17 16:29:56 quorra ollama[4173706]: calling cuDriverGetVersion
Jan 17 16:29:56 quorra ollama[4173706]: raw version 0x2f26
Jan 17 16:29:56 quorra ollama[4173706]: CUDA driver version: 12.7
Jan 17 16:29:56 quorra ollama[4173706]: calling cuDeviceGetCount
Jan 17 16:29:56 quorra ollama[4173706]: device count 1
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.747Z level=DEBUG source=gpu.go:456 msg="updating cuda memory data" gpu=GPU-007c9d9a-8177-bd6f-7654-45652102b937 name="NVIDIA GeForce RTX 4070 Ti SUPER" overhead="0 B" before.total="15.6 GiB" before.free="15.4 GiB" now.total="15.6 GiB" now.free="15.4 GiB" now.used="217.2 MiB"
Jan 17 16:29:56 quorra ollama[4173706]: releasing cuda driver library
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.904Z level=DEBUG source=gpu.go:406 msg="updating system memory data" before.total="94.0 GiB" before.free="90.8 GiB" before.free_swap="7.8 GiB" now.total="94.0 GiB" now.free="90.8 GiB" now.free_swap="7.8 GiB"
Jan 17 16:29:56 quorra ollama[4173706]: initializing /usr/lib/x86_64-linux-gnu/libcuda.so.565.57.01
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuInit - 0x7f7c4d321ec0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDriverGetVersion - 0x7f7c4d321ee0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetCount - 0x7f7c4d321f20
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGet - 0x7f7c4d321f00
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetAttribute - 0x7f7c4d322000
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetUuid - 0x7f7c4d321f60
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuDeviceGetName - 0x7f7c4d321f40
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuCtxCreate_v3 - 0x7f7c4d3221e0
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuMemGetInfo_v2 - 0x7f7c4d322960
Jan 17 16:29:56 quorra ollama[4173706]: dlsym: cuCtxDestroy - 0x7f7c4d36e5a0
Jan 17 16:29:56 quorra ollama[4173706]: calling cuInit
Jan 17 16:29:56 quorra ollama[4173706]: calling cuDriverGetVersion
Jan 17 16:29:56 quorra ollama[4173706]: raw version 0x2f26
Jan 17 16:29:56 quorra ollama[4173706]: CUDA driver version: 12.7
Jan 17 16:29:56 quorra ollama[4173706]: calling cuDeviceGetCount
Jan 17 16:29:56 quorra ollama[4173706]: device count 1
Jan 17 16:29:56 quorra ollama[4173706]: time=2025-01-17T16:29:56.999Z level=DEBUG source=gpu.go:456 msg="updating cuda memory data" gpu=GPU-007c9d9a-8177-bd6f-7654-45652102b937 name="NVIDIA GeForce RTX 4070 Ti SUPER" overhead="0 B" before.total="15.6 GiB" before.free="15.4 GiB" now.total="15.6 GiB" now.free="15.4 GiB" now.used="217.2 MiB"
Jan 17 16:29:56 quorra ollama[4173706]: releasing cuda driver library
```
When I use a context size of 64 and max predict of 64 it works. I get a response from the model.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.7
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8471/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8471/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8038
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8038/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8038/comments
|
https://api.github.com/repos/ollama/ollama/issues/8038/events
|
https://github.com/ollama/ollama/issues/8038
| 2,731,808,477
|
I_kwDOJ0Z1Ps6i1Bbd
| 8,038
|
undefined reference to `ggml_backend_cuda_reg'
|
{
"login": "regularRandom",
"id": 14252934,
"node_id": "MDQ6VXNlcjE0MjUyOTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/14252934?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/regularRandom",
"html_url": "https://github.com/regularRandom",
"followers_url": "https://api.github.com/users/regularRandom/followers",
"following_url": "https://api.github.com/users/regularRandom/following{/other_user}",
"gists_url": "https://api.github.com/users/regularRandom/gists{/gist_id}",
"starred_url": "https://api.github.com/users/regularRandom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/regularRandom/subscriptions",
"organizations_url": "https://api.github.com/users/regularRandom/orgs",
"repos_url": "https://api.github.com/users/regularRandom/repos",
"events_url": "https://api.github.com/users/regularRandom/events{/privacy}",
"received_events_url": "https://api.github.com/users/regularRandom/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 7700262114,
"node_id": "LA_kwDOJ0Z1Ps8AAAAByvis4g",
"url": "https://api.github.com/repos/ollama/ollama/labels/build",
"name": "build",
"color": "006b75",
"default": false,
"description": "Issues relating to building ollama from source"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-12-11T04:45:17
| 2024-12-14T04:09:22
| 2024-12-14T04:09:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
0.5.2-rc0 build fails with the following error message:
> /usr/bin/ld: /tmp/go-link-3808825391/000013.o: in function `ggml_backend_registry::ggml_backend_registry()':
> /_/github.com/ollama/ollama/llama/ggml-backend-reg.cpp:164: undefined reference to `ggml_backend_cuda_reg'
> collect2: error: ld returned 1 exit status
>
> make[1]: *** [make/gpu.make:63: llama/build/linux-amd64/runners/cuda_v12/ollama_llama_server] Error 1
> make: *** [Makefile:50: cuda_v12] Error 2
Same for 0.5.1.
nvcc --version
> nvcc: NVIDIA (R) Cuda compiler driver
> Copyright (c) 2005-2024 NVIDIA Corporation
> Built on Tue_Oct_29_23:50:19_PDT_2024
> Cuda compilation tools, release 12.6, V12.6.85
> Build cuda_12.6.r12.6/compiler.35059454_0
>
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.2-rc0-0-g527cc97
|
{
"login": "regularRandom",
"id": 14252934,
"node_id": "MDQ6VXNlcjE0MjUyOTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/14252934?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/regularRandom",
"html_url": "https://github.com/regularRandom",
"followers_url": "https://api.github.com/users/regularRandom/followers",
"following_url": "https://api.github.com/users/regularRandom/following{/other_user}",
"gists_url": "https://api.github.com/users/regularRandom/gists{/gist_id}",
"starred_url": "https://api.github.com/users/regularRandom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/regularRandom/subscriptions",
"organizations_url": "https://api.github.com/users/regularRandom/orgs",
"repos_url": "https://api.github.com/users/regularRandom/repos",
"events_url": "https://api.github.com/users/regularRandom/events{/privacy}",
"received_events_url": "https://api.github.com/users/regularRandom/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8038/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8038/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3233
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3233/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3233/comments
|
https://api.github.com/repos/ollama/ollama/issues/3233/events
|
https://github.com/ollama/ollama/issues/3233
| 2,193,952,020
|
I_kwDOJ0Z1Ps6CxQ0U
| 3,233
|
Add FinTral model
|
{
"login": "tqangxl",
"id": 9669944,
"node_id": "MDQ6VXNlcjk2Njk5NDQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/9669944?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tqangxl",
"html_url": "https://github.com/tqangxl",
"followers_url": "https://api.github.com/users/tqangxl/followers",
"following_url": "https://api.github.com/users/tqangxl/following{/other_user}",
"gists_url": "https://api.github.com/users/tqangxl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tqangxl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tqangxl/subscriptions",
"organizations_url": "https://api.github.com/users/tqangxl/orgs",
"repos_url": "https://api.github.com/users/tqangxl/repos",
"events_url": "https://api.github.com/users/tqangxl/events{/privacy}",
"received_events_url": "https://api.github.com/users/tqangxl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-03-19T03:50:56
| 2024-07-12T23:13:33
| 2024-07-12T23:13:33
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What model would you like?
pls add FinTral
|
{
"login": "tqangxl",
"id": 9669944,
"node_id": "MDQ6VXNlcjk2Njk5NDQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/9669944?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tqangxl",
"html_url": "https://github.com/tqangxl",
"followers_url": "https://api.github.com/users/tqangxl/followers",
"following_url": "https://api.github.com/users/tqangxl/following{/other_user}",
"gists_url": "https://api.github.com/users/tqangxl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tqangxl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tqangxl/subscriptions",
"organizations_url": "https://api.github.com/users/tqangxl/orgs",
"repos_url": "https://api.github.com/users/tqangxl/repos",
"events_url": "https://api.github.com/users/tqangxl/events{/privacy}",
"received_events_url": "https://api.github.com/users/tqangxl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3233/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3233/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/573
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/573/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/573/comments
|
https://api.github.com/repos/ollama/ollama/issues/573/events
|
https://github.com/ollama/ollama/issues/573
| 1,909,134,882
|
I_kwDOJ0Z1Ps5xyxYi
| 573
|
"Invalid file magic" with falcon models
|
{
"login": "vadim0x60",
"id": 3543310,
"node_id": "MDQ6VXNlcjM1NDMzMTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/3543310?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vadim0x60",
"html_url": "https://github.com/vadim0x60",
"followers_url": "https://api.github.com/users/vadim0x60/followers",
"following_url": "https://api.github.com/users/vadim0x60/following{/other_user}",
"gists_url": "https://api.github.com/users/vadim0x60/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vadim0x60/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vadim0x60/subscriptions",
"organizations_url": "https://api.github.com/users/vadim0x60/orgs",
"repos_url": "https://api.github.com/users/vadim0x60/repos",
"events_url": "https://api.github.com/users/vadim0x60/events{/privacy}",
"received_events_url": "https://api.github.com/users/vadim0x60/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-09-22T15:39:55
| 2023-09-25T13:40:35
| 2023-09-25T13:40:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
This happens every time I try to interact with a falcon model:
```
❯ ollama run falcon:40b
>>> hi
Error: invalid file magic
```
Hardware is Apple silicon with 96GB of RAM
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/573/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/573/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4394
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4394/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4394/comments
|
https://api.github.com/repos/ollama/ollama/issues/4394/events
|
https://github.com/ollama/ollama/issues/4394
| 2,292,273,782
|
I_kwDOJ0Z1Ps6IoVJ2
| 4,394
|
Modelfile containing "home" in its name breaks model execution
|
{
"login": "leon-rgb",
"id": 56979997,
"node_id": "MDQ6VXNlcjU2OTc5OTk3",
"avatar_url": "https://avatars.githubusercontent.com/u/56979997?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leon-rgb",
"html_url": "https://github.com/leon-rgb",
"followers_url": "https://api.github.com/users/leon-rgb/followers",
"following_url": "https://api.github.com/users/leon-rgb/following{/other_user}",
"gists_url": "https://api.github.com/users/leon-rgb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leon-rgb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leon-rgb/subscriptions",
"organizations_url": "https://api.github.com/users/leon-rgb/orgs",
"repos_url": "https://api.github.com/users/leon-rgb/repos",
"events_url": "https://api.github.com/users/leon-rgb/events{/privacy}",
"received_events_url": "https://api.github.com/users/leon-rgb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-05-13T09:21:01
| 2024-05-14T02:04:17
| 2024-05-14T02:04:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
### What have I done?
Created a model through `ollama create sh-llama -f ./home_modelfile`.
Got the usual output.
Trying to run the model also works `ollama run sh-llama`
But when giving an input no output is generated.
### Solution
Renaming the `home_modelfile` to anything that doesn't contain 'home' in its name.
### Note
This problem only occurs on my Ubuntu 20.04 system. No problems on Windows system.
### OS
Linux
### GPU
Other
### CPU
Intel
### Ollama version
0.1.33
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4394/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4394/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4877
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4877/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4877/comments
|
https://api.github.com/repos/ollama/ollama/issues/4877/events
|
https://github.com/ollama/ollama/pull/4877
| 2,338,945,967
|
PR_kwDOJ0Z1Ps5xuAbh
| 4,877
|
Update README.md to add Shinkai Desktop
|
{
"login": "nicarq",
"id": 1622112,
"node_id": "MDQ6VXNlcjE2MjIxMTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/1622112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nicarq",
"html_url": "https://github.com/nicarq",
"followers_url": "https://api.github.com/users/nicarq/followers",
"following_url": "https://api.github.com/users/nicarq/following{/other_user}",
"gists_url": "https://api.github.com/users/nicarq/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nicarq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nicarq/subscriptions",
"organizations_url": "https://api.github.com/users/nicarq/orgs",
"repos_url": "https://api.github.com/users/nicarq/repos",
"events_url": "https://api.github.com/users/nicarq/events{/privacy}",
"received_events_url": "https://api.github.com/users/nicarq/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-06-06T18:50:13
| 2024-11-21T08:16:19
| 2024-11-21T08:16:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4877",
"html_url": "https://github.com/ollama/ollama/pull/4877",
"diff_url": "https://github.com/ollama/ollama/pull/4877.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4877.patch",
"merged_at": "2024-11-21T08:16:18"
}
|
Adding Shinkai Desktop to the list of Apps using Ollama (opensource), free and a two click install! no docker required
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4877/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3575
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3575/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3575/comments
|
https://api.github.com/repos/ollama/ollama/issues/3575/events
|
https://github.com/ollama/ollama/issues/3575
| 2,235,514,582
|
I_kwDOJ0Z1Ps6FPz7W
| 3,575
|
Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.
|
{
"login": "Coder-Vishali",
"id": 60731083,
"node_id": "MDQ6VXNlcjYwNzMxMDgz",
"avatar_url": "https://avatars.githubusercontent.com/u/60731083?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Coder-Vishali",
"html_url": "https://github.com/Coder-Vishali",
"followers_url": "https://api.github.com/users/Coder-Vishali/followers",
"following_url": "https://api.github.com/users/Coder-Vishali/following{/other_user}",
"gists_url": "https://api.github.com/users/Coder-Vishali/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Coder-Vishali/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Coder-Vishali/subscriptions",
"organizations_url": "https://api.github.com/users/Coder-Vishali/orgs",
"repos_url": "https://api.github.com/users/Coder-Vishali/repos",
"events_url": "https://api.github.com/users/Coder-Vishali/events{/privacy}",
"received_events_url": "https://api.github.com/users/Coder-Vishali/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 11
| 2024-04-10T12:46:02
| 2025-01-22T05:58:27
| 2024-04-11T10:49:36
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I execute ollama serve, I face the below issue:
Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.

**

**
Things which I have tired out:
1. Restarted my machine
2. Stop and start the ollama server
3. Kill the port using: netstat -ano | findstr :<PORT>, taskkill /PID <PID> /F
### What did you expect to see?
_No response_
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
Windows
### Architecture
x86
### Platform
_No response_
### Ollama version
_No response_
### GPU
_No response_
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "Coder-Vishali",
"id": 60731083,
"node_id": "MDQ6VXNlcjYwNzMxMDgz",
"avatar_url": "https://avatars.githubusercontent.com/u/60731083?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Coder-Vishali",
"html_url": "https://github.com/Coder-Vishali",
"followers_url": "https://api.github.com/users/Coder-Vishali/followers",
"following_url": "https://api.github.com/users/Coder-Vishali/following{/other_user}",
"gists_url": "https://api.github.com/users/Coder-Vishali/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Coder-Vishali/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Coder-Vishali/subscriptions",
"organizations_url": "https://api.github.com/users/Coder-Vishali/orgs",
"repos_url": "https://api.github.com/users/Coder-Vishali/repos",
"events_url": "https://api.github.com/users/Coder-Vishali/events{/privacy}",
"received_events_url": "https://api.github.com/users/Coder-Vishali/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3575/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3575/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5736
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5736/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5736/comments
|
https://api.github.com/repos/ollama/ollama/issues/5736/events
|
https://github.com/ollama/ollama/issues/5736
| 2,412,334,627
|
I_kwDOJ0Z1Ps6PyU4j
| 5,736
|
bug: Open WebUI RAG Malfunction with Ollama Versions Post 0.2.1
|
{
"login": "silentoplayz",
"id": 50341825,
"node_id": "MDQ6VXNlcjUwMzQxODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/50341825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/silentoplayz",
"html_url": "https://github.com/silentoplayz",
"followers_url": "https://api.github.com/users/silentoplayz/followers",
"following_url": "https://api.github.com/users/silentoplayz/following{/other_user}",
"gists_url": "https://api.github.com/users/silentoplayz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/silentoplayz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/silentoplayz/subscriptions",
"organizations_url": "https://api.github.com/users/silentoplayz/orgs",
"repos_url": "https://api.github.com/users/silentoplayz/repos",
"events_url": "https://api.github.com/users/silentoplayz/events{/privacy}",
"received_events_url": "https://api.github.com/users/silentoplayz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 16
| 2024-07-17T01:02:28
| 2024-12-08T01:10:07
| 2024-07-28T02:51:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
**Summary:**
Retrieval-Augmented Generation (RAG) functionality within Open WebUI breaks when using Ollama versions later than 0.2.1 for local models. While external models (e.g., GroqCloud's LLama 3 8B) function correctly with RAG, local models fail to utilize the selected document, returning irrelevant or fabricated information. This issue occurs with both `SentenceTransformers` and `Ollama` RAG embedding models.
**Affected Versions:**
* Ollama: 0.2.2, 0.2.3, 0.2.4, 0.2.5, 0.2.6, 0.2.7, 0.2.8
* Open WebUI: Latest `dev` and `main` branches
**Unaffected Versions:**
* Ollama: versions prior to 0.2.1
* Open WebUI: ?
**Steps to Reproduce:**
1. **Clean Slate:**
* Downgrade Ollama to version 0.2.0 (`ollama --version`).
* In Open WebUI, clear all documents from the `Workspace` > `Documents` tab.
* Navigate to `Admin Panel` > `Settings` > `Documents` and click `Reset Upload Directory` and `Reset Vector Storage`.
2. **Successful RAG Test (Ollama 0.2.0 & 0.2.1):**
* Add a `.txt` document to the Open WebUI `Documents` workspace.
* Start a new chat and select the document using the `#` key.
* Input a query related to the document content.
* Verify that both local and external LLMs respond accurately, incorporating information from the selected document.
* Repeat steps 1 & 2 for Ollama version 0.2.1 after upgrading (`ollama --version`).
3. **Failing RAG Test (Ollama 0.2.2 onwards):**
* Upgrade Ollama to version 0.2.2 (`ollama --version`).
* Start a new chat, select the same document from step 2 using the `#` key, and input the same query.
* Observe that local LLMs fail to utilize the document content, providing irrelevant or fabricated responses.
* Verify that external LLMs still function correctly with RAG.
* Repeat step 3 for Ollama versions 0.2.3-0.3.0, observing the same behavior.
**Expected Behavior:**
Local LLMs should successfully utilize the selected document for RAG, providing accurate and relevant responses based on its content, regardless of the Ollama version used.
**Actual Behavior:**
Local LLMs fail to perform RAG accurately when using Ollama versions 0.2.2 and later, while external models remain unaffected. This occurs despite successful document loading and embedding generation (confirmed by testing with both `SentenceTransformers` and `Ollama` embedding models).
**Additional Notes:**
* The issue persists across multiple attempts, regenerations, and message edits.
* The problem is not specific to a particular document or query, as it consistently occurs with different types of documents.
* Resetting the Open WebUI upload directory and vector storage, as well as re-uploading documents, does not resolve the issue.
* The issue is not related to `Tika` document extraction for RAG within Open WebUI, as confirmed through testing.
* Downgrading Ollama to version 0.2.0 completely resolves the RAG malfunction within Open WebUI.
**Conclusion:**
A regression appears to have been introduced in Ollama versions after 0.2.1, specifically impacting the interaction between Ollama and Open WebUI for local model RAG functionality. This issue necessitates investigation and resolution to ensure the proper functioning of RAG across all supported Ollama versions.
### Related issue on the Open WebUI repo: https://github.com/open-webui/open-webui/discussions/3907
The maintainer of Open WebUI has also confirmed this bug on the latest version of Open WebUI in combination with the latest version of Ollama:


Latest Open WebUI RAG + Ollama v0.2.2 (failures 100% of the time with local models it seems):

### OS
Windows, Docker
### GPU
AMD RX 6800 XT
### CPU
Intel i7-12700K
### Ollama version
0.2.2, 0.2.3, 0.2.4, 0.2.5, 0.2.6, 0.2.7, 0.2.8, 0.2.9, 0.3.0 (latest)
|
{
"login": "silentoplayz",
"id": 50341825,
"node_id": "MDQ6VXNlcjUwMzQxODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/50341825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/silentoplayz",
"html_url": "https://github.com/silentoplayz",
"followers_url": "https://api.github.com/users/silentoplayz/followers",
"following_url": "https://api.github.com/users/silentoplayz/following{/other_user}",
"gists_url": "https://api.github.com/users/silentoplayz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/silentoplayz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/silentoplayz/subscriptions",
"organizations_url": "https://api.github.com/users/silentoplayz/orgs",
"repos_url": "https://api.github.com/users/silentoplayz/repos",
"events_url": "https://api.github.com/users/silentoplayz/events{/privacy}",
"received_events_url": "https://api.github.com/users/silentoplayz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5736/reactions",
"total_count": 6,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/5736/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6057
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6057/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6057/comments
|
https://api.github.com/repos/ollama/ollama/issues/6057/events
|
https://github.com/ollama/ollama/issues/6057
| 2,435,778,412
|
I_kwDOJ0Z1Ps6RLwds
| 6,057
|
Ollama create from Model failed
|
{
"login": "rentianxiang",
"id": 45681984,
"node_id": "MDQ6VXNlcjQ1NjgxOTg0",
"avatar_url": "https://avatars.githubusercontent.com/u/45681984?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rentianxiang",
"html_url": "https://github.com/rentianxiang",
"followers_url": "https://api.github.com/users/rentianxiang/followers",
"following_url": "https://api.github.com/users/rentianxiang/following{/other_user}",
"gists_url": "https://api.github.com/users/rentianxiang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rentianxiang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rentianxiang/subscriptions",
"organizations_url": "https://api.github.com/users/rentianxiang/orgs",
"repos_url": "https://api.github.com/users/rentianxiang/repos",
"events_url": "https://api.github.com/users/rentianxiang/events{/privacy}",
"received_events_url": "https://api.github.com/users/rentianxiang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-29T15:50:17
| 2024-09-02T00:10:26
| 2024-09-02T00:10:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have downloaded llama3.1:70b model directly from llama.meta.com, and I am trying to import it into Ollama.
It stopped at processing tensors.
I have tried multiple times today, all failed at this stage.
Did I do anything wrong?
This is kinda related to another issue I have raised, since I am not able to download it from Ollama directly, I decided to download it first and then try to import it into Ollama
https://github.com/ollama/ollama/issues/5852
**Commands:**
PS D:\ollama> ollama create llama3.1:70b
transferring model data
unpacking model metadata
processing tensors
**My Modelfile:**
FROM D:\LLMs\Meta-Llama-3.1-70B-Instruct
The Blob files are generated

Model file also available then deleted

**Server Log:**
2024/07/29 22:48:44 routes.go:1099: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\rtx\\.ollama\\models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR:C:\\Users\\rtx\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-07-29T22:48:44.747+08:00 level=ERROR source=images.go:774 msg="couldn't remove blob" blob=1164794173 error="remove C:\\Users\\rtx\\.ollama\\models\\blobs\\1164794173: The directory is not empty."
time=2024-07-29T22:48:44.748+08:00 level=INFO source=images.go:784 msg="total blobs: 11"
time=2024-07-29T22:48:44.814+08:00 level=INFO source=images.go:791 msg="total unused blobs removed: 1"
time=2024-07-29T22:48:44.815+08:00 level=INFO source=routes.go:1146 msg="Listening on 127.0.0.1:11434 (version 0.3.0)"
time=2024-07-29T22:48:44.819+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [rocm_v6.1 cpu cpu_avx cpu_avx2 cuda_v11.3]"
time=2024-07-29T22:48:44.819+08:00 level=INFO source=gpu.go:205 msg="looking for compatible GPUs"
time=2024-07-29T22:48:45.102+08:00 level=INFO source=types.go:105 msg="inference compute" id=GPU-e3ce22d3-ac09-e72f-5795-3c3f0a60b4d2 library=cuda compute=8.9 driver=12.5 name="NVIDIA GeForce RTX 4080 SUPER" total="16.0 GiB" available="14.7 GiB"
[GIN] 2024/07/29 - 22:54:14 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/29 - 22:54:14 | 200 | 1.6607ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/29 - 22:56:15 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/29 - 22:56:15 | 404 | 503.3µs | 127.0.0.1 | POST "/api/show"
[GIN] 2024/07/29 - 22:56:29 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/29 - 22:56:29 | 404 | 0s | 127.0.0.1 | POST "/api/show"
[GIN] 2024/07/29 - 23:01:35 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/29 - 23:01:35 | 200 | 0s | 127.0.0.1 | GET "/api/ps"
[GIN] 2024/07/29 - 23:01:38 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/29 - 23:01:38 | 200 | 612.8µs | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/07/29 - 23:01:43 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/29 - 23:01:43 | 200 | 18.7638ms | 127.0.0.1 | POST "/api/show"
time=2024-07-29T23:01:43.811+08:00 level=INFO source=sched.go:701 msg="new model will fit in available VRAM in single GPU, loading" model=C:\Users\rtx\.ollama\models\blobs\sha256-87048bcd55216712ef14c11c2c303728463207b165bf18440b9b84b07ec00f87 gpu=GPU-e3ce22d3-ac09-e72f-5795-3c3f0a60b4d2 parallel=4 available=15753904128 required="6.2 GiB"
time=2024-07-29T23:01:43.811+08:00 level=INFO source=memory.go:309 msg="offload to cuda" layers.requested=-1 layers.model=33 layers.offload=33 layers.split="" memory.available="[14.7 GiB]" memory.required.full="6.2 GiB" memory.required.partial="6.2 GiB" memory.required.kv="1.0 GiB" memory.required.allocations="[6.2 GiB]" memory.weights.total="4.7 GiB" memory.weights.repeating="4.3 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="560.0 MiB" memory.graph.partial="677.5 MiB"
time=2024-07-29T23:01:43.816+08:00 level=INFO source=server.go:383 msg="starting llama server" cmd="C:\\Users\\rtx\\AppData\\Local\\Programs\\Ollama\\ollama_runners\\cuda_v11.3\\ollama_llama_server.exe --model C:\\Users\\rtx\\.ollama\\models\\blobs\\sha256-87048bcd55216712ef14c11c2c303728463207b165bf18440b9b84b07ec00f87 --ctx-size 8192 --batch-size 512 --embedding --log-disable --n-gpu-layers 33 --no-mmap --parallel 4 --port 63814"
time=2024-07-29T23:01:43.820+08:00 level=INFO source=sched.go:437 msg="loaded runners" count=1
time=2024-07-29T23:01:43.820+08:00 level=INFO source=server.go:583 msg="waiting for llama runner to start responding"
time=2024-07-29T23:01:43.820+08:00 level=INFO source=server.go:617 msg="waiting for server to become available" status="llm server error"
INFO [wmain] build info | build=3440 commit="d94c6e0c" tid="21640" timestamp=1722265303
INFO [wmain] system info | n_threads=16 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 0 | " tid="21640" timestamp=1722265303 total_threads=32
INFO [wmain] HTTP server listening | hostname="127.0.0.1" n_threads_http="31" port="63814" tid="21640" timestamp=1722265303
llama_model_loader: loaded meta data with 29 key-value pairs and 291 tensors from C:\Users\rtx\.ollama\models\blobs\sha256-87048bcd55216712ef14c11c2c303728463207b165bf18440b9b84b07ec00f87 (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.type str = model
llama_model_loader: - kv 2: general.name str = Meta Llama 3.1 8B Instruct
llama_model_loader: - kv 3: general.finetune str = Instruct
llama_model_loader: - kv 4: general.basename str = Meta-Llama-3.1
llama_model_loader: - kv 5: general.size_label str = 8B
llama_model_loader: - kv 6: general.license str = llama3.1
llama_model_loader: - kv 7: general.tags arr[str,6] = ["facebook", "meta", "pytorch", "llam...
llama_model_loader: - kv 8: general.languages arr[str,8] = ["en", "de", "fr", "it", "pt", "hi", ...
llama_model_loader: - kv 9: llama.block_count u32 = 32
llama_model_loader: - kv 10: llama.context_length u32 = 131072
llama_model_loader: - kv 11: llama.embedding_length u32 = 4096
llama_model_loader: - kv 12: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 13: llama.attention.head_count u32 = 32
llama_model_loader: - kv 14: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 15: llama.rope.freq_base f32 = 500000.000000
llama_model_loader: - kv 16: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 17: general.file_type u32 = 2
llama_model_loader: - kv 18: llama.vocab_size u32 = 128256
llama_model_loader: - kv 19: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 20: tokenizer.ggml.model str = gpt2
llama_model_loader: - kv 21: tokenizer.ggml.pre str = llama-bpe
llama_model_loader: - kv 22: tokenizer.ggml.tokens arr[str,128256] = ["!", "\"", "#", "$", "%", "&", "'", ...
llama_model_loader: - kv 23: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
llama_model_loader: - kv 24: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "...
llama_model_loader: - kv 25: tokenizer.ggml.bos_token_id u32 = 128000
llama_model_loader: - kv 26: tokenizer.ggml.eos_token_id u32 = 128009
llama_model_loader: - kv 27: tokenizer.chat_template str = {% set loop_messages = messages %}{% ...
llama_model_loader: - kv 28: general.quantization_version u32 = 2
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type q4_0: 225 tensors
llama_model_loader: - type q6_K: 1 tensors
time=2024-07-29T23:01:44.074+08:00 level=INFO source=server.go:617 msg="waiting for server to become available" status="llm server loading model"
llm_load_vocab: special tokens cache size = 256
llm_load_vocab: token to piece cache size = 0.7999 MB
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = BPE
llm_load_print_meta: n_vocab = 128256
llm_load_print_meta: n_merges = 280147
llm_load_print_meta: vocab_only = 0
llm_load_print_meta: n_ctx_train = 131072
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_swa = 0
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: n_embd_k_gqa = 1024
llm_load_print_meta: n_embd_v_gqa = 1024
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: causal attn = 1
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 0
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 500000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_ctx_orig_yarn = 131072
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: model type = 8B
llm_load_print_meta: model ftype = Q4_0
llm_load_print_meta: model params = 8.03 B
llm_load_print_meta: model size = 4.33 GiB (4.64 BPW)
llm_load_print_meta: general.name = Meta Llama 3.1 8B Instruct
llm_load_print_meta: BOS token = 128000 '<|begin_of_text|>'
llm_load_print_meta: EOS token = 128009 '<|eot_id|>'
llm_load_print_meta: LF token = 128 'Ä'
llm_load_print_meta: EOT token = 128009 '<|eot_id|>'
llm_load_print_meta: max token length = 256
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 4080 SUPER, compute capability 8.9, VMM: yes
llm_load_tensors: ggml ctx size = 0.27 MiB
llm_load_tensors: offloading 32 repeating layers to GPU
llm_load_tensors: offloading non-repeating layers to GPU
llm_load_tensors: offloaded 33/33 layers to GPU
llm_load_tensors: CUDA_Host buffer size = 281.81 MiB
llm_load_tensors: CUDA0 buffer size = 4155.99 MiB
llama_new_context_with_model: n_ctx = 8192
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: flash_attn = 0
llama_new_context_with_model: freq_base = 500000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CUDA0 KV buffer size = 1024.00 MiB
llama_new_context_with_model: KV self size = 1024.00 MiB, K (f16): 512.00 MiB, V (f16): 512.00 MiB
llama_new_context_with_model: CUDA_Host output buffer size = 2.02 MiB
llama_new_context_with_model: CUDA0 compute buffer size = 560.00 MiB
llama_new_context_with_model: CUDA_Host compute buffer size = 24.01 MiB
llama_new_context_with_model: graph nodes = 1030
llama_new_context_with_model: graph splits = 2
INFO [wmain] model loaded | tid="21640" timestamp=1722265307
time=2024-07-29T23:01:47.646+08:00 level=INFO source=server.go:622 msg="llama runner started in 3.83 seconds"
[GIN] 2024/07/29 - 23:01:47 | 200 | 3.8840834s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/07/29 - 23:02:02 | 200 | 6.8410567s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/07/29 - 23:03:03 | 200 | 7.8408725s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/07/29 - 23:04:10 | 200 | 6.8992154s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/07/29 - 23:13:44 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/07/29 - 23:30:20 | 201 | 5m34s | 127.0.0.1 | POST "/api/blobs/sha256:e95f4b961ddd29031a98e9f84e2e2469d1005ee58d88dee7058cccf71a48cac5"
runtime: VirtualAlloc of 117440512 bytes failed with errno=1455
fatal error: out of memory
runtime stack:
runtime.throw({0x182ef79?, 0xe9f5baa000?})
runtime/panic.go:1023 +0x65 fp=0x346afffcf0 sp=0x346afffcc0 pc=0x85e9a5
runtime.sysUsedOS(0xe9f3e00000, 0x7000000)
runtime/mem_windows.go:83 +0x1bb fp=0x346afffd50 sp=0x346afffcf0 pc=0x83d21b
runtime.sysUsed(...)
runtime/mem.go:77
runtime.(*mheap).allocSpan(0x21267a0, 0x3800, 0x0, 0x1)
runtime/mheap.go:1347 +0x487 fp=0x346afffdf0 sp=0x346afffd50 pc=0x84f767
runtime.(*mheap).alloc.func1()
runtime/mheap.go:964 +0x5c fp=0x346afffe38 sp=0x346afffdf0 pc=0x84ef1c
runtime.systemstack(0xc000581180)
runtime/asm_amd64.s:509 +0x49 fp=0x346afffe48 sp=0x346afffe38 pc=0x8904a9
goroutine 195 gp=0xc000105a40 m=12 mp=0xc0003df008 [running]:
runtime.systemstack_switch()
runtime/asm_amd64.s:474 +0x8 fp=0xc00039a910 sp=0xc00039a900 pc=0x890448
runtime.(*mheap).alloc(0x7000000?, 0x3800?, 0xa0?)
runtime/mheap.go:958 +0x5b fp=0xc00039a958 sp=0xc00039a910 pc=0x84ee7b
runtime.(*mcache).allocLarge(0x83babd?, 0x7000000, 0x1)
runtime/mcache.go:234 +0x87 fp=0xc00039a9a8 sp=0xc00039a958 pc=0x83bfa7
runtime.mallocgc(0x7000000, 0x16cda80, 0x1)
runtime/malloc.go:1165 +0x597 fp=0xc00039aa30 sp=0xc00039a9a8 pc=0x832fb7
runtime.makeslice(0xc0003df008?, 0xd9cb6b6480?, 0x0?)
runtime/slice.go:107 +0x49 fp=0xc00039aa58 sp=0xc00039aa30 pc=0x874b89
github.com/nlpodyssey/gopickle/pytorch.(*BFloat16Storage).SetFromFileWithSize(0xd9cb6b6480, {0x155c6ed0048, 0xdb2c0282d0}, 0x1c00000)
github.com/nlpodyssey/gopickle@v0.3.0/pytorch/storage.go:395 +0x45 fp=0xc00039aad8 sp=0xc00039aa58 pc=0x117a285
github.com/nlpodyssey/gopickle/pytorch.loadTensor({0x155c69b6d20, 0x21a4860}, 0x1c00000, {0xd8a87779cb, 0x3}, {0xc88d0f733e, 0x2}, 0x91e219?)
github.com/nlpodyssey/gopickle@v0.3.0/pytorch/pytorch.go:127 +0x209 fp=0xc00039ab88 sp=0xc00039aad8 pc=0x1176a69
github.com/nlpodyssey/gopickle/pytorch.loadZipFile.func1({0x1744640?, 0xc00036e768?})
github.com/nlpodyssey/gopickle@v0.3.0/pytorch/pytorch.go:99 +0x32b fp=0xc00039ac50 sp=0xc00039ab88 pc=0x11761ab
github.com/nlpodyssey/gopickle/pickle.loadBinPersId(0xd8a87f4600)
github.com/nlpodyssey/gopickle@v0.3.0/pickle/pickle.go:439 +0x3e fp=0xc00039acb0 sp=0xc00039ac50 pc=0x116ebbe
github.com/nlpodyssey/gopickle/pickle.(*Unpickler).Load(0xd8a87f4600)
github.com/nlpodyssey/gopickle@v0.3.0/pickle/pickle.go:102 +0xe6 fp=0xc00039ad08 sp=0xc00039acb0 pc=0x116d1c6
github.com/nlpodyssey/gopickle/pytorch.loadZipFile({0xc0000c11a0, 0x40}, 0x18970a0)
github.com/nlpodyssey/gopickle@v0.3.0/pytorch/pytorch.go:107 +0x5f9 fp=0xc00039ae68 sp=0xc00039ad08 pc=0x1175d39
github.com/nlpodyssey/gopickle/pytorch.LoadWithUnpickler({0xc0000c11a0, 0x40}, 0x18970a0)
github.com/nlpodyssey/gopickle@v0.3.0/pytorch/pytorch.go:40 +0x3d fp=0xc00039ae90 sp=0xc00039ae68 pc=0x11756dd
github.com/nlpodyssey/gopickle/pytorch.Load({0xc0000c11a0?, 0xc00039b0f0?})
github.com/nlpodyssey/gopickle@v0.3.0/pytorch/pytorch.go:31 +0x1f fp=0xc00039aeb8 sp=0xc00039ae90 pc=0x117565f
github.com/ollama/ollama/convert.(*TorchFormat).GetTensors(0x21a4860, {0xc0008584e0, 0x2c}, 0xc0005746e0)
github.com/ollama/ollama/convert/torch.go:46 +0x29f fp=0xc00039b180 sp=0xc00039aeb8 pc=0x118405f
github.com/ollama/ollama/convert.(*LlamaModel).GetTensors(0xc000854360)
github.com/ollama/ollama/convert/llama.go:24 +0x42 fp=0xc00039b2e0 sp=0xc00039b180 pc=0x117cd82
github.com/ollama/ollama/server.parseFromZipFile({0x19bf440?, 0xc0000ed2c0?}, 0xc0001600e0, {0xc000036871, 0x47}, 0xc000238230)
github.com/ollama/ollama/server/model.go:162 +0x229 fp=0xc00039b4e0 sp=0xc00039b2e0 pc=0x133ea29
github.com/ollama/ollama/server.parseFromFile({0x19c6e50, 0xc00011a3c0}, 0xc0001600e0, {0xc000036871, 0x47}, 0xc000238230)
github.com/ollama/ollama/server/model.go:222 +0x177 fp=0xc00039b5c0 sp=0xc00039b4e0 pc=0x133f737
github.com/ollama/ollama/server.CreateModel({0x19c6e50, 0xc00011a3c0}, {{0x18346cd, 0x12}, {0x1828b55, 0x7}, {0xc0002222a0, 0x8}, {0xc0002222a9, 0x3}}, ...)
github.com/ollama/ollama/server/images.go:418 +0x7dd fp=0xc00039be78 sp=0xc00039b5c0 pc=0x13343bd
github.com/ollama/ollama/server.(*Server).CreateModelHandler.func1()
github.com/ollama/ollama/server/routes.go:612 +0x26b fp=0xc00039bfe0 sp=0xc00039be78 pc=0x134942b
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00039bfe8 sp=0xc00039bfe0 pc=0x892481
created by github.com/ollama/ollama/server.(*Server).CreateModelHandler in goroutine 111
github.com/ollama/ollama/server/routes.go:602 +0x9b9
goroutine 1 gp=0xc0000a2000 m=nil [IO wait, 20 minutes]:
runtime.gopark(0xc0000c3808?, 0x16365e0?, 0x20?, 0x99?, 0xc000499950?)
runtime/proc.go:402 +0xce fp=0xc0002d5670 sp=0xc0002d5650 pc=0x86176e
runtime.netpollblock(0x1c8?, 0x828e06?, 0x0?)
runtime/netpoll.go:573 +0xf7 fp=0xc0002d56a8 sp=0xc0002d5670 pc=0x859017
internal/poll.runtime_pollWait(0x155c6bb8820, 0x72)
runtime/netpoll.go:345 +0x85 fp=0xc0002d56c8 sp=0xc0002d56a8 pc=0x88c025
internal/poll.(*pollDesc).wait(0x840476?, 0x21a6820?, 0x0)
internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0002d56f0 sp=0xc0002d56c8 pc=0x930887
internal/poll.execIO(0xc000499920, 0xc0002d5790)
internal/poll/fd_windows.go:175 +0xe6 fp=0xc0002d5760 sp=0xc0002d56f0 pc=0x931d66
internal/poll.(*FD).acceptOne(0xc000499908, 0x440, {0xc0002b00f0?, 0x0?, 0x1500000000?}, 0xc0000c3808?)
internal/poll/fd_windows.go:944 +0x67 fp=0xc0002d57c0 sp=0xc0002d5760 pc=0x936427
internal/poll.(*FD).Accept(0xc000499908, 0xc0002d5970)
internal/poll/fd_windows.go:978 +0x1bc fp=0xc0002d5878 sp=0xc0002d57c0 pc=0x93675c
net.(*netFD).accept(0xc000499908)
net/fd_windows.go:178 +0x54 fp=0xc0002d5990 sp=0xc0002d5878 pc=0x9c8594
net.(*TCPListener).accept(0xc000543620)
net/tcpsock_posix.go:159 +0x1e fp=0xc0002d59b8 sp=0xc0002d5990 pc=0x9de95e
net.(*TCPListener).Accept(0xc000543620)
net/tcpsock.go:327 +0x30 fp=0xc0002d59e8 sp=0xc0002d59b8 pc=0x9dd750
net/http.(*onceCloseListener).Accept(0xc0001c5200?)
<autogenerated>:1 +0x24 fp=0xc0002d5a00 sp=0xc0002d59e8 pc=0xb53924
net/http.(*Server).Serve(0xc00056a2d0, {0x19c4480, 0xc000543620})
net/http/server.go:3260 +0x33e fp=0xc0002d5b30 sp=0xc0002d5a00 pc=0xb312de
github.com/ollama/ollama/server.Serve({0x19c4480, 0xc000543620})
github.com/ollama/ollama/server/routes.go:1182 +0x7c5 fp=0xc0002d5cd0 sp=0xc0002d5b30 pc=0x13500c5
github.com/ollama/ollama/cmd.RunServer(0xc00004d500?, {0x21a4860?, 0x4?, 0x181f0ab?})
github.com/ollama/ollama/cmd/cmd.go:1084 +0x105 fp=0xc0002d5d58 sp=0xc0002d5cd0 pc=0x1373805
github.com/spf13/cobra.(*Command).execute(0xc000570908, {0x21a4860, 0x0, 0x0})
github.com/spf13/cobra@v1.7.0/command.go:940 +0x882 fp=0xc0002d5e78 sp=0xc0002d5d58 pc=0xbcd2e2
github.com/spf13/cobra.(*Command).ExecuteC(0xc000139808)
github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc0002d5f30 sp=0xc0002d5e78 pc=0xbcdb25
github.com/spf13/cobra.(*Command).Execute(...)
github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
github.com/spf13/cobra@v1.7.0/command.go:985
main.main()
github.com/ollama/ollama/main.go:11 +0x4d fp=0xc0002d5f50 sp=0xc0002d5f30 pc=0x137c4cd
runtime.main()
runtime/proc.go:271 +0x28b fp=0xc0002d5fe0 sp=0xc0002d5f50 pc=0x86136b
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0002d5fe8 sp=0xc0002d5fe0 pc=0x892481
goroutine 2 gp=0xc0000a2700 m=nil [force gc (idle), 2 minutes]:
runtime.gopark(0x63eee3204744?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0000a5fa8 sp=0xc0000a5f88 pc=0x86176e
runtime.goparkunlock(...)
runtime/proc.go:408
runtime.forcegchelper()
runtime/proc.go:326 +0xb8 fp=0xc0000a5fe0 sp=0xc0000a5fa8 pc=0x8615f8
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0000a5fe8 sp=0xc0000a5fe0 pc=0x892481
created by runtime.init.6 in goroutine 1
runtime/proc.go:314 +0x1a
goroutine 3 gp=0xc0000a2a80 m=nil [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0000a7f80 sp=0xc0000a7f60 pc=0x86176e
runtime.goparkunlock(...)
runtime/proc.go:408
runtime.bgsweep(0xc00003a070)
runtime/mgcsweep.go:318 +0xdf fp=0xc0000a7fc8 sp=0xc0000a7f80 pc=0x84b81f
runtime.gcenable.gowrap1()
runtime/mgc.go:203 +0x25 fp=0xc0000a7fe0 sp=0xc0000a7fc8 pc=0x8400c5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0000a7fe8 sp=0xc0000a7fe0 pc=0x892481
created by runtime.gcenable in goroutine 1
runtime/mgc.go:203 +0x66
goroutine 4 gp=0xc0000a2c40 m=nil [GC scavenge wait]:
runtime.gopark(0xf89bc?, 0x96420?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0000b7f78 sp=0xc0000b7f58 pc=0x86176e
runtime.goparkunlock(...)
runtime/proc.go:408
runtime.(*scavengerState).park(0x2118260)
runtime/mgcscavenge.go:425 +0x49 fp=0xc0000b7fa8 sp=0xc0000b7f78 pc=0x8491a9
runtime.bgscavenge(0xc00003a070)
runtime/mgcscavenge.go:658 +0x59 fp=0xc0000b7fc8 sp=0xc0000b7fa8 pc=0x849759
runtime.gcenable.gowrap2()
runtime/mgc.go:204 +0x25 fp=0xc0000b7fe0 sp=0xc0000b7fc8 pc=0x840065
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0000b7fe8 sp=0xc0000b7fe0 pc=0x892481
created by runtime.gcenable in goroutine 1
runtime/mgc.go:204 +0xa5
goroutine 5 gp=0xc0000a3180 m=nil [finalizer wait, 54 minutes]:
runtime.gopark(0xc0000a9e48?, 0x833465?, 0xa8?, 0x1?, 0xc0000a2000?)
runtime/proc.go:402 +0xce fp=0xc0000a9e20 sp=0xc0000a9e00 pc=0x86176e
runtime.runfinq()
runtime/mfinal.go:194 +0x107 fp=0xc0000a9fe0 sp=0xc0000a9e20 pc=0x83f147
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0000a9fe8 sp=0xc0000a9fe0 pc=0x892481
created by runtime.createfing in goroutine 1
runtime/mfinal.go:164 +0x3d
goroutine 6 gp=0xc00021cc40 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0000b9f50 sp=0xc0000b9f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0000b9fe0 sp=0xc0000b9f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0000b9fe8 sp=0xc0000b9fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 7 gp=0xc00021ce00 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0000b3f50 sp=0xc0000b3f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0000b3fe0 sp=0xc0000b3f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0000b3fe8 sp=0xc0000b3fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 18 gp=0xc000500000 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000507f50 sp=0xc000507f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000507fe0 sp=0xc000507f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000507fe8 sp=0xc000507fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 19 gp=0xc0005001c0 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000509f50 sp=0xc000509f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000509fe0 sp=0xc000509f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000509fe8 sp=0xc000509fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 34 gp=0xc0001041c0 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000503f50 sp=0xc000503f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000503fe0 sp=0xc000503f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000503fe8 sp=0xc000503fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 35 gp=0xc000104380 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000505f50 sp=0xc000505f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000505fe0 sp=0xc000505f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000505fe8 sp=0xc000505fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 8 gp=0xc00021cfc0 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc0000b5f50 sp=0xc0000b5f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc0000b5fe0 sp=0xc0000b5f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0000b5fe8 sp=0xc0000b5fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 20 gp=0xc000500380 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000513f50 sp=0xc000513f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000513fe0 sp=0xc000513f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000513fe8 sp=0xc000513fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 21 gp=0xc000500540 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000515f50 sp=0xc000515f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000515fe0 sp=0xc000515f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000515fe8 sp=0xc000515fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 9 gp=0xc00021d180 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00050ff50 sp=0xc00050ff30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00050ffe0 sp=0xc00050ff50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00050ffe8 sp=0xc00050ffe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 10 gp=0xc00021d340 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000511f50 sp=0xc000511f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000511fe0 sp=0xc000511f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000511fe8 sp=0xc000511fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 11 gp=0xc00021d500 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000487f50 sp=0xc000487f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000487fe0 sp=0xc000487f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000487fe8 sp=0xc000487fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 36 gp=0xc000104540 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000483f50 sp=0xc000483f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000483fe0 sp=0xc000483f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000483fe8 sp=0xc000483fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 50 gp=0xc000580000 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000587f50 sp=0xc000587f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000587fe0 sp=0xc000587f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000587fe8 sp=0xc000587fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 37 gp=0xc000104700 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000485f50 sp=0xc000485f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000485fe0 sp=0xc000485f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000485fe8 sp=0xc000485fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 51 gp=0xc0005801c0 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000589f50 sp=0xc000589f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000589fe0 sp=0xc000589f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000589fe8 sp=0xc000589fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 12 gp=0xc00021d6c0 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000489f50 sp=0xc000489f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000489fe0 sp=0xc000489f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000489fe8 sp=0xc000489fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 13 gp=0xc00021d880 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000583f50 sp=0xc000583f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000583fe0 sp=0xc000583f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000583fe8 sp=0xc000583fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 22 gp=0xc000500700 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00051bf50 sp=0xc00051bf30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00051bfe0 sp=0xc00051bf50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00051bfe8 sp=0xc00051bfe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 52 gp=0xc000580380 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000517f50 sp=0xc000517f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000517fe0 sp=0xc000517f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000517fe8 sp=0xc000517fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 53 gp=0xc000580540 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000519f50 sp=0xc000519f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000519fe0 sp=0xc000519f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000519fe8 sp=0xc000519fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 23 gp=0xc0005008c0 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00051df50 sp=0xc00051df30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00051dfe0 sp=0xc00051df50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00051dfe8 sp=0xc00051dfe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 24 gp=0xc000500a80 m=nil [GC worker (idle), 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000523f50 sp=0xc000523f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000523fe0 sp=0xc000523f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000523fe8 sp=0xc000523fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 14 gp=0xc00021da40 m=nil [GC worker (idle), 2 minutes]:
runtime.gopark(0x21a6820?, 0x1?, 0xbc?, 0x1c?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000585f50 sp=0xc000585f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000585fe0 sp=0xc000585f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000585fe8 sp=0xc000585fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 25 gp=0xc000500c40 m=nil [GC worker (idle)]:
runtime.gopark(0x63eee4216ee8?, 0x1?, 0x9c?, 0x7a?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000525f50 sp=0xc000525f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000525fe0 sp=0xc000525f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000525fe8 sp=0xc000525fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 15 gp=0xc00021dc00 m=nil [GC worker (idle)]:
runtime.gopark(0x21a6820?, 0x1?, 0x40?, 0x43?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00051ff50 sp=0xc00051ff30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00051ffe0 sp=0xc00051ff50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00051ffe8 sp=0xc00051ffe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 26 gp=0xc000500e00 m=nil [GC worker (idle), 7 minutes]:
runtime.gopark(0x638de805c1f0?, 0x1?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00052bf50 sp=0xc00052bf30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00052bfe0 sp=0xc00052bf50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00052bfe8 sp=0xc00052bfe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 16 gp=0xc00021ddc0 m=nil [GC worker (idle)]:
runtime.gopark(0x21a6820?, 0x1?, 0x48?, 0xae?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000521f50 sp=0xc000521f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000521fe0 sp=0xc000521f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000521fe8 sp=0xc000521fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 27 gp=0xc000500fc0 m=nil [GC worker (idle)]:
runtime.gopark(0x21a6820?, 0x1?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc00052df50 sp=0xc00052df30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc00052dfe0 sp=0xc00052df50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc00052dfe8 sp=0xc00052dfe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 66 gp=0xc00048a000 m=nil [GC worker (idle)]:
runtime.gopark(0x21a6820?, 0x1?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000527f50 sp=0xc000527f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000527fe0 sp=0xc000527f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000527fe8 sp=0xc000527fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 28 gp=0xc000501180 m=nil [GC worker (idle)]:
runtime.gopark(0x63eee4216ee8?, 0x1?, 0xd0?, 0x43?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000535f50 sp=0xc000535f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000535fe0 sp=0xc000535f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000535fe8 sp=0xc000535fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 67 gp=0xc00048a1c0 m=nil [GC worker (idle)]:
runtime.gopark(0x21a6820?, 0x1?, 0x40?, 0x5?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000529f50 sp=0xc000529f30 pc=0x86176e
runtime.gcBgMarkWorker()
runtime/mgc.go:1310 +0xe5 fp=0xc000529fe0 sp=0xc000529f50 pc=0x842205
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000529fe8 sp=0xc000529fe0 pc=0x892481
created by runtime.gcBgMarkStartWorkers in goroutine 1
runtime/mgc.go:1234 +0x1c
goroutine 38 gp=0xc00048a380 m=8 mp=0xc000600008 [syscall, 54 minutes]:
runtime.notetsleepg(0x21a5460, 0xffffffffffffffff)
runtime/lock_sema.go:296 +0x31 fp=0xc000533fa0 sp=0xc000533f68 pc=0x831a31
os/signal.signal_recv()
runtime/sigqueue.go:152 +0x29 fp=0xc000533fc0 sp=0xc000533fa0 pc=0x88e189
os/signal.loop()
os/signal/signal_unix.go:23 +0x13 fp=0xc000533fe0 sp=0xc000533fc0 pc=0xb55d73
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000533fe8 sp=0xc000533fe0 pc=0x892481
created by os/signal.Notify.func1.1 in goroutine 1
os/signal/signal.go:151 +0x1f
goroutine 39 gp=0xc00048a540 m=nil [chan receive, 54 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:402 +0xce fp=0xc000537f00 sp=0xc000537ee0 pc=0x86176e
runtime.chanrecv(0xc0001725a0, 0x0, 0x1)
runtime/chan.go:583 +0x3cd fp=0xc000537f78 sp=0xc000537f00 pc=0x82b9ad
runtime.chanrecv1(0x0?, 0x0?)
runtime/chan.go:442 +0x12 fp=0xc000537fa0 sp=0xc000537f78 pc=0x82b5b2
github.com/ollama/ollama/server.Serve.func2()
github.com/ollama/ollama/server/routes.go:1163 +0x3d fp=0xc000537fe0 sp=0xc000537fa0 pc=0x13501dd
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000537fe8 sp=0xc000537fe0 pc=0x892481
created by github.com/ollama/ollama/server.Serve in goroutine 1
github.com/ollama/ollama/server/routes.go:1162 +0x72c
goroutine 40 gp=0xc00048a700 m=nil [select, 34 minutes]:
runtime.gopark(0xc0003abf50?, 0x3?, 0x40?, 0xac?, 0xc0003abd12?)
runtime/proc.go:402 +0xce fp=0xc0003abb98 sp=0xc0003abb78 pc=0x86176e
runtime.selectgo(0xc0003abf50, 0xc0003abd0c, 0x21a4860?, 0x0, 0x185739b?, 0x1)
runtime/select.go:327 +0x725 fp=0xc0003abcb8 sp=0xc0003abb98 pc=0x871bc5
github.com/ollama/ollama/server.(*Scheduler).processPending(0xc000172180, {0x19c6e50, 0xc0000be640})
github.com/ollama/ollama/server/sched.go:114 +0xcf fp=0xc0003abfb8 sp=0xc0003abcb8 pc=0x1352f4f
github.com/ollama/ollama/server.(*Scheduler).Run.func1()
github.com/ollama/ollama/server/sched.go:104 +0x1f fp=0xc0003abfe0 sp=0xc0003abfb8 pc=0x1352e5f
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0003abfe8 sp=0xc0003abfe0 pc=0x892481
created by github.com/ollama/ollama/server.(*Scheduler).Run in goroutine 1
github.com/ollama/ollama/server/sched.go:103 +0xb4
goroutine 41 gp=0xc00048a8c0 m=nil [select, 34 minutes]:
runtime.gopark(0xc000047f50?, 0x3?, 0x8?, 0x7c?, 0xc000047d52?)
runtime/proc.go:402 +0xce fp=0xc000531be0 sp=0xc000531bc0 pc=0x86176e
runtime.selectgo(0xc000531f50, 0xc000047d4c, 0x21a4860?, 0x0, 0x183d4a1?, 0x1)
runtime/select.go:327 +0x725 fp=0xc000531d00 sp=0xc000531be0 pc=0x871bc5
github.com/ollama/ollama/server.(*Scheduler).processCompleted(0xc000172180, {0x19c6e50, 0xc0000be640})
github.com/ollama/ollama/server/sched.go:303 +0xec fp=0xc000531fb8 sp=0xc000531d00 pc=0x135410c
github.com/ollama/ollama/server.(*Scheduler).Run.func2()
github.com/ollama/ollama/server/sched.go:108 +0x1f fp=0xc000531fe0 sp=0xc000531fb8 pc=0x1352e1f
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000531fe8 sp=0xc000531fe0 pc=0x892481
created by github.com/ollama/ollama/server.(*Scheduler).Run in goroutine 1
github.com/ollama/ollama/server/sched.go:107 +0x110
goroutine 111 gp=0xc000581340 m=nil [chan receive, 8 minutes]:
runtime.gopark(0x4?, 0xc0001c5200?, 0x1?, 0x0?, 0xc0000472f0?)
runtime/proc.go:402 +0xce fp=0xc000047290 sp=0xc000047270 pc=0x86176e
runtime.chanrecv(0xc0000c0480, 0xc000047388, 0x1)
runtime/chan.go:583 +0x3cd fp=0xc000047308 sp=0xc000047290 pc=0x82b9ad
runtime.chanrecv2(0xc000116040?, 0xc000380080?)
runtime/chan.go:447 +0x12 fp=0xc000047330 sp=0xc000047308 pc=0x82b5d2
github.com/ollama/ollama/server.streamResponse.func1({0x155c6da18e0, 0xc00010c000})
github.com/ollama/ollama/server/routes.go:1224 +0x36 fp=0xc0000473a8 sp=0xc000047330 pc=0x13507f6
github.com/gin-gonic/gin.(*Context).Stream(0xc000047428?, 0xc000047418)
github.com/gin-gonic/gin@v1.10.0/context.go:1124 +0x79 fp=0xc0000473f0 sp=0xc0000473a8 pc=0x130fa59
github.com/ollama/ollama/server.streamResponse(0xc00010c000, 0xc0000c0480)
github.com/ollama/ollama/server/routes.go:1223 +0x65 fp=0xc000047438 sp=0xc0000473f0 pc=0x1350785
github.com/ollama/ollama/server.(*Server).CreateModelHandler(0x18021a685d?, 0xc00010c000)
github.com/ollama/ollama/server/routes.go:624 +0xa25 fp=0xc000047660 sp=0xc000047438 pc=0x1348f45
github.com/ollama/ollama/server.(*Server).CreateModelHandler-fm(0x9?)
<autogenerated>:1 +0x26 fp=0xc000047680 sp=0xc000047660 pc=0x1363386
github.com/gin-gonic/gin.(*Context).Next(0xc00010c000)
github.com/gin-gonic/gin@v1.10.0/context.go:185 +0x2b fp=0xc0000476a0 sp=0xc000047680 pc=0x1309c4b
github.com/ollama/ollama/server.(*Server).GenerateRoutes.allowedHostsMiddleware.func3(0xc00010c000)
github.com/ollama/ollama/server/routes.go:1022 +0x115 fp=0xc0000476f8 sp=0xc0000476a0 pc=0x134f855
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.10.0/context.go:185
github.com/gin-gonic/gin.CustomRecoveryWithWriter.func1(0xc00010c000)
github.com/gin-gonic/gin@v1.10.0/recovery.go:102 +0x7a fp=0xc000047748 sp=0xc0000476f8 pc=0x1317cba
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.10.0/context.go:185
github.com/gin-gonic/gin.LoggerWithConfig.func1(0xc00010c000)
github.com/gin-gonic/gin@v1.10.0/logger.go:249 +0xe5 fp=0xc000047900 sp=0xc000047748 pc=0x1316de5
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.10.0/context.go:185
github.com/gin-gonic/gin.(*Engine).handleHTTPRequest(0xc0001321a0, 0xc00010c000)
github.com/gin-gonic/gin@v1.10.0/gin.go:633 +0x892 fp=0xc000047ad8 sp=0xc000047900 pc=0x1316212
github.com/gin-gonic/gin.(*Engine).ServeHTTP(0xc0001321a0, {0x19c4690, 0xc0005560e0}, 0xc0004e6240)
github.com/gin-gonic/gin@v1.10.0/gin.go:589 +0x1b2 fp=0xc000047b10 sp=0xc000047ad8 pc=0x13157b2
net/http.(*ServeMux).ServeHTTP(0x833465?, {0x19c4690, 0xc0005560e0}, 0xc0004e6240)
net/http/server.go:2688 +0x1ad fp=0xc000047b60 sp=0xc000047b10 pc=0xb2f6ad
net/http.serverHandler.ServeHTTP({0x19c21d0?}, {0x19c4690?, 0xc0005560e0?}, 0x6?)
net/http/server.go:3142 +0x8e fp=0xc000047b90 sp=0xc000047b60 pc=0xb30eae
net/http.(*conn).serve(0xc0001c5200, {0x19c6e18, 0xc0000ece10})
net/http/server.go:2044 +0x5e8 fp=0xc000047fb8 sp=0xc000047b90 pc=0xb2c1a8
net/http.(*Server).Serve.gowrap3()
net/http/server.go:3290 +0x28 fp=0xc000047fe0 sp=0xc000047fb8 pc=0xb316c8
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000047fe8 sp=0xc000047fe0 pc=0x892481
created by net/http.(*Server).Serve in goroutine 1
net/http/server.go:3290 +0x4b4
goroutine 194 gp=0xc000105180 m=nil [IO wait, 14 minutes]:
runtime.gopark(0x0?, 0xc000553920?, 0xd0?, 0x39?, 0xc000553950?)
runtime/proc.go:402 +0xce fp=0xc0006b5d28 sp=0xc0006b5d08 pc=0x86176e
runtime.netpollblock(0x538?, 0x828e06?, 0x0?)
runtime/netpoll.go:573 +0xf7 fp=0xc0006b5d60 sp=0xc0006b5d28 pc=0x859017
internal/poll.runtime_pollWait(0x155c6bb8728, 0x72)
runtime/netpoll.go:345 +0x85 fp=0xc0006b5d80 sp=0xc0006b5d60 pc=0x88c025
internal/poll.(*pollDesc).wait(0x10?, 0x10?, 0x0)
internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0006b5da8 sp=0xc0006b5d80 pc=0x930887
internal/poll.execIO(0xc000553920, 0x1896ac0)
internal/poll/fd_windows.go:175 +0xe6 fp=0xc0006b5e18 sp=0xc0006b5da8 pc=0x931d66
internal/poll.(*FD).Read(0xc000553908, {0xc000690041, 0x1, 0x1})
internal/poll/fd_windows.go:436 +0x2b1 fp=0xc0006b5ec0 sp=0xc0006b5e18 pc=0x932a11
net.(*netFD).Read(0xc000553908, {0xc000690041?, 0xc0006b5f48?, 0x88ded0?})
net/fd_posix.go:55 +0x25 fp=0xc0006b5f08 sp=0xc0006b5ec0 pc=0x9c66a5
net.(*conn).Read(0xc000160000, {0xc000690041?, 0xc0000bcd80?, 0x21a4860?})
net/net.go:185 +0x45 fp=0xc0006b5f50 sp=0xc0006b5f08 pc=0x9d6345
net.(*TCPConn).Read(0x160ee10?, {0xc000690041?, 0x0?, 0xc00063e3c0?})
<autogenerated>:1 +0x25 fp=0xc0006b5f80 sp=0xc0006b5f50 pc=0x9e6665
net/http.(*connReader).backgroundRead(0xc000690030)
net/http/server.go:681 +0x37 fp=0xc0006b5fc8 sp=0xc0006b5f80 pc=0xb26117
net/http.(*connReader).startBackgroundRead.gowrap2()
net/http/server.go:677 +0x25 fp=0xc0006b5fe0 sp=0xc0006b5fc8 pc=0xb26045
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0006b5fe8 sp=0xc0006b5fe0 pc=0x892481
created by net/http.(*connReader).startBackgroundRead in goroutine 111
net/http/server.go:677 +0xba
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.0
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6057/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6057/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6342
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6342/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6342/comments
|
https://api.github.com/repos/ollama/ollama/issues/6342/events
|
https://github.com/ollama/ollama/issues/6342
| 2,463,951,443
|
I_kwDOJ0Z1Ps6S3OpT
| 6,342
|
Windows Defender
|
{
"login": "Eniti-Codes",
"id": 106023124,
"node_id": "U_kgDOBlHI1A",
"avatar_url": "https://avatars.githubusercontent.com/u/106023124?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Eniti-Codes",
"html_url": "https://github.com/Eniti-Codes",
"followers_url": "https://api.github.com/users/Eniti-Codes/followers",
"following_url": "https://api.github.com/users/Eniti-Codes/following{/other_user}",
"gists_url": "https://api.github.com/users/Eniti-Codes/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Eniti-Codes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Eniti-Codes/subscriptions",
"organizations_url": "https://api.github.com/users/Eniti-Codes/orgs",
"repos_url": "https://api.github.com/users/Eniti-Codes/repos",
"events_url": "https://api.github.com/users/Eniti-Codes/events{/privacy}",
"received_events_url": "https://api.github.com/users/Eniti-Codes/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-08-13T18:21:41
| 2024-08-13T18:34:17
| 2024-08-13T18:34:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
So for some reason, with the newest update, Windows Defender thinks this is a Trojan.
<img width="210" alt="ApplicationFrameHost_ye04kXZxFA" src="https://github.com/user-attachments/assets/85525c54-a0ca-40e5-a764-f4227bf73ba3">
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
I don't know. Windows removed it
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6342/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6342/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7623
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7623/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7623/comments
|
https://api.github.com/repos/ollama/ollama/issues/7623/events
|
https://github.com/ollama/ollama/issues/7623
| 2,650,674,272
|
I_kwDOJ0Z1Ps6d_hRg
| 7,623
|
ollama 70B model on 10x32G vram rtx5000 - loading to 256G ram and cpu
|
{
"login": "paolss",
"id": 18089673,
"node_id": "MDQ6VXNlcjE4MDg5Njcz",
"avatar_url": "https://avatars.githubusercontent.com/u/18089673?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/paolss",
"html_url": "https://github.com/paolss",
"followers_url": "https://api.github.com/users/paolss/followers",
"following_url": "https://api.github.com/users/paolss/following{/other_user}",
"gists_url": "https://api.github.com/users/paolss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/paolss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/paolss/subscriptions",
"organizations_url": "https://api.github.com/users/paolss/orgs",
"repos_url": "https://api.github.com/users/paolss/repos",
"events_url": "https://api.github.com/users/paolss/events{/privacy}",
"received_events_url": "https://api.github.com/users/paolss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-11T23:54:20
| 2024-12-13T11:42:30
| 2024-12-13T11:42:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
as in topic. something changed for rly bad in ollama - was trying to load 70B model that was working before update and now its not... all because it want to load to ram and use cpu not 10x32G rtx5000
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
_No response_
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7623/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7623/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/126
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/126/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/126/comments
|
https://api.github.com/repos/ollama/ollama/issues/126/events
|
https://github.com/ollama/ollama/pull/126
| 1,812,177,085
|
PR_kwDOJ0Z1Ps5V55P5
| 126
|
add llama2:13b model to the readme
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-19T15:16:40
| 2023-07-19T15:21:29
| 2023-07-19T15:21:29
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/126",
"html_url": "https://github.com/ollama/ollama/pull/126",
"diff_url": "https://github.com/ollama/ollama/pull/126.diff",
"patch_url": "https://github.com/ollama/ollama/pull/126.patch",
"merged_at": "2023-07-19T15:21:29"
}
| null |
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/126/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/126/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3261
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3261/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3261/comments
|
https://api.github.com/repos/ollama/ollama/issues/3261/events
|
https://github.com/ollama/ollama/issues/3261
| 2,196,395,575
|
I_kwDOJ0Z1Ps6C6lY3
| 3,261
|
404 while installing NVIDIA repository
|
{
"login": "gnumoksha",
"id": 696797,
"node_id": "MDQ6VXNlcjY5Njc5Nw==",
"avatar_url": "https://avatars.githubusercontent.com/u/696797?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gnumoksha",
"html_url": "https://github.com/gnumoksha",
"followers_url": "https://api.github.com/users/gnumoksha/followers",
"following_url": "https://api.github.com/users/gnumoksha/following{/other_user}",
"gists_url": "https://api.github.com/users/gnumoksha/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gnumoksha/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gnumoksha/subscriptions",
"organizations_url": "https://api.github.com/users/gnumoksha/orgs",
"repos_url": "https://api.github.com/users/gnumoksha/repos",
"events_url": "https://api.github.com/users/gnumoksha/events{/privacy}",
"received_events_url": "https://api.github.com/users/gnumoksha/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-03-20T00:57:56
| 2024-07-29T21:24:21
| 2024-07-29T21:24:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```console
$ curl -fsSL https://ollama.com/install.sh | sh
>>> Downloading ollama...
######################################################################## 100.0%#=#=# ######################################################################## 100.0%
>>> Installing ollama to /usr/local/bin...
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> Installing NVIDIA repository...
curl: (22) The requested URL returned error: 404
```
### What did you expect to see?
ollama installed and running
### Steps to reproduce
- Create a AWS instance of type `g5g.xlarge`
- Try to install ollama
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
arm64
### Platform
_No response_
### Ollama version
_No response_
### GPU
Nvidia
### GPU info
```console
$ lspci -d '10de:'
00:1f.0 3D controller: NVIDIA Corporation TU104GL [T4G] (rev a1)
```
### CPU
Other
### Other software
```console
$ lscpu
Architecture: aarch64
CPU op-mode(s): 32-bit, 64-bit
Byte Order: Little Endian
CPU(s): 4
On-line CPU(s) list: 0-3
Vendor ID: ARM
Model name: Neoverse-N1
Model: 1
Thread(s) per core: 1
Core(s) per socket: 4
Socket(s): 1
Stepping: r3p1
BogoMIPS: 243.75
Flags: fp asimd evtstrm aes pmull sha1 sha2 crc32 atomics fphp asimdhp cpuid asimdrdm lrcpc dcpop asimd
dp ssbs
[...]
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3261/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3261/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5145
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5145/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5145/comments
|
https://api.github.com/repos/ollama/ollama/issues/5145/events
|
https://github.com/ollama/ollama/pull/5145
| 2,362,708,795
|
PR_kwDOJ0Z1Ps5y-05J
| 5,145
|
Fix bad symbol load detection
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-19T15:56:58
| 2024-06-19T16:12:35
| 2024-06-19T16:12:33
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5145",
"html_url": "https://github.com/ollama/ollama/pull/5145",
"diff_url": "https://github.com/ollama/ollama/pull/5145.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5145.patch",
"merged_at": "2024-06-19T16:12:33"
}
|
pointer deref's weren't correct on a few libraries, which explains some crashes on older systems or miswired symlinks for discovery libraries.
Fixes #4982
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5145/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5145/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1140
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1140/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1140/comments
|
https://api.github.com/repos/ollama/ollama/issues/1140/events
|
https://github.com/ollama/ollama/issues/1140
| 1,995,273,108
|
I_kwDOJ0Z1Ps527XOU
| 1,140
|
Model push is not working
|
{
"login": "eramax",
"id": 542413,
"node_id": "MDQ6VXNlcjU0MjQxMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/542413?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eramax",
"html_url": "https://github.com/eramax",
"followers_url": "https://api.github.com/users/eramax/followers",
"following_url": "https://api.github.com/users/eramax/following{/other_user}",
"gists_url": "https://api.github.com/users/eramax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eramax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eramax/subscriptions",
"organizations_url": "https://api.github.com/users/eramax/orgs",
"repos_url": "https://api.github.com/users/eramax/repos",
"events_url": "https://api.github.com/users/eramax/events{/privacy}",
"received_events_url": "https://api.github.com/users/eramax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 18
| 2023-11-15T18:02:57
| 2024-12-12T01:58:22
| 2023-11-16T21:44:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I tried many times to push my models to ollama but always I get this error
```
retrieving manifest
Error: max retries exceeded
```
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1140/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1140/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2147
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2147/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2147/comments
|
https://api.github.com/repos/ollama/ollama/issues/2147/events
|
https://github.com/ollama/ollama/issues/2147
| 2,094,834,807
|
I_kwDOJ0Z1Ps583KR3
| 2,147
|
permission denied when setting OLLAMA_MODELS in service file
|
{
"login": "lasseedfast",
"id": 8794658,
"node_id": "MDQ6VXNlcjg3OTQ2NTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8794658?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lasseedfast",
"html_url": "https://github.com/lasseedfast",
"followers_url": "https://api.github.com/users/lasseedfast/followers",
"following_url": "https://api.github.com/users/lasseedfast/following{/other_user}",
"gists_url": "https://api.github.com/users/lasseedfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lasseedfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lasseedfast/subscriptions",
"organizations_url": "https://api.github.com/users/lasseedfast/orgs",
"repos_url": "https://api.github.com/users/lasseedfast/repos",
"events_url": "https://api.github.com/users/lasseedfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/lasseedfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 14
| 2024-01-22T21:59:56
| 2024-12-30T23:00:24
| 2024-03-12T18:45:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm trying to set MODEL_FILE env variable in /etc/systemd/system/ollama.service.d but the logs shows that the service tries to create the directory:
```
Jan 22 21:25:41 airig systemd[1]: ollama.service: Scheduled restart job, restart counter is at 151.
Jan 22 21:25:41 airig systemd[1]: Stopped ollama.service - Ollama Service.
Jan 22 21:25:41 airig systemd[1]: Started ollama.service - Ollama Service.
Jan 22 21:25:41 airig sh[301002]: Error: mkdir /home/lasse/model_drive: permission denied
Jan 22 21:25:41 airig systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Jan 22 21:25:41 airig systemd[1]: ollama.service: Failed with result 'exit-code'.
```
environment.conf:
```
~$ cat /etc/systemd/system/ollama.service.d/environment.conf
[Service]
Environment="OLLAMA_MODELS=/home/lasse/model_drive/ollama"
```
The model_file folder is a mount point for a SSD disk, but when checking permissions for my user and the ollama user it looks fine.
`drwxrwxrwx 5 lasse lasse 4096 Jan 21 19:18 model_drive`
When starting the service like `OLLAMA_MODELS=~/model_drive/ollama ollama serve` everything works fine, only when using the conf file as proposed in the [FAQ](https://github.com/jmorganca/ollama/blob/main/docs/faq.md#where-are-models-stored).
This might be related to the bug in https://github.com/jmorganca/ollama/issues/1066
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2147/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2147/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/12
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/12/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/12/comments
|
https://api.github.com/repos/ollama/ollama/issues/12/events
|
https://github.com/ollama/ollama/pull/12
| 1,779,528,163
|
PR_kwDOJ0Z1Ps5ULEGP
| 12
|
add prompt templates as j2 templates
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-06-28T18:51:07
| 2023-06-28T18:53:54
| 2023-06-28T18:53:50
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/12",
"html_url": "https://github.com/ollama/ollama/pull/12",
"diff_url": "https://github.com/ollama/ollama/pull/12.diff",
"patch_url": "https://github.com/ollama/ollama/pull/12.patch",
"merged_at": "2023-06-28T18:53:50"
}
|
#7 is missing from main
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/12/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/12/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5882
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5882/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5882/comments
|
https://api.github.com/repos/ollama/ollama/issues/5882/events
|
https://github.com/ollama/ollama/issues/5882
| 2,425,791,666
|
I_kwDOJ0Z1Ps6QlqSy
| 5,882
|
Generate actionable error message when a model meets insufficient GPU memory or RAM
|
{
"login": "sagarrandive",
"id": 11855008,
"node_id": "MDQ6VXNlcjExODU1MDA4",
"avatar_url": "https://avatars.githubusercontent.com/u/11855008?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sagarrandive",
"html_url": "https://github.com/sagarrandive",
"followers_url": "https://api.github.com/users/sagarrandive/followers",
"following_url": "https://api.github.com/users/sagarrandive/following{/other_user}",
"gists_url": "https://api.github.com/users/sagarrandive/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sagarrandive/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sagarrandive/subscriptions",
"organizations_url": "https://api.github.com/users/sagarrandive/orgs",
"repos_url": "https://api.github.com/users/sagarrandive/repos",
"events_url": "https://api.github.com/users/sagarrandive/events{/privacy}",
"received_events_url": "https://api.github.com/users/sagarrandive/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-07-23T17:57:26
| 2024-08-11T18:30:21
| 2024-08-11T18:30:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When model is too large for the GPU or RAM of the underlying compute, it would be helpful if Ollama generates a message explicitly calling out that the model is too large for the memory. Currently that is not the case.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5882/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5882/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8227
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8227/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8227/comments
|
https://api.github.com/repos/ollama/ollama/issues/8227/events
|
https://github.com/ollama/ollama/pull/8227
| 2,757,337,721
|
PR_kwDOJ0Z1Ps6GJWri
| 8,227
|
README.md inclusion of a project alpaca
|
{
"login": "olumolu",
"id": 162728301,
"node_id": "U_kgDOCbMJbQ",
"avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/olumolu",
"html_url": "https://github.com/olumolu",
"followers_url": "https://api.github.com/users/olumolu/followers",
"following_url": "https://api.github.com/users/olumolu/following{/other_user}",
"gists_url": "https://api.github.com/users/olumolu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/olumolu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/olumolu/subscriptions",
"organizations_url": "https://api.github.com/users/olumolu/orgs",
"repos_url": "https://api.github.com/users/olumolu/repos",
"events_url": "https://api.github.com/users/olumolu/events{/privacy}",
"received_events_url": "https://api.github.com/users/olumolu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-24T07:32:36
| 2024-12-25T04:05:36
| 2024-12-25T04:05:36
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8227",
"html_url": "https://github.com/ollama/ollama/pull/8227",
"diff_url": "https://github.com/ollama/ollama/pull/8227.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8227.patch",
"merged_at": "2024-12-25T04:05:36"
}
|
Alpaca An Ollama client application for linux and macos made with GTK4 and Adwaita https://github.com/ollama/ollama/issues/8220
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8227/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8227/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/380
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/380/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/380/comments
|
https://api.github.com/repos/ollama/ollama/issues/380/events
|
https://github.com/ollama/ollama/issues/380
| 1,857,036,496
|
I_kwDOJ0Z1Ps5usCDQ
| 380
|
Increase Inference Throughput by Employing Parallelism
|
{
"login": "gusanmaz",
"id": 2552975,
"node_id": "MDQ6VXNlcjI1NTI5NzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/2552975?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gusanmaz",
"html_url": "https://github.com/gusanmaz",
"followers_url": "https://api.github.com/users/gusanmaz/followers",
"following_url": "https://api.github.com/users/gusanmaz/following{/other_user}",
"gists_url": "https://api.github.com/users/gusanmaz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gusanmaz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gusanmaz/subscriptions",
"organizations_url": "https://api.github.com/users/gusanmaz/orgs",
"repos_url": "https://api.github.com/users/gusanmaz/repos",
"events_url": "https://api.github.com/users/gusanmaz/events{/privacy}",
"received_events_url": "https://api.github.com/users/gusanmaz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2023-08-18T17:13:46
| 2024-05-12T15:11:01
| 2023-12-22T03:33:00
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am running llama2 model for inference on Mac Mini M2 Pro using Langchain. According to System Monitor ollama process doesn't consume significant CPU but around 95% GPU and around 3GB memory. When I run 2 instances of the almost same code, inference speed decreases around 2-fold.
The code I am running looks like this:
```python
import json
import requests
from langchain.llms import Ollama
import time
with open("queries.json", "r") as file:
queries = json.load(file)
try:
with open("output.json", "r") as file:
output_data = json.load(file)
except FileNotFoundError:
output_data = {}
ollama = Ollama(base_url='http://localhost:11434', model="llama")
prev_time = None
for query in queries:
if query not in output_data:
current_time = time.time()
if prev_time:
elapsed_time = current_time - prev_time
print(f"Elapsed Time: {elapsed_time:.2f} seconds\n")
out = ollama(query)
output_data[query] = out
with open("output.json", "w") as file:
json.dump(output_data, file, indent=4)
print("\n")
print("Query: " + query)
print("Answer: " + out)
print("\n")
prev_time = current_time
```
Is there a way to increase inference throughput using parallelism or other methods?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/380/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/380/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2532
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2532/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2532/comments
|
https://api.github.com/repos/ollama/ollama/issues/2532/events
|
https://github.com/ollama/ollama/pull/2532
| 2,137,750,643
|
PR_kwDOJ0Z1Ps5nCp-i
| 2,532
|
add gguf file types
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-16T02:12:04
| 2024-02-21T00:06:30
| 2024-02-21T00:06:29
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2532",
"html_url": "https://github.com/ollama/ollama/pull/2532",
"diff_url": "https://github.com/ollama/ollama/pull/2532.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2532.patch",
"merged_at": "2024-02-21T00:06:29"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2532/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2532/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1243
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1243/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1243/comments
|
https://api.github.com/repos/ollama/ollama/issues/1243/events
|
https://github.com/ollama/ollama/issues/1243
| 2,006,806,790
|
I_kwDOJ0Z1Ps53nXEG
| 1,243
|
Set system prompt in `ollama run`
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5667396210,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2acg",
"url": "https://api.github.com/repos/ollama/ollama/labels/good%20first%20issue",
"name": "good first issue",
"color": "7057ff",
"default": true,
"description": "Good for newcomers"
}
] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2023-11-22T17:29:45
| 2023-12-07T05:25:44
| 2023-12-04T21:32:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
We can see the system prompt with /show system, but have no way to set it. I would be nice to be able to set it from the command line.
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1243/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1243/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3633
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3633/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3633/comments
|
https://api.github.com/repos/ollama/ollama/issues/3633/events
|
https://github.com/ollama/ollama/pull/3633
| 2,241,764,299
|
PR_kwDOJ0Z1Ps5sk88V
| 3,633
|
Update README.md with Discord-Ollama project
|
{
"login": "JT2M0L3Y",
"id": 67881240,
"node_id": "MDQ6VXNlcjY3ODgxMjQw",
"avatar_url": "https://avatars.githubusercontent.com/u/67881240?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JT2M0L3Y",
"html_url": "https://github.com/JT2M0L3Y",
"followers_url": "https://api.github.com/users/JT2M0L3Y/followers",
"following_url": "https://api.github.com/users/JT2M0L3Y/following{/other_user}",
"gists_url": "https://api.github.com/users/JT2M0L3Y/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JT2M0L3Y/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JT2M0L3Y/subscriptions",
"organizations_url": "https://api.github.com/users/JT2M0L3Y/orgs",
"repos_url": "https://api.github.com/users/JT2M0L3Y/repos",
"events_url": "https://api.github.com/users/JT2M0L3Y/events{/privacy}",
"received_events_url": "https://api.github.com/users/JT2M0L3Y/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-13T20:16:45
| 2024-04-23T00:14:20
| 2024-04-23T00:14:20
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3633",
"html_url": "https://github.com/ollama/ollama/pull/3633",
"diff_url": "https://github.com/ollama/ollama/pull/3633.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3633.patch",
"merged_at": "2024-04-23T00:14:20"
}
|
Generalized TypeScript Discord Bot that integrates the Ollama-js library so any model from Ollama can be built or pulled into use.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3633/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3633/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6935
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6935/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6935/comments
|
https://api.github.com/repos/ollama/ollama/issues/6935/events
|
https://github.com/ollama/ollama/issues/6935
| 2,545,440,166
|
I_kwDOJ0Z1Ps6XuFWm
| 6,935
|
Typo in Linux uninstallation docs
|
{
"login": "jasondunsmore",
"id": 53437,
"node_id": "MDQ6VXNlcjUzNDM3",
"avatar_url": "https://avatars.githubusercontent.com/u/53437?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jasondunsmore",
"html_url": "https://github.com/jasondunsmore",
"followers_url": "https://api.github.com/users/jasondunsmore/followers",
"following_url": "https://api.github.com/users/jasondunsmore/following{/other_user}",
"gists_url": "https://api.github.com/users/jasondunsmore/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jasondunsmore/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jasondunsmore/subscriptions",
"organizations_url": "https://api.github.com/users/jasondunsmore/orgs",
"repos_url": "https://api.github.com/users/jasondunsmore/repos",
"events_url": "https://api.github.com/users/jasondunsmore/events{/privacy}",
"received_events_url": "https://api.github.com/users/jasondunsmore/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-09-24T13:37:16
| 2024-10-24T03:48:34
| 2024-10-24T03:48:11
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
At the bottom of docs/linux.md, it should say `sudo rm -r /usr/lib/ollama/` instead of `sudo rm -r /usr/share/ollama`.
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6935/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6935/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/145
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/145/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/145/comments
|
https://api.github.com/repos/ollama/ollama/issues/145/events
|
https://github.com/ollama/ollama/pull/145
| 1,814,620,135
|
PR_kwDOJ0Z1Ps5WCTDi
| 145
|
verify blob digest
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-20T18:54:21
| 2023-07-20T19:14:25
| 2023-07-20T19:14:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/145",
"html_url": "https://github.com/ollama/ollama/pull/145",
"diff_url": "https://github.com/ollama/ollama/pull/145.diff",
"patch_url": "https://github.com/ollama/ollama/pull/145.patch",
"merged_at": "2023-07-20T19:14:21"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/145/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/145/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6446
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6446/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6446/comments
|
https://api.github.com/repos/ollama/ollama/issues/6446/events
|
https://github.com/ollama/ollama/issues/6446
| 2,476,169,712
|
I_kwDOJ0Z1Ps6Tl1nw
| 6,446
|
Model Library: Ability to update model manifest via editor
|
{
"login": "MaxJa4",
"id": 74194322,
"node_id": "MDQ6VXNlcjc0MTk0MzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/74194322?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MaxJa4",
"html_url": "https://github.com/MaxJa4",
"followers_url": "https://api.github.com/users/MaxJa4/followers",
"following_url": "https://api.github.com/users/MaxJa4/following{/other_user}",
"gists_url": "https://api.github.com/users/MaxJa4/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MaxJa4/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MaxJa4/subscriptions",
"organizations_url": "https://api.github.com/users/MaxJa4/orgs",
"repos_url": "https://api.github.com/users/MaxJa4/repos",
"events_url": "https://api.github.com/users/MaxJa4/events{/privacy}",
"received_events_url": "https://api.github.com/users/MaxJa4/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-08-20T17:26:06
| 2024-08-20T17:26:06
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# TLDR
Direct updates of small, text-based files like parameters, template, license and system message **within the Ollama library** to avoid the time-consuming and bandwidth-heavy process of pulling, updating and pushing all tags of a model. This would simplify updates without affecting already existing possibilities with ollama-push.
# Details
Currently, when changes to a manifest/modelfile need to be made (e.g. template or parameters), all `N` tags of a model need to be pulled, updated and then pushed `N` times respectively.
This can be quite time consuming (especially for 70B+ models and with many quantizations) as well as somewhat wasteful regarding bandwidth of the Ollama library - especially since things like the template are often shared between all tags, so just one single object would need to be updated.
Perhaps updating purely text-based, small files/objects like parameters, license, system message and template could be done directly in the Ollama library, like the model readme can (by the author). That would leave the fine-grained control via ollama-push untouched and not complicate anything there.
That's just what came to mind first though, there might be other sensible approaches as well.
Since the library is not open source as far as I'm aware, I can't help with contributing there myself, so here's a feature request to hopefully keep an eye on this topic. Many thanks!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6446/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6446/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5119
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5119/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5119/comments
|
https://api.github.com/repos/ollama/ollama/issues/5119/events
|
https://github.com/ollama/ollama/pull/5119
| 2,360,455,078
|
PR_kwDOJ0Z1Ps5y3Ch2
| 5,119
|
Add a few missing server settings and sort the list
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-06-18T18:28:48
| 2024-07-29T21:26:38
| 2024-07-29T21:26:37
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5119",
"html_url": "https://github.com/ollama/ollama/pull/5119",
"diff_url": "https://github.com/ollama/ollama/pull/5119.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5119.patch",
"merged_at": null
}
|
Fixes #5093
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5119/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5119/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3308
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3308/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3308/comments
|
https://api.github.com/repos/ollama/ollama/issues/3308/events
|
https://github.com/ollama/ollama/pull/3308
| 2,203,833,834
|
PR_kwDOJ0Z1Ps5qjzUJ
| 3,308
|
Bump llama.cpp to b2510
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-23T11:24:46
| 2024-03-25T19:56:18
| 2024-03-25T19:56:12
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3308",
"html_url": "https://github.com/ollama/ollama/pull/3308",
"diff_url": "https://github.com/ollama/ollama/pull/3308.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3308.patch",
"merged_at": "2024-03-25T19:56:12"
}
|
~latest release, after the cuda refactoring
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3308/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3308/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4662
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4662/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4662/comments
|
https://api.github.com/repos/ollama/ollama/issues/4662/events
|
https://github.com/ollama/ollama/issues/4662
| 2,319,068,140
|
I_kwDOJ0Z1Ps6KOivs
| 4,662
|
Can Ollama be ran with Nemo Guardrails
|
{
"login": "ShreyasChhetri",
"id": 143403865,
"node_id": "U_kgDOCIwrWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/143403865?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ShreyasChhetri",
"html_url": "https://github.com/ShreyasChhetri",
"followers_url": "https://api.github.com/users/ShreyasChhetri/followers",
"following_url": "https://api.github.com/users/ShreyasChhetri/following{/other_user}",
"gists_url": "https://api.github.com/users/ShreyasChhetri/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ShreyasChhetri/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ShreyasChhetri/subscriptions",
"organizations_url": "https://api.github.com/users/ShreyasChhetri/orgs",
"repos_url": "https://api.github.com/users/ShreyasChhetri/repos",
"events_url": "https://api.github.com/users/ShreyasChhetri/events{/privacy}",
"received_events_url": "https://api.github.com/users/ShreyasChhetri/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-05-27T12:41:02
| 2024-10-23T03:07:29
| 2024-10-23T03:07:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I want to run ollama Mistral 7B model with Nemo Guardrails but when I'm using it in my config file models:
- type: main
engine: ollama
model: mistral
parameters:
base_url: http://127.0.0.1:11434
like this I'm not able to access it
Error while execution self_check_input: Ollama call failed with status code 404.
I'm sorry, an internal error has occurred.
this error is giving can anyone help me with it ??
### OS
Windows
### GPU
_No response_
### CPU
Intel
### Ollama version
latest
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4662/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4662/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1074
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1074/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1074/comments
|
https://api.github.com/repos/ollama/ollama/issues/1074/events
|
https://github.com/ollama/ollama/pull/1074
| 1,987,788,391
|
PR_kwDOJ0Z1Ps5fJlqm
| 1,074
|
Log Analysis Example
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-10T14:57:55
| 2023-11-17T00:33:08
| 2023-11-17T00:33:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1074",
"html_url": "https://github.com/ollama/ollama/pull/1074",
"diff_url": "https://github.com/ollama/ollama/pull/1074.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1074.patch",
"merged_at": "2023-11-17T00:33:07"
}
|
At kubecon and other events and on discord, we have been asked how to analyse logs using ollama. This is a simple example of one approach to this.
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1074/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1074/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7190
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7190/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7190/comments
|
https://api.github.com/repos/ollama/ollama/issues/7190/events
|
https://github.com/ollama/ollama/issues/7190
| 2,583,782,889
|
I_kwDOJ0Z1Ps6aAWXp
| 7,190
|
ollama was built for Mac OS X 12 instead of 11
|
{
"login": "josergc",
"id": 7774952,
"node_id": "MDQ6VXNlcjc3NzQ5NTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/7774952?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/josergc",
"html_url": "https://github.com/josergc",
"followers_url": "https://api.github.com/users/josergc/followers",
"following_url": "https://api.github.com/users/josergc/following{/other_user}",
"gists_url": "https://api.github.com/users/josergc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/josergc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/josergc/subscriptions",
"organizations_url": "https://api.github.com/users/josergc/orgs",
"repos_url": "https://api.github.com/users/josergc/repos",
"events_url": "https://api.github.com/users/josergc/events{/privacy}",
"received_events_url": "https://api.github.com/users/josergc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677279472,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjf8y8A",
"url": "https://api.github.com/repos/ollama/ollama/labels/macos",
"name": "macos",
"color": "E2DBC0",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-10-13T08:36:08
| 2024-10-13T17:47:43
| 2024-10-13T17:47:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
In the website it is stated: Requires macOS 11 Big Sur or later.
I'm using version 11.7.10, it installs fine and when it is launched through command like, it is requiring Mac OS X 12.0
Cannot get version
```
% ollama --version
dyld: Symbol not found: __ZTTNSt3__114basic_ifstreamIcNS_11char_traitsIcEEEE
Referenced from: /usr/local/bin/ollama (which was built for Mac OS X 12.0)
Expected in: /usr/lib/libc++.1.dylib
in /usr/local/bin/ollama
zsh: abort ollama --version
```
I had to get info from the downloaded package, so it is 0.3.13
Cannot run a model
```
% ollama run llama3
dyld: Symbol not found: __ZTTNSt3__114basic_ifstreamIcNS_11char_traitsIcEEEE
Referenced from: /usr/local/bin/ollama (which was built for Mac OS X 12.0)
Expected in: /usr/lib/libc++.1.dylib
in /usr/local/bin/ollama
zsh: abort ollama run llama3
```
### OS
macOS
### GPU
Intel
### CPU
Intel
### Ollama version
0.3.13
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7190/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7190/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2297
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2297/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2297/comments
|
https://api.github.com/repos/ollama/ollama/issues/2297/events
|
https://github.com/ollama/ollama/issues/2297
| 2,111,239,234
|
I_kwDOJ0Z1Ps591vRC
| 2,297
|
Wingman-AI, please add the new extension
|
{
"login": "RussellCanfield",
"id": 17344904,
"node_id": "MDQ6VXNlcjE3MzQ0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/17344904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RussellCanfield",
"html_url": "https://github.com/RussellCanfield",
"followers_url": "https://api.github.com/users/RussellCanfield/followers",
"following_url": "https://api.github.com/users/RussellCanfield/following{/other_user}",
"gists_url": "https://api.github.com/users/RussellCanfield/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RussellCanfield/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RussellCanfield/subscriptions",
"organizations_url": "https://api.github.com/users/RussellCanfield/orgs",
"repos_url": "https://api.github.com/users/RussellCanfield/repos",
"events_url": "https://api.github.com/users/RussellCanfield/events{/privacy}",
"received_events_url": "https://api.github.com/users/RussellCanfield/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-02-01T00:39:44
| 2024-02-01T19:17:24
| 2024-02-01T19:17:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi, we’re huge fans of Ollama!
We built a VSCode extension around it called Wingman:
https://marketplace.visualstudio.com/items?itemName=WingMan.wing-man
Would we be able to add it your readme? Do you accept PRs?
Thanks!
|
{
"login": "RussellCanfield",
"id": 17344904,
"node_id": "MDQ6VXNlcjE3MzQ0OTA0",
"avatar_url": "https://avatars.githubusercontent.com/u/17344904?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RussellCanfield",
"html_url": "https://github.com/RussellCanfield",
"followers_url": "https://api.github.com/users/RussellCanfield/followers",
"following_url": "https://api.github.com/users/RussellCanfield/following{/other_user}",
"gists_url": "https://api.github.com/users/RussellCanfield/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RussellCanfield/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RussellCanfield/subscriptions",
"organizations_url": "https://api.github.com/users/RussellCanfield/orgs",
"repos_url": "https://api.github.com/users/RussellCanfield/repos",
"events_url": "https://api.github.com/users/RussellCanfield/events{/privacy}",
"received_events_url": "https://api.github.com/users/RussellCanfield/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2297/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2297/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3088
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3088/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3088/comments
|
https://api.github.com/repos/ollama/ollama/issues/3088/events
|
https://github.com/ollama/ollama/pull/3088
| 2,182,870,941
|
PR_kwDOJ0Z1Ps5pcg1z
| 3,088
|
Fix iGPU detection for linux
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-13T00:01:25
| 2024-03-13T01:47:30
| 2024-03-13T00:20:28
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3088",
"html_url": "https://github.com/ollama/ollama/pull/3088",
"diff_url": "https://github.com/ollama/ollama/pull/3088.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3088.patch",
"merged_at": "2024-03-13T00:20:28"
}
|
This fixes a few bugs in the new sysfs discovery logic. iGPUs are now correctly identified by their <1G VRAM reported. the sysfs IDs are off by one compared to what HIP wants due to the CPU being reported in amdgpu, but HIP only cares about GPUs.
Tested on a Ryzen 9 7900X system with an RX 7900 XTX. The amdgpu driver exposes 3 nodes, 0 is CPU, 1 is the discrete GPU, and 2 is the iGPU. This logic now correctly detects this system and sets the visible devices properly.
Example scenario 1:
```
% OLLAMA_DEBUG=1 ./ollama-linux-amd64 serve
..
time=2024-03-13T00:03:55.440Z level=DEBUG source=amd_linux.go:110 msg="rocm supported GPU types [gfx1030 gfx1100 gfx1101 gfx1102 gfx900 gfx906 gfx908 gfx90a gfx940 gfx941 gfx942]"
time=2024-03-13T00:03:55.440Z level=INFO source=amd_linux.go:119 msg="amdgpu [0] gfx1100 is supported"
time=2024-03-13T00:03:55.440Z level=WARN source=amd_linux.go:114 msg="amdgpu [1] gfx1036 is not supported by /tmp/ollama2436258655/rocm [gfx1030 gfx1100 gfx1101 gfx1102 gfx900 gfx906 gfx908 gfx90a gfx940 gfx941 gfx942]"
time=2024-03-13T00:03:55.440Z level=WARN source=amd_linux.go:116 msg="See https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md for HSA_OVERRIDE_GFX_VERSION usage"
time=2024-03-13T00:03:55.440Z level=DEBUG source=amd_linux.go:152 msg="discovering VRAM for amdgpu devices"
time=2024-03-13T00:03:55.440Z level=DEBUG source=amd_linux.go:171 msg="amdgpu devices [0 1]"
time=2024-03-13T00:03:55.441Z level=INFO source=amd_linux.go:246 msg="[0] amdgpu totalMemory 24560M"
time=2024-03-13T00:03:55.441Z level=INFO source=amd_linux.go:247 msg="[0] amdgpu freeMemory 24560M"
time=2024-03-13T00:03:55.441Z level=INFO source=amd_common.go:54 msg="Setting HIP_VISIBLE_DEVICES=0"
time=2024-03-13T00:03:55.441Z level=DEBUG source=gpu.go:180 msg="rocm detected 1 devices with 22104M available memory"
```
If I force the override to bypass the compatibility check, the new iGPU detection logic kicks in (both of these scenarios work)
```
% OLLAMA_DEBUG=1 HSA_OVERRIDE_GFX_VERSION=11.0.0 ./ollama-linux-amd64 serve
...
time=2024-03-13T00:04:59.799Z level=DEBUG source=amd_linux.go:123 msg="skipping rocm gfx compatibility check with HSA_OVERRIDE_GFX_VERSION=11.0.0"
time=2024-03-13T00:04:59.799Z level=DEBUG source=amd_linux.go:152 msg="discovering VRAM for amdgpu devices"
time=2024-03-13T00:04:59.799Z level=DEBUG source=amd_linux.go:171 msg="amdgpu devices [0 1]"
time=2024-03-13T00:04:59.799Z level=INFO source=amd_linux.go:246 msg="[0] amdgpu totalMemory 24560M"
time=2024-03-13T00:04:59.799Z level=INFO source=amd_linux.go:247 msg="[0] amdgpu freeMemory 24560M"
time=2024-03-13T00:04:59.799Z level=INFO source=amd_linux.go:217 msg="amdgpu [1] appears to be an iGPU with 512M reported total memory, skipping"
time=2024-03-13T00:04:59.799Z level=INFO source=amd_common.go:54 msg="Setting HIP_VISIBLE_DEVICES=0"
time=2024-03-13T00:04:59.799Z level=DEBUG source=gpu.go:180 msg="rocm detected 1 devices with 22104M available memory"
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3088/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3088/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/599
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/599/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/599/comments
|
https://api.github.com/repos/ollama/ollama/issues/599/events
|
https://github.com/ollama/ollama/pull/599
| 1,912,511,182
|
PR_kwDOJ0Z1Ps5bLZ1F
| 599
|
update install.sh
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-26T01:10:06
| 2023-09-26T01:24:14
| 2023-09-26T01:24:13
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/599",
"html_url": "https://github.com/ollama/ollama/pull/599",
"diff_url": "https://github.com/ollama/ollama/pull/599.diff",
"patch_url": "https://github.com/ollama/ollama/pull/599.patch",
"merged_at": "2023-09-26T01:24:13"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/599/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/599/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3716
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3716/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3716/comments
|
https://api.github.com/repos/ollama/ollama/issues/3716/events
|
https://github.com/ollama/ollama/issues/3716
| 2,249,471,075
|
I_kwDOJ0Z1Ps6GFDRj
| 3,716
|
I can't push a model
|
{
"login": "jonathanhecl",
"id": 1691623,
"node_id": "MDQ6VXNlcjE2OTE2MjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1691623?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jonathanhecl",
"html_url": "https://github.com/jonathanhecl",
"followers_url": "https://api.github.com/users/jonathanhecl/followers",
"following_url": "https://api.github.com/users/jonathanhecl/following{/other_user}",
"gists_url": "https://api.github.com/users/jonathanhecl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jonathanhecl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jonathanhecl/subscriptions",
"organizations_url": "https://api.github.com/users/jonathanhecl/orgs",
"repos_url": "https://api.github.com/users/jonathanhecl/repos",
"events_url": "https://api.github.com/users/jonathanhecl/events{/privacy}",
"received_events_url": "https://api.github.com/users/jonathanhecl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-04-17T23:58:23
| 2024-04-18T02:13:45
| 2024-04-18T00:07:57
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
> ollama push command-r-plus
retrieving manifest
pushing 503c8cac166f... 100% ▕████████████████████████████████████████████████████████▏ 59 GB
pushing f0624a2393a5... 100% ▕████████████████████████████████████████████████████████▏ 13 KB
pushing 42499e38acdf... 100% ▕████████████████████████████████████████████████████████▏ 270 B
pushing 36b9655abe6a... 100% ▕████████████████████████████████████████████████████████▏ 81 B
pushing 748dd5320e31... 100% ▕████████████████████████████████████████████████████████▏ 493 B
pushing manifest
Error: unauthorized
```
Why? I can pull, but not push a model?
I have also tried it with the administration terminal.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3716/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3716/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1136
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1136/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1136/comments
|
https://api.github.com/repos/ollama/ollama/issues/1136/events
|
https://github.com/ollama/ollama/issues/1136
| 1,994,569,908
|
I_kwDOJ0Z1Ps524ri0
| 1,136
|
please support neural-chat-7b-v3-1
|
{
"login": "eramax",
"id": 542413,
"node_id": "MDQ6VXNlcjU0MjQxMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/542413?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eramax",
"html_url": "https://github.com/eramax",
"followers_url": "https://api.github.com/users/eramax/followers",
"following_url": "https://api.github.com/users/eramax/following{/other_user}",
"gists_url": "https://api.github.com/users/eramax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eramax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eramax/subscriptions",
"organizations_url": "https://api.github.com/users/eramax/orgs",
"repos_url": "https://api.github.com/users/eramax/repos",
"events_url": "https://api.github.com/users/eramax/events{/privacy}",
"received_events_url": "https://api.github.com/users/eramax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-11-15T11:12:17
| 2023-11-16T23:40:13
| 2023-11-16T23:40:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Definitely one the most amazing models I've ever seen [neural-chat-7b-v3-1 ](https://huggingface.co/Intel/neural-chat-7b-v3-1)
please include it to the models page.
I used it thorugh this command
```bash
ollama run fakezeta/neural-chat-7b-v3-1:Q5_K_M
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1136/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1136/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6851
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6851/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6851/comments
|
https://api.github.com/repos/ollama/ollama/issues/6851/events
|
https://github.com/ollama/ollama/issues/6851
| 2,532,501,475
|
I_kwDOJ0Z1Ps6W8ufj
| 6,851
|
数据隐私问题
|
{
"login": "deict",
"id": 112455517,
"node_id": "U_kgDOBrPvXQ",
"avatar_url": "https://avatars.githubusercontent.com/u/112455517?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/deict",
"html_url": "https://github.com/deict",
"followers_url": "https://api.github.com/users/deict/followers",
"following_url": "https://api.github.com/users/deict/following{/other_user}",
"gists_url": "https://api.github.com/users/deict/gists{/gist_id}",
"starred_url": "https://api.github.com/users/deict/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/deict/subscriptions",
"organizations_url": "https://api.github.com/users/deict/orgs",
"repos_url": "https://api.github.com/users/deict/repos",
"events_url": "https://api.github.com/users/deict/events{/privacy}",
"received_events_url": "https://api.github.com/users/deict/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-18T01:38:54
| 2024-09-21T00:38:55
| 2024-09-21T00:38:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
如果我在windows本地部署ollama,并使用ollama3.1模型,我问的问题,这些数据会被你们接收存储吗
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6851/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6851/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1606
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1606/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1606/comments
|
https://api.github.com/repos/ollama/ollama/issues/1606/events
|
https://github.com/ollama/ollama/pull/1606
| 2,048,760,694
|
PR_kwDOJ0Z1Ps5iYBNf
| 1,606
|
Added support for specifying an arbitrary GBNF compatible grammar
|
{
"login": "clevcode",
"id": 1842180,
"node_id": "MDQ6VXNlcjE4NDIxODA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1842180?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/clevcode",
"html_url": "https://github.com/clevcode",
"followers_url": "https://api.github.com/users/clevcode/followers",
"following_url": "https://api.github.com/users/clevcode/following{/other_user}",
"gists_url": "https://api.github.com/users/clevcode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/clevcode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/clevcode/subscriptions",
"organizations_url": "https://api.github.com/users/clevcode/orgs",
"repos_url": "https://api.github.com/users/clevcode/repos",
"events_url": "https://api.github.com/users/clevcode/events{/privacy}",
"received_events_url": "https://api.github.com/users/clevcode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 26
| 2023-12-19T14:21:53
| 2024-12-05T00:43:26
| 2024-12-05T00:43:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1606",
"html_url": "https://github.com/ollama/ollama/pull/1606",
"diff_url": "https://github.com/ollama/ollama/pull/1606.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1606.patch",
"merged_at": null
}
|
in the Modelfile, for models running on the llama.cpp backend
Note that this is basically just the same PR as the one submitted by SyrupThinker in September (#565), and that has been mentioned in issue #1507 and #808 since then.
There are plenty of users that would appreciate this feature, so I really hope that it can get merged.
It's great that support for JSON grammar specifically has been added, by setting the GBNF grammar in question when JSON format is requested, but providing the user with the ability to specify an arbitrary grammar opens up for a lot more possibilities than that
Pull request #830 adds support for specifying JSON schemas, which is yet another great convenience feature for a specific and common usecase, but by adding support for arbitrary GBNF grammar it would be possible to have any model outputting data in any type of format, including custom DSLs and text-based file formats in general
This is a tremendously useful thing to have when building various types of automation related applications, so I really hope that this can get merged to avoid having to maintain separate forks. Ollama is a great project, let's keep making it even better
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1606/reactions",
"total_count": 71,
"+1": 40,
"-1": 0,
"laugh": 0,
"hooray": 3,
"confused": 0,
"heart": 21,
"rocket": 4,
"eyes": 3
}
|
https://api.github.com/repos/ollama/ollama/issues/1606/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1782
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1782/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1782/comments
|
https://api.github.com/repos/ollama/ollama/issues/1782/events
|
https://github.com/ollama/ollama/issues/1782
| 2,065,339,126
|
I_kwDOJ0Z1Ps57GpL2
| 1,782
|
Model kept unloading no matter what
|
{
"login": "Opaatia",
"id": 118029983,
"node_id": "U_kgDOBwj-nw",
"avatar_url": "https://avatars.githubusercontent.com/u/118029983?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Opaatia",
"html_url": "https://github.com/Opaatia",
"followers_url": "https://api.github.com/users/Opaatia/followers",
"following_url": "https://api.github.com/users/Opaatia/following{/other_user}",
"gists_url": "https://api.github.com/users/Opaatia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Opaatia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Opaatia/subscriptions",
"organizations_url": "https://api.github.com/users/Opaatia/orgs",
"repos_url": "https://api.github.com/users/Opaatia/repos",
"events_url": "https://api.github.com/users/Opaatia/events{/privacy}",
"received_events_url": "https://api.github.com/users/Opaatia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 11
| 2024-01-04T09:54:19
| 2024-01-28T22:33:09
| 2024-01-28T22:33:08
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Greeting, I have modified the ollama/server/routes.go to set the following variable:
```go
var defaultSessionDuration = 1440 * time.Minute
```
However when running the ollama, it kept unloading the **exact same** model over and over for every single API invocation for /api/generate endpoint and this is visible from nvtop CLI where I can observe the Host Memory climbing first and then GPU finally have the model loaded.
This makes Ollama very impractical for production environment when it takes significant amount of time to load the model for each and every API invocation. It should be noted that this is **NOT** running from docker as it is an intentional decision.
Is there an alternative recommendation to workaround this?
Please and thank you.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1782/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1782/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5256
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5256/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5256/comments
|
https://api.github.com/repos/ollama/ollama/issues/5256/events
|
https://github.com/ollama/ollama/issues/5256
| 2,370,267,702
|
I_kwDOJ0Z1Ps6NR2o2
| 5,256
|
openai ChatCompletionRequest missing tools field?
|
{
"login": "bingo789",
"id": 40812718,
"node_id": "MDQ6VXNlcjQwODEyNzE4",
"avatar_url": "https://avatars.githubusercontent.com/u/40812718?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bingo789",
"html_url": "https://github.com/bingo789",
"followers_url": "https://api.github.com/users/bingo789/followers",
"following_url": "https://api.github.com/users/bingo789/following{/other_user}",
"gists_url": "https://api.github.com/users/bingo789/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bingo789/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bingo789/subscriptions",
"organizations_url": "https://api.github.com/users/bingo789/orgs",
"repos_url": "https://api.github.com/users/bingo789/repos",
"events_url": "https://api.github.com/users/bingo789/events{/privacy}",
"received_events_url": "https://api.github.com/users/bingo789/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-06-24T13:31:21
| 2024-07-24T19:07:04
| 2024-07-24T19:07:04
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
when i use ChatOpenAI client in langchain, ollama server receive:
```
{
"messages": [{
"content": "what is the weather in Boston?",
"role": "user"
}],
"model": "llama3:8b-instruct-q4_0",
"n": 1,
"stream": false,
"temperature": 0.0,
"tools": [{
"type": "function",
"function": {
"name": "tavily_search_results_json",
"description": "A search engine optimized for comprehensive, accurate, and trusted results. Useful for when you need to answer questions about current events. Input should be a search query.",
"parameters": {
"type": "object",
"properties": {
"location": {
"description": "The city and state, e.g. San Francisco, CA",
"type": "string"
}
},
"required": ["location"]
}
}
}]
}
```
but in Middleware() funtion, it transform to ChatCompletionRequest, and ChatCompletionRequest have no tools field
```
type ChatCompletionRequest struct {
Model string `json:"model"`
Messages []Message `json:"messages"`
Stream bool `json:"stream"`
MaxTokens *int `json:"max_tokens"`
Seed *int `json:"seed"`
Stop any `json:"stop"`
Temperature *float64 `json:"temperature"`
FrequencyPenalty *float64 `json:"frequency_penalty"`
PresencePenalty *float64 `json:"presence_penalty_penalty"`
TopP *float64 `json:"top_p"`
ResponseFormat *ResponseFormat `json:"response_format"`
}
func Middleware() gin.HandlerFunc {
return func(c *gin.Context) {
var req ChatCompletionRequest
err := c.ShouldBindJSON(&req)
......}
```
so i think it's not support function calling?
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
v0.1.46
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5256/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5256/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4519
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4519/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4519/comments
|
https://api.github.com/repos/ollama/ollama/issues/4519/events
|
https://github.com/ollama/ollama/issues/4519
| 2,304,610,082
|
I_kwDOJ0Z1Ps6JXY8i
| 4,519
|
ollama run codellama:34b issue
|
{
"login": "Iliceth",
"id": 68381834,
"node_id": "MDQ6VXNlcjY4MzgxODM0",
"avatar_url": "https://avatars.githubusercontent.com/u/68381834?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Iliceth",
"html_url": "https://github.com/Iliceth",
"followers_url": "https://api.github.com/users/Iliceth/followers",
"following_url": "https://api.github.com/users/Iliceth/following{/other_user}",
"gists_url": "https://api.github.com/users/Iliceth/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Iliceth/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Iliceth/subscriptions",
"organizations_url": "https://api.github.com/users/Iliceth/orgs",
"repos_url": "https://api.github.com/users/Iliceth/repos",
"events_url": "https://api.github.com/users/Iliceth/events{/privacy}",
"received_events_url": "https://api.github.com/users/Iliceth/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 15
| 2024-05-19T13:19:02
| 2024-06-19T16:28:50
| 2024-06-19T16:28:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Every model I tested with ollama runs fine, but when trying: `ollama run codellama:34b`, I get `Error: llama runner process has terminated: signal: segmentation fault (core dumped)`Tried the 13B-version then, works fine.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.37
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4519/reactions",
"total_count": 7,
"+1": 7,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4519/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8145
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8145/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8145/comments
|
https://api.github.com/repos/ollama/ollama/issues/8145/events
|
https://github.com/ollama/ollama/pull/8145
| 2,746,269,027
|
PR_kwDOJ0Z1Ps6FkOyb
| 8,145
|
Embedding Normalization Options
|
{
"login": "gabe-l-hart",
"id": 1254484,
"node_id": "MDQ6VXNlcjEyNTQ0ODQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1254484?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gabe-l-hart",
"html_url": "https://github.com/gabe-l-hart",
"followers_url": "https://api.github.com/users/gabe-l-hart/followers",
"following_url": "https://api.github.com/users/gabe-l-hart/following{/other_user}",
"gists_url": "https://api.github.com/users/gabe-l-hart/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gabe-l-hart/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gabe-l-hart/subscriptions",
"organizations_url": "https://api.github.com/users/gabe-l-hart/orgs",
"repos_url": "https://api.github.com/users/gabe-l-hart/repos",
"events_url": "https://api.github.com/users/gabe-l-hart/events{/privacy}",
"received_events_url": "https://api.github.com/users/gabe-l-hart/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-12-17T22:48:19
| 2024-12-23T15:39:38
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8145",
"html_url": "https://github.com/ollama/ollama/pull/8145",
"diff_url": "https://github.com/ollama/ollama/pull/8145.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8145.patch",
"merged_at": null
}
|
## Description
This PR introduces the new API parameter `normalize` for the `/api/embed` and `/api/embeddings` endpoints that allow the user to explicitly enable/disable normalization. The default behavior of both endpoints is preserved (normalization for `embed`, no normalization for `embeddings`), but this allows them to produce equivalent output to avoid future confusion.
## Issues
- Related to confusion in https://github.com/ollama/ollama/issues/8094
- Closes https://github.com/ollama/ollama/issues/7858
- Closes https://github.com/ollama/ollama/issues/6496
## Discussion
The `/api/embeddings` endpoint is deprecated according to [the docs](https://github.com/ollama/ollama/blob/main/docs/api.md#generate-embedding). An argument could be made that we should not add this functionality to this endpoint and simply document around the discrepancy. I'd be totally open to this!
## Testing
I updated the `embed_test.go` to include the various combinations against `embed` and `embedding`. I ran them as follows:
```sh
OLLAMA_TEST_EXISTING=true go test -tags=integration ./integration/ -test.run TestAllMiniLM.*
```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8145/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8145/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2721
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2721/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2721/comments
|
https://api.github.com/repos/ollama/ollama/issues/2721/events
|
https://github.com/ollama/ollama/issues/2721
| 2,152,084,864
|
I_kwDOJ0Z1Ps6ARjWA
| 2,721
|
Add latest tag for docker image with ROCm support
|
{
"login": "robertvazan",
"id": 3514517,
"node_id": "MDQ6VXNlcjM1MTQ1MTc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3514517?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robertvazan",
"html_url": "https://github.com/robertvazan",
"followers_url": "https://api.github.com/users/robertvazan/followers",
"following_url": "https://api.github.com/users/robertvazan/following{/other_user}",
"gists_url": "https://api.github.com/users/robertvazan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robertvazan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robertvazan/subscriptions",
"organizations_url": "https://api.github.com/users/robertvazan/orgs",
"repos_url": "https://api.github.com/users/robertvazan/repos",
"events_url": "https://api.github.com/users/robertvazan/events{/privacy}",
"received_events_url": "https://api.github.com/users/robertvazan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-02-24T03:23:16
| 2024-02-27T19:29:09
| 2024-02-27T19:29:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Using docker image with ROCm support requires specifying version in the tag, e.g. 0.1.27-rocm. Please add rocm tag that always points to the latest version with ROCm, so that we can upgrade by running docker pull.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2721/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2721/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3954
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3954/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3954/comments
|
https://api.github.com/repos/ollama/ollama/issues/3954/events
|
https://github.com/ollama/ollama/pull/3954
| 2,266,407,759
|
PR_kwDOJ0Z1Ps5t4F4E
| 3,954
|
Put back non-avx CPU build for windows
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-26T19:44:47
| 2024-04-26T20:09:24
| 2024-04-26T20:09:04
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3954",
"html_url": "https://github.com/ollama/ollama/pull/3954",
"diff_url": "https://github.com/ollama/ollama/pull/3954.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3954.patch",
"merged_at": "2024-04-26T20:09:04"
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3954/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3954/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/96
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/96/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/96/comments
|
https://api.github.com/repos/ollama/ollama/issues/96/events
|
https://github.com/ollama/ollama/pull/96
| 1,808,836,936
|
PR_kwDOJ0Z1Ps5Vukbl
| 96
|
add modelpaths
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-18T00:35:39
| 2023-07-18T05:44:21
| 2023-07-18T05:44:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/96",
"html_url": "https://github.com/ollama/ollama/pull/96",
"diff_url": "https://github.com/ollama/ollama/pull/96.diff",
"patch_url": "https://github.com/ollama/ollama/pull/96.patch",
"merged_at": "2023-07-18T05:44:21"
}
|
This change adds ModelPath{} which takes care of figuring out the various URL and file paths to a given model.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/96/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/96/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3533
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3533/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3533/comments
|
https://api.github.com/repos/ollama/ollama/issues/3533/events
|
https://github.com/ollama/ollama/issues/3533
| 2,230,380,627
|
I_kwDOJ0Z1Ps6E8OhT
| 3,533
|
Suggestion: AnomalibGPT
|
{
"login": "monkeycc",
"id": 6490927,
"node_id": "MDQ6VXNlcjY0OTA5Mjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6490927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/monkeycc",
"html_url": "https://github.com/monkeycc",
"followers_url": "https://api.github.com/users/monkeycc/followers",
"following_url": "https://api.github.com/users/monkeycc/following{/other_user}",
"gists_url": "https://api.github.com/users/monkeycc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/monkeycc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/monkeycc/subscriptions",
"organizations_url": "https://api.github.com/users/monkeycc/orgs",
"repos_url": "https://api.github.com/users/monkeycc/repos",
"events_url": "https://api.github.com/users/monkeycc/events{/privacy}",
"received_events_url": "https://api.github.com/users/monkeycc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-04-08T06:38:04
| 2024-04-19T15:41:11
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
Discovering an interesting defect detection big language model
https://github.com/CASIA-IVA-Lab/AnomalyGPT


### How should we solve this?
_No response_
### What is the impact of not solving this?
_No response_
### Anything else?
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3533/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3533/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/887
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/887/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/887/comments
|
https://api.github.com/repos/ollama/ollama/issues/887/events
|
https://github.com/ollama/ollama/issues/887
| 1,958,301,788
|
I_kwDOJ0Z1Ps50uVBc
| 887
|
invalid version
|
{
"login": "UICJohn",
"id": 4167985,
"node_id": "MDQ6VXNlcjQxNjc5ODU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4167985?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/UICJohn",
"html_url": "https://github.com/UICJohn",
"followers_url": "https://api.github.com/users/UICJohn/followers",
"following_url": "https://api.github.com/users/UICJohn/following{/other_user}",
"gists_url": "https://api.github.com/users/UICJohn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/UICJohn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/UICJohn/subscriptions",
"organizations_url": "https://api.github.com/users/UICJohn/orgs",
"repos_url": "https://api.github.com/users/UICJohn/repos",
"events_url": "https://api.github.com/users/UICJohn/events{/privacy}",
"received_events_url": "https://api.github.com/users/UICJohn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-10-24T01:32:18
| 2023-10-24T14:29:00
| 2023-10-24T14:29:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi guys
I am tring to build model from Modelfile by following this [guid](https://github.com/jmorganca/ollama/blob/main/docs/import.md).It works fine until I am tring to create from Modelfile using this command:
`ollama create 7b-32k-instruct -f ./Modelfile`
Here is my Modelfile:
```
FROM ./q4_0.bin
TEMPLATE "[INST] {{ .Prompt }} [/INST]"
```
It returns:
`⠋ creating model layer Error: invalid version`
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/887/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/887/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8004
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8004/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8004/comments
|
https://api.github.com/repos/ollama/ollama/issues/8004/events
|
https://github.com/ollama/ollama/issues/8004
| 2,725,725,090
|
I_kwDOJ0Z1Ps6id0Oi
| 8,004
|
QwQ 32B Preview: Q4_K_M better than Q8_0 at coding
|
{
"login": "leikareipa",
"id": 18671947,
"node_id": "MDQ6VXNlcjE4NjcxOTQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/18671947?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leikareipa",
"html_url": "https://github.com/leikareipa",
"followers_url": "https://api.github.com/users/leikareipa/followers",
"following_url": "https://api.github.com/users/leikareipa/following{/other_user}",
"gists_url": "https://api.github.com/users/leikareipa/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leikareipa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leikareipa/subscriptions",
"organizations_url": "https://api.github.com/users/leikareipa/orgs",
"repos_url": "https://api.github.com/users/leikareipa/repos",
"events_url": "https://api.github.com/users/leikareipa/events{/privacy}",
"received_events_url": "https://api.github.com/users/leikareipa/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 21
| 2024-12-09T01:34:20
| 2024-12-29T20:43:10
| 2024-12-29T20:08:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
It seems the Q4 version of QwQ Preview may consistently produce better coding responses than the Q8 version, even though I'd expect the opposite.
Both versions were downloaded via Ollama and tested varyingly with Ollama 0.5.1 and I think 0.4.7, can't remember at what point I updated. Context length is 8k-32k depending on the task, but I believe always more than the total number of input + output tokens. Otherwise default settings.
In my private 5-test coding/software development benchmark including JavaScript, C++ and assembly, Q4_K_M scores 70% and Q8_0 scores 50%. For reference, Bartowski's GGUFs score 60% for Q5_K_M, 40% for Q3_K_M and 10% for Q2_K. But each test is run only once per model, so there's room for noise in the results.
In other tests, Q4 still appears to produce better responses. For the prompt `Write a program using QBasic that draws a natural-looking lightning bolt.` Q4 gives code that works, albeit only partly looks like a lightning bolt, while Q8's code has various syntax errors. For a prompt asking the model to read ~10k tokens of a JavaScript 3D software renderer's reference manual and then write a fisheye pixel shader, Q4's code does it while Q8's code produces a broken effect.
These aren't average-over-x-runs results since even one run takes a while to finish on CPU, but on the whole it appears Q4 may well produce better solutions than Q8. I don't have any meaningful results for the FP16 version at this point, takes long to run.
The Q4_K_M I'm using has the ID hash 1211a3265dc8, and the Q8_0 9c62a2e770b7. This isn't the same Q4_K_M that's on Ollama right now, the model was updated on there about a day after I got it. But if the new version performs worse then that's a problem as well.
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.5.1
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8004/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8004/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1717
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1717/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1717/comments
|
https://api.github.com/repos/ollama/ollama/issues/1717/events
|
https://github.com/ollama/ollama/issues/1717
| 2,056,045,538
|
I_kwDOJ0Z1Ps56jMPi
| 1,717
|
[Feature request] update models from CLI
|
{
"login": "ThatOneCalculator",
"id": 44733677,
"node_id": "MDQ6VXNlcjQ0NzMzNjc3",
"avatar_url": "https://avatars.githubusercontent.com/u/44733677?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ThatOneCalculator",
"html_url": "https://github.com/ThatOneCalculator",
"followers_url": "https://api.github.com/users/ThatOneCalculator/followers",
"following_url": "https://api.github.com/users/ThatOneCalculator/following{/other_user}",
"gists_url": "https://api.github.com/users/ThatOneCalculator/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ThatOneCalculator/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ThatOneCalculator/subscriptions",
"organizations_url": "https://api.github.com/users/ThatOneCalculator/orgs",
"repos_url": "https://api.github.com/users/ThatOneCalculator/repos",
"events_url": "https://api.github.com/users/ThatOneCalculator/events{/privacy}",
"received_events_url": "https://api.github.com/users/ThatOneCalculator/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 5
| 2023-12-26T05:31:16
| 2024-01-25T22:46:34
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When an update is available to an already installed model, something like `ollama pull` (without an argument) or `ollama update` would be great!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1717/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1717/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8557
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8557/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8557/comments
|
https://api.github.com/repos/ollama/ollama/issues/8557/events
|
https://github.com/ollama/ollama/issues/8557
| 2,808,386,818
|
I_kwDOJ0Z1Ps6nZJUC
| 8,557
|
Please separate deepseek-r1 from deepseek-r1-Distill!
|
{
"login": "win10ogod",
"id": 125795763,
"node_id": "U_kgDOB399sw",
"avatar_url": "https://avatars.githubusercontent.com/u/125795763?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/win10ogod",
"html_url": "https://github.com/win10ogod",
"followers_url": "https://api.github.com/users/win10ogod/followers",
"following_url": "https://api.github.com/users/win10ogod/following{/other_user}",
"gists_url": "https://api.github.com/users/win10ogod/gists{/gist_id}",
"starred_url": "https://api.github.com/users/win10ogod/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/win10ogod/subscriptions",
"organizations_url": "https://api.github.com/users/win10ogod/orgs",
"repos_url": "https://api.github.com/users/win10ogod/repos",
"events_url": "https://api.github.com/users/win10ogod/events{/privacy}",
"received_events_url": "https://api.github.com/users/win10ogod/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-24T03:04:57
| 2025-01-28T15:27:58
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Please separate deepseek-r1 from deepseek-r1-Distill!
This is not the same model and the architecture is different!
The model on the ollama official website is a perfect obfuscation!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8557/reactions",
"total_count": 15,
"+1": 15,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8557/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5799
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5799/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5799/comments
|
https://api.github.com/repos/ollama/ollama/issues/5799/events
|
https://github.com/ollama/ollama/pull/5799
| 2,419,631,795
|
PR_kwDOJ0Z1Ps517xJn
| 5,799
|
README: Added LLMStack to the list of UI integrations
|
{
"login": "ajhai",
"id": 431988,
"node_id": "MDQ6VXNlcjQzMTk4OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/431988?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ajhai",
"html_url": "https://github.com/ajhai",
"followers_url": "https://api.github.com/users/ajhai/followers",
"following_url": "https://api.github.com/users/ajhai/following{/other_user}",
"gists_url": "https://api.github.com/users/ajhai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ajhai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ajhai/subscriptions",
"organizations_url": "https://api.github.com/users/ajhai/orgs",
"repos_url": "https://api.github.com/users/ajhai/repos",
"events_url": "https://api.github.com/users/ajhai/events{/privacy}",
"received_events_url": "https://api.github.com/users/ajhai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-19T18:52:41
| 2024-07-23T18:50:32
| 2024-07-23T18:40:23
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5799",
"html_url": "https://github.com/ollama/ollama/pull/5799",
"diff_url": "https://github.com/ollama/ollama/pull/5799.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5799.patch",
"merged_at": "2024-07-23T18:40:23"
}
|
Also wrote a quick guide to show to how to use `ollama` with `LLMStack` at https://docs.trypromptly.com/guides/using-llama3-with-ollama.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5799/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5799/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4538
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4538/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4538/comments
|
https://api.github.com/repos/ollama/ollama/issues/4538/events
|
https://github.com/ollama/ollama/issues/4538
| 2,306,057,386
|
I_kwDOJ0Z1Ps6Jc6Sq
| 4,538
|
Error: no safetensors or torch files found
|
{
"login": "SreeHaran",
"id": 62993067,
"node_id": "MDQ6VXNlcjYyOTkzMDY3",
"avatar_url": "https://avatars.githubusercontent.com/u/62993067?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SreeHaran",
"html_url": "https://github.com/SreeHaran",
"followers_url": "https://api.github.com/users/SreeHaran/followers",
"following_url": "https://api.github.com/users/SreeHaran/following{/other_user}",
"gists_url": "https://api.github.com/users/SreeHaran/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SreeHaran/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SreeHaran/subscriptions",
"organizations_url": "https://api.github.com/users/SreeHaran/orgs",
"repos_url": "https://api.github.com/users/SreeHaran/repos",
"events_url": "https://api.github.com/users/SreeHaran/events{/privacy}",
"received_events_url": "https://api.github.com/users/SreeHaran/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-05-20T13:54:22
| 2024-06-04T13:43:46
| 2024-06-04T13:43:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm trying to create model using Modelfile. My Modefile looks same as the example given
```
FROM llama3
# set the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1
# set the system message
SYSTEM """
You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.
"""
```
But when I execute it, it throws error. saying "Error: no safetensors or torch files found"

I tried the same with ```FROM llama2-uncensored``` in the Modelfile that seems to work fine.
other commands like ```ollama run llama3``` are working properly.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.38
|
{
"login": "SreeHaran",
"id": 62993067,
"node_id": "MDQ6VXNlcjYyOTkzMDY3",
"avatar_url": "https://avatars.githubusercontent.com/u/62993067?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SreeHaran",
"html_url": "https://github.com/SreeHaran",
"followers_url": "https://api.github.com/users/SreeHaran/followers",
"following_url": "https://api.github.com/users/SreeHaran/following{/other_user}",
"gists_url": "https://api.github.com/users/SreeHaran/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SreeHaran/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SreeHaran/subscriptions",
"organizations_url": "https://api.github.com/users/SreeHaran/orgs",
"repos_url": "https://api.github.com/users/SreeHaran/repos",
"events_url": "https://api.github.com/users/SreeHaran/events{/privacy}",
"received_events_url": "https://api.github.com/users/SreeHaran/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4538/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4538/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2765
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2765/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2765/comments
|
https://api.github.com/repos/ollama/ollama/issues/2765/events
|
https://github.com/ollama/ollama/pull/2765
| 2,154,407,092
|
PR_kwDOJ0Z1Ps5n7YWk
| 2,765
|
Added BoltAI as a desktop UI for Ollama
|
{
"login": "longseespace",
"id": 187720,
"node_id": "MDQ6VXNlcjE4NzcyMA==",
"avatar_url": "https://avatars.githubusercontent.com/u/187720?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/longseespace",
"html_url": "https://github.com/longseespace",
"followers_url": "https://api.github.com/users/longseespace/followers",
"following_url": "https://api.github.com/users/longseespace/following{/other_user}",
"gists_url": "https://api.github.com/users/longseespace/gists{/gist_id}",
"starred_url": "https://api.github.com/users/longseespace/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/longseespace/subscriptions",
"organizations_url": "https://api.github.com/users/longseespace/orgs",
"repos_url": "https://api.github.com/users/longseespace/repos",
"events_url": "https://api.github.com/users/longseespace/events{/privacy}",
"received_events_url": "https://api.github.com/users/longseespace/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2024-02-26T15:06:55
| 2024-07-31T12:23:12
| 2024-07-31T12:23:11
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2765",
"html_url": "https://github.com/ollama/ollama/pull/2765",
"diff_url": "https://github.com/ollama/ollama/pull/2765.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2765.patch",
"merged_at": null
}
|
### Overview
BoltAI supports Ollama natively. It automatically synchronize with Ollama model lists, and allows users to use advanced features such as AI Command and AI Inline.
### Screenshots
**Main Chat UI**

**Model Management**

|
{
"login": "longseespace",
"id": 187720,
"node_id": "MDQ6VXNlcjE4NzcyMA==",
"avatar_url": "https://avatars.githubusercontent.com/u/187720?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/longseespace",
"html_url": "https://github.com/longseespace",
"followers_url": "https://api.github.com/users/longseespace/followers",
"following_url": "https://api.github.com/users/longseespace/following{/other_user}",
"gists_url": "https://api.github.com/users/longseespace/gists{/gist_id}",
"starred_url": "https://api.github.com/users/longseespace/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/longseespace/subscriptions",
"organizations_url": "https://api.github.com/users/longseespace/orgs",
"repos_url": "https://api.github.com/users/longseespace/repos",
"events_url": "https://api.github.com/users/longseespace/events{/privacy}",
"received_events_url": "https://api.github.com/users/longseespace/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2765/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2765/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8671
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8671/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8671/comments
|
https://api.github.com/repos/ollama/ollama/issues/8671/events
|
https://github.com/ollama/ollama/issues/8671
| 2,819,221,143
|
I_kwDOJ0Z1Ps6oCeaX
| 8,671
|
How do I remove a model which I have downloaded
|
{
"login": "santakd",
"id": 11453706,
"node_id": "MDQ6VXNlcjExNDUzNzA2",
"avatar_url": "https://avatars.githubusercontent.com/u/11453706?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/santakd",
"html_url": "https://github.com/santakd",
"followers_url": "https://api.github.com/users/santakd/followers",
"following_url": "https://api.github.com/users/santakd/following{/other_user}",
"gists_url": "https://api.github.com/users/santakd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/santakd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/santakd/subscriptions",
"organizations_url": "https://api.github.com/users/santakd/orgs",
"repos_url": "https://api.github.com/users/santakd/repos",
"events_url": "https://api.github.com/users/santakd/events{/privacy}",
"received_events_url": "https://api.github.com/users/santakd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 1
| 2025-01-29T20:28:31
| 2025-01-29T23:20:31
| 2025-01-29T23:20:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
I want to know how I can remove a model that I have downloaded?
Is there is process or functionality or I can simply go to the folder and remove that particular model?
The location of the models are:
```
macOS: ~/.ollama/models
Linux: /usr/share/ollama/.ollama/models
Windows: C:\Users\%username%\.ollama\models
```
I see entries here too
`/Users/<userid>/.ollama/models/manifests/registry.ollama.ai/library/`
If there is a process or functionality, it will really help
Thank you
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8671/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8671/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3657
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3657/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3657/comments
|
https://api.github.com/repos/ollama/ollama/issues/3657/events
|
https://github.com/ollama/ollama/pull/3657
| 2,244,253,927
|
PR_kwDOJ0Z1Ps5stTEm
| 3,657
|
Add support for IQ1_S, IQ3_S, IQ2_S, IQ4_XS. IQ4_NL is not functional
|
{
"login": "mann1x",
"id": 20623405,
"node_id": "MDQ6VXNlcjIwNjIzNDA1",
"avatar_url": "https://avatars.githubusercontent.com/u/20623405?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mann1x",
"html_url": "https://github.com/mann1x",
"followers_url": "https://api.github.com/users/mann1x/followers",
"following_url": "https://api.github.com/users/mann1x/following{/other_user}",
"gists_url": "https://api.github.com/users/mann1x/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mann1x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mann1x/subscriptions",
"organizations_url": "https://api.github.com/users/mann1x/orgs",
"repos_url": "https://api.github.com/users/mann1x/repos",
"events_url": "https://api.github.com/users/mann1x/events{/privacy}",
"received_events_url": "https://api.github.com/users/mann1x/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 14
| 2024-04-15T17:33:46
| 2024-05-25T07:32:22
| 2024-05-10T20:26:47
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3657",
"html_url": "https://github.com/ollama/ollama/pull/3657",
"diff_url": "https://github.com/ollama/ollama/pull/3657.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3657.patch",
"merged_at": null
}
|
This patch adds support for IQ1_S, IQ3_S, IQ2_S, IQ4_XS.
IQ4_NL is using a different format, have to investigate further what are the differences.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3657/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3657/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5919
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5919/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5919/comments
|
https://api.github.com/repos/ollama/ollama/issues/5919/events
|
https://github.com/ollama/ollama/issues/5919
| 2,427,969,916
|
I_kwDOJ0Z1Ps6Qt-F8
| 5,919
|
`ollama run (modelname)` runs instruction-finetuned model
|
{
"login": "d-kleine",
"id": 53251018,
"node_id": "MDQ6VXNlcjUzMjUxMDE4",
"avatar_url": "https://avatars.githubusercontent.com/u/53251018?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/d-kleine",
"html_url": "https://github.com/d-kleine",
"followers_url": "https://api.github.com/users/d-kleine/followers",
"following_url": "https://api.github.com/users/d-kleine/following{/other_user}",
"gists_url": "https://api.github.com/users/d-kleine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/d-kleine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/d-kleine/subscriptions",
"organizations_url": "https://api.github.com/users/d-kleine/orgs",
"repos_url": "https://api.github.com/users/d-kleine/repos",
"events_url": "https://api.github.com/users/d-kleine/events{/privacy}",
"received_events_url": "https://api.github.com/users/d-kleine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-07-24T16:04:46
| 2024-07-29T08:21:03
| 2024-07-29T08:21:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I just have noticed that `ollama run llama3` runs ***Llama 3 8B Instruct*** (the instruction-finetuned variant) instead of ***Llama 3 8B***:
https://ollama.com/library/llama3/blobs/6a0746a1ec1a
These are different models:
**Llama 3 8B**: https://huggingface.co/meta-llama/Meta-Llama-3-8B
**Llama 3 8B Instruct**: https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct
Would it be possible to fix this?
|
{
"login": "d-kleine",
"id": 53251018,
"node_id": "MDQ6VXNlcjUzMjUxMDE4",
"avatar_url": "https://avatars.githubusercontent.com/u/53251018?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/d-kleine",
"html_url": "https://github.com/d-kleine",
"followers_url": "https://api.github.com/users/d-kleine/followers",
"following_url": "https://api.github.com/users/d-kleine/following{/other_user}",
"gists_url": "https://api.github.com/users/d-kleine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/d-kleine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/d-kleine/subscriptions",
"organizations_url": "https://api.github.com/users/d-kleine/orgs",
"repos_url": "https://api.github.com/users/d-kleine/repos",
"events_url": "https://api.github.com/users/d-kleine/events{/privacy}",
"received_events_url": "https://api.github.com/users/d-kleine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5919/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5919/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8396
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8396/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8396/comments
|
https://api.github.com/repos/ollama/ollama/issues/8396/events
|
https://github.com/ollama/ollama/issues/8396
| 2,782,462,306
|
I_kwDOJ0Z1Ps6l2QFi
| 8,396
|
Error: could not connect to ollama app, is it running?
|
{
"login": "Eyion",
"id": 26318038,
"node_id": "MDQ6VXNlcjI2MzE4MDM4",
"avatar_url": "https://avatars.githubusercontent.com/u/26318038?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Eyion",
"html_url": "https://github.com/Eyion",
"followers_url": "https://api.github.com/users/Eyion/followers",
"following_url": "https://api.github.com/users/Eyion/following{/other_user}",
"gists_url": "https://api.github.com/users/Eyion/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Eyion/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Eyion/subscriptions",
"organizations_url": "https://api.github.com/users/Eyion/orgs",
"repos_url": "https://api.github.com/users/Eyion/repos",
"events_url": "https://api.github.com/users/Eyion/events{/privacy}",
"received_events_url": "https://api.github.com/users/Eyion/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 8
| 2025-01-12T12:33:25
| 2025-01-15T23:58:42
| 2025-01-15T23:58:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I type "ollama -v", it shows "Warning: could not connect to a running Ollama instance
Warning: client version is 0.5.4"
When I type "ollama run qwen2.5:7b", it shows "Error: could not connect to ollama app, is it running?"
When I type "ollama serve", it shows "Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions."
[server.log](https://github.com/user-attachments/files/18389384/server.log)
I have reinstalled the software once again.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.4
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8396/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8396/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1055
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1055/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1055/comments
|
https://api.github.com/repos/ollama/ollama/issues/1055/events
|
https://github.com/ollama/ollama/pull/1055
| 1,985,521,157
|
PR_kwDOJ0Z1Ps5fB3QB
| 1,055
|
Fixed incorrect base model name
|
{
"login": "dansreis",
"id": 9052608,
"node_id": "MDQ6VXNlcjkwNTI2MDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/9052608?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dansreis",
"html_url": "https://github.com/dansreis",
"followers_url": "https://api.github.com/users/dansreis/followers",
"following_url": "https://api.github.com/users/dansreis/following{/other_user}",
"gists_url": "https://api.github.com/users/dansreis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dansreis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dansreis/subscriptions",
"organizations_url": "https://api.github.com/users/dansreis/orgs",
"repos_url": "https://api.github.com/users/dansreis/repos",
"events_url": "https://api.github.com/users/dansreis/events{/privacy}",
"received_events_url": "https://api.github.com/users/dansreis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-09T12:28:31
| 2023-11-13T17:46:20
| 2023-11-13T16:42:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1055",
"html_url": "https://github.com/ollama/ollama/pull/1055",
"diff_url": "https://github.com/ollama/ollama/pull/1055.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1055.patch",
"merged_at": "2023-11-13T16:42:55"
}
|
Added tag version to 'GetNamespaceRepository' method in order to set the correct model used model tag version.
(This PR fixes bug/issue: #946 )
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1055/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1055/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3445
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3445/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3445/comments
|
https://api.github.com/repos/ollama/ollama/issues/3445/events
|
https://github.com/ollama/ollama/pull/3445
| 2,219,441,851
|
PR_kwDOJ0Z1Ps5rYWX6
| 3,445
|
Add CI full build capability
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-02T02:34:26
| 2024-11-21T18:22:46
| 2024-11-21T18:22:46
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3445",
"html_url": "https://github.com/ollama/ollama/pull/3445",
"diff_url": "https://github.com/ollama/ollama/pull/3445.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3445.patch",
"merged_at": null
}
|
For labeled PRs, generate a full build for testing
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3445/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3445/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2992
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2992/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2992/comments
|
https://api.github.com/repos/ollama/ollama/issues/2992/events
|
https://github.com/ollama/ollama/issues/2992
| 2,174,938,351
|
I_kwDOJ0Z1Ps6Bouzv
| 2,992
|
Support Roberta embedding models
|
{
"login": "eliranwong",
"id": 25262722,
"node_id": "MDQ6VXNlcjI1MjYyNzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/25262722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eliranwong",
"html_url": "https://github.com/eliranwong",
"followers_url": "https://api.github.com/users/eliranwong/followers",
"following_url": "https://api.github.com/users/eliranwong/following{/other_user}",
"gists_url": "https://api.github.com/users/eliranwong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eliranwong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eliranwong/subscriptions",
"organizations_url": "https://api.github.com/users/eliranwong/orgs",
"repos_url": "https://api.github.com/users/eliranwong/repos",
"events_url": "https://api.github.com/users/eliranwong/events{/privacy}",
"received_events_url": "https://api.github.com/users/eliranwong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-03-07T22:33:40
| 2024-06-13T17:47:19
| 2024-06-13T17:47:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
can ollama support multi-language embedding model, like "paraphrase-multilingual-mpnet-base-v2"
https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2
much appreciated
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2992/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2992/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7612
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7612/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7612/comments
|
https://api.github.com/repos/ollama/ollama/issues/7612/events
|
https://github.com/ollama/ollama/issues/7612
| 2,648,195,355
|
I_kwDOJ0Z1Ps6d2EEb
| 7,612
|
Feature for Filter models by type option
|
{
"login": "Abubakkar13",
"id": 45032674,
"node_id": "MDQ6VXNlcjQ1MDMyNjc0",
"avatar_url": "https://avatars.githubusercontent.com/u/45032674?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Abubakkar13",
"html_url": "https://github.com/Abubakkar13",
"followers_url": "https://api.github.com/users/Abubakkar13/followers",
"following_url": "https://api.github.com/users/Abubakkar13/following{/other_user}",
"gists_url": "https://api.github.com/users/Abubakkar13/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Abubakkar13/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Abubakkar13/subscriptions",
"organizations_url": "https://api.github.com/users/Abubakkar13/orgs",
"repos_url": "https://api.github.com/users/Abubakkar13/repos",
"events_url": "https://api.github.com/users/Abubakkar13/events{/privacy}",
"received_events_url": "https://api.github.com/users/Abubakkar13/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-11T05:27:50
| 2024-11-11T05:34:27
| 2024-11-11T05:34:27
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Feature Request:
Model Filtering by Type
Objective: Add a model filtering feature that allows users to filter available models by predefined categories, specifically:
1. Normal Models
2. Tool Models
3. Vision Models
4. Embedding Models
(New type if needed in future)
This feature in ollama site will improve usability by allowing users to easily find and select models according to their specific type or use case.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7612/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7612/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4524
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4524/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4524/comments
|
https://api.github.com/repos/ollama/ollama/issues/4524/events
|
https://github.com/ollama/ollama/issues/4524
| 2,304,734,344
|
I_kwDOJ0Z1Ps6JX3SI
| 4,524
|
Default Stop Sequence is not working when user provides additional stop sequences
|
{
"login": "Nanthagopal-Eswaran",
"id": 115451020,
"node_id": "U_kgDOBuGkjA",
"avatar_url": "https://avatars.githubusercontent.com/u/115451020?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Nanthagopal-Eswaran",
"html_url": "https://github.com/Nanthagopal-Eswaran",
"followers_url": "https://api.github.com/users/Nanthagopal-Eswaran/followers",
"following_url": "https://api.github.com/users/Nanthagopal-Eswaran/following{/other_user}",
"gists_url": "https://api.github.com/users/Nanthagopal-Eswaran/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Nanthagopal-Eswaran/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Nanthagopal-Eswaran/subscriptions",
"organizations_url": "https://api.github.com/users/Nanthagopal-Eswaran/orgs",
"repos_url": "https://api.github.com/users/Nanthagopal-Eswaran/repos",
"events_url": "https://api.github.com/users/Nanthagopal-Eswaran/events{/privacy}",
"received_events_url": "https://api.github.com/users/Nanthagopal-Eswaran/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-05-19T18:47:36
| 2024-05-19T19:24:44
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I want to stop generation at stop sequence "\nObservation".
My expectation here is generation will stop at either "\nObservation" or models default stop sequence "<|eot_id|>" (in llama3 case).
But it does not happen and model keep on generating answer.
Request:

Response:

> I think we should add stop sequences to default stop sequence instead of replacing them
### OS
Windows
### GPU
_No response_
### CPU
Intel
### Ollama version
0.1.27
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4524/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4524/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6124
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6124/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6124/comments
|
https://api.github.com/repos/ollama/ollama/issues/6124/events
|
https://github.com/ollama/ollama/issues/6124
| 2,442,856,881
|
I_kwDOJ0Z1Ps6Rmwmx
| 6,124
|
Do not generate a history
|
{
"login": "zipfile6209",
"id": 141074644,
"node_id": "U_kgDOCGig1A",
"avatar_url": "https://avatars.githubusercontent.com/u/141074644?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zipfile6209",
"html_url": "https://github.com/zipfile6209",
"followers_url": "https://api.github.com/users/zipfile6209/followers",
"following_url": "https://api.github.com/users/zipfile6209/following{/other_user}",
"gists_url": "https://api.github.com/users/zipfile6209/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zipfile6209/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zipfile6209/subscriptions",
"organizations_url": "https://api.github.com/users/zipfile6209/orgs",
"repos_url": "https://api.github.com/users/zipfile6209/repos",
"events_url": "https://api.github.com/users/zipfile6209/events{/privacy}",
"received_events_url": "https://api.github.com/users/zipfile6209/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-08-01T16:15:35
| 2024-08-01T16:21:35
| 2024-08-01T16:21:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I would like to request that history can be disabled, as it can be undesirable to write sensitive data to disk for no reason.
I would like to point out that ollama is a bit aggressive about it, the usual tricks like turning it into a symbolic link, giving ownership to root or removing all permissions didn't work :-/
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6124/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6124/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1555
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1555/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1555/comments
|
https://api.github.com/repos/ollama/ollama/issues/1555/events
|
https://github.com/ollama/ollama/issues/1555
| 2,044,488,739
|
I_kwDOJ0Z1Ps553Gwj
| 1,555
|
GGUF in Docker?
|
{
"login": "jimmyjam-50066",
"id": 153751346,
"node_id": "U_kgDOCSoPMg",
"avatar_url": "https://avatars.githubusercontent.com/u/153751346?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jimmyjam-50066",
"html_url": "https://github.com/jimmyjam-50066",
"followers_url": "https://api.github.com/users/jimmyjam-50066/followers",
"following_url": "https://api.github.com/users/jimmyjam-50066/following{/other_user}",
"gists_url": "https://api.github.com/users/jimmyjam-50066/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jimmyjam-50066/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jimmyjam-50066/subscriptions",
"organizations_url": "https://api.github.com/users/jimmyjam-50066/orgs",
"repos_url": "https://api.github.com/users/jimmyjam-50066/repos",
"events_url": "https://api.github.com/users/jimmyjam-50066/events{/privacy}",
"received_events_url": "https://api.github.com/users/jimmyjam-50066/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-12-15T23:25:18
| 2024-01-22T23:52:38
| 2024-01-22T23:52:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
To support GGUF files in Docker, could we have a script in the docker that will take the argument and create the Model file for ollama to use?
example with solar-10.7b being the target local model name:
```
docker exec ollama_cat pull_gguf_from_url.sh solar-10.7b https://huggingface.co/TheBloke/SOLAR-10.7B-Instruct-v1.0-GGUF
```
*magic* (really creating a new modelfile with the first parameter and downloading the second parameter to a gguf directory or something)
then
```
docker exec ollama_cat ollama create solar-10.7b -f solar-10.7bModel
```
Just an idea, i'm sure there's a better way to accomplish this.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1555/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1555/timeline
| null |
completed
| false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.