url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/5714
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5714/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5714/comments
|
https://api.github.com/repos/ollama/ollama/issues/5714/events
|
https://github.com/ollama/ollama/pull/5714
| 2,410,097,651
|
PR_kwDOJ0Z1Ps51dX7E
| 5,714
|
README.md: Package managers: add Gentoo
|
{
"login": "vitaly-zdanevich",
"id": 3514015,
"node_id": "MDQ6VXNlcjM1MTQwMTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3514015?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vitaly-zdanevich",
"html_url": "https://github.com/vitaly-zdanevich",
"followers_url": "https://api.github.com/users/vitaly-zdanevich/followers",
"following_url": "https://api.github.com/users/vitaly-zdanevich/following{/other_user}",
"gists_url": "https://api.github.com/users/vitaly-zdanevich/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vitaly-zdanevich/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vitaly-zdanevich/subscriptions",
"organizations_url": "https://api.github.com/users/vitaly-zdanevich/orgs",
"repos_url": "https://api.github.com/users/vitaly-zdanevich/repos",
"events_url": "https://api.github.com/users/vitaly-zdanevich/events{/privacy}",
"received_events_url": "https://api.github.com/users/vitaly-zdanevich/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-07-16T03:09:17
| 2024-09-05T16:58:14
| 2024-09-05T16:58:14
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5714",
"html_url": "https://github.com/ollama/ollama/pull/5714",
"diff_url": "https://github.com/ollama/ollama/pull/5714.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5714.patch",
"merged_at": "2024-09-05T16:58:14"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5714/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5714/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5892
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5892/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5892/comments
|
https://api.github.com/repos/ollama/ollama/issues/5892/events
|
https://github.com/ollama/ollama/issues/5892
| 2,426,119,832
|
I_kwDOJ0Z1Ps6Qm6aY
| 5,892
|
Ollama: 500 error on Larger Models
|
{
"login": "nicholhai",
"id": 96297412,
"node_id": "U_kgDOBb1hxA",
"avatar_url": "https://avatars.githubusercontent.com/u/96297412?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nicholhai",
"html_url": "https://github.com/nicholhai",
"followers_url": "https://api.github.com/users/nicholhai/followers",
"following_url": "https://api.github.com/users/nicholhai/following{/other_user}",
"gists_url": "https://api.github.com/users/nicholhai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nicholhai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nicholhai/subscriptions",
"organizations_url": "https://api.github.com/users/nicholhai/orgs",
"repos_url": "https://api.github.com/users/nicholhai/repos",
"events_url": "https://api.github.com/users/nicholhai/events{/privacy}",
"received_events_url": "https://api.github.com/users/nicholhai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 62
| 2024-07-23T21:00:26
| 2024-08-19T14:37:13
| 2024-07-24T15:35:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Whenever I try to run a model greater than the 7b or 8b, I get the following error. HOWEVER, any of the regular ones that are 7b and 8b run just fine.
Ollama: 500, message='Internal Server Error', url=URL('http://localhost:11434/api/chat')
- Running Ubuntu Server 24.04
- Running through docker
- i7 2.1GHz
- 64GB RAM
- GeForce RTX 4060 Ti 16GB
Any assistance would be appreciated
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "nicholhai",
"id": 96297412,
"node_id": "U_kgDOBb1hxA",
"avatar_url": "https://avatars.githubusercontent.com/u/96297412?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nicholhai",
"html_url": "https://github.com/nicholhai",
"followers_url": "https://api.github.com/users/nicholhai/followers",
"following_url": "https://api.github.com/users/nicholhai/following{/other_user}",
"gists_url": "https://api.github.com/users/nicholhai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nicholhai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nicholhai/subscriptions",
"organizations_url": "https://api.github.com/users/nicholhai/orgs",
"repos_url": "https://api.github.com/users/nicholhai/repos",
"events_url": "https://api.github.com/users/nicholhai/events{/privacy}",
"received_events_url": "https://api.github.com/users/nicholhai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5892/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5892/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3788
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3788/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3788/comments
|
https://api.github.com/repos/ollama/ollama/issues/3788/events
|
https://github.com/ollama/ollama/pull/3788
| 2,254,790,213
|
PR_kwDOJ0Z1Ps5tQoHh
| 3,788
|
types/model: export IsValidNamePart
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-21T01:11:55
| 2024-04-21T01:26:35
| 2024-04-21T01:26:34
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3788",
"html_url": "https://github.com/ollama/ollama/pull/3788",
"diff_url": "https://github.com/ollama/ollama/pull/3788.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3788.patch",
"merged_at": "2024-04-21T01:26:34"
}
| null |
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3788/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3788/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1159
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1159/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1159/comments
|
https://api.github.com/repos/ollama/ollama/issues/1159/events
|
https://github.com/ollama/ollama/pull/1159
| 1,998,029,489
|
PR_kwDOJ0Z1Ps5fsW43
| 1,159
|
Example: Function Calling in Typescript
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-17T00:32:33
| 2023-11-21T18:06:56
| 2023-11-21T18:06:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1159",
"html_url": "https://github.com/ollama/ollama/pull/1159",
"diff_url": "https://github.com/ollama/ollama/pull/1159.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1159.patch",
"merged_at": "2023-11-21T18:06:55"
}
|
Two examples here. One to list the characters in the first few pages of War and Peace. The other parses emails for events and addresses.
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1159/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1159/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4664
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4664/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4664/comments
|
https://api.github.com/repos/ollama/ollama/issues/4664/events
|
https://github.com/ollama/ollama/issues/4664
| 2,319,099,097
|
I_kwDOJ0Z1Ps6KOqTZ
| 4,664
|
OLLAMA support MiniCPM-Llama3-V 2.5
|
{
"login": "zhqfdn",
"id": 25156863,
"node_id": "MDQ6VXNlcjI1MTU2ODYz",
"avatar_url": "https://avatars.githubusercontent.com/u/25156863?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhqfdn",
"html_url": "https://github.com/zhqfdn",
"followers_url": "https://api.github.com/users/zhqfdn/followers",
"following_url": "https://api.github.com/users/zhqfdn/following{/other_user}",
"gists_url": "https://api.github.com/users/zhqfdn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhqfdn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhqfdn/subscriptions",
"organizations_url": "https://api.github.com/users/zhqfdn/orgs",
"repos_url": "https://api.github.com/users/zhqfdn/repos",
"events_url": "https://api.github.com/users/zhqfdn/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhqfdn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-05-27T12:56:32
| 2024-06-09T17:11:30
| 2024-06-09T17:11:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://github.com/OpenBMB/ollama/tree/minicpm-v2.5/examples/minicpm-v2.5
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4664/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 4,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4664/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/841
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/841/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/841/comments
|
https://api.github.com/repos/ollama/ollama/issues/841/events
|
https://github.com/ollama/ollama/pull/841
| 1,950,446,984
|
PR_kwDOJ0Z1Ps5dLciz
| 841
|
cleanup: command args
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-18T19:05:33
| 2023-10-19T18:22:41
| 2023-10-19T18:22:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/841",
"html_url": "https://github.com/ollama/ollama/pull/841",
"diff_url": "https://github.com/ollama/ollama/pull/841.diff",
"patch_url": "https://github.com/ollama/ollama/pull/841.patch",
"merged_at": "2023-10-19T18:22:40"
}
|
A number of subcommands incorrectly set `MinimumNArgs` instead of `ExactArgs` which leads to confusion.
Related #803
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/841/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/841/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3679
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3679/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3679/comments
|
https://api.github.com/repos/ollama/ollama/issues/3679/events
|
https://github.com/ollama/ollama/pull/3679
| 2,246,710,964
|
PR_kwDOJ0Z1Ps5s1t5h
| 3,679
|
Update install.sh added /etc/default/ollama
|
{
"login": "digitalw00t",
"id": 593045,
"node_id": "MDQ6VXNlcjU5MzA0NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/593045?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/digitalw00t",
"html_url": "https://github.com/digitalw00t",
"followers_url": "https://api.github.com/users/digitalw00t/followers",
"following_url": "https://api.github.com/users/digitalw00t/following{/other_user}",
"gists_url": "https://api.github.com/users/digitalw00t/gists{/gist_id}",
"starred_url": "https://api.github.com/users/digitalw00t/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/digitalw00t/subscriptions",
"organizations_url": "https://api.github.com/users/digitalw00t/orgs",
"repos_url": "https://api.github.com/users/digitalw00t/repos",
"events_url": "https://api.github.com/users/digitalw00t/events{/privacy}",
"received_events_url": "https://api.github.com/users/digitalw00t/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-04-16T19:05:15
| 2024-05-16T01:06:44
| 2024-05-16T01:06:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3679",
"html_url": "https://github.com/ollama/ollama/pull/3679",
"diff_url": "https://github.com/ollama/ollama/pull/3679.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3679.patch",
"merged_at": null
}
|
Added persistent env file for the server, so one update to /etc/default/ollama will stay between updates.
|
{
"login": "digitalw00t",
"id": 593045,
"node_id": "MDQ6VXNlcjU5MzA0NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/593045?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/digitalw00t",
"html_url": "https://github.com/digitalw00t",
"followers_url": "https://api.github.com/users/digitalw00t/followers",
"following_url": "https://api.github.com/users/digitalw00t/following{/other_user}",
"gists_url": "https://api.github.com/users/digitalw00t/gists{/gist_id}",
"starred_url": "https://api.github.com/users/digitalw00t/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/digitalw00t/subscriptions",
"organizations_url": "https://api.github.com/users/digitalw00t/orgs",
"repos_url": "https://api.github.com/users/digitalw00t/repos",
"events_url": "https://api.github.com/users/digitalw00t/events{/privacy}",
"received_events_url": "https://api.github.com/users/digitalw00t/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3679/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3679/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5071
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5071/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5071/comments
|
https://api.github.com/repos/ollama/ollama/issues/5071/events
|
https://github.com/ollama/ollama/issues/5071
| 2,355,158,147
|
I_kwDOJ0Z1Ps6MYNyD
| 5,071
|
ollama not utilizing AMD GPU through METAL
|
{
"login": "dbl001",
"id": 3105499,
"node_id": "MDQ6VXNlcjMxMDU0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/3105499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dbl001",
"html_url": "https://github.com/dbl001",
"followers_url": "https://api.github.com/users/dbl001/followers",
"following_url": "https://api.github.com/users/dbl001/following{/other_user}",
"gists_url": "https://api.github.com/users/dbl001/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dbl001/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dbl001/subscriptions",
"organizations_url": "https://api.github.com/users/dbl001/orgs",
"repos_url": "https://api.github.com/users/dbl001/repos",
"events_url": "https://api.github.com/users/dbl001/events{/privacy}",
"received_events_url": "https://api.github.com/users/dbl001/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-15T19:20:08
| 2024-06-18T19:40:41
| 2024-06-18T19:40:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Here's my build command:
```
% OLLAMA_CUSTOM_CPU_DEFS="-DLLAMA_AVX=on -DLLAMA_AVX2=on -DLLAMA_F16C=on -DLLAMA_FMA=on -DLLAMA_METAL=on -DLLAMA_METAL_EMBED_LIBRARY=on -DGGML_USE_METAL=on -DLLAMA_METAL_COMPILE_SERIALIZED=1" go generate -v ./...
```
The go script subsequently turns -DLLAMA_METAL=off
```
+ cmake -S ../llama.cpp -B ../build/darwin/x86_64/cpu_avx2 -DCMAKE_OSX_DEPLOYMENT_TARGET=11.3 -DLLAMA_METAL_MACOSX_VERSION_MIN=11.3 -DCMAKE_SYSTEM_NAME=Darwin -DLLAMA_METAL_EMBED_LIBRARY=on -DCMAKE_SYSTEM_PROCESSOR=x86_64 -DCMAKE_OSX_ARCHITECTURES=x86_64 -DLLAMA_METAL=off -DLLAMA_NATIVE=off -DLLAMA_ACCELERATE=on -DLLAMA_AVX=on -DLLAMA_AVX2=on -DLLAMA_AVX512=off -DLLAMA_FMA=on -DLLAMA_F16C=on -DCMAKE_BUILD_TYPE=Release -DLLAMA_SERVER_VERBOSE=off
```
Finally, the server runs without utilizing the GPU.
```
% ollama serve
2024/06/15 10:36:43 routes.go:1011: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS:/Users/davidlaxer/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]"
time=2024-06-15T10:36:43.742-07:00 level=INFO source=images.go:725 msg="total blobs: 28"
time=2024-06-15T10:36:43.743-07:00 level=INFO source=images.go:732 msg="total unused blobs removed: 0"
time=2024-06-15T10:36:43.744-07:00 level=INFO source=routes.go:1057 msg="Listening on 127.0.0.1:11434 (version 0.1.44)"
time=2024-06-15T10:36:43.744-07:00 level=INFO source=payload.go:30 msg="extracting embedded files" dir=/var/folders/3n/56fpv14n4wj0c1l1sb106pzw0000gn/T/ollama2746628305/runners
time=2024-06-15T10:36:43.770-07:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu_avx2 cpu cpu_avx]"
time=2024-06-15T10:36:43.770-07:00 level=INFO source=types.go:71 msg="inference compute" id="" library=cpu compute="" driver=0.0 name="" total="128.0 GiB" available="0 B"
time=2024-06-15T10:41:36.771-07:00 level=INFO source=memory.go:133 msg="offload to gpu" layers.requested=-1 layers.real=0 memory.available="0 B" memory.required.full="4.6 GiB" memory.required.partial="794.5 MiB" memory.required.kv="256.0 MiB" memory.weights.total="4.1 GiB" memory.weights.repeating="3.7 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="164.0 MiB" memory.graph.partial="677.5 MiB"
time=2024-06-15T10:41:36.772-07:00 level=INFO source=server.go:341 msg="starting llama server" cmd="/var/folders/3n/56fpv14n4wj0c1l1sb106pzw0000gn/T/ollama2746628305/runners/cpu_avx2/ollama_llama_server --model /Users/davidlaxer/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa --ctx-size 2048 --batch-size 512 --embedding --log-disable --parallel 1 --port 63042"
time=2024-06-15T10:41:36.780-07:00 level=INFO source=sched.go:338 msg="loaded runners" count=1
time=2024-06-15T10:41:36.780-07:00 level=INFO source=server.go:529 msg="waiting for llama runner to start responding"
time=2024-06-15T10:41:36.780-07:00 level=INFO source=server.go:567 msg="waiting for server to become available" status="llm server error"
INFO [main] build info | build=3051 commit="5921b8f0" tid="0x7ff85e144fc0" timestamp=1718473296
INFO [main] system info | n_threads=8 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 1 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="0x7ff85e144fc0" timestamp=1718473296 total_threads=16
INFO [main] HTTP server listening | hostname="127.0.0.1" n_threads_http="15" port="63042" tid="0x7ff85e144fc0" timestamp=1718473296
time=2024-06-15T10:41:37.032-07:00 level=INFO source=server.go:567 msg="waiting for server to become available" status="llm server loading model"
llama_model_loader: loaded meta data with 22 key-value pairs and 291 tensors from /Users/davidlaxer/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = Meta-Llama-3-8B-Instruct
llama_model_loader: - kv 2: llama.block_count u32 = 32
llama_model_loader: - kv 3: llama.context_length u32 = 8192
llama_model_loader: - kv 4: llama.embedding_length u32 = 4096
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 6: llama.attention.head_count u32 = 32
llama_model_loader: - kv 7: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 8: llama.rope.freq_base f32 = 500000.000000
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 10: general.file_type u32 = 2
llama_model_loader: - kv 11: llama.vocab_size u32 = 128256
llama_model_loader: - kv 12: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 13: tokenizer.ggml.model str = gpt2
llama_model_loader: - kv 14: tokenizer.ggml.pre str = llama-bpe
llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,128256] = ["!", "\"", "#", "$", "%", "&", "'", ...
llama_model_loader: - kv 16: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
llama_model_loader: - kv 17: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "...
llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 128000
llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 128009
llama_model_loader: - kv 20: tokenizer.chat_template str = {% set loop_messages = messages %}{% ...
llama_model_loader: - kv 21: general.quantization_version u32 = 2
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type q4_0: 225 tensors
llama_model_loader: - type q6_K: 1 tensors
llm_load_vocab: special tokens cache size = 256
llm_load_vocab: token to piece cache size = 1.5928 MB
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = BPE
llm_load_print_meta: n_vocab = 128256
llm_load_print_meta: n_merges = 280147
llm_load_print_meta: n_ctx_train = 8192
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: n_embd_k_gqa = 1024
llm_load_print_meta: n_embd_v_gqa = 1024
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: causal attn = 1
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 0
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 500000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx = 8192
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: model type = 8B
llm_load_print_meta: model ftype = Q4_0
llm_load_print_meta: model params = 8.03 B
llm_load_print_meta: model size = 4.33 GiB (4.64 BPW)
llm_load_print_meta: general.name = Meta-Llama-3-8B-Instruct
llm_load_print_meta: BOS token = 128000 '<|begin_of_text|>'
llm_load_print_meta: EOS token = 128009 '<|eot_id|>'
llm_load_print_meta: LF token = 128 'Ä'
llm_load_print_meta: EOT token = 128009 '<|eot_id|>'
llm_load_tensors: ggml ctx size = 0.15 MiB
llm_load_tensors: CPU buffer size = 4437.80 MiB
llama_new_context_with_model: n_ctx = 2048
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: flash_attn = 0
llama_new_context_with_model: freq_base = 500000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CPU KV buffer size = 256.00 MiB
llama_new_context_with_model: KV self size = 256.00 MiB, K (f16): 128.00 MiB, V (f16): 128.00 MiB
llama_new_context_with_model: CPU output buffer size = 0.50 MiB
llama_new_context_with_model: CPU compute buffer size = 258.50 MiB
llama_new_context_with_model: graph nodes = 1030
llama_new_context_with_model: graph splits = 1
INFO [main] model loaded | tid="0x7ff85e144fc0" timestamp=1718473304
time=2024-06-15T10:41:44.296-07:00 level=INFO source=server.go:572 msg="llama runner started in 7.52 seconds"
[GIN] 2024/06/15 - 10:41:44 | 200 | 9.093560734s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:41:54 | 200 | 1.154057317s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:42:35 | 200 | 40.688860055s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:42:41 | 200 | 6.229453908s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:42:43 | 200 | 1.270069572s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:43:23 | 200 | 40.445274886s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:43:29 | 200 | 5.92720864s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:43:31 | 200 | 1.186419337s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:44:11 | 200 | 40.475555077s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:44:17 | 200 | 6.143890785s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:44:19 | 200 | 1.327419018s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:44:59 | 200 | 40.358735272s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:45:05 | 200 | 5.842486079s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:45:06 | 200 | 1.151830787s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:45:45 | 200 | 38.130374809s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:45:50 | 200 | 5.863281373s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:45:52 | 200 | 763.567512ms | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:46:16 | 200 | 24.464886509s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:46:17 | 200 | 844.612204ms | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:46:25 | 200 | 7.366777251s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:46:27 | 200 | 1.314771295s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:46:45 | 200 | 18.025285278s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:46:47 | 200 | 1.448278338s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:47:26 | 200 | 38.918308755s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:47:45 | 200 | 18.653427075s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:47:47 | 200 | 1.097321882s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:48:29 | 200 | 41.37452429s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:48:32 | 200 | 1.331141018s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:49:11 | 200 | 39.111446616s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:49:31 | 200 | 20.771630418s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:49:33 | 200 | 1.171729854s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:50:14 | 200 | 40.365819016s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:50:17 | 200 | 1.213320125s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:50:35 | 200 | 18.183581597s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:50:38 | 200 | 1.575906212s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:51:17 | 200 | 39.023216091s | 127.0.0.1 | POST "/api/embeddings"
[GIN] 2024/06/15 - 10:51:37 | 200 | 19.720073759s | 127.0.0.1 | POST "/api/embeddings"
```
If I run
```
% ./main -m /Users/davidlaxer/llama.cpp/models/7B/ggml-model-q4_0.gguf -n 128 -ngl 1
```
the AMD GPU is detected
```
ggml_metal_init: allocating
ggml_metal_init: found device: AMD Radeon Pro 5700 XT
ggml_metal_init: picking default device: AMD Radeon Pro 5700 XT
ggml_metal_init: default.metallib not found, loading from source
ggml_metal_init: GGML_METAL_PATH_RESOURCES = nil
ggml_metal_init: loading '/Users/davidlaxer/ollama/llm/llama.cpp/ggml-metal.metal'
ggml_metal_init: GPU name: AMD Radeon Pro 5700 XT
ggml_metal_init: GPU family: MTLGPUFamilyCommon3 (3003)
ggml_metal_init: GPU family: MTLGPUFamilyMetal3 (5001)
ggml_metal_init: simdgroup reduction support = true
ggml_metal_init: simdgroup matrix mul. support = false
ggml_metal_init: hasUnifiedMemory = false
ggml_metal_init: recommendedMaxWorkingSetSize = 17163.09 MB
ggml_metal_init: skipping kernel_mul_mm_f32_f32 (not supported)
ggml_metal_init: skipping kernel_mul_mm_f16_f32 (not supported)
ggml_metal_init: skipping kernel_mul_mm_q4_0_f32 (not supported)
ggml_metal_init: skipping kernel_mul_mm_q4_1_f32 (not supported)
ggml_metal_init: skipping kernel_mul_mm_q5_0_f32 (not supported)
ggml_metal_init: skipping kernel_mul_mm_q5_1_f32 (not supported)
ggml_metal_init: skipping kernel_mul_mm_q8_0_f32 (not supported)
ggml_metal_init: skipping kernel_mul_mm_q2_K_f32 (not supported)
...
```
### OS
macOS
### GPU
AMD
### CPU
Intel
### Ollama version
0.2.1
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5071/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5071/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5499
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5499/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5499/comments
|
https://api.github.com/repos/ollama/ollama/issues/5499/events
|
https://github.com/ollama/ollama/issues/5499
| 2,392,964,252
|
I_kwDOJ0Z1Ps6Oobyc
| 5,499
|
Error Pull Model Manifest
|
{
"login": "Moonlight1220",
"id": 172665223,
"node_id": "U_kgDOCkqphw",
"avatar_url": "https://avatars.githubusercontent.com/u/172665223?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Moonlight1220",
"html_url": "https://github.com/Moonlight1220",
"followers_url": "https://api.github.com/users/Moonlight1220/followers",
"following_url": "https://api.github.com/users/Moonlight1220/following{/other_user}",
"gists_url": "https://api.github.com/users/Moonlight1220/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Moonlight1220/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Moonlight1220/subscriptions",
"organizations_url": "https://api.github.com/users/Moonlight1220/orgs",
"repos_url": "https://api.github.com/users/Moonlight1220/repos",
"events_url": "https://api.github.com/users/Moonlight1220/events{/privacy}",
"received_events_url": "https://api.github.com/users/Moonlight1220/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-07-05T17:53:36
| 2024-09-26T00:14:43
| 2024-09-26T00:14:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
### Error Pulling Manifest Repeatedly
Hello Ollama Comunity,
When I first installed Ollama on my early 2015 13in Macbook Air (1.6GHz Dual Core Intel Core i5, 8 GB 1600 MHz DDR3, Intel HD Graphics 600 1536 MB) it worked perfectly fine once i used it again after instilation i got the error:
`Error: pull model manifest: file does not exist`
I looked through the github issues and was told to re-install Ollama and am still getting the same error, please advise on where to go next.
### OS
macOS
### GPU
Intel
### CPU
Intel
### Ollama version
0.1.48
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5499/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5499/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/295
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/295/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/295/comments
|
https://api.github.com/repos/ollama/ollama/issues/295/events
|
https://github.com/ollama/ollama/issues/295
| 1,838,028,667
|
I_kwDOJ0Z1Ps5tjhd7
| 295
|
`stop` parameter values don't always stop generation
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2023-08-06T03:35:35
| 2023-08-30T04:17:43
| 2023-08-08T04:29:28
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Stop words don't always stop generation
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/295/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/295/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1661
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1661/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1661/comments
|
https://api.github.com/repos/ollama/ollama/issues/1661/events
|
https://github.com/ollama/ollama/pull/1661
| 2,052,871,624
|
PR_kwDOJ0Z1Ps5imIOM
| 1,661
|
Fix `template` api doc description
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-21T18:12:50
| 2024-01-03T16:01:00
| 2024-01-03T16:00:59
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1661",
"html_url": "https://github.com/ollama/ollama/pull/1661",
"diff_url": "https://github.com/ollama/ollama/pull/1661.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1661.patch",
"merged_at": "2024-01-03T16:00:59"
}
|
The API docs specify that `template` overrides the prompt which isn't the case (verified back to v0.1.13), this is the functionality that `raw` mode enables. This change fixes the description.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1661/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1661/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3975
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3975/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3975/comments
|
https://api.github.com/repos/ollama/ollama/issues/3975/events
|
https://github.com/ollama/ollama/issues/3975
| 2,266,883,233
|
I_kwDOJ0Z1Ps6HHeSh
| 3,975
|
When used, it is always cpu full instead of gpu full, and gpu usage is almost zero
|
{
"login": "KritoAndAsuna",
"id": 59231253,
"node_id": "MDQ6VXNlcjU5MjMxMjUz",
"avatar_url": "https://avatars.githubusercontent.com/u/59231253?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/KritoAndAsuna",
"html_url": "https://github.com/KritoAndAsuna",
"followers_url": "https://api.github.com/users/KritoAndAsuna/followers",
"following_url": "https://api.github.com/users/KritoAndAsuna/following{/other_user}",
"gists_url": "https://api.github.com/users/KritoAndAsuna/gists{/gist_id}",
"starred_url": "https://api.github.com/users/KritoAndAsuna/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/KritoAndAsuna/subscriptions",
"organizations_url": "https://api.github.com/users/KritoAndAsuna/orgs",
"repos_url": "https://api.github.com/users/KritoAndAsuna/repos",
"events_url": "https://api.github.com/users/KritoAndAsuna/events{/privacy}",
"received_events_url": "https://api.github.com/users/KritoAndAsuna/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-04-27T07:14:05
| 2024-05-21T18:20:44
| 2024-05-21T18:20:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When used, it is always cpu full instead of gpu full, and gpu usage is almost zero
### OS
Windows
### GPU
AMD
### CPU
AMD
### Ollama version
0.1.32
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3975/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3975/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1326
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1326/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1326/comments
|
https://api.github.com/repos/ollama/ollama/issues/1326/events
|
https://github.com/ollama/ollama/issues/1326
| 2,017,827,752
|
I_kwDOJ0Z1Ps54RZuo
| 1,326
|
Installation fails on Fedora 39 (38+)
|
{
"login": "cephalization",
"id": 8948924,
"node_id": "MDQ6VXNlcjg5NDg5MjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/8948924?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cephalization",
"html_url": "https://github.com/cephalization",
"followers_url": "https://api.github.com/users/cephalization/followers",
"following_url": "https://api.github.com/users/cephalization/following{/other_user}",
"gists_url": "https://api.github.com/users/cephalization/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cephalization/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cephalization/subscriptions",
"organizations_url": "https://api.github.com/users/cephalization/orgs",
"repos_url": "https://api.github.com/users/cephalization/repos",
"events_url": "https://api.github.com/users/cephalization/events{/privacy}",
"received_events_url": "https://api.github.com/users/cephalization/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-11-30T03:58:35
| 2024-01-18T22:23:43
| 2024-01-18T22:23:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Nvidia hasn't uploaded specific cuda drivers for later versions of fedora here https://developer.download.nvidia.com/compute/cuda/repos/
So, installation fails when trying to install them for 38 and 39.
To fix, you can follow the steps for Fedora 35 and later here https://rpmfusion.org/Howto/CUDA
```sh
sudo dnf config-manager --add-repo https://developer.download.nvidia.com/compute/cuda/repos/fedora35/x86_64/cuda-fedora35.repo
sudo dnf clean all
sudo dnf module disable nvidia-driver
sudo dnf -y install cuda
```
And then redo the ollama install.
Everything seems to work afterwards
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1326/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1326/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3322
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3322/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3322/comments
|
https://api.github.com/repos/ollama/ollama/issues/3322/events
|
https://github.com/ollama/ollama/issues/3322
| 2,204,186,202
|
I_kwDOJ0Z1Ps6DYTZa
| 3,322
|
I can't make vision models work
|
{
"login": "donnadulcinea",
"id": 34122487,
"node_id": "MDQ6VXNlcjM0MTIyNDg3",
"avatar_url": "https://avatars.githubusercontent.com/u/34122487?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/donnadulcinea",
"html_url": "https://github.com/donnadulcinea",
"followers_url": "https://api.github.com/users/donnadulcinea/followers",
"following_url": "https://api.github.com/users/donnadulcinea/following{/other_user}",
"gists_url": "https://api.github.com/users/donnadulcinea/gists{/gist_id}",
"starred_url": "https://api.github.com/users/donnadulcinea/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/donnadulcinea/subscriptions",
"organizations_url": "https://api.github.com/users/donnadulcinea/orgs",
"repos_url": "https://api.github.com/users/donnadulcinea/repos",
"events_url": "https://api.github.com/users/donnadulcinea/events{/privacy}",
"received_events_url": "https://api.github.com/users/donnadulcinea/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 8
| 2024-03-24T05:17:58
| 2024-11-24T22:17:59
| 2024-11-24T22:17:59
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am running ollama via docker. Everything works smootly but vision models.
I tried `llava` and `bakllava` with no success.
### What did you expect to see?
The description of the image I provided.
### Steps to reproduce
Run an instance of ollama with docker, pull latest model of llava or bakllava.
Make a query test, exactly as in
https://github.com/ollama/ollama/blob/main/docs/api.md#request-with-images
The answer is not as expected, it is always random, for example:
```
{
"model": "llava",
"created_at": "2024-03-24T05:02:22.859351985Z",
"response": " The image shows a person sitting at a table with some papers or documents. The focus is on the person's face, which appears to be in deep thought or concentration. There are no other discernable objects or details in the picture. ",
"done": true,
"context": [...
```
Tried with llava and bakllava, every other model seems to work smoothly. I tried with highest quality and simple content images.
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
arm64
### Platform
Docker
### Ollama version
ollama version is 0.1.28
### GPU
_No response_
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3322/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3322/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1389
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1389/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1389/comments
|
https://api.github.com/repos/ollama/ollama/issues/1389/events
|
https://github.com/ollama/ollama/issues/1389
| 2,026,755,277
|
I_kwDOJ0Z1Ps54zdTN
| 1,389
|
Request: The ability to load multiple models into the same GPUs and running them concurrently.
|
{
"login": "phalexo",
"id": 4603365,
"node_id": "MDQ6VXNlcjQ2MDMzNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4603365?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phalexo",
"html_url": "https://github.com/phalexo",
"followers_url": "https://api.github.com/users/phalexo/followers",
"following_url": "https://api.github.com/users/phalexo/following{/other_user}",
"gists_url": "https://api.github.com/users/phalexo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phalexo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phalexo/subscriptions",
"organizations_url": "https://api.github.com/users/phalexo/orgs",
"repos_url": "https://api.github.com/users/phalexo/repos",
"events_url": "https://api.github.com/users/phalexo/events{/privacy}",
"received_events_url": "https://api.github.com/users/phalexo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2023-12-05T17:19:34
| 2024-03-12T16:46:44
| 2024-03-12T16:46:36
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently what ollama does is UNLOAD the previously loaded model, and loads the last model you try to use. Although the load is reasonably fast (if you intend to manually enter text and such) but if you want to use it with AutoGen or similar, loads and unloads put additional latency into the system, when token generation can already be pretty slow.
I am going try to separate GPUs into different groups and try to run different models within different groups, BUT it does not really solve the problem of resource utilization.
Thanks.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1389/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1389/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8601
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8601/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8601/comments
|
https://api.github.com/repos/ollama/ollama/issues/8601/events
|
https://github.com/ollama/ollama/pull/8601
| 2,812,057,230
|
PR_kwDOJ0Z1Ps6JCJq5
| 8,601
|
README: Add handy-ollama to tutorial
|
{
"login": "AXYZdong",
"id": 45477220,
"node_id": "MDQ6VXNlcjQ1NDc3MjIw",
"avatar_url": "https://avatars.githubusercontent.com/u/45477220?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AXYZdong",
"html_url": "https://github.com/AXYZdong",
"followers_url": "https://api.github.com/users/AXYZdong/followers",
"following_url": "https://api.github.com/users/AXYZdong/following{/other_user}",
"gists_url": "https://api.github.com/users/AXYZdong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AXYZdong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AXYZdong/subscriptions",
"organizations_url": "https://api.github.com/users/AXYZdong/orgs",
"repos_url": "https://api.github.com/users/AXYZdong/repos",
"events_url": "https://api.github.com/users/AXYZdong/events{/privacy}",
"received_events_url": "https://api.github.com/users/AXYZdong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2025-01-27T04:29:41
| 2025-01-27T17:08:48
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8601",
"html_url": "https://github.com/ollama/ollama/pull/8601",
"diff_url": "https://github.com/ollama/ollama/pull/8601.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8601.patch",
"merged_at": null
}
|
Chinese Tutorial for Ollama by [Datawhale ](https://github.com/datawhalechina)- China's Largest Open Source AI Learning Community.
We'd like to contribute to the Ollama community by announcing the release of our open-source Chinese tutorial.
This tutorial aims to be comprehensive and easy to understand, covering:
- Ollama Introduction
- Ollama Installation and Configuration
- Custom Model Import
- Ollama REST API
- Using Ollama with LangChain
- Deployment of Ollama Visual Interfaces
- Application Examples
The repo is at : https://github.com/datawhalechina/handy-ollama
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8601/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8601/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4971
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4971/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4971/comments
|
https://api.github.com/repos/ollama/ollama/issues/4971/events
|
https://github.com/ollama/ollama/issues/4971
| 2,345,302,568
|
I_kwDOJ0Z1Ps6Lynoo
| 4,971
|
How to disallow the use of both gpu and cpu
|
{
"login": "xiaohanglei",
"id": 32543872,
"node_id": "MDQ6VXNlcjMyNTQzODcy",
"avatar_url": "https://avatars.githubusercontent.com/u/32543872?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xiaohanglei",
"html_url": "https://github.com/xiaohanglei",
"followers_url": "https://api.github.com/users/xiaohanglei/followers",
"following_url": "https://api.github.com/users/xiaohanglei/following{/other_user}",
"gists_url": "https://api.github.com/users/xiaohanglei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xiaohanglei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xiaohanglei/subscriptions",
"organizations_url": "https://api.github.com/users/xiaohanglei/orgs",
"repos_url": "https://api.github.com/users/xiaohanglei/repos",
"events_url": "https://api.github.com/users/xiaohanglei/events{/privacy}",
"received_events_url": "https://api.github.com/users/xiaohanglei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 4
| 2024-06-11T03:28:49
| 2024-06-14T02:45:19
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When using both GPU and CPU, the output will be garbled, so I want to prohibit this scenario

| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4971/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4971/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4633
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4633/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4633/comments
|
https://api.github.com/repos/ollama/ollama/issues/4633/events
|
https://github.com/ollama/ollama/issues/4633
| 2,316,911,164
|
I_kwDOJ0Z1Ps6KGUI8
| 4,633
|
Problem while pulling some models
|
{
"login": "skrew",
"id": 738170,
"node_id": "MDQ6VXNlcjczODE3MA==",
"avatar_url": "https://avatars.githubusercontent.com/u/738170?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/skrew",
"html_url": "https://github.com/skrew",
"followers_url": "https://api.github.com/users/skrew/followers",
"following_url": "https://api.github.com/users/skrew/following{/other_user}",
"gists_url": "https://api.github.com/users/skrew/gists{/gist_id}",
"starred_url": "https://api.github.com/users/skrew/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/skrew/subscriptions",
"organizations_url": "https://api.github.com/users/skrew/orgs",
"repos_url": "https://api.github.com/users/skrew/repos",
"events_url": "https://api.github.com/users/skrew/events{/privacy}",
"received_events_url": "https://api.github.com/users/skrew/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-05-25T10:32:28
| 2024-05-28T13:21:34
| 2024-05-25T16:29:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
While using the command `ollama pull aya:35b-23-q8_0`, downloading stuck to 98-99%
Tested multiple time
Then i've tested with 0.1.37 version, i can pull this model without problem
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.39
|
{
"login": "skrew",
"id": 738170,
"node_id": "MDQ6VXNlcjczODE3MA==",
"avatar_url": "https://avatars.githubusercontent.com/u/738170?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/skrew",
"html_url": "https://github.com/skrew",
"followers_url": "https://api.github.com/users/skrew/followers",
"following_url": "https://api.github.com/users/skrew/following{/other_user}",
"gists_url": "https://api.github.com/users/skrew/gists{/gist_id}",
"starred_url": "https://api.github.com/users/skrew/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/skrew/subscriptions",
"organizations_url": "https://api.github.com/users/skrew/orgs",
"repos_url": "https://api.github.com/users/skrew/repos",
"events_url": "https://api.github.com/users/skrew/events{/privacy}",
"received_events_url": "https://api.github.com/users/skrew/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4633/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4633/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/348
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/348/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/348/comments
|
https://api.github.com/repos/ollama/ollama/issues/348/events
|
https://github.com/ollama/ollama/pull/348
| 1,850,616,235
|
PR_kwDOJ0Z1Ps5X7UvX
| 348
|
cross repo blob mount
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-08-14T22:08:27
| 2023-08-16T16:20:37
| 2023-08-16T16:20:36
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/348",
"html_url": "https://github.com/ollama/ollama/pull/348",
"diff_url": "https://github.com/ollama/ollama/pull/348.diff",
"patch_url": "https://github.com/ollama/ollama/pull/348.patch",
"merged_at": "2023-08-16T16:20:36"
}
|
implement registry's cross repo blob mount
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/348/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/348/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4926
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4926/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4926/comments
|
https://api.github.com/repos/ollama/ollama/issues/4926/events
|
https://github.com/ollama/ollama/issues/4926
| 2,341,485,627
|
I_kwDOJ0Z1Ps6LkDw7
| 4,926
|
fail to upload models due to max try
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-06-08T05:26:26
| 2024-06-10T06:00:09
| 2024-06-09T23:16:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
`pushing c165271d7cbb... 73% ▕████████████████ ▏ 44 GB/ 61 GB 4.3 MB/s 1h3m
Error: max retries exceeded: Put "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/taozhiyuai/qwen2-57b-a14b-instruct/_uploads/cb319ba7-5ab8-40c3-a59b-03a007ffbae6/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%!F(MISSING)20240607%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240607T235721Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=48&uploadId=AHVfUfgfMc11-7PxdRLFlN2yqnUV68oRPODmHTg-a_dG_1nK5pBAEcAFf7Jt-79U2UCiAWGFDI-cFvkNxjzWL23Ly7xfr6VkHhyZWUETuxPDm3ADSL8KM2P51ZZ-Vye1olRgYFzt2L6NhedUCxtrAogdpZKh6lGd25X25VLx7NOLr0V4Obd-W9w5HuPYyVIbegFqH8cHYrPbFbiso8zt8kzb3LwqINTnKF7IDzakHe5rzmztMAH-J5HI0Lu6ELbmS-k3qeZF4-Xwuf9rNp9jsBSkJOvkBbECUpt2FLKchLqT2uAVH4-zxRqVXu9UGkA-d-XzB4FglNJrk6fGd__EGRw&X-Amz-Signature=06249a03866edad37e421d5a34bf016714a2fcb46019545544c03dcdce29a470": write tcp 192.168.31.110:62212->104.18.9.90:443: write: broken pipe`
how to ignore max try.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
1.40
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4926/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4926/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3564
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3564/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3564/comments
|
https://api.github.com/repos/ollama/ollama/issues/3564/events
|
https://github.com/ollama/ollama/pull/3564
| 2,234,426,010
|
PR_kwDOJ0Z1Ps5sLzNh
| 3,564
|
Revert "build.go: introduce a friendlier way to build Ollama (#3548)"
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-09T22:40:31
| 2024-04-09T22:57:46
| 2024-04-09T22:57:45
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3564",
"html_url": "https://github.com/ollama/ollama/pull/3564",
"diff_url": "https://github.com/ollama/ollama/pull/3564.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3564.patch",
"merged_at": "2024-04-09T22:57:45"
}
|
This reverts commit fccf3eecaaecc94178a12084aabe6e0bcb24a1d9.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3564/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3564/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7126
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7126/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7126/comments
|
https://api.github.com/repos/ollama/ollama/issues/7126/events
|
https://github.com/ollama/ollama/pull/7126
| 2,571,830,579
|
PR_kwDOJ0Z1Ps594mrb
| 7,126
|
Add web management tool to Community Integrations
|
{
"login": "lemonit-eric-mao",
"id": 68628461,
"node_id": "MDQ6VXNlcjY4NjI4NDYx",
"avatar_url": "https://avatars.githubusercontent.com/u/68628461?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lemonit-eric-mao",
"html_url": "https://github.com/lemonit-eric-mao",
"followers_url": "https://api.github.com/users/lemonit-eric-mao/followers",
"following_url": "https://api.github.com/users/lemonit-eric-mao/following{/other_user}",
"gists_url": "https://api.github.com/users/lemonit-eric-mao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lemonit-eric-mao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lemonit-eric-mao/subscriptions",
"organizations_url": "https://api.github.com/users/lemonit-eric-mao/orgs",
"repos_url": "https://api.github.com/users/lemonit-eric-mao/repos",
"events_url": "https://api.github.com/users/lemonit-eric-mao/events{/privacy}",
"received_events_url": "https://api.github.com/users/lemonit-eric-mao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-08T01:21:12
| 2024-11-21T10:51:46
| 2024-11-21T10:51:46
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7126",
"html_url": "https://github.com/ollama/ollama/pull/7126",
"diff_url": "https://github.com/ollama/ollama/pull/7126.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7126.patch",
"merged_at": "2024-11-21T10:51:46"
}
|
"Add web management tool to Community Integrations"
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7126/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7126/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2431
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2431/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2431/comments
|
https://api.github.com/repos/ollama/ollama/issues/2431/events
|
https://github.com/ollama/ollama/issues/2431
| 2,127,716,970
|
I_kwDOJ0Z1Ps5-0mJq
| 2,431
|
Ability to preload a model?
|
{
"login": "powellnorma",
"id": 101364699,
"node_id": "U_kgDOBgqz2w",
"avatar_url": "https://avatars.githubusercontent.com/u/101364699?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/powellnorma",
"html_url": "https://github.com/powellnorma",
"followers_url": "https://api.github.com/users/powellnorma/followers",
"following_url": "https://api.github.com/users/powellnorma/following{/other_user}",
"gists_url": "https://api.github.com/users/powellnorma/gists{/gist_id}",
"starred_url": "https://api.github.com/users/powellnorma/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/powellnorma/subscriptions",
"organizations_url": "https://api.github.com/users/powellnorma/orgs",
"repos_url": "https://api.github.com/users/powellnorma/repos",
"events_url": "https://api.github.com/users/powellnorma/events{/privacy}",
"received_events_url": "https://api.github.com/users/powellnorma/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-02-09T19:15:38
| 2024-05-15T18:58:59
| 2024-02-19T23:20:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is it possible to preload a model without actually using it? For example if the users starts typing his request, it would be useful to be able to "preload" the model, instead of just loading it once the request is submitted.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2431/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2431/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6957
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6957/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6957/comments
|
https://api.github.com/repos/ollama/ollama/issues/6957/events
|
https://github.com/ollama/ollama/issues/6957
| 2,548,409,065
|
I_kwDOJ0Z1Ps6X5aLp
| 6,957
|
`ollama stop` fails if the model has been deleted
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-25T16:13:07
| 2024-10-01T22:45:44
| 2024-10-01T22:45:44
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
PS C:\Users\jmorgan> ollama ps
NAME ID SIZE PROCESSOR UNTIL
solar-pro:latest 9a8c71c441ca 18 GB 100% GPU 4 minutes from now
PS C:\Users\jmorgan> ollama rm solar-pro
deleted 'solar-pro'
PS C:\Users\jmorgan> ollama stop solar-pro
Error: couldn't find model "solar-pro" to stop
PS C:\Users\jmorgan> ollama ps
NAME ID SIZE PROCESSOR UNTIL
solar-pro:latest 9a8c71c441ca 18 GB 100% GPU 4 minutes from now
```
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6957/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6957/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5487
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5487/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5487/comments
|
https://api.github.com/repos/ollama/ollama/issues/5487/events
|
https://github.com/ollama/ollama/issues/5487
| 2,391,234,042
|
I_kwDOJ0Z1Ps6Oh1X6
| 5,487
|
granite code page does not show the 20 and 34 b models
|
{
"login": "olumolu",
"id": 162728301,
"node_id": "U_kgDOCbMJbQ",
"avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/olumolu",
"html_url": "https://github.com/olumolu",
"followers_url": "https://api.github.com/users/olumolu/followers",
"following_url": "https://api.github.com/users/olumolu/following{/other_user}",
"gists_url": "https://api.github.com/users/olumolu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/olumolu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/olumolu/subscriptions",
"organizations_url": "https://api.github.com/users/olumolu/orgs",
"repos_url": "https://api.github.com/users/olumolu/repos",
"events_url": "https://api.github.com/users/olumolu/events{/privacy}",
"received_events_url": "https://api.github.com/users/olumolu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 5
| 2024-07-04T17:10:19
| 2024-10-24T02:42:38
| 2024-10-24T02:42:30
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

granite code page does not show the 20 and 34 b models we can see there are 4 models but does not have the size mentioned on the website end
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5487/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5487/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8007
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8007/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8007/comments
|
https://api.github.com/repos/ollama/ollama/issues/8007/events
|
https://github.com/ollama/ollama/issues/8007
| 2,725,925,227
|
I_kwDOJ0Z1Ps6ielFr
| 8,007
|
EXAONE-3.5 2.4B, 7.8B, and 32B
|
{
"login": "vYLQs6",
"id": 143073604,
"node_id": "U_kgDOCIchRA",
"avatar_url": "https://avatars.githubusercontent.com/u/143073604?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vYLQs6",
"html_url": "https://github.com/vYLQs6",
"followers_url": "https://api.github.com/users/vYLQs6/followers",
"following_url": "https://api.github.com/users/vYLQs6/following{/other_user}",
"gists_url": "https://api.github.com/users/vYLQs6/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vYLQs6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vYLQs6/subscriptions",
"organizations_url": "https://api.github.com/users/vYLQs6/orgs",
"repos_url": "https://api.github.com/users/vYLQs6/repos",
"events_url": "https://api.github.com/users/vYLQs6/events{/privacy}",
"received_events_url": "https://api.github.com/users/vYLQs6/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-12-09T04:04:57
| 2024-12-10T08:04:52
| 2024-12-10T08:04:51
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | ERROR: type should be string, got "https://huggingface.co/collections/LGAI-EXAONE/exaone-35-674d0e1bb3dcd2ab6f39dbb4\r\n\r\n---\r\n\r\nNote: \r\n\r\nI enabled Q8 KV Cache, and when I tried the gguf uploaded by LG, I always get this error: `Error: llama runner process has terminated: GGML_ASSERT(hparams.n_embd_head_k % ggml_blck_size(type_k) == 0) failed`\r\n\r\nIdk if this means this model just doesn't support kv cache or ollama needs an update.\r\n\r\nEverything works fine after I disabled kv cache & flash attention\r\n\r\n---\r\n\r\n<table>\r\n <tr>\r\n <th>Models</th>\r\n <th>MT-Bench</th>\r\n <th>LiveBench</th>\r\n <th>Arena-Hard</th>\r\n <th>AlpacaEval</th>\r\n <th>IFEval</th>\r\n <th>KoMT-Bench[1]</th>\r\n <th>LogicKor</th>\r\n </tr>\r\n <tr>\r\n <td>EXAONE 3.5 32B</td>\r\n <td align=\"center\"><strong>8.51</strong></td>\r\n <td align=\"center\">43.0</td>\r\n <td align=\"center\"><strong>78.6</strong></td>\r\n <td align=\"center\"><strong>60.6</strong></td>\r\n <td align=\"center\"><strong>81.7</strong></td>\r\n <td align=\"center\"><strong>8.05</strong></td>\r\n <td align=\"center\"><strong>9.06</strong></td>\r\n </tr>\r\n <tr>\r\n <td>Qwen 2.5 32B</td>\r\n <td align=\"center\">8.49</td>\r\n <td align=\"center\"><strong>50.6</strong></td>\r\n <td align=\"center\">67.0</td>\r\n <td align=\"center\">41.0</td>\r\n <td align=\"center\">78.7</td>\r\n <td align=\"center\">7.75</td>\r\n <td align=\"center\">8.89</td>\r\n </tr>\r\n <tr>\r\n <td>C4AI Command R 32B</td>\r\n <td align=\"center\">7.38</td>\r\n <td align=\"center\">29.7</td>\r\n <td align=\"center\">17.0</td>\r\n <td align=\"center\">25.9</td>\r\n <td align=\"center\">26.1</td>\r\n <td align=\"center\">6.72</td>\r\n <td align=\"center\">8.24</td>\r\n </tr>\r\n <tr>\r\n <td>Gemma 2 27B</td>\r\n <td align=\"center\">8.28</td>\r\n <td align=\"center\">40.0</td>\r\n <td align=\"center\">57.5</td>\r\n <td align=\"center\">52.2</td>\r\n <td align=\"center\">59.7</td>\r\n <td align=\"center\">7.19</td>\r\n <td align=\"center\">8.56</td>\r\n </tr>\r\n <tr>\r\n <td>Yi 1.5 34B</td>\r\n <td align=\"center\">7.64</td>\r\n <td align=\"center\">26.2</td>\r\n <td align=\"center\">23.1</td>\r\n <td align=\"center\">34.8</td>\r\n <td align=\"center\">55.5</td>\r\n <td align=\"center\">4.88</td>\r\n <td align=\"center\">6.33</td>\r\n </tr>\r\n</table>"
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8007/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8007/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/867
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/867/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/867/comments
|
https://api.github.com/repos/ollama/ollama/issues/867/events
|
https://github.com/ollama/ollama/issues/867
| 1,955,179,445
|
I_kwDOJ0Z1Ps50iau1
| 867
|
ollama API not responding
|
{
"login": "abulka",
"id": 11467530,
"node_id": "MDQ6VXNlcjExNDY3NTMw",
"avatar_url": "https://avatars.githubusercontent.com/u/11467530?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/abulka",
"html_url": "https://github.com/abulka",
"followers_url": "https://api.github.com/users/abulka/followers",
"following_url": "https://api.github.com/users/abulka/following{/other_user}",
"gists_url": "https://api.github.com/users/abulka/gists{/gist_id}",
"starred_url": "https://api.github.com/users/abulka/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/abulka/subscriptions",
"organizations_url": "https://api.github.com/users/abulka/orgs",
"repos_url": "https://api.github.com/users/abulka/repos",
"events_url": "https://api.github.com/users/abulka/events{/privacy}",
"received_events_url": "https://api.github.com/users/abulka/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-10-21T01:01:00
| 2023-10-21T01:20:05
| 2023-10-21T01:06:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
ollama isn't reponding to
```
curl http://localhost:11434/api/show --json '{"name": "codellama:7b-instruct"}'
404 page not found
```
and I didn't configure ollama to start on a particular port, just a default install.
I have the models:
```
% ollama list
NAME SIZE MODIFIED
codellama:7b-instruct 3.8 GB 7 weeks ago
llama2:latest 3.8 GB 2 months ago
llama2-uncensored:latest 3.8 GB 2 months ago
```
Is there a way of checking what port ollama is running on to verify this?
This API issue is stopping me using `oterm`:
https://github.com/ggozad/oterm/issues/2
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/867/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/867/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7848
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7848/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7848/comments
|
https://api.github.com/repos/ollama/ollama/issues/7848/events
|
https://github.com/ollama/ollama/issues/7848
| 2,696,108,449
|
I_kwDOJ0Z1Ps6gs1mh
| 7,848
|
Teuken-7b
|
{
"login": "tilllt",
"id": 1854364,
"node_id": "MDQ6VXNlcjE4NTQzNjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1854364?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tilllt",
"html_url": "https://github.com/tilllt",
"followers_url": "https://api.github.com/users/tilllt/followers",
"following_url": "https://api.github.com/users/tilllt/following{/other_user}",
"gists_url": "https://api.github.com/users/tilllt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tilllt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tilllt/subscriptions",
"organizations_url": "https://api.github.com/users/tilllt/orgs",
"repos_url": "https://api.github.com/users/tilllt/repos",
"events_url": "https://api.github.com/users/tilllt/events{/privacy}",
"received_events_url": "https://api.github.com/users/tilllt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 2
| 2024-11-26T21:13:22
| 2024-12-05T15:15:06
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/openGPT-X/Teuken-7B-instruct-research-v0.4
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7848/reactions",
"total_count": 32,
"+1": 32,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7848/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2719
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2719/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2719/comments
|
https://api.github.com/repos/ollama/ollama/issues/2719/events
|
https://github.com/ollama/ollama/pull/2719
| 2,152,000,317
|
PR_kwDOJ0Z1Ps5nzf0G
| 2,719
|
remove format private key
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-24T00:54:51
| 2024-03-28T18:20:59
| 2024-02-24T01:15:14
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2719",
"html_url": "https://github.com/ollama/ollama/pull/2719",
"diff_url": "https://github.com/ollama/ollama/pull/2719.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2719.patch",
"merged_at": "2024-02-24T01:15:14"
}
|
the utility format/openssh.go is no longer necessary since x/crypto/ssh v0.14.0 introduced MarshalPrivateKey
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2719/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2719/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6511
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6511/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6511/comments
|
https://api.github.com/repos/ollama/ollama/issues/6511/events
|
https://github.com/ollama/ollama/issues/6511
| 2,486,332,769
|
I_kwDOJ0Z1Ps6UMm1h
| 6,511
|
Embedding model text2vec-large-chinese
|
{
"login": "icetech233",
"id": 17383321,
"node_id": "MDQ6VXNlcjE3MzgzMzIx",
"avatar_url": "https://avatars.githubusercontent.com/u/17383321?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/icetech233",
"html_url": "https://github.com/icetech233",
"followers_url": "https://api.github.com/users/icetech233/followers",
"following_url": "https://api.github.com/users/icetech233/following{/other_user}",
"gists_url": "https://api.github.com/users/icetech233/gists{/gist_id}",
"starred_url": "https://api.github.com/users/icetech233/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/icetech233/subscriptions",
"organizations_url": "https://api.github.com/users/icetech233/orgs",
"repos_url": "https://api.github.com/users/icetech233/repos",
"events_url": "https://api.github.com/users/icetech233/events{/privacy}",
"received_events_url": "https://api.github.com/users/icetech233/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-08-26T09:00:11
| 2024-08-26T09:00:11
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
www.modelscope.cn/Jerry0/text2vec-large-chinese
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6511/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6511/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3650
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3650/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3650/comments
|
https://api.github.com/repos/ollama/ollama/issues/3650/events
|
https://github.com/ollama/ollama/issues/3650
| 2,243,361,026
|
I_kwDOJ0Z1Ps6FtvkC
| 3,650
|
Default command R Modelfile template does not respect specification
|
{
"login": "GiovanniGatti",
"id": 1745450,
"node_id": "MDQ6VXNlcjE3NDU0NTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1745450?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/GiovanniGatti",
"html_url": "https://github.com/GiovanniGatti",
"followers_url": "https://api.github.com/users/GiovanniGatti/followers",
"following_url": "https://api.github.com/users/GiovanniGatti/following{/other_user}",
"gists_url": "https://api.github.com/users/GiovanniGatti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/GiovanniGatti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GiovanniGatti/subscriptions",
"organizations_url": "https://api.github.com/users/GiovanniGatti/orgs",
"repos_url": "https://api.github.com/users/GiovanniGatti/repos",
"events_url": "https://api.github.com/users/GiovanniGatti/events{/privacy}",
"received_events_url": "https://api.github.com/users/GiovanniGatti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-04-15T10:55:08
| 2024-04-15T19:10:12
| 2024-04-15T19:10:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
After reading the documentation of [Command R](https://docs.cohere.com/docs/prompting-command-r#components-of-a-structured-prompt) I found it strange that the (mandatory) `<BOS_TOKEN>` wans't specified in the default Modelfile.template [here](https://ollama.com/library/command-r:latest/blobs/42499e38acdf).
I'm not sure where is the best place to report this sort of issue.
### What did you expect to see?
_No response_
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
_No response_
### Architecture
_No response_
### Platform
_No response_
### Ollama version
_No response_
### GPU
_No response_
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3650/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3650/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2210
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2210/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2210/comments
|
https://api.github.com/repos/ollama/ollama/issues/2210/events
|
https://github.com/ollama/ollama/issues/2210
| 2,102,604,557
|
I_kwDOJ0Z1Ps59UzMN
| 2,210
|
Keep models in RAM
|
{
"login": "LeoPiresDeSouza",
"id": 40829469,
"node_id": "MDQ6VXNlcjQwODI5NDY5",
"avatar_url": "https://avatars.githubusercontent.com/u/40829469?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LeoPiresDeSouza",
"html_url": "https://github.com/LeoPiresDeSouza",
"followers_url": "https://api.github.com/users/LeoPiresDeSouza/followers",
"following_url": "https://api.github.com/users/LeoPiresDeSouza/following{/other_user}",
"gists_url": "https://api.github.com/users/LeoPiresDeSouza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LeoPiresDeSouza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LeoPiresDeSouza/subscriptions",
"organizations_url": "https://api.github.com/users/LeoPiresDeSouza/orgs",
"repos_url": "https://api.github.com/users/LeoPiresDeSouza/repos",
"events_url": "https://api.github.com/users/LeoPiresDeSouza/events{/privacy}",
"received_events_url": "https://api.github.com/users/LeoPiresDeSouza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-26T17:37:29
| 2024-01-28T22:29:53
| 2024-01-28T22:29:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am testing llama2:7b models both using ollama and calling direct from a langchain python script.
My models are stored in an Ubuntu server withu 12 cores e 36 Gb of ram, but no GPU.
When I cal the model direct from python, setting memlock parameter to true, my memory usage goes above 6Gb, but when using ollma it stays below 3Gb.
It seams that ollama is not keeping the model entirely in ram, and it is taking a long time to response.
Is there a parameter like memlock to be set in Ollama to make it use my ram extensivelly?
I have installed Ollama using curl https://ollama.ai/install.sh | sh.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2210/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2210/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7947
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7947/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7947/comments
|
https://api.github.com/repos/ollama/ollama/issues/7947/events
|
https://github.com/ollama/ollama/issues/7947
| 2,719,646,444
|
I_kwDOJ0Z1Ps6iGoLs
| 7,947
|
Not using GPU
|
{
"login": "frenzybiscuit",
"id": 190028151,
"node_id": "U_kgDOC1OZdw",
"avatar_url": "https://avatars.githubusercontent.com/u/190028151?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/frenzybiscuit",
"html_url": "https://github.com/frenzybiscuit",
"followers_url": "https://api.github.com/users/frenzybiscuit/followers",
"following_url": "https://api.github.com/users/frenzybiscuit/following{/other_user}",
"gists_url": "https://api.github.com/users/frenzybiscuit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/frenzybiscuit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/frenzybiscuit/subscriptions",
"organizations_url": "https://api.github.com/users/frenzybiscuit/orgs",
"repos_url": "https://api.github.com/users/frenzybiscuit/repos",
"events_url": "https://api.github.com/users/frenzybiscuit/events{/privacy}",
"received_events_url": "https://api.github.com/users/frenzybiscuit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 10
| 2024-12-05T07:49:00
| 2024-12-23T08:05:46
| 2024-12-23T08:05:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have the following setup:
7950x3d (AMD iGPU)
3090 + 2080ti
When using Ollama with open-webui the GPU (3090) gets used BRIEFLY. It starts using the GPU, GPU ramps up to 90% utilization and then it just stops and falls back to the CPU.
I have installed Ollama and built from source on Fedora 41. I have installed the cuda toolkit manually. I call the following environmental variables from bashrc:
```
export PATH=/usr/local/cuda/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH
```
I can build llamacpp from source and it works with CUDA.
I am launching Ollama with the following:
`CUDA_VISIBLE_DEVICES=0 ROCR_VISIBLE_DEVICES=55 OLLAMA_HOST=127.0.0.1 OLLAMA_FLASH_ATTENTION=1 OLLAMA_KV_CACHE_TYPE=q8_0 ollama-0.4.8-rc0/./ollama serve`
CUDA_VISIBLE_DEVICES=0 should force it to just use the 3090. I am using ROCR_VISIBLE_DEVICES=55 (fake number) so it doesn't use the AMD iGPU and fall back to CPU.
Any idea why this setup isn't working?
Ollama keeps briefly showing up on nvidia-smi and then vanishing.
It does show the following error:
`"WARN source=sched.go:646 GPU VRAM usage didn't recover within timeout.`
### OS
Linux
### GPU
Other
### CPU
AMD
### Ollama version
0.4.8-rc0
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7947/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7947/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/151
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/151/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/151/comments
|
https://api.github.com/repos/ollama/ollama/issues/151/events
|
https://github.com/ollama/ollama/pull/151
| 1,814,871,849
|
PR_kwDOJ0Z1Ps5WDKbV
| 151
|
add rm command for models
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-20T22:19:23
| 2023-07-20T23:09:23
| 2023-07-20T23:09:23
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/151",
"html_url": "https://github.com/ollama/ollama/pull/151",
"diff_url": "https://github.com/ollama/ollama/pull/151.diff",
"patch_url": "https://github.com/ollama/ollama/pull/151.patch",
"merged_at": "2023-07-20T23:09:23"
}
|
This change adds an "rm" command so that you can remove models that you don't want anymore. The handler determines if other manifests require a given layer and will save anything still required.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/151/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/151/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/968
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/968/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/968/comments
|
https://api.github.com/repos/ollama/ollama/issues/968/events
|
https://github.com/ollama/ollama/pull/968
| 1,973,577,853
|
PR_kwDOJ0Z1Ps5eZcY3
| 968
|
Use default RoPE params for new models
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-02T06:13:35
| 2023-11-02T15:41:31
| 2023-11-02T15:41:30
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/968",
"html_url": "https://github.com/ollama/ollama/pull/968",
"diff_url": "https://github.com/ollama/ollama/pull/968.diff",
"patch_url": "https://github.com/ollama/ollama/pull/968.patch",
"merged_at": "2023-11-02T15:41:30"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/968/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/968/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7154
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7154/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7154/comments
|
https://api.github.com/repos/ollama/ollama/issues/7154/events
|
https://github.com/ollama/ollama/pull/7154
| 2,576,932,647
|
PR_kwDOJ0Z1Ps5-IV1-
| 7,154
|
update .gitattributes with proper linguist-vendored entry
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-09T20:52:51
| 2024-10-10T00:25:10
| 2024-10-10T00:25:10
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7154",
"html_url": "https://github.com/ollama/ollama/pull/7154",
"diff_url": "https://github.com/ollama/ollama/pull/7154.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7154.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7154/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7154/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1772
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1772/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1772/comments
|
https://api.github.com/repos/ollama/ollama/issues/1772/events
|
https://github.com/ollama/ollama/issues/1772
| 2,064,537,335
|
I_kwDOJ0Z1Ps57Dlb3
| 1,772
|
Metadata field for multimodal models
|
{
"login": "shreyaskarnik",
"id": 311217,
"node_id": "MDQ6VXNlcjMxMTIxNw==",
"avatar_url": "https://avatars.githubusercontent.com/u/311217?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shreyaskarnik",
"html_url": "https://github.com/shreyaskarnik",
"followers_url": "https://api.github.com/users/shreyaskarnik/followers",
"following_url": "https://api.github.com/users/shreyaskarnik/following{/other_user}",
"gists_url": "https://api.github.com/users/shreyaskarnik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shreyaskarnik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shreyaskarnik/subscriptions",
"organizations_url": "https://api.github.com/users/shreyaskarnik/orgs",
"repos_url": "https://api.github.com/users/shreyaskarnik/repos",
"events_url": "https://api.github.com/users/shreyaskarnik/events{/privacy}",
"received_events_url": "https://api.github.com/users/shreyaskarnik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-01-03T19:26:03
| 2024-01-04T01:34:14
| 2024-01-04T00:12:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Would it be possible to add some metadata to the model indicating that it is multimodal? This will help to select the right model in applications that are built on top of the API to support multimodal architecture. I believe this will also help to search through models at https://ollama.ai/library and filter based on multimodal support.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1772/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1772/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8194
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8194/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8194/comments
|
https://api.github.com/repos/ollama/ollama/issues/8194/events
|
https://github.com/ollama/ollama/pull/8194
| 2,753,677,449
|
PR_kwDOJ0Z1Ps6F9uvT
| 8,194
|
Mxyng/next llama
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-21T01:00:23
| 2025-01-10T19:30:24
| 2025-01-10T19:30:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8194",
"html_url": "https://github.com/ollama/ollama/pull/8194",
"diff_url": "https://github.com/ollama/ollama/pull/8194.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8194.patch",
"merged_at": "2025-01-10T19:30:24"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8194/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8194/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6673
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6673/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6673/comments
|
https://api.github.com/repos/ollama/ollama/issues/6673/events
|
https://github.com/ollama/ollama/issues/6673
| 2,509,963,694
|
I_kwDOJ0Z1Ps6VmwGu
| 6,673
|
Ollama-rocm on Kubernetes with shared AMD GPU seems to have problems allocating vram
|
{
"login": "kubax",
"id": 1083100,
"node_id": "MDQ6VXNlcjEwODMxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1083100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kubax",
"html_url": "https://github.com/kubax",
"followers_url": "https://api.github.com/users/kubax/followers",
"following_url": "https://api.github.com/users/kubax/following{/other_user}",
"gists_url": "https://api.github.com/users/kubax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kubax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kubax/subscriptions",
"organizations_url": "https://api.github.com/users/kubax/orgs",
"repos_url": "https://api.github.com/users/kubax/repos",
"events_url": "https://api.github.com/users/kubax/events{/privacy}",
"received_events_url": "https://api.github.com/users/kubax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-09-06T09:19:56
| 2024-09-06T10:58:54
| 2024-09-06T10:58:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
i'm pretty new to Ollama, and recently replaced my RX580 with a RX7600 to be able to use Ollama in Kubernetes with ROCm.
When i run Ollama on Arch directly with ROCm support everything works great and is realy snappy.
But when i run Ollama through Kubernetes with the AMD Plugin to share GPUs, it seems to not be able to allocate vram.
Here is my Kubernetes deployment:
```
apiVersion: apps/v1
kind: Deployment
metadata:
labels:
app: ollama
name: ollama
namespace: ollama
spec:
replicas: 1
selector:
matchLabels:
app: ollama
strategy:
type: Recreate
template:
metadata:
labels:
app: ollama
spec:
containers:
- env:
- name: OLLAMA_HOST
value: 0.0.0.0
- name: OLLAMA_KEEP_ALIVE
value: 24h
- name: HSA_OVERRIDE_GFX_VERSION
value: "10.3.0"
- name: AMD_SERIALIZE_KERNEL
value: "3"
- name: OLLAMA_LLM_LIBRARY
value: "rocm_v60002"
# image: ollama/ollama:latest
image: ollama/ollama:rocm
# image: bergutman/ollama-rocm
name: ollama
ports:
- containerPort: 11434
hostPort: 7869
protocol: TCP
tty: true
volumeMounts:
- mountPath: /code
name: ollama-claim0
- mountPath: /root/.ollama
name: ollama-claim1
resources:
limits:
amd.com/gpu: 1 # requesting a GPU
restartPolicy: Always
volumes:
- name: ollama-claim0
persistentVolumeClaim:
claimName: ollama-claim0
- name: ollama-claim1
persistentVolumeClaim:
claimName: ollama-claim1
```
When Ollama tries to load the modle into vram nvtop does report 0byte vram usage and Ollama crashes with a segfault.
Here are the last lines wich seem to be interessting to me.
```
time=2024-09-06T10:51:05.040+02:00 level=INFO source=server.go:625 msg="waiting for server to become available" status="llm server loading model"
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 ROCm devices:
Device 0: AMD Radeon™ RX 7600 XT, compute capability 10.3, VMM: no
llm_load_tensors: ggml ctx size = 0.27 MiB
time=2024-09-06T10:51:46.399+02:00 level=INFO source=server.go:625 msg="waiting for server to become available" status="llm server not responding"
time=2024-09-06T10:51:47.553+02:00 level=ERROR source=sched.go:456 msg="error loading llama server" error="llama runner process has terminated: signal: segmentation fault (core dumped)"
[GIN] 2024/09/06 - 10:51:47 | 500 | 43.035446402s | 127.0.0.1 | POST "/api/chat"
time=2024-09-06T10:51:52.553+02:00 level=WARN source=sched.go:647 msg="gpu VRAM usage didn't recover within timeout" seconds=5.000619052 model=/root/.ollama/models/blobs/sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246
time=2024-09-06T10:51:52.803+02:00 level=WARN source=sched.go:647 msg="gpu VRAM usage didn't recover within timeout" seconds=5.250422078 model=/root/.ollama/models/blobs/sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246
time=2024-09-06T10:51:53.054+02:00 level=WARN source=sched.go:647 msg="gpu VRAM usage didn't recover within timeout" seconds=5.500667507 model=/root/.ollama/models/blobs/sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246
[GIN] 2024/09/06 - 11:13:11 | 200 | 46.378µs | 127.0.0.1 | GET "/api/version"
```
Is there something i can try to fix this?
### OS
Linux, Docker
### GPU
AMD
### CPU
AMD
### Ollama version
0.3.9
|
{
"login": "kubax",
"id": 1083100,
"node_id": "MDQ6VXNlcjEwODMxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1083100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kubax",
"html_url": "https://github.com/kubax",
"followers_url": "https://api.github.com/users/kubax/followers",
"following_url": "https://api.github.com/users/kubax/following{/other_user}",
"gists_url": "https://api.github.com/users/kubax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kubax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kubax/subscriptions",
"organizations_url": "https://api.github.com/users/kubax/orgs",
"repos_url": "https://api.github.com/users/kubax/repos",
"events_url": "https://api.github.com/users/kubax/events{/privacy}",
"received_events_url": "https://api.github.com/users/kubax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6673/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6673/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1697
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1697/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1697/comments
|
https://api.github.com/repos/ollama/ollama/issues/1697/events
|
https://github.com/ollama/ollama/pull/1697
| 2,055,165,058
|
PR_kwDOJ0Z1Ps5ityR_
| 1,697
|
Add windows native build instructions
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-24T17:05:29
| 2024-01-06T03:34:24
| 2024-01-06T03:34:21
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1697",
"html_url": "https://github.com/ollama/ollama/pull/1697",
"diff_url": "https://github.com/ollama/ollama/pull/1697.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1697.patch",
"merged_at": "2024-01-06T03:34:21"
}
|
Fixes #1694
Note: the resulting native windows binary isn't particularly user friendly right now as it requires setting your PATH deep into the source tree to pick up the dependent DLLs. I'm working on another change that will address this. I'll keep this PR as a draft until that's ready.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1697/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1697/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4570
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4570/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4570/comments
|
https://api.github.com/repos/ollama/ollama/issues/4570/events
|
https://github.com/ollama/ollama/pull/4570
| 2,309,546,545
|
PR_kwDOJ0Z1Ps5wJoQz
| 4,570
|
lint some of the things
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-22T04:31:33
| 2024-06-04T20:27:06
| 2024-06-04T20:27:05
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4570",
"html_url": "https://github.com/ollama/ollama/pull/4570",
"diff_url": "https://github.com/ollama/ollama/pull/4570.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4570.patch",
"merged_at": "2024-06-04T20:27:05"
}
|
now that ollama uses go1.22, `x/exp/slices` can be replaced with regular `slices`
enable some useful linters:
- intrange is a 1.22 feature which simplifies `for i := 0; i < n; i++ { }` with `for i := range n { }`
- testifylint to find bad testify assertions
- unconvert to find unnecessary type conversions
- ~usestdlibvars to find values that can be replaced with stdlib vars, e.g. `OPTIONS` with `http.MethodOptions`~
- wastedassign to find unnecessary assignments
- whitespace is find unnecessary line
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4570/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4570/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7703
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7703/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7703/comments
|
https://api.github.com/repos/ollama/ollama/issues/7703/events
|
https://github.com/ollama/ollama/issues/7703
| 2,664,794,748
|
I_kwDOJ0Z1Ps6e1Yp8
| 7,703
|
Clarify JSONL as the Returned Format for Streaming JSON Objects
|
{
"login": "gwpl",
"id": 221403,
"node_id": "MDQ6VXNlcjIyMTQwMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/221403?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gwpl",
"html_url": "https://github.com/gwpl",
"followers_url": "https://api.github.com/users/gwpl/followers",
"following_url": "https://api.github.com/users/gwpl/following{/other_user}",
"gists_url": "https://api.github.com/users/gwpl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gwpl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gwpl/subscriptions",
"organizations_url": "https://api.github.com/users/gwpl/orgs",
"repos_url": "https://api.github.com/users/gwpl/repos",
"events_url": "https://api.github.com/users/gwpl/events{/privacy}",
"received_events_url": "https://api.github.com/users/gwpl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
}
] |
open
| false
| null |
[] | null | 3
| 2024-11-16T19:01:43
| 2024-11-20T13:41:59
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
**Current Documentation**:
[API documentation](https://github.com/ollama/ollama/blob/4759d879f2376ffb9b82f296e442ec8ef137f27b/docs/api.md?plain=1#L79) states:
> A stream of JSON objects is returned.
**Proposal**:
Specify the format explicitly as:
> A stream of JSON objects in [JSON Lines (JSONL)] format is returned.
**Reasoning**:
By constraining the format to JSONL (i.e., each JSON object is serialized on a separate line as per [jsonlines.org](https://jsonlines.org/)), parsing implementations become simpler. Instead of relying on complex JSON parsing to determine object boundaries—especially when JSON content may include brackets—developers can leverage line-buffered iterators to process incoming chunks more efficiently.
This clarification aligns the documentation with common parsing practices for streaming JSON data and reduces potential ambiguity.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7703/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7703/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5890
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5890/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5890/comments
|
https://api.github.com/repos/ollama/ollama/issues/5890/events
|
https://github.com/ollama/ollama/issues/5890
| 2,426,109,102
|
I_kwDOJ0Z1Ps6Qm3yu
| 5,890
|
Assistant doesn't continue from its last message on 0.2.8
|
{
"login": "josegtmonteiro",
"id": 169712316,
"node_id": "U_kgDOCh2avA",
"avatar_url": "https://avatars.githubusercontent.com/u/169712316?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/josegtmonteiro",
"html_url": "https://github.com/josegtmonteiro",
"followers_url": "https://api.github.com/users/josegtmonteiro/followers",
"following_url": "https://api.github.com/users/josegtmonteiro/following{/other_user}",
"gists_url": "https://api.github.com/users/josegtmonteiro/gists{/gist_id}",
"starred_url": "https://api.github.com/users/josegtmonteiro/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/josegtmonteiro/subscriptions",
"organizations_url": "https://api.github.com/users/josegtmonteiro/orgs",
"repos_url": "https://api.github.com/users/josegtmonteiro/repos",
"events_url": "https://api.github.com/users/josegtmonteiro/events{/privacy}",
"received_events_url": "https://api.github.com/users/josegtmonteiro/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 8
| 2024-07-23T20:53:56
| 2024-07-25T01:14:29
| 2024-07-25T01:14:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
@jmorganca , thanks for the quick fix on the https://github.com/ollama/ollama/issues/5775
However, testing here with 0.2.8. Still not able to continue the message.
With the same example I mentioned before, using the OLLAMA_DEBUG I'm able to see the final prompt on the console, it is:
prompt="<|start_header_id|>system<|end_header_id|>\n\nYou are a helpful assistant.<|eot_id|><|start_header_id|>user<|end_header_id|>\n\nHello, how are you today?<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\nThanks for asking! I'm <|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n"
The messages I'm passing to the chat endpoint are:
history = [ {"role": "system", "content": 'You are a helpful assistant.'}, {"role": "user", "content": "Hello, how are you today?"}, {"role": "assistant", "content": "Thanks for asking! I'm "}, ]
Not sure it has any difference, but I'm testing with "llama3-groq-tool-use:8b-q8_0" model. Was the fix just made for some specific model? or it should be applied to all of them?
Please let me know if more info/tests needed from my side.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.2.8
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5890/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5890/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7468
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7468/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7468/comments
|
https://api.github.com/repos/ollama/ollama/issues/7468/events
|
https://github.com/ollama/ollama/pull/7468
| 2,630,254,140
|
PR_kwDOJ0Z1Ps6AsIEs
| 7,468
|
Add a command to clear the screen
|
{
"login": "cootshk",
"id": 83678457,
"node_id": "MDQ6VXNlcjgzNjc4NDU3",
"avatar_url": "https://avatars.githubusercontent.com/u/83678457?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cootshk",
"html_url": "https://github.com/cootshk",
"followers_url": "https://api.github.com/users/cootshk/followers",
"following_url": "https://api.github.com/users/cootshk/following{/other_user}",
"gists_url": "https://api.github.com/users/cootshk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cootshk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cootshk/subscriptions",
"organizations_url": "https://api.github.com/users/cootshk/orgs",
"repos_url": "https://api.github.com/users/cootshk/repos",
"events_url": "https://api.github.com/users/cootshk/events{/privacy}",
"received_events_url": "https://api.github.com/users/cootshk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-11-02T06:18:36
| 2024-11-12T00:43:40
| 2024-11-12T00:43:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7468",
"html_url": "https://github.com/ollama/ollama/pull/7468",
"diff_url": "https://github.com/ollama/ollama/pull/7468.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7468.patch",
"merged_at": null
}
|
`/clearscreen` clears the screen
Kind of like Ctrl+L
I use [ansi escape codes](https://gist.github.com/fnky/458719343aabd01cfb17a3a4f7296797) because I don't want to deal with the buffer
I made this because I have Ctrl+L bound to something else and wanted a quick slash command similar to `/clear`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7468/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7468/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5564
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5564/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5564/comments
|
https://api.github.com/repos/ollama/ollama/issues/5564/events
|
https://github.com/ollama/ollama/issues/5564
| 2,397,372,084
|
I_kwDOJ0Z1Ps6O5P60
| 5,564
|
token/second after ollama finish request
|
{
"login": "zinwelzl",
"id": 113045180,
"node_id": "U_kgDOBrzuvA",
"avatar_url": "https://avatars.githubusercontent.com/u/113045180?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zinwelzl",
"html_url": "https://github.com/zinwelzl",
"followers_url": "https://api.github.com/users/zinwelzl/followers",
"following_url": "https://api.github.com/users/zinwelzl/following{/other_user}",
"gists_url": "https://api.github.com/users/zinwelzl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zinwelzl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zinwelzl/subscriptions",
"organizations_url": "https://api.github.com/users/zinwelzl/orgs",
"repos_url": "https://api.github.com/users/zinwelzl/repos",
"events_url": "https://api.github.com/users/zinwelzl/events{/privacy}",
"received_events_url": "https://api.github.com/users/zinwelzl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-07-09T07:25:31
| 2024-07-09T07:29:10
| 2024-07-09T07:29:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Can you add token/second after ollama finish request at the end?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5564/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5564/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6208
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6208/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6208/comments
|
https://api.github.com/repos/ollama/ollama/issues/6208/events
|
https://github.com/ollama/ollama/pull/6208
| 2,451,472,956
|
PR_kwDOJ0Z1Ps53mm3r
| 6,208
|
update llama.cpp submodule to `1e6f6554`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-06T18:35:14
| 2024-08-06T19:11:47
| 2024-08-06T19:11:46
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6208",
"html_url": "https://github.com/ollama/ollama/pull/6208",
"diff_url": "https://github.com/ollama/ollama/pull/6208.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6208.patch",
"merged_at": "2024-08-06T19:11:45"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6208/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6208/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6625
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6625/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6625/comments
|
https://api.github.com/repos/ollama/ollama/issues/6625/events
|
https://github.com/ollama/ollama/issues/6625
| 2,504,427,710
|
I_kwDOJ0Z1Ps6VRoi-
| 6,625
|
Support for HuatuoGPT-Vision-7B
|
{
"login": "Chuyun-Shen",
"id": 59833738,
"node_id": "MDQ6VXNlcjU5ODMzNzM4",
"avatar_url": "https://avatars.githubusercontent.com/u/59833738?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Chuyun-Shen",
"html_url": "https://github.com/Chuyun-Shen",
"followers_url": "https://api.github.com/users/Chuyun-Shen/followers",
"following_url": "https://api.github.com/users/Chuyun-Shen/following{/other_user}",
"gists_url": "https://api.github.com/users/Chuyun-Shen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Chuyun-Shen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Chuyun-Shen/subscriptions",
"organizations_url": "https://api.github.com/users/Chuyun-Shen/orgs",
"repos_url": "https://api.github.com/users/Chuyun-Shen/repos",
"events_url": "https://api.github.com/users/Chuyun-Shen/events{/privacy}",
"received_events_url": "https://api.github.com/users/Chuyun-Shen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-09-04T06:32:21
| 2024-09-04T06:32:21
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Can you support the HuatuoGPT-Vision-7B model, or do you have any advice on how I can deploy it on GPU?
Model: [FreedomIntelligence/HuatuoGPT-Vision-7B](https://huggingface.co/FreedomIntelligence/HuatuoGPT-Vision-7B)
This model is built with Llava and Qwen2, and their CLI code is here: https://github.com/FreedomIntelligence/HuatuoGPT-Vision/blob/main/cli.py
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6625/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6625/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2607
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2607/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2607/comments
|
https://api.github.com/repos/ollama/ollama/issues/2607/events
|
https://github.com/ollama/ollama/issues/2607
| 2,143,655,528
|
I_kwDOJ0Z1Ps5_xZZo
| 2,607
|
Does not work on Mac? Causing System Crashes building and running
|
{
"login": "kuro337",
"id": 65412787,
"node_id": "MDQ6VXNlcjY1NDEyNzg3",
"avatar_url": "https://avatars.githubusercontent.com/u/65412787?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kuro337",
"html_url": "https://github.com/kuro337",
"followers_url": "https://api.github.com/users/kuro337/followers",
"following_url": "https://api.github.com/users/kuro337/following{/other_user}",
"gists_url": "https://api.github.com/users/kuro337/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kuro337/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kuro337/subscriptions",
"organizations_url": "https://api.github.com/users/kuro337/orgs",
"repos_url": "https://api.github.com/users/kuro337/repos",
"events_url": "https://api.github.com/users/kuro337/events{/privacy}",
"received_events_url": "https://api.github.com/users/kuro337/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-02-20T06:47:31
| 2024-03-12T21:32:06
| 2024-03-12T21:32:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is Ollama not meant to be run on ARM macs?
I followed these steps
```bash
git clone git@github.com:ollama/ollama.git
cd ollama
go generate ./...
go build .
./ollama
# First time running
[1] 1651 killed ./ollama
# After running again
./ollama
# hangs indefinitely
```
Then it hands indefinitely - I am not able to Terminate it and even using `kill` does not work
```bash
./ollama
^C^C^C^C
# or any combination of cancels/sigterms
```
Deleting it for now, will try to run on my Ubuntu with some clarification
Is this the way to run and serve a Model over HTTP?
```bash
# steps to run the REST API?
./ollama serve
./ollama run mixtral:8x7b-instruct-v0.1-q5_1
curl http://localhost:11434/api/generate -d '{
"model": "mixtral",
"messages": [
{ "role": "system", "content": "Explain using Async in Scala?" }
]
}'
```
Thank you , would appreciate any pointers
I have the latest version of Go , running on a Macbook with 128gb memory
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2607/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2607/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7901
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7901/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7901/comments
|
https://api.github.com/repos/ollama/ollama/issues/7901/events
|
https://github.com/ollama/ollama/issues/7901
| 2,708,360,256
|
I_kwDOJ0Z1Ps6hbkxA
| 7,901
|
Error: max retries exceeded: unexpected EOF
|
{
"login": "szzhh",
"id": 78521539,
"node_id": "MDQ6VXNlcjc4NTIxNTM5",
"avatar_url": "https://avatars.githubusercontent.com/u/78521539?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/szzhh",
"html_url": "https://github.com/szzhh",
"followers_url": "https://api.github.com/users/szzhh/followers",
"following_url": "https://api.github.com/users/szzhh/following{/other_user}",
"gists_url": "https://api.github.com/users/szzhh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/szzhh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/szzhh/subscriptions",
"organizations_url": "https://api.github.com/users/szzhh/orgs",
"repos_url": "https://api.github.com/users/szzhh/repos",
"events_url": "https://api.github.com/users/szzhh/events{/privacy}",
"received_events_url": "https://api.github.com/users/szzhh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-12-01T03:04:43
| 2024-12-02T11:42:56
| 2024-12-02T11:42:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I pull the llama3.1:405b model, ' Error: max retries exceeded: unexpected EOF ' often appears. But usually, I can continue downloading instead of re-downloading. But I encountered a problem yesterday. I downloaded more than 200g of the model, but after the error, I had to re-download the command, and the model's ID and size also changed. I have encountered this problem before, as shown in the figure below. I don't know why I encounter such a problem. Is there any way I can continue downloading?



### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.6
|
{
"login": "szzhh",
"id": 78521539,
"node_id": "MDQ6VXNlcjc4NTIxNTM5",
"avatar_url": "https://avatars.githubusercontent.com/u/78521539?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/szzhh",
"html_url": "https://github.com/szzhh",
"followers_url": "https://api.github.com/users/szzhh/followers",
"following_url": "https://api.github.com/users/szzhh/following{/other_user}",
"gists_url": "https://api.github.com/users/szzhh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/szzhh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/szzhh/subscriptions",
"organizations_url": "https://api.github.com/users/szzhh/orgs",
"repos_url": "https://api.github.com/users/szzhh/repos",
"events_url": "https://api.github.com/users/szzhh/events{/privacy}",
"received_events_url": "https://api.github.com/users/szzhh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7901/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7901/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1934
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1934/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1934/comments
|
https://api.github.com/repos/ollama/ollama/issues/1934/events
|
https://github.com/ollama/ollama/pull/1934
| 2,077,702,149
|
PR_kwDOJ0Z1Ps5j3UZt
| 1,934
|
fix build and lint
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-11T22:20:28
| 2024-01-11T22:36:21
| 2024-01-11T22:36:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1934",
"html_url": "https://github.com/ollama/ollama/pull/1934",
"diff_url": "https://github.com/ollama/ollama/pull/1934.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1934.patch",
"merged_at": "2024-01-11T22:36:21"
}
|
x/exp/slices is compatible with 1.20 while slices is not
also fix llm/llm.go where fmt is used but not imported
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1934/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1934/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3797
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3797/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3797/comments
|
https://api.github.com/repos/ollama/ollama/issues/3797/events
|
https://github.com/ollama/ollama/issues/3797
| 2,255,066,913
|
I_kwDOJ0Z1Ps6GaZch
| 3,797
|
hope add more embedding models
|
{
"login": "zhangzhongpeng02",
"id": 130722043,
"node_id": "U_kgDOB8qo-w",
"avatar_url": "https://avatars.githubusercontent.com/u/130722043?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhangzhongpeng02",
"html_url": "https://github.com/zhangzhongpeng02",
"followers_url": "https://api.github.com/users/zhangzhongpeng02/followers",
"following_url": "https://api.github.com/users/zhangzhongpeng02/following{/other_user}",
"gists_url": "https://api.github.com/users/zhangzhongpeng02/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhangzhongpeng02/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhangzhongpeng02/subscriptions",
"organizations_url": "https://api.github.com/users/zhangzhongpeng02/orgs",
"repos_url": "https://api.github.com/users/zhangzhongpeng02/repos",
"events_url": "https://api.github.com/users/zhangzhongpeng02/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhangzhongpeng02/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-04-21T13:11:37
| 2024-06-04T02:28:57
| 2024-06-04T02:28:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I hope to add more embedding models, such as "bge-large-zh-v1.5"、”bce-embedding-base“、"gte-large"
|
{
"login": "zhangzhongpeng02",
"id": 130722043,
"node_id": "U_kgDOB8qo-w",
"avatar_url": "https://avatars.githubusercontent.com/u/130722043?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhangzhongpeng02",
"html_url": "https://github.com/zhangzhongpeng02",
"followers_url": "https://api.github.com/users/zhangzhongpeng02/followers",
"following_url": "https://api.github.com/users/zhangzhongpeng02/following{/other_user}",
"gists_url": "https://api.github.com/users/zhangzhongpeng02/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhangzhongpeng02/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhangzhongpeng02/subscriptions",
"organizations_url": "https://api.github.com/users/zhangzhongpeng02/orgs",
"repos_url": "https://api.github.com/users/zhangzhongpeng02/repos",
"events_url": "https://api.github.com/users/zhangzhongpeng02/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhangzhongpeng02/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3797/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3797/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8259
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8259/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8259/comments
|
https://api.github.com/repos/ollama/ollama/issues/8259/events
|
https://github.com/ollama/ollama/pull/8259
| 2,761,205,050
|
PR_kwDOJ0Z1Ps6GVI1G
| 8,259
|
create a default, non-root user for the container image
|
{
"login": "chgl",
"id": 5307555,
"node_id": "MDQ6VXNlcjUzMDc1NTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/5307555?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chgl",
"html_url": "https://github.com/chgl",
"followers_url": "https://api.github.com/users/chgl/followers",
"following_url": "https://api.github.com/users/chgl/following{/other_user}",
"gists_url": "https://api.github.com/users/chgl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chgl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chgl/subscriptions",
"organizations_url": "https://api.github.com/users/chgl/orgs",
"repos_url": "https://api.github.com/users/chgl/repos",
"events_url": "https://api.github.com/users/chgl/events{/privacy}",
"received_events_url": "https://api.github.com/users/chgl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 6
| 2024-12-27T19:24:33
| 2025-01-20T15:56:52
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8259",
"html_url": "https://github.com/ollama/ollama/pull/8259",
"diff_url": "https://github.com/ollama/ollama/pull/8259.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8259.patch",
"merged_at": null
}
|
Closes #5986
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8259/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8259/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8677
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8677/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8677/comments
|
https://api.github.com/repos/ollama/ollama/issues/8677/events
|
https://github.com/ollama/ollama/issues/8677
| 2,819,603,374
|
I_kwDOJ0Z1Ps6oD7uu
| 8,677
|
Wrote scripts to import gguf files/folder
|
{
"login": "gl2007",
"id": 4097227,
"node_id": "MDQ6VXNlcjQwOTcyMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/4097227?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gl2007",
"html_url": "https://github.com/gl2007",
"followers_url": "https://api.github.com/users/gl2007/followers",
"following_url": "https://api.github.com/users/gl2007/following{/other_user}",
"gists_url": "https://api.github.com/users/gl2007/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gl2007/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gl2007/subscriptions",
"organizations_url": "https://api.github.com/users/gl2007/orgs",
"repos_url": "https://api.github.com/users/gl2007/repos",
"events_url": "https://api.github.com/users/gl2007/events{/privacy}",
"received_events_url": "https://api.github.com/users/gl2007/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-30T00:09:02
| 2025-01-30T00:09:02
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Don't see a "discussion" tab like I see for other repos, so just creating an issue.
Had a bunch of gguf's in a folder, so wrote 2 scripts (windows and shell) to import a single gguf and all ggufs in a given folder.
Don't know how to get a PR in but I can attach them here is any of you think they are useful.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8677/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8677/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5796
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5796/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5796/comments
|
https://api.github.com/repos/ollama/ollama/issues/5796/events
|
https://github.com/ollama/ollama/issues/5796
| 2,419,278,016
|
I_kwDOJ0Z1Ps6QM0DA
| 5,796
|
Streaming for tool calls is unsupported
|
{
"login": "vertrue",
"id": 30557724,
"node_id": "MDQ6VXNlcjMwNTU3NzI0",
"avatar_url": "https://avatars.githubusercontent.com/u/30557724?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vertrue",
"html_url": "https://github.com/vertrue",
"followers_url": "https://api.github.com/users/vertrue/followers",
"following_url": "https://api.github.com/users/vertrue/following{/other_user}",
"gists_url": "https://api.github.com/users/vertrue/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vertrue/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vertrue/subscriptions",
"organizations_url": "https://api.github.com/users/vertrue/orgs",
"repos_url": "https://api.github.com/users/vertrue/repos",
"events_url": "https://api.github.com/users/vertrue/events{/privacy}",
"received_events_url": "https://api.github.com/users/vertrue/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 38
| 2024-07-19T16:09:55
| 2024-11-30T06:41:17
| 2024-11-28T02:40:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi everyone!
I am trying to use tools in requests to `llama3-groq-tool-use:70b`. Here is simple code in Python using langchain==0.2.9:
```
from langchain_core.tools import tool
from langchain_openai import ChatOpenAI
from langchain.prompts import (
ChatPromptTemplate,
MessagesPlaceholder,
)
from langchain.agents import AgentExecutor, create_openai_tools_agent
@tool
def function_1(a: int, b: int) -> int:
"""uses function function_1 for arguments a and b."""
return a % b + 2
@tool
def function_2(a: int, b: int) -> int:
"""uses function function_2 for arguments a and b."""
return a * b + 1
tools = [function_1, function_2]
llm = ChatOpenAI(
model="llama3-groq-tool-use:70b",
temperature=0,
)
default_prompt = ChatPromptTemplate.from_messages(
[
("system", "You are a helpful AI assistant."),
MessagesPlaceholder("chat_history", optional=True),
("human", "{input}"),
MessagesPlaceholder("agent_scratchpad"),
]
)
input_message = "What is function_1(10, 11)? Also what is function_2(10, 11)?"
agent = create_openai_tools_agent(
llm=llm,
tools=tools,
prompt=default_prompt
)
agent_executor = AgentExecutor(
agent=agent,
tools=tools,
verbose=True,
return_intermediate_steps=False,
)
res = agent_executor.invoke({"input": input_message})
print(res)
```
The result is following:
```
> Entering new AgentExecutor chain...
<tool_call>
{"id": 0, "name": "function_1", "arguments": {"a": 10, "b": 11}}
</tool_call>
<tool_call>
{"id": 1, "name": "function_2", "arguments": {"a": 10, "b": 11}}
</tool_call>
> Finished chain.
{
'input': 'What is function_1(10, 11)? Also what is function_2(10, 11)?',
'output': '<tool_call>\n{"id": 0, "name": "function_1", "arguments": {"a": 10, "b": 11}}\n</tool_call>\n<tool_call>\n{"id": 1, "name": "function_2", "arguments": {"a": 10, "b": 11}}\n</tool_call>'
}
```
If I am using `langchain_community.chat_models.ollama.ChatOllama` it output the same.
But if I use same model (`llama3-groq-70b-8192-tool-use-preview`) with groq OpenAI Compatible API, it uses tools and invokes the functions, output below:
```
> Entering new AgentExecutor chain...
Invoking: `function_1` with `{'a': 10, 'b': 11}`
12
Invoking: `function_2` with `{'a': 10, 'b': 11}`
111The result of function_1(10, 11) is 12, and the result of function_2(10, 11) is 111.
> Finished chain.
{
'input': 'What is function_1(10, 11)? Also what is function_2(10, 11)?',
'output': 'The result of function_1(10, 11) is 12, and the result of function_2(10, 11) is 111.'
}
```
Is it expected behaviour or this problem is it still in progress?
Many thanks
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.2.7
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5796/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5796/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7661
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7661/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7661/comments
|
https://api.github.com/repos/ollama/ollama/issues/7661/events
|
https://github.com/ollama/ollama/issues/7661
| 2,657,344,510
|
I_kwDOJ0Z1Ps6eY9v-
| 7,661
|
How does ollam support the input of long text quantity
|
{
"login": "smileyboy2019",
"id": 59221294,
"node_id": "MDQ6VXNlcjU5MjIxMjk0",
"avatar_url": "https://avatars.githubusercontent.com/u/59221294?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/smileyboy2019",
"html_url": "https://github.com/smileyboy2019",
"followers_url": "https://api.github.com/users/smileyboy2019/followers",
"following_url": "https://api.github.com/users/smileyboy2019/following{/other_user}",
"gists_url": "https://api.github.com/users/smileyboy2019/gists{/gist_id}",
"starred_url": "https://api.github.com/users/smileyboy2019/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/smileyboy2019/subscriptions",
"organizations_url": "https://api.github.com/users/smileyboy2019/orgs",
"repos_url": "https://api.github.com/users/smileyboy2019/repos",
"events_url": "https://api.github.com/users/smileyboy2019/events{/privacy}",
"received_events_url": "https://api.github.com/users/smileyboy2019/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-14T02:29:40
| 2024-11-14T22:54:18
| 2024-11-14T22:54:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The large model supports 128k data input, which is equivalent to supporting hundreds of thousands of words. What is the maximum number of words that the ollama URL can support for input? Can you input hundreds of thousands of words.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7661/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7661/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/768
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/768/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/768/comments
|
https://api.github.com/repos/ollama/ollama/issues/768/events
|
https://github.com/ollama/ollama/pull/768
| 1,940,368,166
|
PR_kwDOJ0Z1Ps5cp4TZ
| 768
|
fix memory check
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-12T16:34:57
| 2023-10-16T19:42:42
| 2023-10-16T19:42:41
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/768",
"html_url": "https://github.com/ollama/ollama/pull/768",
"diff_url": "https://github.com/ollama/ollama/pull/768.diff",
"patch_url": "https://github.com/ollama/ollama/pull/768.patch",
"merged_at": "2023-10-16T19:42:41"
}
|
only do a system memory check on macos which has unified memory. on other platforms, rely on the vram offloading
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/768/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/768/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/982
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/982/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/982/comments
|
https://api.github.com/repos/ollama/ollama/issues/982/events
|
https://github.com/ollama/ollama/pull/982
| 1,975,233,953
|
PR_kwDOJ0Z1Ps5efF58
| 982
|
Set `NumKeep` to `4` by default
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-03T00:14:31
| 2023-11-03T00:26:12
| 2023-11-03T00:26:12
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/982",
"html_url": "https://github.com/ollama/ollama/pull/982",
"diff_url": "https://github.com/ollama/ollama/pull/982.diff",
"patch_url": "https://github.com/ollama/ollama/pull/982.patch",
"merged_at": "2023-11-03T00:26:11"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/982/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/982/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1738
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1738/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1738/comments
|
https://api.github.com/repos/ollama/ollama/issues/1738/events
|
https://github.com/ollama/ollama/issues/1738
| 2,060,387,918
|
I_kwDOJ0Z1Ps56zwZO
| 1,738
|
Scope of Ollama,
|
{
"login": "Luxadevi",
"id": 116653852,
"node_id": "U_kgDOBvP_HA",
"avatar_url": "https://avatars.githubusercontent.com/u/116653852?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Luxadevi",
"html_url": "https://github.com/Luxadevi",
"followers_url": "https://api.github.com/users/Luxadevi/followers",
"following_url": "https://api.github.com/users/Luxadevi/following{/other_user}",
"gists_url": "https://api.github.com/users/Luxadevi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Luxadevi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Luxadevi/subscriptions",
"organizations_url": "https://api.github.com/users/Luxadevi/orgs",
"repos_url": "https://api.github.com/users/Luxadevi/repos",
"events_url": "https://api.github.com/users/Luxadevi/events{/privacy}",
"received_events_url": "https://api.github.com/users/Luxadevi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 6
| 2023-12-29T20:37:02
| 2024-11-19T17:56:47
| 2024-01-25T22:58:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Could you tell me more about the scope of Ollama, you guys build it around the llama.cpp stack and added an API and other tools on top of that.
GGUF is pretty stable but there are some other formats on the horizon.
I would like to add EXL2 formatting to my app but since this is a companion app for ollama I was actually questioning where to add this functionality. It would be pretty easy to just add support for it on my end with some local code.
But to be honest, I wouldn't mind building this within Ollama. P
So my question is would you be interested if I added transformers/c-transformers to Ollama.
I'm very interested in your answer and what your long term goals would be for Ollama.
Happy holidays!
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1738/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1738/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3529
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3529/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3529/comments
|
https://api.github.com/repos/ollama/ollama/issues/3529/events
|
https://github.com/ollama/ollama/pull/3529
| 2,230,010,991
|
PR_kwDOJ0Z1Ps5r8lpI
| 3,529
|
Add metrics endpoint and request metrics
|
{
"login": "amila-ku",
"id": 12775690,
"node_id": "MDQ6VXNlcjEyNzc1Njkw",
"avatar_url": "https://avatars.githubusercontent.com/u/12775690?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amila-ku",
"html_url": "https://github.com/amila-ku",
"followers_url": "https://api.github.com/users/amila-ku/followers",
"following_url": "https://api.github.com/users/amila-ku/following{/other_user}",
"gists_url": "https://api.github.com/users/amila-ku/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amila-ku/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amila-ku/subscriptions",
"organizations_url": "https://api.github.com/users/amila-ku/orgs",
"repos_url": "https://api.github.com/users/amila-ku/repos",
"events_url": "https://api.github.com/users/amila-ku/events{/privacy}",
"received_events_url": "https://api.github.com/users/amila-ku/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2024-04-07T23:39:42
| 2024-09-30T19:52:23
| 2024-09-30T19:52:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3529",
"html_url": "https://github.com/ollama/ollama/pull/3529",
"diff_url": "https://github.com/ollama/ollama/pull/3529.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3529.patch",
"merged_at": null
}
|
Resolves https://github.com/ollama/ollama/issues/3144
This pull request is to add /metrics endpoint and generally used metrics.
It exposes default prometheus metrics and custom metrics for request endpoints.
This PR does not try to cover all metrics to keep it simple. If this looks good. I could add few more that will be useful.
How to test:
once Ollama server is running pull a model and list(or any other Ollama actions)
```
curl http://127.0.0.1:11434/metrics
```
example custom metrics(not all are shown since i tried only few commands):
```
curl http://127.0.0.1:11434/metrics | grep -i ollama
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 6664 0 6664 0 0 519k 0 --:--:-- --:--:-- --:--:-- 542k
# HELP ollama_model_list_requests_total The total number of model list requets that have been attempted.
# TYPE ollama_model_list_requests_total counter
ollama_model_list_requests_total{action="list",status="OK",status_code="200"} 1
# HELP ollama_model_pull_requests_total The total number of model pulls that have been attempted.
# TYPE ollama_model_pull_requests_total counter
ollama_model_pull_requests_total{action="pull",status="OK",status_code="200"} 1
# HELP ollama_model_requests_total The total number of requests on all endpoints.
# TYPE ollama_model_requests_total counter
ollama_model_requests_total{action="all",status="OK",status_code="200"} 6
```
Full output:
```
# curl http://127.0.0.1:11434/metrics
# HELP go_gc_duration_seconds A summary of the pause duration of garbage collection cycles.
# TYPE go_gc_duration_seconds summary
go_gc_duration_seconds{quantile="0"} 1.0458e-05
go_gc_duration_seconds{quantile="0.25"} 5.4042e-05
go_gc_duration_seconds{quantile="0.5"} 6.9792e-05
go_gc_duration_seconds{quantile="0.75"} 0.000121417
go_gc_duration_seconds{quantile="1"} 0.00423525
go_gc_duration_seconds_sum 0.019693795
go_gc_duration_seconds_count 42
# HELP go_goroutines Number of goroutines that currently exist.
# TYPE go_goroutines gauge
go_goroutines 9
# HELP go_info Information about the Go environment.
# TYPE go_info gauge
go_info{version="go1.22.1"} 1
# HELP go_memstats_alloc_bytes Number of bytes allocated and still in use.
# TYPE go_memstats_alloc_bytes gauge
go_memstats_alloc_bytes 2.429592e+06
# HELP go_memstats_alloc_bytes_total Total number of bytes allocated, even if freed.
# TYPE go_memstats_alloc_bytes_total counter
go_memstats_alloc_bytes_total 9.359476e+07
# HELP go_memstats_buck_hash_sys_bytes Number of bytes used by the profiling bucket hash table.
# TYPE go_memstats_buck_hash_sys_bytes gauge
go_memstats_buck_hash_sys_bytes 11319
# HELP go_memstats_frees_total Total number of frees.
# TYPE go_memstats_frees_total counter
go_memstats_frees_total 545225
# HELP go_memstats_gc_sys_bytes Number of bytes used for garbage collection system metadata.
# TYPE go_memstats_gc_sys_bytes gauge
go_memstats_gc_sys_bytes 3.465616e+06
# HELP go_memstats_heap_alloc_bytes Number of heap bytes allocated and still in use.
# TYPE go_memstats_heap_alloc_bytes gauge
go_memstats_heap_alloc_bytes 2.429592e+06
# HELP go_memstats_heap_idle_bytes Number of heap bytes waiting to be used.
# TYPE go_memstats_heap_idle_bytes gauge
go_memstats_heap_idle_bytes 6.660096e+06
# HELP go_memstats_heap_inuse_bytes Number of heap bytes that are in use.
# TYPE go_memstats_heap_inuse_bytes gauge
go_memstats_heap_inuse_bytes 5.136384e+06
# HELP go_memstats_heap_objects Number of allocated objects.
# TYPE go_memstats_heap_objects gauge
go_memstats_heap_objects 12324
# HELP go_memstats_heap_released_bytes Number of heap bytes released to OS.
# TYPE go_memstats_heap_released_bytes gauge
go_memstats_heap_released_bytes 6.201344e+06
# HELP go_memstats_heap_sys_bytes Number of heap bytes obtained from system.
# TYPE go_memstats_heap_sys_bytes gauge
go_memstats_heap_sys_bytes 1.179648e+07
# HELP go_memstats_last_gc_time_seconds Number of seconds since 1970 of last garbage collection.
# TYPE go_memstats_last_gc_time_seconds gauge
go_memstats_last_gc_time_seconds 1.7125328358982313e+09
# HELP go_memstats_lookups_total Total number of pointer lookups.
# TYPE go_memstats_lookups_total counter
go_memstats_lookups_total 0
# HELP go_memstats_mallocs_total Total number of mallocs.
# TYPE go_memstats_mallocs_total counter
go_memstats_mallocs_total 557549
# HELP go_memstats_mcache_inuse_bytes Number of bytes in use by mcache structures.
# TYPE go_memstats_mcache_inuse_bytes gauge
go_memstats_mcache_inuse_bytes 4800
# HELP go_memstats_mcache_sys_bytes Number of bytes used for mcache structures obtained from system.
# TYPE go_memstats_mcache_sys_bytes gauge
go_memstats_mcache_sys_bytes 15600
# HELP go_memstats_mspan_inuse_bytes Number of bytes in use by mspan structures.
# TYPE go_memstats_mspan_inuse_bytes gauge
go_memstats_mspan_inuse_bytes 123840
# HELP go_memstats_mspan_sys_bytes Number of bytes used for mspan structures obtained from system.
# TYPE go_memstats_mspan_sys_bytes gauge
go_memstats_mspan_sys_bytes 179520
# HELP go_memstats_next_gc_bytes Number of heap bytes when next garbage collection will take place.
# TYPE go_memstats_next_gc_bytes gauge
go_memstats_next_gc_bytes 5.554272e+06
# HELP go_memstats_other_sys_bytes Number of bytes used for other system allocations.
# TYPE go_memstats_other_sys_bytes gauge
go_memstats_other_sys_bytes 1.004569e+06
# HELP go_memstats_stack_inuse_bytes Number of bytes in use by the stack allocator.
# TYPE go_memstats_stack_inuse_bytes gauge
go_memstats_stack_inuse_bytes 786432
# HELP go_memstats_stack_sys_bytes Number of bytes obtained from system for stack allocator.
# TYPE go_memstats_stack_sys_bytes gauge
go_memstats_stack_sys_bytes 786432
# HELP go_memstats_sys_bytes Number of bytes obtained from system.
# TYPE go_memstats_sys_bytes gauge
go_memstats_sys_bytes 1.7259536e+07
# HELP go_threads Number of OS threads created.
# TYPE go_threads gauge
go_threads 15
# HELP ollama_model_list_requests_total The total number of model list requets that have been attempted.
# TYPE ollama_model_list_requests_total counter
ollama_model_list_requests_total{action="list",status="OK",status_code="200"} 1
# HELP ollama_model_pull_requests_total The total number of model pulls that have been attempted.
# TYPE ollama_model_pull_requests_total counter
ollama_model_pull_requests_total{action="pull",status="OK",status_code="200"} 1
# HELP ollama_model_requests_total The total number of requests on all endpoints.
# TYPE ollama_model_requests_total counter
ollama_model_requests_total{action="all",status="OK",status_code="200"} 7
# HELP process_cpu_seconds_total Total user and system CPU time spent in seconds.
# TYPE process_cpu_seconds_total counter
process_cpu_seconds_total 73.31
# HELP process_max_fds Maximum number of open file descriptors.
# TYPE process_max_fds gauge
process_max_fds 1.048576e+06
# HELP process_open_fds Number of open file descriptors.
# TYPE process_open_fds gauge
process_open_fds 9
# HELP process_resident_memory_bytes Resident memory size in bytes.
# TYPE process_resident_memory_bytes gauge
process_resident_memory_bytes 2.942976e+08
# HELP process_start_time_seconds Start time of the process since unix epoch in seconds.
# TYPE process_start_time_seconds gauge
process_start_time_seconds 1.71253222787e+09
# HELP process_virtual_memory_bytes Virtual memory size in bytes.
# TYPE process_virtual_memory_bytes gauge
process_virtual_memory_bytes 2.536882176e+09
# HELP process_virtual_memory_max_bytes Maximum amount of virtual memory available in bytes.
# TYPE process_virtual_memory_max_bytes gauge
process_virtual_memory_max_bytes 1.8446744073709552e+19
# HELP promhttp_metric_handler_requests_in_flight Current number of scrapes being served.
# TYPE promhttp_metric_handler_requests_in_flight gauge
promhttp_metric_handler_requests_in_flight 1
# HELP promhttp_metric_handler_requests_total Total number of scrapes by HTTP status code.
# TYPE promhttp_metric_handler_requests_total counter
promhttp_metric_handler_requests_total{code="200"} 3
promhttp_metric_handler_requests_total{code="500"} 0
promhttp_metric_handler_requests_total{code="503"} 0
```
|
{
"login": "amila-ku",
"id": 12775690,
"node_id": "MDQ6VXNlcjEyNzc1Njkw",
"avatar_url": "https://avatars.githubusercontent.com/u/12775690?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amila-ku",
"html_url": "https://github.com/amila-ku",
"followers_url": "https://api.github.com/users/amila-ku/followers",
"following_url": "https://api.github.com/users/amila-ku/following{/other_user}",
"gists_url": "https://api.github.com/users/amila-ku/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amila-ku/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amila-ku/subscriptions",
"organizations_url": "https://api.github.com/users/amila-ku/orgs",
"repos_url": "https://api.github.com/users/amila-ku/repos",
"events_url": "https://api.github.com/users/amila-ku/events{/privacy}",
"received_events_url": "https://api.github.com/users/amila-ku/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3529/reactions",
"total_count": 11,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 5,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3529/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/283
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/283/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/283/comments
|
https://api.github.com/repos/ollama/ollama/issues/283/events
|
https://github.com/ollama/ollama/issues/283
| 1,836,913,021
|
I_kwDOJ0Z1Ps5tfRF9
| 283
|
Do not prompt to install CLI if already on `$PATH`
|
{
"login": "justinmayer",
"id": 1503700,
"node_id": "MDQ6VXNlcjE1MDM3MDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1503700?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/justinmayer",
"html_url": "https://github.com/justinmayer",
"followers_url": "https://api.github.com/users/justinmayer/followers",
"following_url": "https://api.github.com/users/justinmayer/following{/other_user}",
"gists_url": "https://api.github.com/users/justinmayer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/justinmayer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/justinmayer/subscriptions",
"organizations_url": "https://api.github.com/users/justinmayer/orgs",
"repos_url": "https://api.github.com/users/justinmayer/repos",
"events_url": "https://api.github.com/users/justinmayer/events{/privacy}",
"received_events_url": "https://api.github.com/users/justinmayer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5667396210,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2acg",
"url": "https://api.github.com/repos/ollama/ollama/labels/good%20first%20issue",
"name": "good first issue",
"color": "7057ff",
"default": true,
"description": "Good for newcomers"
},
{
"id": 6677279472,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjf8y8A",
"url": "https://api.github.com/repos/ollama/ollama/labels/macos",
"name": "macos",
"color": "E2DBC0",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2023-08-04T14:59:07
| 2024-12-23T00:50:48
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When launching the Ollama application, a dialog window will appear and prompt you for administrative access in order to “install” the command line executable, which in practice means symlinking `/Applications/Ollama.app/Contents/Resources/ollama` to `/usr/local/bin/ollama`.
## Observed Behavior
This dialog window appears even when `ollama` is already available elsewhere on `$PATH`. Presumably the check naively looks for `/usr/local/bin/ollama`, and if it is not found at that specific location, the aforementioned dialog window appears on each and every launch of the application.
## Expected Behavior
If `ollama` is already available on `$PATH`, regardless of its specific location, the aforementioned dialog window should _not_ appear.
## Rationale
When installing new applications, I prefer not to grant such administrative permissions, particularly when the only purpose is to create a symlink into a protected directory. I therefore prefer to manage my `$PATH` myself and manually create such symlinks into places such as `~/.local/bin/` that do not require admin permissions. Others might even prefer to invoke the full path to `/Applications/Ollama.app/Contents/Resources/ollama` instead.
But as the situation currently stands, there is no way to stop this dialog from appearing, even when its intended purpose is entirely unnecessary.
## Proposed Remediation
I suggest all three of the following be implemented posthaste:
1. Do not show this dialog window if `ollama` is already available on `$PATH`.
2. Put a button on the dialog window to skip the CLI symlink step.
3. Put a checkbox on this dialog that says "Do not ask again" and record the result, so future application launches will not spawn the CLI symlink prompt.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/283/reactions",
"total_count": 17,
"+1": 17,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/283/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8064
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8064/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8064/comments
|
https://api.github.com/repos/ollama/ollama/issues/8064/events
|
https://github.com/ollama/ollama/issues/8064
| 2,735,001,249
|
I_kwDOJ0Z1Ps6jBM6h
| 8,064
|
How can I specify the GPU for running the LLM?
|
{
"login": "NilsHellwig",
"id": 44339207,
"node_id": "MDQ6VXNlcjQ0MzM5MjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/44339207?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NilsHellwig",
"html_url": "https://github.com/NilsHellwig",
"followers_url": "https://api.github.com/users/NilsHellwig/followers",
"following_url": "https://api.github.com/users/NilsHellwig/following{/other_user}",
"gists_url": "https://api.github.com/users/NilsHellwig/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NilsHellwig/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NilsHellwig/subscriptions",
"organizations_url": "https://api.github.com/users/NilsHellwig/orgs",
"repos_url": "https://api.github.com/users/NilsHellwig/repos",
"events_url": "https://api.github.com/users/NilsHellwig/events{/privacy}",
"received_events_url": "https://api.github.com/users/NilsHellwig/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-12T06:38:20
| 2024-12-23T08:11:41
| 2024-12-23T08:11:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The `num_gpu` parameter doesn't seem to work as expected. How can I ensure the model runs on a specific GPU? I have two A5000 GPUs available.
I'm not using Docker, just installed ollama by using `curl -fsSL https://ollama.com/install.sh | sh`.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.1
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8064/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8064/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4692
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4692/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4692/comments
|
https://api.github.com/repos/ollama/ollama/issues/4692/events
|
https://github.com/ollama/ollama/issues/4692
| 2,322,261,119
|
I_kwDOJ0Z1Ps6KauR_
| 4,692
|
About Deepseek - V2
|
{
"login": "DirtyKnightForVi",
"id": 116725810,
"node_id": "U_kgDOBvUYMg",
"avatar_url": "https://avatars.githubusercontent.com/u/116725810?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DirtyKnightForVi",
"html_url": "https://github.com/DirtyKnightForVi",
"followers_url": "https://api.github.com/users/DirtyKnightForVi/followers",
"following_url": "https://api.github.com/users/DirtyKnightForVi/following{/other_user}",
"gists_url": "https://api.github.com/users/DirtyKnightForVi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DirtyKnightForVi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DirtyKnightForVi/subscriptions",
"organizations_url": "https://api.github.com/users/DirtyKnightForVi/orgs",
"repos_url": "https://api.github.com/users/DirtyKnightForVi/repos",
"events_url": "https://api.github.com/users/DirtyKnightForVi/events{/privacy}",
"received_events_url": "https://api.github.com/users/DirtyKnightForVi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-05-29T03:09:25
| 2024-06-04T14:24:51
| 2024-06-04T14:24:51
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
llama.cpp already supports deepseek v2. Will Ollama follow up with support?
https://github.com/ggerganov/llama.cpp/pull/7519
|
{
"login": "DirtyKnightForVi",
"id": 116725810,
"node_id": "U_kgDOBvUYMg",
"avatar_url": "https://avatars.githubusercontent.com/u/116725810?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DirtyKnightForVi",
"html_url": "https://github.com/DirtyKnightForVi",
"followers_url": "https://api.github.com/users/DirtyKnightForVi/followers",
"following_url": "https://api.github.com/users/DirtyKnightForVi/following{/other_user}",
"gists_url": "https://api.github.com/users/DirtyKnightForVi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DirtyKnightForVi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DirtyKnightForVi/subscriptions",
"organizations_url": "https://api.github.com/users/DirtyKnightForVi/orgs",
"repos_url": "https://api.github.com/users/DirtyKnightForVi/repos",
"events_url": "https://api.github.com/users/DirtyKnightForVi/events{/privacy}",
"received_events_url": "https://api.github.com/users/DirtyKnightForVi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4692/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4692/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1156
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1156/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1156/comments
|
https://api.github.com/repos/ollama/ollama/issues/1156/events
|
https://github.com/ollama/ollama/pull/1156
| 1,997,651,867
|
PR_kwDOJ0Z1Ps5frDb7
| 1,156
|
fix push for model inheriting from other models
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-16T19:52:56
| 2023-11-16T21:33:31
| 2023-11-16T21:33:30
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1156",
"html_url": "https://github.com/ollama/ollama/pull/1156",
"diff_url": "https://github.com/ollama/ollama/pull/1156.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1156.patch",
"merged_at": "2023-11-16T21:33:30"
}
|
- fix auth scope: side effect of #1055 which changed the value of the scope parameter in the auth challenge
- fix cross repo mounts
resolves #1154
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1156/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1156/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6823
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6823/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6823/comments
|
https://api.github.com/repos/ollama/ollama/issues/6823/events
|
https://github.com/ollama/ollama/issues/6823
| 2,527,527,093
|
I_kwDOJ0Z1Ps6WpwC1
| 6,823
|
用qwen2微调训练出的hf转为gguf后,用ollama加载,要指定TEMPLATE才不会胡乱输出
|
{
"login": "czhcc",
"id": 4754730,
"node_id": "MDQ6VXNlcjQ3NTQ3MzA=",
"avatar_url": "https://avatars.githubusercontent.com/u/4754730?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/czhcc",
"html_url": "https://github.com/czhcc",
"followers_url": "https://api.github.com/users/czhcc/followers",
"following_url": "https://api.github.com/users/czhcc/following{/other_user}",
"gists_url": "https://api.github.com/users/czhcc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/czhcc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/czhcc/subscriptions",
"organizations_url": "https://api.github.com/users/czhcc/orgs",
"repos_url": "https://api.github.com/users/czhcc/repos",
"events_url": "https://api.github.com/users/czhcc/events{/privacy}",
"received_events_url": "https://api.github.com/users/czhcc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 9
| 2024-09-16T05:22:53
| 2024-12-16T01:41:43
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
而同样的gguf,用其它框架直接加载使用是正常的。
原始的qwen2模型转为gguf加载也不需要TEMPLATE参数。只有微调训练后生成的gguf要加TEMPLATE才不会胡乱输出。为什么?
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.10
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6823/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6823/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7739
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7739/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7739/comments
|
https://api.github.com/repos/ollama/ollama/issues/7739/events
|
https://github.com/ollama/ollama/pull/7739
| 2,671,429,551
|
PR_kwDOJ0Z1Ps6CXBYM
| 7,739
|
Better error suppression when getting terminal colours
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-19T09:24:56
| 2024-11-19T16:33:53
| 2024-11-19T16:33:53
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7739",
"html_url": "https://github.com/ollama/ollama/pull/7739",
"diff_url": "https://github.com/ollama/ollama/pull/7739.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7739.patch",
"merged_at": "2024-11-19T16:33:53"
}
|
Fixes #7737
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7739/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7739/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7426
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7426/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7426/comments
|
https://api.github.com/repos/ollama/ollama/issues/7426/events
|
https://github.com/ollama/ollama/issues/7426
| 2,624,822,219
|
I_kwDOJ0Z1Ps6cc5vL
| 7,426
|
x/llama3.2-vision on cli reports only "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" but works in ollama run
|
{
"login": "draeician",
"id": 177489421,
"node_id": "U_kgDOCpRGDQ",
"avatar_url": "https://avatars.githubusercontent.com/u/177489421?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/draeician",
"html_url": "https://github.com/draeician",
"followers_url": "https://api.github.com/users/draeician/followers",
"following_url": "https://api.github.com/users/draeician/following{/other_user}",
"gists_url": "https://api.github.com/users/draeician/gists{/gist_id}",
"starred_url": "https://api.github.com/users/draeician/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/draeician/subscriptions",
"organizations_url": "https://api.github.com/users/draeician/orgs",
"repos_url": "https://api.github.com/users/draeician/repos",
"events_url": "https://api.github.com/users/draeician/events{/privacy}",
"received_events_url": "https://api.github.com/users/draeician/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-10-30T17:31:10
| 2024-10-30T18:10:21
| 2024-10-30T18:10:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
draeician@nomnom ~/Downloads $ ollama run x/llama3.2-vision "Describe the image in detail: /home/draeician/Downloads/test.jpg"
Added image '/home/draeician/Downloads/test.jpg'
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
But if I run it through the interactive:
draeician@nomnom ~/Downloads $ ollama run x/llama3.2-vision
>>> Describe the image in detail: /home/draeician/Downloads/test.jpg
Added image '/home/draeician/Downloads/test.jpg'
The image shows a sketch of a female character from an anime or manga, with long hair and wearing a jacket. The purpose of the image is to showcase the artist's work.
* A sketch of a female character from an anime or manga:
+ The character has long hair that flows behind her.
+ She is wearing a jacket with a high collar.
+ Her facial expression is serious and determined.
* The character is wearing a jacket with a high collar:
+ The jacket is dark-colored and has a sleek design.
+ It covers most of the character's body, leaving only her arms and legs visible.
+ The high collar adds to the character's mysterious and intimidating appearance.
* She has long hair that flows behind her:
+ Her hair is dark brown and falls down her back in loose waves.
+ It frames her face and accentuates her features.
+ The flowing hair adds a sense of elegance and sophistication to the character's overall look.
Overall, the image showcases a well-drawn female character with a strong and mysterious presence. The artist has paid attention to detail in rendering the character's facial expression, clothing, and hairstyle, creating a compelling and engaging
visual representation.
Works fine. Tried png and jpg files, and all on the cli returned nothing but '!'
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.4.0-rc5
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7426/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7426/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/3105
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3105/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3105/comments
|
https://api.github.com/repos/ollama/ollama/issues/3105/events
|
https://github.com/ollama/ollama/pull/3105
| 2,184,184,211
|
PR_kwDOJ0Z1Ps5phAJy
| 3,105
|
Improve usability with Bash completion for Ollama on Linux
|
{
"login": "aosan",
"id": 8534160,
"node_id": "MDQ6VXNlcjg1MzQxNjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8534160?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aosan",
"html_url": "https://github.com/aosan",
"followers_url": "https://api.github.com/users/aosan/followers",
"following_url": "https://api.github.com/users/aosan/following{/other_user}",
"gists_url": "https://api.github.com/users/aosan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aosan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aosan/subscriptions",
"organizations_url": "https://api.github.com/users/aosan/orgs",
"repos_url": "https://api.github.com/users/aosan/repos",
"events_url": "https://api.github.com/users/aosan/events{/privacy}",
"received_events_url": "https://api.github.com/users/aosan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2024-03-13T14:40:44
| 2024-08-26T11:17:45
| 2024-05-09T18:58:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3105",
"html_url": "https://github.com/ollama/ollama/pull/3105",
"diff_url": "https://github.com/ollama/ollama/pull/3105.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3105.patch",
"merged_at": null
}
|
Please review this PR for adding Bash completion to install.sh for Linux.
Currently works for all arguments and options, including autocomplete for long model names (yay!)
`ollama ls` works, but it's missing from the -h/--help and ollama listing. I'll open a separate issue for it.
It does not include autocomplete for `ollama rm` as I consider easy destructive actions would need user confirmation before execution. I'll also create an issue for it.
This version of install.sh has been tested successfully on Fedora 39, Ubuntu 23.10 and Debian 12.5.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3105/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3105/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8162
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8162/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8162/comments
|
https://api.github.com/repos/ollama/ollama/issues/8162/events
|
https://github.com/ollama/ollama/issues/8162
| 2,748,654,398
|
I_kwDOJ0Z1Ps6j1SM-
| 8,162
|
StructuredOutputs Schema Missing in Prompt [Unlike OpenAI API Default Behavior]
|
{
"login": "ikot-humanoid",
"id": 190361581,
"node_id": "U_kgDOC1iv7Q",
"avatar_url": "https://avatars.githubusercontent.com/u/190361581?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ikot-humanoid",
"html_url": "https://github.com/ikot-humanoid",
"followers_url": "https://api.github.com/users/ikot-humanoid/followers",
"following_url": "https://api.github.com/users/ikot-humanoid/following{/other_user}",
"gists_url": "https://api.github.com/users/ikot-humanoid/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ikot-humanoid/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ikot-humanoid/subscriptions",
"organizations_url": "https://api.github.com/users/ikot-humanoid/orgs",
"repos_url": "https://api.github.com/users/ikot-humanoid/repos",
"events_url": "https://api.github.com/users/ikot-humanoid/events{/privacy}",
"received_events_url": "https://api.github.com/users/ikot-humanoid/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-12-18T20:02:55
| 2024-12-18T20:11:02
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When using StructuredOutputs, I noticed that the model's outputs were nonsensical and didn't align with expectations.
After debugging, I discovered that the output schema isn't included in the prompt, leaving the model unaware of its options and what it should generate. While developers could manually add the schema to the prompt, this isn't a common practice. For instance, OpenAI's API automatically includes the schema (easily verified by counting input prompt tokens), and I believe this behavior should be standard for all inference engines.
I tested Ollama in three different ways, and all of them exhibited the same behavior. Below is the code to reproduce the issue:
```python
import aiohttp
from pydantic import BaseModel, Field
from typing import Union, List, Literal
from typing import Annotated
class Action1(BaseModel):
type: Literal["action1"]
__doc__: str = "store in memory"
class Action2(BaseModel):
type: Literal["action2"]
__doc__: str = "call someone"
class Action3(BaseModel):
type: Literal["action3"]
__doc__: str = "move to a location"
class Response(BaseModel):
steps: Annotated[
List[Union[Action1, Action2, Action3]],
Field(..., description='Sequence of steps to perform', discriminator='type')
]
messages = [
{"role": "system", "content": "Decompose input request into sequence of steps. Possible steps are listed in the response schema."},
{"role": "user", "content": "I want to go to the park."},
]
payload = {
"model": 'qwen2.5-coder:7b',
"messages": messages,
"stream": False,
"options": {"temperature" : 0.0},
"format": Response.model_json_schema()
}
async def get_response(payload):
async with aiohttp.ClientSession() as session:
async with session.post('http://localhost:11434/api/chat', json=payload) as response:
json_response = await response.json()
raw_string = json_response['message']['content']
res = Response.model_validate_json(raw_string)
return json_response, res
json_response, res = await get_response(payload)
# json_response['prompt_eval_count'] shows 40 prompt tokens
# res.model_dump() shows `{'steps': [{'type': 'action1'}, {'type': 'action2'}]}`, which doesn't make any sense
# the correct answer for "I want to go to the park." is Action 3 ("move to a location"), not 1 or 2
messages_w_schema = [
{
"role": "system",
"content":
"Decompose input request into sequence of steps. Possible steps are listed in the response schema."
+ f'\nSCHEMA:\n{Response.model_json_schema()}'
},
{"role": "user", "content": "I want to go to the park."},
]
payload_new = {
"model": 'qwen2.5-coder:7b',
"messages": messages_w_schema,
"stream": False,
"options": {"temperature" : 0.0},
"format": Response.model_json_schema()
}
json_response_2, res_2 = await get_response(payload_new)
# json_response_2['prompt_eval_count'] shows 328 prompt tokens
# res_2.model_dump() correclty suggests action3
#============
from ollama import chat
chat_res_1 = chat(
model=payload['model'],
messages=messages,
format=Response.model_json_schema(),
options={'temperature': 0}
)
chat_res_2 = chat(
model=payload['model'],
messages=messages_w_schema,
format=Response.model_json_schema(),
options={'temperature': 0}
)
# chat_res_1.prompt_eval_count = 40, chat_res_2.prompt_eval_count = 328
# Response.model_validate_json(chat_res_1.message.content) again points to actions 1 & 2, not 3
#================
from openai import OpenAI
client = OpenAI(
base_url = 'http://localhost:11434/v1',
api_key='ollama',
)
oai_client_1 = client.beta.chat.completions.parse(
model=payload['model'],
messages=payload['messages'],
response_format=Response,
temperature=0
)
oai_client_2 = client.beta.chat.completions.parse(
model=payload_new['model'],
messages=payload_new['messages'],
response_format=Response,
temperature=0
)
# same here: oai_client_1.choices[0].message.parsed contains actions 1 & 2, not 3
# and input prompt token counts are the same
```
As you might see, we have 3 actions: memory, call and navigation. We ask an llm to decompose input query into sequence of actions. For this example (`I want to go to the park.`) we expect model to output 3rd action. There is no way for the model to understand which action to choose other than parsing that info from the output schema. Without it, model just randomly guesses, and is not able to produce anything meaningful.
====
The biggest issue I see is that even though Ollama is OpenAI compatible, it doesn't have the same logic under the hood, which might surprise a lot of developers when switching from proprietary LLMs to local ones.
Proposal: by default, add output schema to inputs, and maybe allow to disable that behaviour with a flag. Given that the functionality was added not so long ago, I believe it is important to fix it RN, before the workaround logic with system prompt patching will be widely spread.
===
My library versions:
- pip: ollama==0.4.4
- ollama CLI version is 0.5.1
- pip: openai=='1.57.3'
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8162/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8162/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4485
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4485/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4485/comments
|
https://api.github.com/repos/ollama/ollama/issues/4485/events
|
https://github.com/ollama/ollama/issues/4485
| 2,301,692,122
|
I_kwDOJ0Z1Ps6JMQja
| 4,485
|
Import a model:latest aborted (core dumped)
|
{
"login": "Anorid",
"id": 139095718,
"node_id": "U_kgDOCEpupg",
"avatar_url": "https://avatars.githubusercontent.com/u/139095718?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Anorid",
"html_url": "https://github.com/Anorid",
"followers_url": "https://api.github.com/users/Anorid/followers",
"following_url": "https://api.github.com/users/Anorid/following{/other_user}",
"gists_url": "https://api.github.com/users/Anorid/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Anorid/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Anorid/subscriptions",
"organizations_url": "https://api.github.com/users/Anorid/orgs",
"repos_url": "https://api.github.com/users/Anorid/repos",
"events_url": "https://api.github.com/users/Anorid/events{/privacy}",
"received_events_url": "https://api.github.com/users/Anorid/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 12
| 2024-05-17T02:33:20
| 2024-05-30T05:36:45
| 2024-05-30T05:36:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I carefully read the documentation content of the README to try
root@autodl-container-36e51198ae-c4ed76b0:~/autodl-tmp/model# ollama create example -f Modelfile
transferring model data
using existing layer sha256:8c7d76a23837d1b07ca3c3aa497d90ffafdfc2fd417b93e4e06caeeabf4f1526
using existing layer sha256:dbc2ca980bfce0b44450f42033a51513616ac71f8b5881efbaa81d8f5e9b253e
using existing layer sha256:be7c61fea675f5a89b441192e604c0fcc8806a19e235421f17dda66e5fc67b2d
writing manifest
success
root@autodl-container-36e51198ae-c4ed76b0:~/autodl-tmp/model# ollama run example "What is your favourite condiment?"
Error: llama runner process has terminated: signal: aborted (core dumped)
root@autodl-container-36e51198ae-c4ed76b0:~/autodl-tmp/model# nivdia-smi
bash: nivdia-smi: command not found
root@autodl-container-36e51198ae-c4ed76b0:~/autodl-tmp/model# nvidia-smi
Fri May 17 10:02:03 2024
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 525.105.17 Driver Version: 525.105.17 CUDA Version: 12.0 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA A40 On | 00000000:C1:00.0 Off | Off |
| 0% 25C P8 20W / 300W | 2MiB / 49140MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| No running processes found |
+-----------------------------------------------------------------------------+
root@autodl-container-36e51198ae-c4ed76b0:~/autodl-tmp/model# ollama run example
Error: llama runner process has terminated: signal: aborted (core dumped)
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4485/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4485/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3236
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3236/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3236/comments
|
https://api.github.com/repos/ollama/ollama/issues/3236/events
|
https://github.com/ollama/ollama/issues/3236
| 2,194,215,654
|
I_kwDOJ0Z1Ps6CyRLm
| 3,236
|
Unable to run Falcon Models
|
{
"login": "mebinjoy77",
"id": 62318229,
"node_id": "MDQ6VXNlcjYyMzE4MjI5",
"avatar_url": "https://avatars.githubusercontent.com/u/62318229?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mebinjoy77",
"html_url": "https://github.com/mebinjoy77",
"followers_url": "https://api.github.com/users/mebinjoy77/followers",
"following_url": "https://api.github.com/users/mebinjoy77/following{/other_user}",
"gists_url": "https://api.github.com/users/mebinjoy77/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mebinjoy77/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mebinjoy77/subscriptions",
"organizations_url": "https://api.github.com/users/mebinjoy77/orgs",
"repos_url": "https://api.github.com/users/mebinjoy77/repos",
"events_url": "https://api.github.com/users/mebinjoy77/events{/privacy}",
"received_events_url": "https://api.github.com/users/mebinjoy77/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-03-19T07:18:07
| 2024-03-19T13:25:23
| 2024-03-19T13:25:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Unable to run falcon models through ollama. Running falcon models crashes ollama service.
Here is the log :

### What did you expect to see?
ollama working fine with falcon.
### Steps to reproduce
Follow the commands :
ollama run falcon
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
amd64
### Platform
_No response_
### Ollama version
0.1.29
### GPU
Nvidia
### GPU info
NVIDIA A10G
Driver Version: 470.199.02 CUDA Version: 11.4
### CPU
Intel
### Other software
_No response_
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3236/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3236/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6503
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6503/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6503/comments
|
https://api.github.com/repos/ollama/ollama/issues/6503/events
|
https://github.com/ollama/ollama/pull/6503
| 2,485,426,225
|
PR_kwDOJ0Z1Ps55XSan
| 6,503
|
add integration: py-gpt
|
{
"login": "szczyglis-dev",
"id": 61396542,
"node_id": "MDQ6VXNlcjYxMzk2NTQy",
"avatar_url": "https://avatars.githubusercontent.com/u/61396542?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/szczyglis-dev",
"html_url": "https://github.com/szczyglis-dev",
"followers_url": "https://api.github.com/users/szczyglis-dev/followers",
"following_url": "https://api.github.com/users/szczyglis-dev/following{/other_user}",
"gists_url": "https://api.github.com/users/szczyglis-dev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/szczyglis-dev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/szczyglis-dev/subscriptions",
"organizations_url": "https://api.github.com/users/szczyglis-dev/orgs",
"repos_url": "https://api.github.com/users/szczyglis-dev/repos",
"events_url": "https://api.github.com/users/szczyglis-dev/events{/privacy}",
"received_events_url": "https://api.github.com/users/szczyglis-dev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-25T19:16:55
| 2024-11-21T09:54:40
| 2024-11-21T09:54:39
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6503",
"html_url": "https://github.com/ollama/ollama/pull/6503",
"diff_url": "https://github.com/ollama/ollama/pull/6503.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6503.patch",
"merged_at": "2024-11-21T09:54:39"
}
|
Add integration: PyGPT - AI desktop assistant for Linux, Windows, and Mac with support for models provided through Ollama.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6503/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6503/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2465
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2465/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2465/comments
|
https://api.github.com/repos/ollama/ollama/issues/2465/events
|
https://github.com/ollama/ollama/pull/2465
| 2,130,460,835
|
PR_kwDOJ0Z1Ps5mpzMk
| 2,465
|
Detect AMD GPU info via sysfs and block old cards
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-12T16:10:40
| 2024-02-12T20:41:46
| 2024-02-12T20:41:43
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2465",
"html_url": "https://github.com/ollama/ollama/pull/2465",
"diff_url": "https://github.com/ollama/ollama/pull/2465.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2465.patch",
"merged_at": "2024-02-12T20:41:43"
}
|
This wires up some new logic to start using sysfs to discover AMD GPU information and detects old cards we can't yet support so we can fallback to CPU mode.
This also serves as an initial foundation where I believe we'll be able to move away from the AMD management library and query the sysfs files to discover the details we need with less complexity.
This will mitigate some cases of #2165
Tested on a `Radeon RX 580` and it correctly falls back to CPU.
```
time=2024-02-12T16:02:58.657Z level=INFO source=gpu.go:157 msg="AMD Driver: 6.2.4"
time=2024-02-12T16:02:58.657Z level=INFO source=gpu.go:162 msg="AMD GPU too old, falling back to CPU gfx803"
time=2024-02-12T16:02:58.657Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2465/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2465/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2448
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2448/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2448/comments
|
https://api.github.com/repos/ollama/ollama/issues/2448/events
|
https://github.com/ollama/ollama/issues/2448
| 2,129,086,352
|
I_kwDOJ0Z1Ps5-50eQ
| 2,448
|
Linux(WSL Ubuntu) installation curl command fails
|
{
"login": "UeberTimei",
"id": 45313665,
"node_id": "MDQ6VXNlcjQ1MzEzNjY1",
"avatar_url": "https://avatars.githubusercontent.com/u/45313665?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/UeberTimei",
"html_url": "https://github.com/UeberTimei",
"followers_url": "https://api.github.com/users/UeberTimei/followers",
"following_url": "https://api.github.com/users/UeberTimei/following{/other_user}",
"gists_url": "https://api.github.com/users/UeberTimei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/UeberTimei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/UeberTimei/subscriptions",
"organizations_url": "https://api.github.com/users/UeberTimei/orgs",
"repos_url": "https://api.github.com/users/UeberTimei/repos",
"events_url": "https://api.github.com/users/UeberTimei/events{/privacy}",
"received_events_url": "https://api.github.com/users/UeberTimei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 17
| 2024-02-11T17:28:25
| 2024-03-28T20:51:57
| 2024-03-28T20:51:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
curl -fsSL https://ollama.com/install.sh | sh
This leads to:
curl: (35) OpenSSL SSL_connect: Connection reset by peer in connection to ollama.com:443
I tried everything. I reinstalled WSL and set Google DNS.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2448/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2448/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6528
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6528/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6528/comments
|
https://api.github.com/repos/ollama/ollama/issues/6528/events
|
https://github.com/ollama/ollama/pull/6528
| 2,490,061,599
|
PR_kwDOJ0Z1Ps55nETw
| 6,528
|
Fix import image width
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-27T18:26:09
| 2024-08-27T21:19:49
| 2024-08-27T21:19:48
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6528",
"html_url": "https://github.com/ollama/ollama/pull/6528",
"diff_url": "https://github.com/ollama/ollama/pull/6528.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6528.patch",
"merged_at": "2024-08-27T21:19:48"
}
|
This gives more reasonable output for the images:
<img width="1183" alt="Screenshot 2024-08-27 at 14 10 33" src="https://github.com/user-attachments/assets/428cee86-ba62-4270-b308-2bacc07a1460">
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6528/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6528/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5530
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5530/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5530/comments
|
https://api.github.com/repos/ollama/ollama/issues/5530/events
|
https://github.com/ollama/ollama/pull/5530
| 2,394,116,561
|
PR_kwDOJ0Z1Ps50nbP4
| 5,530
|
Update llama.cpp submodule to `a8db2a9c`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-07T16:11:44
| 2024-07-07T17:03:11
| 2024-07-07T17:03:10
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5530",
"html_url": "https://github.com/ollama/ollama/pull/5530",
"diff_url": "https://github.com/ollama/ollama/pull/5530.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5530.patch",
"merged_at": "2024-07-07T17:03:10"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5530/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5530/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7581
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7581/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7581/comments
|
https://api.github.com/repos/ollama/ollama/issues/7581/events
|
https://github.com/ollama/ollama/issues/7581
| 2,645,396,415
|
I_kwDOJ0Z1Ps6drYu_
| 7,581
|
Support importing vision models from Safetensors in `ollama create`
|
{
"login": "chigkim",
"id": 22120994,
"node_id": "MDQ6VXNlcjIyMTIwOTk0",
"avatar_url": "https://avatars.githubusercontent.com/u/22120994?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chigkim",
"html_url": "https://github.com/chigkim",
"followers_url": "https://api.github.com/users/chigkim/followers",
"following_url": "https://api.github.com/users/chigkim/following{/other_user}",
"gists_url": "https://api.github.com/users/chigkim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chigkim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chigkim/subscriptions",
"organizations_url": "https://api.github.com/users/chigkim/orgs",
"repos_url": "https://api.github.com/users/chigkim/repos",
"events_url": "https://api.github.com/users/chigkim/events{/privacy}",
"received_events_url": "https://api.github.com/users/chigkim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6947643302,
"node_id": "LA_kwDOJ0Z1Ps8AAAABnhyfpg",
"url": "https://api.github.com/repos/ollama/ollama/labels/create",
"name": "create",
"color": "b60205",
"default": false,
"description": "Issues relating to ollama create"
}
] |
open
| false
| null |
[] | null | 5
| 2024-11-08T23:52:56
| 2024-12-29T20:09:23
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I tried to import finetuned llama-3.2-11b-vision, but I got "Error: unsupported architecture."
In order to make sure my model is not the problem, I downloaded [meta-llama/Llama-3.2-11B-Vision-Instruct](https://huggingface.co/meta-llama/Llama-3.2-11B-Vision-Instruct) from Huggingface.
I copied modelfile from `Ollama show llama3.2-vision --modelfile`.
Then edited the modelfile and pointed `FROM` to the downloaded model from HF.
When I run `ollama create llama-vision -f llama-vision.modelfile`, I get this:
```bash
transferring model data 100%
converting model
Error: unsupported architecture
```
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.4.0
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7581/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7581/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1284
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1284/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1284/comments
|
https://api.github.com/repos/ollama/ollama/issues/1284/events
|
https://github.com/ollama/ollama/issues/1284
| 2,011,737,532
|
I_kwDOJ0Z1Ps536K28
| 1,284
|
Argument list too long
|
{
"login": "shubhammicrosoft1",
"id": 50182145,
"node_id": "MDQ6VXNlcjUwMTgyMTQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/50182145?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shubhammicrosoft1",
"html_url": "https://github.com/shubhammicrosoft1",
"followers_url": "https://api.github.com/users/shubhammicrosoft1/followers",
"following_url": "https://api.github.com/users/shubhammicrosoft1/following{/other_user}",
"gists_url": "https://api.github.com/users/shubhammicrosoft1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shubhammicrosoft1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shubhammicrosoft1/subscriptions",
"organizations_url": "https://api.github.com/users/shubhammicrosoft1/orgs",
"repos_url": "https://api.github.com/users/shubhammicrosoft1/repos",
"events_url": "https://api.github.com/users/shubhammicrosoft1/events{/privacy}",
"received_events_url": "https://api.github.com/users/shubhammicrosoft1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-11-27T08:16:49
| 2024-01-20T00:09:39
| 2024-01-20T00:09:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When i am running a summarization using ollama for reading a 7 MB file & summarizing the data on Linux , it reports
(bash: /usr/local/bin/ollama: Argument list too long)
Command used
ollama run llama2 "$(cat data.txt)" please summarize this data
Is this a OS limitation or some configurations that we can update in Ollama
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1284/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1284/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1622
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1622/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1622/comments
|
https://api.github.com/repos/ollama/ollama/issues/1622/events
|
https://github.com/ollama/ollama/pull/1622
| 2,049,896,518
|
PR_kwDOJ0Z1Ps5ib6GB
| 1,622
|
Update Readme Quickstart Ollama with Docker
|
{
"login": "Hidayathamir",
"id": 57469556,
"node_id": "MDQ6VXNlcjU3NDY5NTU2",
"avatar_url": "https://avatars.githubusercontent.com/u/57469556?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hidayathamir",
"html_url": "https://github.com/Hidayathamir",
"followers_url": "https://api.github.com/users/Hidayathamir/followers",
"following_url": "https://api.github.com/users/Hidayathamir/following{/other_user}",
"gists_url": "https://api.github.com/users/Hidayathamir/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Hidayathamir/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hidayathamir/subscriptions",
"organizations_url": "https://api.github.com/users/Hidayathamir/orgs",
"repos_url": "https://api.github.com/users/Hidayathamir/repos",
"events_url": "https://api.github.com/users/Hidayathamir/events{/privacy}",
"received_events_url": "https://api.github.com/users/Hidayathamir/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-12-20T06:04:12
| 2024-06-09T18:06:14
| 2024-06-09T18:06:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1622",
"html_url": "https://github.com/ollama/ollama/pull/1622",
"diff_url": "https://github.com/ollama/ollama/pull/1622.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1622.patch",
"merged_at": null
}
|
# Update Readme Quickstart Ollama with Docker
Upon initial exploration of the repository, leveraging Docker for getting started appears to be the most straightforward approach. Following the [Ollama documentation for initiating with Docker](https://github.com/jmorganca/ollama?tab=readme-ov-file#docker) led me to the [Ollama Docker Image](https://hub.docker.com/r/ollama/ollama).
Subsequently, I launched Ollama using the command:
```shell
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
```
However, beyond this point, there lacks guidance on the subsequent steps.
This pull request aims to address this gap by incorporating instructions to guide users from spinning up the Docker image to utilizing the chat API.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1622/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1622/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7014
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7014/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7014/comments
|
https://api.github.com/repos/ollama/ollama/issues/7014/events
|
https://github.com/ollama/ollama/issues/7014
| 2,553,949,038
|
I_kwDOJ0Z1Ps6YOitu
| 7,014
|
Better Tool Call parsing
|
{
"login": "zly2006",
"id": 66198935,
"node_id": "MDQ6VXNlcjY2MTk4OTM1",
"avatar_url": "https://avatars.githubusercontent.com/u/66198935?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zly2006",
"html_url": "https://github.com/zly2006",
"followers_url": "https://api.github.com/users/zly2006/followers",
"following_url": "https://api.github.com/users/zly2006/following{/other_user}",
"gists_url": "https://api.github.com/users/zly2006/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zly2006/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zly2006/subscriptions",
"organizations_url": "https://api.github.com/users/zly2006/orgs",
"repos_url": "https://api.github.com/users/zly2006/repos",
"events_url": "https://api.github.com/users/zly2006/events{/privacy}",
"received_events_url": "https://api.github.com/users/zly2006/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-09-28T01:55:36
| 2025-01-01T03:52:27
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently tool call patterns are defined in go templates. this is fine for cases [e.g. in this comment](https://github.com/ollama/ollama/issues/6061#issuecomment-2257137350). However, it is not ideal.
## Problems
1. Content loss
To say, the model responds this text:
```plaintext
Yes, I can help you compute 3+4 with python
<tool_call>
{"name":"python", "args": {"expr":"3+4"}}
</tool_call>
```
In [this line](https://github.com/ollama/ollama/blob/cd5c8f6471abf32965289f0226016a78f0c5c938/server/routes.go#L1480), all content are removed. So if the model provided some information like the first sentence, it is ignored arbitrarily.
2. Streaming
Tool calls does NOT support streaming, it only works when it got the full content. However we all know we can process for tool calls even dont know all content.
3. Other format support
[here](https://github.com/ollama/ollama/blob/d05da2991245cfa0cd8da0bda476c626e26caaec/server/model.go#L301), this function only supports json, shall we support other formats, for example XML, in the future?
## Solution
**TL;DR:** For 1. and 2., we can use [Aho–Corasick algorithm](https://en.wikipedia.org/wiki/Aho%E2%80%93Corasick_algorithm) for parsing. Define the pattern such as `@@json{name, args}@@` means `{"name":"python", "args": {"expr":"3+4"}}` (e.g. llama3.2) and the pattern `<tool_call>@@json{name,args}@@</tool_call>` could be used for the output above.
When you are not in stream mode we can still process it like a stream. So each time a token received from the state machine, we should do some process to see if the token is part of the tool call syntax. This fundamentally resolves the first problem, because we only remove the part of output that we are sure it is part of tool call.
So I think a state machine should be very nice to "guess" if it is a valid tool call. We should keep in mind that the model may also put out some normal jsons (e.g. user ask it to process a json file), so never regard a json as tool call arbitrarily, we are "guessing" it. When we got a new token, we try to match it with our state machine. **If it matches successfully, it is possible a tool call, so dont send the token to the client and put on hold temporarily.** When we reach the end of the pattern string, e.g. finally matched `</tool_call>` means it is a valid tool call, we can drop all tokens, otherwise, when the state machine fails to match, this json is not a tool call and it should be sent to the client, we then send all tokens and reset the state machine.
Then lets talk about the pattern string. I currently design it like regular expression, all chars not wrapped with `@@` should be matched as is(spaces, \t \r and \n are allowed everywhere in pattern and match string and will be ignored). So `<tool_call>` and `</tool_call>` can be matched.
The model output is NOT reliable. When we test on the qwen model, it sometimes dont output `<tool_call>`, but a random token, though the json is still valid. for these situation we can use `@<match as is>@?` in the pattern string. for example
`@<tool_call>@? @@json{name,args}@@ @</tool_call>@?`
Then lets talk how to ensure the json object is a valid tool call, instead of what the user ask it to put out. Im designing it like this:
`@@json@@` means it is json and matches all josn. you can also specify type of the object to validate it. e.g. `@@json{name}@@` means the `name` field of the json object should neither be `undefined` nor `null`. `@@json{name:string}@@` further specifies the type must be string.
All supported types listed:
|pattern|example|
|---|---|
|any|
|string| "text"|
|number| 114514 |
| [`type`] | `[number]` => [1, 2, 3, 4] |
| {`type of values`} | `{number}` => {"name1": 1, "name2": 5} |
| {`name of field`: `type`} | `{name:string,args:any}` |
Welcome to share your opinions here.
Related: #5796
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7014/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7014/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7202
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7202/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7202/comments
|
https://api.github.com/repos/ollama/ollama/issues/7202/events
|
https://github.com/ollama/ollama/pull/7202
| 2,587,203,254
|
PR_kwDOJ0Z1Ps5-ma6Z
| 7,202
|
Add AI Summary Helper to list of community integrations
|
{
"login": "philffm",
"id": 6079545,
"node_id": "MDQ6VXNlcjYwNzk1NDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6079545?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/philffm",
"html_url": "https://github.com/philffm",
"followers_url": "https://api.github.com/users/philffm/followers",
"following_url": "https://api.github.com/users/philffm/following{/other_user}",
"gists_url": "https://api.github.com/users/philffm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/philffm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/philffm/subscriptions",
"organizations_url": "https://api.github.com/users/philffm/orgs",
"repos_url": "https://api.github.com/users/philffm/repos",
"events_url": "https://api.github.com/users/philffm/events{/privacy}",
"received_events_url": "https://api.github.com/users/philffm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-14T22:39:43
| 2024-12-11T00:13:07
| 2024-12-11T00:13:06
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7202",
"html_url": "https://github.com/ollama/ollama/pull/7202",
"diff_url": "https://github.com/ollama/ollama/pull/7202.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7202.patch",
"merged_at": "2024-12-11T00:13:06"
}
|
Adding AI Summary Helper to the community integrations list. The plugin allows users to generate custom summaries for web content using tailored prompts right in the browser / DOM which makes it compatible with send-to-kindle, printing articles etc.. It now supports Ollama/LLaMA models
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7202/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7202/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/874
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/874/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/874/comments
|
https://api.github.com/repos/ollama/ollama/issues/874/events
|
https://github.com/ollama/ollama/issues/874
| 1,955,745,228
|
I_kwDOJ0Z1Ps50kk3M
| 874
|
Add flag `--web-root` for serving UI (w/ code example)
|
{
"login": "coolaj86",
"id": 122831,
"node_id": "MDQ6VXNlcjEyMjgzMQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/122831?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coolaj86",
"html_url": "https://github.com/coolaj86",
"followers_url": "https://api.github.com/users/coolaj86/followers",
"following_url": "https://api.github.com/users/coolaj86/following{/other_user}",
"gists_url": "https://api.github.com/users/coolaj86/gists{/gist_id}",
"starred_url": "https://api.github.com/users/coolaj86/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coolaj86/subscriptions",
"organizations_url": "https://api.github.com/users/coolaj86/orgs",
"repos_url": "https://api.github.com/users/coolaj86/repos",
"events_url": "https://api.github.com/users/coolaj86/events{/privacy}",
"received_events_url": "https://api.github.com/users/coolaj86/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 8
| 2023-10-22T03:31:23
| 2023-10-26T05:36:54
| 2023-10-25T19:13:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
**edit**: removed potentially confusing language that was given as an example, not a fixed implementation detail
```sh
ollama serve --web-root ./ollama-webui/
```
1. Serve `/api/*` to the API
2. For all other requests, return results from the web server
3. If the `--web-root` flag is given, serve that directory for static files (index.html, etc)
4. If `--web-root` is not given, (optionally) serve an embedded filesystem (i.e. a help page)
Here's a tested, working example:
```sh
# serves ./documentation/
go run main.go
# serves ./customui/
go run main.go --web-root ./customui/
```
```text
main.go
webui/index.html
customui/index.html
```
```go
package main
import (
"embed"
"flag"
"fmt"
"io/fs"
"net/http"
"os"
)
//go:embed webui/*
var defaultWebRoot embed.FS
func main() {
var webRoot string
flag.StringVar(&webRoot, "web-root", "", "serve the given GPT API Web Client")
flag.Parse()
mux := http.NewServeMux()
apiHandler := func(w http.ResponseWriter, r *http.Request) {
fmt.Fprint(w, "This is the API route.")
}
mux.HandleFunc("/api/", apiHandler)
var webRootHandler http.Handler
if len(webRoot) > 0 {
webRootFs := http.Dir(webRoot)
webRootHandler = http.FileServer(webRootFs)
} else {
webRootFs, err := fs.Sub(defaultWebRoot, "webui")
if err != nil {
// panic only because this should be impossible
panic(err)
}
webRootHttpFs := http.FS(webRootFs)
webRootHandler = http.FileServer(webRootHttpFs)
}
mux.Handle("/", webRootHandler)
port := os.Getenv("PORT")
if len(port) == 0 {
port = "8080"
}
addr := "0.0.0.0:" + port
fmt.Println("Serving on ", addr)
http.ListenAndServe(addr, mux)
}
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/874/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/874/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/6577
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6577/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6577/comments
|
https://api.github.com/repos/ollama/ollama/issues/6577/events
|
https://github.com/ollama/ollama/pull/6577
| 2,498,850,998
|
PR_kwDOJ0Z1Ps56DZ3Y
| 6,577
|
Update documentation: Change .bin to .gguf in GGUF file and adapter examples
|
{
"login": "rayfiyo",
"id": 108730891,
"node_id": "U_kgDOBnsaCw",
"avatar_url": "https://avatars.githubusercontent.com/u/108730891?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rayfiyo",
"html_url": "https://github.com/rayfiyo",
"followers_url": "https://api.github.com/users/rayfiyo/followers",
"following_url": "https://api.github.com/users/rayfiyo/following{/other_user}",
"gists_url": "https://api.github.com/users/rayfiyo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rayfiyo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rayfiyo/subscriptions",
"organizations_url": "https://api.github.com/users/rayfiyo/orgs",
"repos_url": "https://api.github.com/users/rayfiyo/repos",
"events_url": "https://api.github.com/users/rayfiyo/events{/privacy}",
"received_events_url": "https://api.github.com/users/rayfiyo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-31T13:39:52
| 2024-09-01T02:34:25
| 2024-09-01T02:34:25
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6577",
"html_url": "https://github.com/ollama/ollama/pull/6577",
"diff_url": "https://github.com/ollama/ollama/pull/6577.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6577.patch",
"merged_at": "2024-09-01T02:34:25"
}
|
This pull request updates the documentation to reflect the change from GGML to GGUF format.
Changes made:
- In the "Build from a GGUF file" section, updated the example Modelfile to use the .gguf extension instead of .bin
- Modified the explanatory text to refer to "GGUF file" instead of "GGUF bin file"
- In the "GGUF adapter" section, updated the example Modelfile to use the .gguf extension for the adapter file
(This is my first contribution to OSS, so I'm excited about it. Thank you.)
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6577/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6577/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4265
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4265/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4265/comments
|
https://api.github.com/repos/ollama/ollama/issues/4265/events
|
https://github.com/ollama/ollama/pull/4265
| 2,286,315,770
|
PR_kwDOJ0Z1Ps5u6zM7
| 4,265
|
routes: fix show llava models
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-08T19:43:19
| 2024-05-08T19:51:22
| 2024-05-08T19:51:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4265",
"html_url": "https://github.com/ollama/ollama/pull/4265",
"diff_url": "https://github.com/ollama/ollama/pull/4265.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4265.patch",
"merged_at": "2024-05-08T19:51:21"
}
|
show model file isn't showing the projector because it's set to name `projector` instead of `model`
also change the order so adapters/projectors appear ahead of template/system to group with the language model
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4265/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4265/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7336
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7336/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7336/comments
|
https://api.github.com/repos/ollama/ollama/issues/7336/events
|
https://github.com/ollama/ollama/pull/7336
| 2,609,963,618
|
PR_kwDOJ0Z1Ps5_r9kJ
| 7,336
|
Update install.sh to support multiple init systems
|
{
"login": "Sachin-Bhat",
"id": 25080916,
"node_id": "MDQ6VXNlcjI1MDgwOTE2",
"avatar_url": "https://avatars.githubusercontent.com/u/25080916?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sachin-Bhat",
"html_url": "https://github.com/Sachin-Bhat",
"followers_url": "https://api.github.com/users/Sachin-Bhat/followers",
"following_url": "https://api.github.com/users/Sachin-Bhat/following{/other_user}",
"gists_url": "https://api.github.com/users/Sachin-Bhat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Sachin-Bhat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Sachin-Bhat/subscriptions",
"organizations_url": "https://api.github.com/users/Sachin-Bhat/orgs",
"repos_url": "https://api.github.com/users/Sachin-Bhat/repos",
"events_url": "https://api.github.com/users/Sachin-Bhat/events{/privacy}",
"received_events_url": "https://api.github.com/users/Sachin-Bhat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-10-23T22:28:33
| 2024-11-21T18:58:42
| 2024-11-21T18:58:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7336",
"html_url": "https://github.com/ollama/ollama/pull/7336",
"diff_url": "https://github.com/ollama/ollama/pull/7336.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7336.patch",
"merged_at": null
}
|
Hey folks,
Made a few additions to the script as follows:
- Support for Runit (tested), OpenRC and S6 (need help with testing)
The runit support works as expected. I need help with testing openrc and s6. Please let me know if you face any challenges.
If you aren't able to start the service with the commands issued in the warning consider using `sudo`.
Merging this closes #7332.
Cheers,
Sachin
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7336/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6155
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6155/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6155/comments
|
https://api.github.com/repos/ollama/ollama/issues/6155/events
|
https://github.com/ollama/ollama/issues/6155
| 2,446,639,425
|
I_kwDOJ0Z1Ps6R1MFB
| 6,155
|
Support Nested Parameters for Tools
|
{
"login": "kirel",
"id": 9124,
"node_id": "MDQ6VXNlcjkxMjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/9124?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kirel",
"html_url": "https://github.com/kirel",
"followers_url": "https://api.github.com/users/kirel/followers",
"following_url": "https://api.github.com/users/kirel/following{/other_user}",
"gists_url": "https://api.github.com/users/kirel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kirel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kirel/subscriptions",
"organizations_url": "https://api.github.com/users/kirel/orgs",
"repos_url": "https://api.github.com/users/kirel/repos",
"events_url": "https://api.github.com/users/kirel/events{/privacy}",
"received_events_url": "https://api.github.com/users/kirel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 7
| 2024-08-03T22:42:19
| 2024-11-06T00:54:07
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have been trying to get ollama tool use to work with https://github.com/jekalmin/extended_openai_conversation but was getting errors. It looked like in the response from ollama `tool_calls.0.function.aguments` is not an object but a string containing the (sometimes correct) object with the params. I tried to debug where it's coming from and it seems to be an issue with ollama (or the models if they are so bad but I tried a few and it's consistent)
I put together a minimal example:
```
import requests
import json
# Define the API key and endpoint
api_key = 'ollama'
api_endpoint = 'http://localhost:11434/v1/chat/completions'
# Define the headers
headers = {
'Content-Type': 'application/json',
'Authorization': f'Bearer {api_key}',
}
tools = {
"tools": [
{
"type": "function",
"function": {
"name": "execute_services",
"description": "Use this function to execute service of devices in Home Assistant.",
"parameters": {
"type": "object",
"properties": {
"list": {
"type": "array",
"items": {
"type": "object",
"properties": {
"domain": {
"type": "string",
"description": "The domain of the service",
},
"service": {
"type": "string",
"description": "The service to be called",
},
"service_data": {
"type": "object",
"description": "The service data object to indicate what to control.",
"properties": {
"entity_id": {
"type": "string",
"description": "The entity_id retrieved from available devices. It must start with domain, followed by dot character.",
}
},
"required": ["entity_id"],
},
},
"required": ["domain", "service", "service_data"],
},
}
},
},
},
}
],
"tool_choice": "auto"
}
messages = [
{
"role": "system",
"content": "Ich möchte, dass du als Smart Home Manager von Home Assistant agierst. Ich werde Informationen über das Smart Home zusammen mit einer Frage bereitstellen. Du wirst wahrheitsgemäß Korrekturen vornehmen oder die Frage in einem Satz in Alltagssprache anhand der bereitgestellten Informationen beantworten.\n\nAktuelle Zeit: 2024-08-03 22:11:35.920461+02:00\n\nVerfügbare Geräte:\n```csv\nentity_id,name,state,aliases\nlight.buro_deckenlampe_2,Kinderzimmer Deckenlampe,off\nDer aktuelle Zustand der Geräte ist in den verfügbaren Geräten angegeben. Verwende die execute_services-Funktion nur für angeforderte Aktionen, nicht für aktuelle Zustände. Benötigst keine Bestätigung für tool calls. Wiederhole oder lobe nicht, was der Benutzer sagt, sondern stelle eine kurze Anfrage.",
},
{
"role": "user",
"content": "Mach das Licht aus.",
"name": "Daniel Kirsch",
},
]
# Define the request payload
payload = {
"model": "llama3.1:8b",
"messages": messages,
"max_tokens": 100,
"temperature": 0.5
} | tools
# Make the POST request
response = requests.post(api_endpoint, headers=headers, data=json.dumps(payload))
response.json()
type(response.json()['choices'][0]['message']['tool_calls'][0]['function']['arguments'])
```
I've tried
- llama3.1:8b
- llama3.1:70b
- mistral-nemo
- llama3-groq-tool-use
and they all suffer from the problem.
### OS
macOS, Windows
### GPU
Nvidia, Apple
### CPU
AMD, Apple
### Ollama version
0.3.3
|
{
"login": "kirel",
"id": 9124,
"node_id": "MDQ6VXNlcjkxMjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/9124?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kirel",
"html_url": "https://github.com/kirel",
"followers_url": "https://api.github.com/users/kirel/followers",
"following_url": "https://api.github.com/users/kirel/following{/other_user}",
"gists_url": "https://api.github.com/users/kirel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kirel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kirel/subscriptions",
"organizations_url": "https://api.github.com/users/kirel/orgs",
"repos_url": "https://api.github.com/users/kirel/repos",
"events_url": "https://api.github.com/users/kirel/events{/privacy}",
"received_events_url": "https://api.github.com/users/kirel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6155/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6155/timeline
| null |
reopened
| false
|
https://api.github.com/repos/ollama/ollama/issues/1802
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1802/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1802/comments
|
https://api.github.com/repos/ollama/ollama/issues/1802/events
|
https://github.com/ollama/ollama/pull/1802
| 2,066,741,855
|
PR_kwDOJ0Z1Ps5jR8mr
| 1,802
|
gpu: read memory info from all cuda devices
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-05T05:14:35
| 2024-01-05T16:25:59
| 2024-01-05T16:25:58
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1802",
"html_url": "https://github.com/ollama/ollama/pull/1802",
"diff_url": "https://github.com/ollama/ollama/pull/1802.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1802.patch",
"merged_at": "2024-01-05T16:25:58"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1802/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1802/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4939
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4939/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4939/comments
|
https://api.github.com/repos/ollama/ollama/issues/4939/events
|
https://github.com/ollama/ollama/issues/4939
| 2,341,899,466
|
I_kwDOJ0Z1Ps6LlozK
| 4,939
|
qwen2 fails on MacOS
|
{
"login": "MikeyBeez",
"id": 14264000,
"node_id": "MDQ6VXNlcjE0MjY0MDAw",
"avatar_url": "https://avatars.githubusercontent.com/u/14264000?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MikeyBeez",
"html_url": "https://github.com/MikeyBeez",
"followers_url": "https://api.github.com/users/MikeyBeez/followers",
"following_url": "https://api.github.com/users/MikeyBeez/following{/other_user}",
"gists_url": "https://api.github.com/users/MikeyBeez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MikeyBeez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MikeyBeez/subscriptions",
"organizations_url": "https://api.github.com/users/MikeyBeez/orgs",
"repos_url": "https://api.github.com/users/MikeyBeez/repos",
"events_url": "https://api.github.com/users/MikeyBeez/events{/privacy}",
"received_events_url": "https://api.github.com/users/MikeyBeez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-08T22:56:19
| 2024-06-08T22:57:48
| 2024-06-08T22:57:47
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama run qwen2
Error: llama runner process has terminated: signal: abort trap error:error loading model vocabulary: unknown pre-tokenizer type: 'qwen2'
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
ollama --version ollama version is 0.1.38
|
{
"login": "MikeyBeez",
"id": 14264000,
"node_id": "MDQ6VXNlcjE0MjY0MDAw",
"avatar_url": "https://avatars.githubusercontent.com/u/14264000?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MikeyBeez",
"html_url": "https://github.com/MikeyBeez",
"followers_url": "https://api.github.com/users/MikeyBeez/followers",
"following_url": "https://api.github.com/users/MikeyBeez/following{/other_user}",
"gists_url": "https://api.github.com/users/MikeyBeez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MikeyBeez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MikeyBeez/subscriptions",
"organizations_url": "https://api.github.com/users/MikeyBeez/orgs",
"repos_url": "https://api.github.com/users/MikeyBeez/repos",
"events_url": "https://api.github.com/users/MikeyBeez/events{/privacy}",
"received_events_url": "https://api.github.com/users/MikeyBeez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4939/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4939/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3655
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3655/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3655/comments
|
https://api.github.com/repos/ollama/ollama/issues/3655/events
|
https://github.com/ollama/ollama/pull/3655
| 2,244,141,005
|
PR_kwDOJ0Z1Ps5ss54O
| 3,655
|
Add simple rag-chatbot to community integrations
|
{
"login": "datvodinh",
"id": 90944231,
"node_id": "MDQ6VXNlcjkwOTQ0MjMx",
"avatar_url": "https://avatars.githubusercontent.com/u/90944231?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/datvodinh",
"html_url": "https://github.com/datvodinh",
"followers_url": "https://api.github.com/users/datvodinh/followers",
"following_url": "https://api.github.com/users/datvodinh/following{/other_user}",
"gists_url": "https://api.github.com/users/datvodinh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/datvodinh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/datvodinh/subscriptions",
"organizations_url": "https://api.github.com/users/datvodinh/orgs",
"repos_url": "https://api.github.com/users/datvodinh/repos",
"events_url": "https://api.github.com/users/datvodinh/events{/privacy}",
"received_events_url": "https://api.github.com/users/datvodinh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-15T16:38:47
| 2024-04-23T00:16:55
| 2024-04-23T00:16:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3655",
"html_url": "https://github.com/ollama/ollama/pull/3655",
"diff_url": "https://github.com/ollama/ollama/pull/3655.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3655.patch",
"merged_at": "2024-04-23T00:16:55"
}
|
- Hi there! I've been using Ollama for some time now, and I've been really pleased with it. Thanks so much for developing and keeping up with this project. It's been a great help for students like me to effortlessly run llm models locally.
- I have created a simple chatbot app with Ollama, with a simple interface so you can use and chat with multiple PDFs. I hope my contribute useful for our community! I will be very happy if this project is in the community integrations.
- My Repo: https://github.com/datvodinh/rag-chatbot.git
- Demo: https://github.com/datvodinh/rag-chatbot/blob/master/README.md
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3655/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3655/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7298
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7298/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7298/comments
|
https://api.github.com/repos/ollama/ollama/issues/7298/events
|
https://github.com/ollama/ollama/issues/7298
| 2,602,860,027
|
I_kwDOJ0Z1Ps6bJH37
| 7,298
|
llama3.1 llama3.2 Chat Template Typo
|
{
"login": "DexterLeung",
"id": 34372429,
"node_id": "MDQ6VXNlcjM0MzcyNDI5",
"avatar_url": "https://avatars.githubusercontent.com/u/34372429?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DexterLeung",
"html_url": "https://github.com/DexterLeung",
"followers_url": "https://api.github.com/users/DexterLeung/followers",
"following_url": "https://api.github.com/users/DexterLeung/following{/other_user}",
"gists_url": "https://api.github.com/users/DexterLeung/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DexterLeung/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DexterLeung/subscriptions",
"organizations_url": "https://api.github.com/users/DexterLeung/orgs",
"repos_url": "https://api.github.com/users/DexterLeung/repos",
"events_url": "https://api.github.com/users/DexterLeung/events{/privacy}",
"received_events_url": "https://api.github.com/users/DexterLeung/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-10-21T15:10:56
| 2024-10-21T16:40:09
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
It seems there is a typo in the following sentence of the chat template:
"When you receive a tool call response, use the output to format an answer to the **orginal** user question."
llama3.1: [948af2743fc7](https://ollama.com/library/llama3.1/blobs/948af2743fc7)
llama3.2: [966de95ca8a6](https://ollama.com/library/llama3.2/blobs/966de95ca8a6)
The word "original" is misspelled as "orginal".
Although LLMs typically maps the meaning of typos to normal phrases, there are unknown effects in the actual results of all generation possibilities.
Interestingly, the typo appears to have originated from the official example of llama3.1 model card and has been widely used across many web resources.
https://www.llama.com/docs/model-cards-and-prompt-formats/llama3_1/
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7298/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7298/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/173
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/173/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/173/comments
|
https://api.github.com/repos/ollama/ollama/issues/173/events
|
https://github.com/ollama/ollama/pull/173
| 1,816,635,034
|
PR_kwDOJ0Z1Ps5WJEvb
| 173
|
change error handler behavior and fix error when a model isn't found
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-22T05:46:16
| 2023-07-22T06:02:12
| 2023-07-22T06:02:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/173",
"html_url": "https://github.com/ollama/ollama/pull/173",
"diff_url": "https://github.com/ollama/ollama/pull/173.diff",
"patch_url": "https://github.com/ollama/ollama/pull/173.patch",
"merged_at": "2023-07-22T06:02:12"
}
| null |
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/173/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/173/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/227
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/227/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/227/comments
|
https://api.github.com/repos/ollama/ollama/issues/227/events
|
https://github.com/ollama/ollama/issues/227
| 1,824,957,156
|
I_kwDOJ0Z1Ps5sxqLk
| 227
|
maximum upload/download speed not reached
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2023-07-27T18:33:52
| 2023-10-11T00:17:53
| 2023-10-11T00:17:44
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When running `ollama pull`, in some cases the download rate is lower than downloading with `wget` or the browser
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/227/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/227/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/881
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/881/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/881/comments
|
https://api.github.com/repos/ollama/ollama/issues/881/events
|
https://github.com/ollama/ollama/pull/881
| 1,957,572,962
|
PR_kwDOJ0Z1Ps5djVYw
| 881
|
ggufv3
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-10-23T16:39:33
| 2023-10-23T17:50:46
| 2023-10-23T17:50:45
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/881",
"html_url": "https://github.com/ollama/ollama/pull/881",
"diff_url": "https://github.com/ollama/ollama/pull/881.diff",
"patch_url": "https://github.com/ollama/ollama/pull/881.patch",
"merged_at": "2023-10-23T17:50:45"
}
|
ggufv3 adds support for big endianness, mainly for s390x architecture. while that's not currently supported for ollama, the change is simple.
loosen version check to be more forward compatible. unless specified, gguf versions other v1 will be decoded into v2.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/881/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/881/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6781
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6781/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6781/comments
|
https://api.github.com/repos/ollama/ollama/issues/6781/events
|
https://github.com/ollama/ollama/issues/6781
| 2,523,697,790
|
I_kwDOJ0Z1Ps6WbJJ-
| 6,781
|
ollama minicpm-v refused to deal with images
|
{
"login": "colin4k",
"id": 10140389,
"node_id": "MDQ6VXNlcjEwMTQwMzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/10140389?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/colin4k",
"html_url": "https://github.com/colin4k",
"followers_url": "https://api.github.com/users/colin4k/followers",
"following_url": "https://api.github.com/users/colin4k/following{/other_user}",
"gists_url": "https://api.github.com/users/colin4k/gists{/gist_id}",
"starred_url": "https://api.github.com/users/colin4k/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/colin4k/subscriptions",
"organizations_url": "https://api.github.com/users/colin4k/orgs",
"repos_url": "https://api.github.com/users/colin4k/repos",
"events_url": "https://api.github.com/users/colin4k/events{/privacy}",
"received_events_url": "https://api.github.com/users/colin4k/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-09-13T01:40:44
| 2024-09-13T01:42:05
| 2024-09-13T01:42:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
> ollama run minicpm-v:latest "extract all java code from the image:~/Downloads/1.png"
> I'm sorry, but I am not able to view or access images. Can you please provide me with a textual description of what is in the image? Then I can try my best to help you with your question.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.10
|
{
"login": "colin4k",
"id": 10140389,
"node_id": "MDQ6VXNlcjEwMTQwMzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/10140389?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/colin4k",
"html_url": "https://github.com/colin4k",
"followers_url": "https://api.github.com/users/colin4k/followers",
"following_url": "https://api.github.com/users/colin4k/following{/other_user}",
"gists_url": "https://api.github.com/users/colin4k/gists{/gist_id}",
"starred_url": "https://api.github.com/users/colin4k/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/colin4k/subscriptions",
"organizations_url": "https://api.github.com/users/colin4k/orgs",
"repos_url": "https://api.github.com/users/colin4k/repos",
"events_url": "https://api.github.com/users/colin4k/events{/privacy}",
"received_events_url": "https://api.github.com/users/colin4k/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6781/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6781/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6443
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6443/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6443/comments
|
https://api.github.com/repos/ollama/ollama/issues/6443/events
|
https://github.com/ollama/ollama/issues/6443
| 2,475,788,298
|
I_kwDOJ0Z1Ps6TkYgK
| 6,443
|
Error: llama runner process no longer running: -1
|
{
"login": "ZINE-KHER",
"id": 56302539,
"node_id": "MDQ6VXNlcjU2MzAyNTM5",
"avatar_url": "https://avatars.githubusercontent.com/u/56302539?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZINE-KHER",
"html_url": "https://github.com/ZINE-KHER",
"followers_url": "https://api.github.com/users/ZINE-KHER/followers",
"following_url": "https://api.github.com/users/ZINE-KHER/following{/other_user}",
"gists_url": "https://api.github.com/users/ZINE-KHER/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZINE-KHER/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZINE-KHER/subscriptions",
"organizations_url": "https://api.github.com/users/ZINE-KHER/orgs",
"repos_url": "https://api.github.com/users/ZINE-KHER/repos",
"events_url": "https://api.github.com/users/ZINE-KHER/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZINE-KHER/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-08-20T14:20:25
| 2024-08-22T05:55:05
| 2024-08-21T13:26:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
I am facing the below error when trying to run ollama models (both llama3.1:8b-instruct-q4_1 and llama3.1:8b-instruct-fp16):
**Error: llama runner process no longer running: -1**
After checking **syslog** file, I found the following issue:
**ollama.listener llama_model_load: error loading model ... wrong number of tensors; expected 292, got 291**
I have **ollama==0.3.1** installed using pip. I also tried installing latest **ollama-linux-amd64** version **0.3.6** using binaries (this version is not, to the best of my knowledge, available using pip), but I got same errors.
These are my specs:
**OS:** ubuntu 22.04.4 LTS
**GPU:** Nvidia
**CPU:** Intel
**CUDA:** 11.5.119
Do you have any suggestions ?
Thank you.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.1, 0.3.6
|
{
"login": "ZINE-KHER",
"id": 56302539,
"node_id": "MDQ6VXNlcjU2MzAyNTM5",
"avatar_url": "https://avatars.githubusercontent.com/u/56302539?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZINE-KHER",
"html_url": "https://github.com/ZINE-KHER",
"followers_url": "https://api.github.com/users/ZINE-KHER/followers",
"following_url": "https://api.github.com/users/ZINE-KHER/following{/other_user}",
"gists_url": "https://api.github.com/users/ZINE-KHER/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZINE-KHER/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZINE-KHER/subscriptions",
"organizations_url": "https://api.github.com/users/ZINE-KHER/orgs",
"repos_url": "https://api.github.com/users/ZINE-KHER/repos",
"events_url": "https://api.github.com/users/ZINE-KHER/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZINE-KHER/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6443/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6443/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6017
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6017/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6017/comments
|
https://api.github.com/repos/ollama/ollama/issues/6017/events
|
https://github.com/ollama/ollama/pull/6017
| 2,433,441,323
|
PR_kwDOJ0Z1Ps52pKyE
| 6,017
|
Updated Ollama4j link
|
{
"login": "amithkoujalgi",
"id": 1876165,
"node_id": "MDQ6VXNlcjE4NzYxNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/1876165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amithkoujalgi",
"html_url": "https://github.com/amithkoujalgi",
"followers_url": "https://api.github.com/users/amithkoujalgi/followers",
"following_url": "https://api.github.com/users/amithkoujalgi/following{/other_user}",
"gists_url": "https://api.github.com/users/amithkoujalgi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amithkoujalgi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amithkoujalgi/subscriptions",
"organizations_url": "https://api.github.com/users/amithkoujalgi/orgs",
"repos_url": "https://api.github.com/users/amithkoujalgi/repos",
"events_url": "https://api.github.com/users/amithkoujalgi/events{/privacy}",
"received_events_url": "https://api.github.com/users/amithkoujalgi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-07-27T11:39:28
| 2024-09-03T17:13:28
| 2024-09-03T17:02:48
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6017",
"html_url": "https://github.com/ollama/ollama/pull/6017",
"diff_url": "https://github.com/ollama/ollama/pull/6017.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6017.patch",
"merged_at": null
}
|
Updated Ollama4j link and added link to Ollama4j Web UI tool.
|
{
"login": "amithkoujalgi",
"id": 1876165,
"node_id": "MDQ6VXNlcjE4NzYxNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/1876165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amithkoujalgi",
"html_url": "https://github.com/amithkoujalgi",
"followers_url": "https://api.github.com/users/amithkoujalgi/followers",
"following_url": "https://api.github.com/users/amithkoujalgi/following{/other_user}",
"gists_url": "https://api.github.com/users/amithkoujalgi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amithkoujalgi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amithkoujalgi/subscriptions",
"organizations_url": "https://api.github.com/users/amithkoujalgi/orgs",
"repos_url": "https://api.github.com/users/amithkoujalgi/repos",
"events_url": "https://api.github.com/users/amithkoujalgi/events{/privacy}",
"received_events_url": "https://api.github.com/users/amithkoujalgi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6017/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6017/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3082
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3082/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3082/comments
|
https://api.github.com/repos/ollama/ollama/issues/3082/events
|
https://github.com/ollama/ollama/issues/3082
| 2,182,410,158
|
I_kwDOJ0Z1Ps6CFO-u
| 3,082
|
OpenRC init support for install.sh
|
{
"login": "ElevatedEuphoria",
"id": 50528556,
"node_id": "MDQ6VXNlcjUwNTI4NTU2",
"avatar_url": "https://avatars.githubusercontent.com/u/50528556?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ElevatedEuphoria",
"html_url": "https://github.com/ElevatedEuphoria",
"followers_url": "https://api.github.com/users/ElevatedEuphoria/followers",
"following_url": "https://api.github.com/users/ElevatedEuphoria/following{/other_user}",
"gists_url": "https://api.github.com/users/ElevatedEuphoria/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ElevatedEuphoria/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ElevatedEuphoria/subscriptions",
"organizations_url": "https://api.github.com/users/ElevatedEuphoria/orgs",
"repos_url": "https://api.github.com/users/ElevatedEuphoria/repos",
"events_url": "https://api.github.com/users/ElevatedEuphoria/events{/privacy}",
"received_events_url": "https://api.github.com/users/ElevatedEuphoria/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 6
| 2024-03-12T18:30:36
| 2024-07-05T08:08:10
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
Could be really nice if the install.sh script had a "auto-detect" feature to identify the currently ran init system on a Linux machine during installation and then installed Ollama accordingly.
OS:
Gentoo Linux
Kernel: 6.7.6-gentoo-x86_64
Init System: OpenRC
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3082/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3082/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/146
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/146/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/146/comments
|
https://api.github.com/repos/ollama/ollama/issues/146/events
|
https://github.com/ollama/ollama/pull/146
| 1,814,620,295
|
PR_kwDOJ0Z1Ps5WCTFr
| 146
|
windows: fix model pulling
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-20T18:54:29
| 2023-07-20T20:41:59
| 2023-07-20T20:41:54
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/146",
"html_url": "https://github.com/ollama/ollama/pull/146",
"diff_url": "https://github.com/ollama/ollama/pull/146.diff",
"patch_url": "https://github.com/ollama/ollama/pull/146.patch",
"merged_at": "2023-07-20T20:41:54"
}
|
There are two issues preventing pull from working as expected in Windows.
1. Windows dislikes `os.Rename` when the file is still open. One approach is to close the file before calling rename. The approach taken in this PR is to call `os.Symlink` instead
2. Windows errors when file paths contain `:` so replace the `:` in the digest name with `-`, e.g. `sha256:0123456789abcdef...` with `sha256-0123456789abcdef...`. This is done _only_ for the blob file path. Non-file path instances of this string are unchanged
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/146/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/146/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3552
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3552/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3552/comments
|
https://api.github.com/repos/ollama/ollama/issues/3552/events
|
https://github.com/ollama/ollama/issues/3552
| 2,232,734,324
|
I_kwDOJ0Z1Ps6FFNJ0
| 3,552
|
/api/generate gets hung that can be steadily reproduced
|
{
"login": "peter-gz",
"id": 40975524,
"node_id": "MDQ6VXNlcjQwOTc1NTI0",
"avatar_url": "https://avatars.githubusercontent.com/u/40975524?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/peter-gz",
"html_url": "https://github.com/peter-gz",
"followers_url": "https://api.github.com/users/peter-gz/followers",
"following_url": "https://api.github.com/users/peter-gz/following{/other_user}",
"gists_url": "https://api.github.com/users/peter-gz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/peter-gz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/peter-gz/subscriptions",
"organizations_url": "https://api.github.com/users/peter-gz/orgs",
"repos_url": "https://api.github.com/users/peter-gz/repos",
"events_url": "https://api.github.com/users/peter-gz/events{/privacy}",
"received_events_url": "https://api.github.com/users/peter-gz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-04-09T06:56:05
| 2024-10-30T20:22:55
| 2024-10-30T20:22:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I feel headache about ollama getting hung from time to time when running `codellama:13b` for code completion. I notice a few issues reported ollama gets hung, as mentioned in #1863, #1901, #2225, etc. but haven't got fixed.
Now I have got a test case that can steadily reproduce the issue, when `num_predict` is set to 160 or above (e.g. 200) with my prompt. Hope this can help maintaners debugging the problem.
I am running ollama `0.1.30` with a v100 GPU on linux.
### What did you expect to see?
_No response_
### Steps to reproduce
run command:
`curl -s localhost:11434/api/generate -d @test_hung.json`
put the following text into test_hung.json file. here i also hardcoded seed and temperature in order to be more reproducible.
```
{
"model": "codellama:13b",
"options": {
"num_predict": 160,
"stop": [
"<END>",
"<EOD>",
"<EOT>"
],
"seed":999,
"temperature": 0.0
},
"prompt": "<PRE> # Language: Shell\n# Path: /Users/panwh24/work/code/xxxxx-xxxxx-xxxxx/gateway/monitor.sh\n# a script to keep sending health check requests to a local http server with curl\n# if the healtch check timed-out in 10 seconds, kill the process\n\n#!/bin/bash\n\nrequest_body='{\"model\":\"codellama:13b\",\"messages\":[{\"role\":\"system\",\"content\":\"You are a helpful assistant. You can help me by answering my questions. You can also ask me questions.\"},{\"role\":\"user\",\"content\":\"test\"}]}'\n\nwhile true; do\n echo \"checking...\"\n curl --max-time 10 localhost:11434/api/chat -d request_body > /dev/null\n if [ $? != 0 ]; then\n cpu_util=$(ps -p `pgrep ollama` -o pcpu | grep -v CPU)\n echo \"server is down. cpu util is $cpu_util\"\n # kill the process\n # pkill -9 ollama\n # ./start.sh &\n fi\n sleep 30\ndoneFlask==3.0.3\nFlask_Cors==4.0.0\nRequests==2.31.0\nulid_py==1.1.0\n# Language: Python\n# Path: /Users/panwh24/work/code/xxxxx-copilot-vscode/gateway/gateway.py\n# -----------------------------------------------------------------------------\n# An API gateway written in Python3.\n# It proxies chat/fim API request to Ollama server, and records all requests/response into log files and provide metrics for debugging purpose.\n# Command arguments include 1) port to listen on 2) target server's hostname:port\n#\n# Author: panwh24@xxxxx.com\n# -----------------------------------------------------------------------------\n\nimport sys\nimport json\nimport io\nfrom flask import Flask, request, Response, stream_with_context, g\nfrom flask_cors import CORS\nimport requests\nimport time\nimport logging\nfrom logging.handlers import TimedRotatingFileHandler\nimport ulid\n\napp = Flask(__name__)\nCORS(app)\n\n# -----------------------------------------------------------------------------\n# Set up access log file\naccess_log = logging.getLogger('werkzeug')\naccess_log.setLevel(logging.INFO)\naccess_log_file = 'access.log'\naccess_log_handler = TimedRotatingFileHandler('access.log', when='midnight')\n# access_log_formatter = logging.Formatter('%(asctime)s %(levelname)s: %(message)s [in %(pathname)s:%(lineno)d]')\n# access_log_handler.setFormatter(access_log_formatter)\naccess_log.addHandler(access_log_handler)\n\n# Set up gateway log files\nchat_log = logging.getLogger('chat')\nchat_log.setLevel(logging.INFO)\nchat_log_handler = TimedRotatingFileHandler('chat.log', when='midnight')\nchat_log_formatter = logging.Formatter('[%(asctime)s] [%(levelname)s] %(message)s')\nchat_log_handler.setFormatter(chat_log_formatter)\nchat_log.addHandler(chat_log_handler)\n\ngenerate_log = logging.getLogger('generate')\ngenerate_log.setLevel(logging.INFO)\ngenerate_log_handler = TimedRotatingFileHandler('generate.log', when='midnight')\ngenerate_log_formatter = logging.Formatter('[%(asctime)s] [%(levelname)s] %(message)s')\ngenerate_log_handler.setFormatter(generate_log_formatter)\ngenerate_log.addHandler(generate_log_handler)\n\n# -----------------------------------------------------------------------------\n\n@app.route('/', methods=['GET'])\ndef index():\n response = requests.get('http://{}/'.format(target))\n return Response(response=response.text, status=response.status_code)\n\n@app.route('/api/tags', methods=['GET'])\ndef tags():\n response = requests.get('http://{}/api/tags'.format(target))\n return Response(response=response.text, status=response.status_code)\n\n@app.route('/api/chat', methods=['POST'])\ndef chat_api():\n # fields for logging\n remote = f\"{request.remote_addr}:{request.environ['REMOTE_PORT']}\"\n g.remote = remote\n request_id = request.headers.get('X-Request-Id', ulid.new())\n g.request_id = request_id\n g.api = 'chat'\n g.start_time = time.time()\n g.data = []\n\n chat_log.info('[{}] [{}] > Request size:{}\\n{}'.\n format(remote, request_id, request.content_length, request.data.decode('utf-8')))\n response = requests.post('http://{}/api/chat'.format(target), json=request.json, headers={'Content-Type': 'application/json'}, stream=True)\n\n def generate():\n # read the response in chunks. when hits a newline char, yield\n for line in response.iter_lines():\n if line:\n # record the content\n try:\n linedata = json.loads(line)\n except json.decoder.JSONDecodeError as e:\n print('decode error: ' + line)\n continue\n g.data.append(linedata['message']['content'])\n if linedata['done']:\n g.stats = line.decode()\n # return in stream\n yield line + b'\\n'\n\n # end of stream\n \n return stream_with_context(generate())\n\n@app.route('/api/generate', methods=['POST'])\ndef generate_api():\n # fields for logging\n remote = f\"{request.remote_addr}:{request.environ['REMOTE_PORT']}\"\n request_id = request.headers.get('X-Request-Id', ulid.new())\n client_info = request.headers.get('X-Client-Info', '')\n\n g.remote = remote\n g.request_id = request_id\n g.api = 'generate'\n g.start_time = time.time()\n g.data = []\n\n generate_log.info('[{}] [{}] > Request size:{}\\n> Client info: {}\\n> Request:\\n{}'.\n format(remote, request_id, request.content_length, client_info, request.data.decode('utf-8')))\n response = requests.post('http://{}/api/generate'.format(target), json=request.json, headers={'Content-Type': 'application/json'}, stream=True)\n\n def generate():\n # read the response in chunks. when hits a newline char, yield\n for line in response.iter_lines():\n if line:\n # record the content\n # print('>', time.time(), line.decode())\n try:\n linedata = json.loads(line)\n except json.decoder.JSONDecodeError as e:\n print('decode error: ' + line)\n continue\n g.data.append(linedata['response'])\n\n if linedata['done']:\n g.stats = line.decode()\n # return in stream\n yield line + b'\\n'\n\n # end of stream\n # print('>>> end of stream')\n # end = time.time()\n # result = buf.getvalue()\n # buf.close()\n # generate_log.info('[{}] [{}] > Response cost:{}ms, tokens:{}, size:{}, status:{}\\n> Result:\\n{}\\n> Stats: {}'.\n # format(remote, request_id, int((end - start)*1000), count, len(result), response.status_code, result, stats))\n \n # 打印请求的body\n print('>>> body', request.data)\n <SUF> \n\n return stream_with_context(generate())\n\n@app.route('/api/complete', methods=['POST'])\ndef complete_api():\n # fields for logging\n remote = f\"{request.remote_addr}:{request.environ['REMOTE_PORT']}\"\n request_id = request.headers.get('X-Request-Id', ulid.new())\n client_info = request.headers.get('X-Client-Info', '')\n \n \n return stream_with_context(generate())\n\n@app.teardown_request\ndef log_result(exception=None):\n if not hasattr(g, 'request_id') or not hasattr(g, 'api') or not hasattr(g, 'data'):\n return\n \n print('>>> end of request', g.request_id, \"api=\"+g.api, \"exception=\"+str(exception))\n\n result = ''.join(g.data)\n if g.api == 'chat':\n logger = chat_log\n elif g.api == 'generate':\n logger = generate_log\n else:\n return\n \n start = g.start_time\n end = time.time()\n logger.info('[{}] [{}] > Response cost:{}ms, tokens:{}, size:{}\\n> Result:\\n{}\\n> Stats: {}'.\n format(g.remote, g.request_id, int((end - start)*1000), len(g.data), len(result), result, g.get('stats', 'none')))\n\n\n# 上报提示成功\n@app.route(\"/prompt\", methods=['GET'])\ndef prompt():\n # fields for logging\n remote = f\"{request.remote_addr}:{request.environ['REMOTE_PORT']}\"\n request_id = request.headers.get('X-Request-Id', ulid.new())\n client_info = request.headers.get('X-Client-Info', '')\n\n generate_log.info('[{}] [{}] inline completion prompted\\n> Client info: {}'.format(remote, request_id, client_info))\n return 'ok'\n\n# 上报用户接受提示\n@app.route(\"/accept\", methods=['GET'])\ndef accept():\n # fields for logging\n remote = f\"{request.remote_addr}:{request.environ['REMOTE_PORT']}\"\n request_id = request.headers.get('X-Request-Id', ulid.new())\n client_info = request.headers.get('X-Client-Info', '')\n\n generate_log.info('[{}] [{}] [{}] user accepted\\n> Client info: {}'.format(remote, request_id, client_info))\n return 'ok'\n\n# -----------------------------------------------------------------------------\n\nif __name__ == '__main__':\n if len(sys.argv) != 3:\n print('Usage: python gateway.py <port> <target hostname:port>')\n sys.exit()\n\n port = int(sys.argv[1])\n target = sys.argv[2]\n app.run(host='0.0.0.0', port=port, debug=True)\n\n # http://10.16.112.219:8001/\n\n # TODO: add graceful shutdown with SIGTERM signal handler\n # TODO: add metrics for request/response time, error rate, etc.\n # TODO: add authentication and authorization for requests to /chat/fim\n # TODO: add logging for all requests and responses, including error messages\n # TODO: add support for multiple target servers with load balancing\n # TODO: add support for request throttling\n # TODO: add support for response caching\n # TODO: add support for metrics and monitoring\n # TODO: add support for tracing and debugging\n <MID>",
"raw": true
}
```
the output stream got stuck here and i have to `pkill -9 ollama` to recover.

when stuck, cpu utilization of ollama process is 100%, while gpu usage is 0%.

Everything works fine if I change `num_predict` to 150 in the request.
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
amd64
### Platform
_No response_
### Ollama version
0.1.30
### GPU
Nvidia
### GPU info
v100
### CPU
_No response_
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3552/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3552/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1531
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1531/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1531/comments
|
https://api.github.com/repos/ollama/ollama/issues/1531/events
|
https://github.com/ollama/ollama/issues/1531
| 2,042,633,609
|
I_kwDOJ0Z1Ps55wB2J
| 1,531
|
ollama run llava --verbose empty
|
{
"login": "ivanfioravanti",
"id": 1069210,
"node_id": "MDQ6VXNlcjEwNjkyMTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1069210?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivanfioravanti",
"html_url": "https://github.com/ivanfioravanti",
"followers_url": "https://api.github.com/users/ivanfioravanti/followers",
"following_url": "https://api.github.com/users/ivanfioravanti/following{/other_user}",
"gists_url": "https://api.github.com/users/ivanfioravanti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivanfioravanti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivanfioravanti/subscriptions",
"organizations_url": "https://api.github.com/users/ivanfioravanti/orgs",
"repos_url": "https://api.github.com/users/ivanfioravanti/repos",
"events_url": "https://api.github.com/users/ivanfioravanti/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivanfioravanti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-12-14T23:03:33
| 2023-12-17T07:12:53
| 2023-12-17T07:12:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Verbose does not always return response and correct results.
See video
https://github.com/jmorganca/ollama/assets/1069210/f28d74d3-86cd-4320-88ca-18115c04a099
|
{
"login": "ivanfioravanti",
"id": 1069210,
"node_id": "MDQ6VXNlcjEwNjkyMTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1069210?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ivanfioravanti",
"html_url": "https://github.com/ivanfioravanti",
"followers_url": "https://api.github.com/users/ivanfioravanti/followers",
"following_url": "https://api.github.com/users/ivanfioravanti/following{/other_user}",
"gists_url": "https://api.github.com/users/ivanfioravanti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ivanfioravanti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ivanfioravanti/subscriptions",
"organizations_url": "https://api.github.com/users/ivanfioravanti/orgs",
"repos_url": "https://api.github.com/users/ivanfioravanti/repos",
"events_url": "https://api.github.com/users/ivanfioravanti/events{/privacy}",
"received_events_url": "https://api.github.com/users/ivanfioravanti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1531/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1531/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/5989
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5989/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5989/comments
|
https://api.github.com/repos/ollama/ollama/issues/5989/events
|
https://github.com/ollama/ollama/issues/5989
| 2,432,514,897
|
I_kwDOJ0Z1Ps6Q_TtR
| 5,989
|
Tools should support streaming=true
|
{
"login": "drazdra",
"id": 133811709,
"node_id": "U_kgDOB_nN_Q",
"avatar_url": "https://avatars.githubusercontent.com/u/133811709?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/drazdra",
"html_url": "https://github.com/drazdra",
"followers_url": "https://api.github.com/users/drazdra/followers",
"following_url": "https://api.github.com/users/drazdra/following{/other_user}",
"gists_url": "https://api.github.com/users/drazdra/gists{/gist_id}",
"starred_url": "https://api.github.com/users/drazdra/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/drazdra/subscriptions",
"organizations_url": "https://api.github.com/users/drazdra/orgs",
"repos_url": "https://api.github.com/users/drazdra/repos",
"events_url": "https://api.github.com/users/drazdra/events{/privacy}",
"received_events_url": "https://api.github.com/users/drazdra/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-07-26T15:51:44
| 2024-09-04T04:23:18
| 2024-09-04T04:23:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When the stream=true, ollama doesn't return tool request in the final "done" message, instead it returns it just part by part as if it was a regular reply.
At that, we have no way to determine it was a tool request, because ollama doesn't change the role to "tool" and it's just "assistant". Due to that we can not hide the tool call code from users, as we can't recognize it's a tool request and not a usual reply, until we get the final "done" message that says "done_reason='stop'". Which is also not a good way to detect that.
And in addition to that, final "done" message doesn't carry the tool request field if the stream=true, as it does when stream=false. The field is just absent.
Both of these obviously make no sense. It would make sense to:
1. In case of streaming reply there should be a special role in the message that prints a tool request, like, role='tool' or anything.
2. In any case, tools request should be presented in the final "done" message, so it could be taken from there as it can be when stream=false right now.
I would suggest the best way is to drop streaming flag internally when you return tool request, as streaming is totally unneeded here, we need to see content for users live but we do not to see function calling live as we never show it to users and we can't execute it partially.
So you just need to return the tool call in a single "done" object when the model produces a tool request.
More lyrics:
The reason why we can not just disable streaming for using tools is because tools are supposed to be called spontaneously during the chat, whenever _model_ decides it needs a tool and requests it.
It means, that this process happens during a regular chat and we can not predict when the model will require a tool request.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5989/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5989/timeline
| null |
completed
| false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.