url
stringlengths
51
54
repository_url
stringclasses
1 value
labels_url
stringlengths
65
68
comments_url
stringlengths
60
63
events_url
stringlengths
58
61
html_url
stringlengths
39
44
id
int64
1.78B
2.82B
node_id
stringlengths
18
19
number
int64
1
8.69k
title
stringlengths
1
382
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
1 class
assignee
dict
assignees
listlengths
0
2
milestone
null
comments
int64
0
323
created_at
timestamp[s]
updated_at
timestamp[s]
closed_at
timestamp[s]
author_association
stringclasses
4 values
sub_issues_summary
dict
active_lock_reason
null
draft
bool
2 classes
pull_request
dict
body
stringlengths
2
118k
closed_by
dict
reactions
dict
timeline_url
stringlengths
60
63
performed_via_github_app
null
state_reason
stringclasses
4 values
is_pull_request
bool
2 classes
https://api.github.com/repos/ollama/ollama/issues/509
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/509/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/509/comments
https://api.github.com/repos/ollama/ollama/issues/509/events
https://github.com/ollama/ollama/pull/509
1,891,425,301
PR_kwDOJ0Z1Ps5aEmqk
509
support for packaging in multiple cuda runners
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-09-11T23:38:14
2023-09-14T19:08:15
2023-09-14T19:08:13
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/509", "html_url": "https://github.com/ollama/ollama/pull/509", "diff_url": "https://github.com/ollama/ollama/pull/509.diff", "patch_url": "https://github.com/ollama/ollama/pull/509.patch", "merged_at": "2023-09-14T19:08:13" }
null
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/509/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/509/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5023
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5023/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5023/comments
https://api.github.com/repos/ollama/ollama/issues/5023/events
https://github.com/ollama/ollama/issues/5023
2,351,350,308
I_kwDOJ0Z1Ps6MJsIk
5,023
DeepSeek-V2-Lite-Chat - ERROR [validate_model_chat_template] The chat template comes with this model is not yet supported
{ "login": "OldishCoder", "id": 142182836, "node_id": "U_kgDOCHmJtA", "avatar_url": "https://avatars.githubusercontent.com/u/142182836?v=4", "gravatar_id": "", "url": "https://api.github.com/users/OldishCoder", "html_url": "https://github.com/OldishCoder", "followers_url": "https://api.github.com/users/OldishCoder/followers", "following_url": "https://api.github.com/users/OldishCoder/following{/other_user}", "gists_url": "https://api.github.com/users/OldishCoder/gists{/gist_id}", "starred_url": "https://api.github.com/users/OldishCoder/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/OldishCoder/subscriptions", "organizations_url": "https://api.github.com/users/OldishCoder/orgs", "repos_url": "https://api.github.com/users/OldishCoder/repos", "events_url": "https://api.github.com/users/OldishCoder/events{/privacy}", "received_events_url": "https://api.github.com/users/OldishCoder/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-06-13T14:47:48
2024-06-30T05:41:16
2024-06-13T16:39:31
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When I load DeepSeek-V2 I see this message: `ERROR [validate_model_chat_template] The chat template comes with this model is not yet supported, falling back to chatml. This may cause the model to output suboptimal responses | tid="139815279423488" timestamp=1718280244` I do not know if it's related, but on inference the model sometimes, randomly (?), starts generating exclusively in Chinese, even if I ask it to generate the response in English. ### OS WSL2 ### GPU Nvidia ### CPU Intel ### Ollama version git HEAD from June 11, 2024
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5023/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5023/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7982
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7982/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7982/comments
https://api.github.com/repos/ollama/ollama/issues/7982/events
https://github.com/ollama/ollama/issues/7982
2,724,302,558
I_kwDOJ0Z1Ps6iYY7e
7,982
exit status 2
{ "login": "bb159", "id": 128159516, "node_id": "U_kgDOB6OPHA", "avatar_url": "https://avatars.githubusercontent.com/u/128159516?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bb159", "html_url": "https://github.com/bb159", "followers_url": "https://api.github.com/users/bb159/followers", "following_url": "https://api.github.com/users/bb159/following{/other_user}", "gists_url": "https://api.github.com/users/bb159/gists{/gist_id}", "starred_url": "https://api.github.com/users/bb159/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bb159/subscriptions", "organizations_url": "https://api.github.com/users/bb159/orgs", "repos_url": "https://api.github.com/users/bb159/repos", "events_url": "https://api.github.com/users/bb159/events{/privacy}", "received_events_url": "https://api.github.com/users/bb159/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677367769, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q", "url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info", "name": "needs more info", "color": "BA8041", "default": false, "description": "More information is needed to assist" } ]
closed
false
null
[]
null
1
2024-12-07T03:25:46
2024-12-23T08:06:16
2024-12-23T08:06:16
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? exit status 2 ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version ollama version is 0.5.1
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/users/rick-github/followers", "following_url": "https://api.github.com/users/rick-github/following{/other_user}", "gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}", "starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rick-github/subscriptions", "organizations_url": "https://api.github.com/users/rick-github/orgs", "repos_url": "https://api.github.com/users/rick-github/repos", "events_url": "https://api.github.com/users/rick-github/events{/privacy}", "received_events_url": "https://api.github.com/users/rick-github/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7982/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7982/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/773
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/773/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/773/comments
https://api.github.com/repos/ollama/ollama/issues/773/events
https://github.com/ollama/ollama/pull/773
1,940,896,513
PR_kwDOJ0Z1Ps5crty9
773
add how to quantize doc
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.github.com/users/technovangelist/followers", "following_url": "https://api.github.com/users/technovangelist/following{/other_user}", "gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}", "starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions", "organizations_url": "https://api.github.com/users/technovangelist/orgs", "repos_url": "https://api.github.com/users/technovangelist/repos", "events_url": "https://api.github.com/users/technovangelist/events{/privacy}", "received_events_url": "https://api.github.com/users/technovangelist/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-10-12T22:35:56
2023-10-15T21:06:47
2023-10-14T15:29:39
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/773", "html_url": "https://github.com/ollama/ollama/pull/773", "diff_url": "https://github.com/ollama/ollama/pull/773.diff", "patch_url": "https://github.com/ollama/ollama/pull/773.patch", "merged_at": "2023-10-14T15:29:39" }
null
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.github.com/users/technovangelist/followers", "following_url": "https://api.github.com/users/technovangelist/following{/other_user}", "gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}", "starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions", "organizations_url": "https://api.github.com/users/technovangelist/orgs", "repos_url": "https://api.github.com/users/technovangelist/repos", "events_url": "https://api.github.com/users/technovangelist/events{/privacy}", "received_events_url": "https://api.github.com/users/technovangelist/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/773/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/773/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6276
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6276/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6276/comments
https://api.github.com/repos/ollama/ollama/issues/6276/events
https://github.com/ollama/ollama/pull/6276
2,457,229,099
PR_kwDOJ0Z1Ps536Xcl
6,276
feat: K/V cache quantisation (massive vRAM improvement!)
{ "login": "sammcj", "id": 862951, "node_id": "MDQ6VXNlcjg2Mjk1MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/862951?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sammcj", "html_url": "https://github.com/sammcj", "followers_url": "https://api.github.com/users/sammcj/followers", "following_url": "https://api.github.com/users/sammcj/following{/other_user}", "gists_url": "https://api.github.com/users/sammcj/gists{/gist_id}", "starred_url": "https://api.github.com/users/sammcj/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sammcj/subscriptions", "organizations_url": "https://api.github.com/users/sammcj/orgs", "repos_url": "https://api.github.com/users/sammcj/repos", "events_url": "https://api.github.com/users/sammcj/events{/privacy}", "received_events_url": "https://api.github.com/users/sammcj/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-08-09T06:49:35
2024-08-09T07:18:52
2024-08-09T07:18:51
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
true
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6276", "html_url": "https://github.com/ollama/ollama/pull/6276", "diff_url": "https://github.com/ollama/ollama/pull/6276.diff", "patch_url": "https://github.com/ollama/ollama/pull/6276.patch", "merged_at": null }
This PR introduces optional K/V (context) cache quantisation. (PR recreated after Github broke https://github.com/ollama/ollama/pull/5894 🤦) In addition the deprecated `F16KV` parameter has been removed, if a user wishes for some reason to run the KV at f32, they can provide that as an option. ## Impact - With defaults (f16) - none, behaviour is the same as the current defaults. - With q8_0 - **The K/V context cache will consume 1/2 the vRAM** (!) - A _very_ small loss in quality within the cache - With q4_0 - **the K/V context cache will consume 1/4 the vRAM** (!!) - A small/medium loss in quality within the cache - For example, loading llama3.1 8b with a 32K context drops vRAM usage by cache from 4GB to 1.1GB - The and q4_1 -> q5_1 in between. Additional quantisations supported by llama.cpp and this PR that may depend on the quantisation of the model you're running: `q5_1`, `q5_0`, `q4_1`, `iq4_nl` - Fixes https://github.com/ollama/ollama/issues/5091 - Related discussion in llama.cpp - https://github.com/ggerganov/llama.cpp/discussions/5932 - (Note that ExllamaV2 has a similar feature - https://github.com/turboderp/exllamav2/blob/master/doc/qcache_eval.md) ## Screenshots Example of estimated (v)RAM savings - f16 (q8_0,q4_0) <img width="1211" alt="image" src="https://github.com/user-attachments/assets/a3520770-7b31-40c7-b45b-4aad6db9b117"> ### f16 ![kv_cache_f16](https://github.com/user-attachments/assets/af0a3b40-70e2-47f1-90b0-6ecd09dc59df) ### q4_0 ![kv_cache_q4_0](https://github.com/user-attachments/assets/47ba6578-1f5b-4091-8594-f63ecfada49e) ### q8_0 ![kv_cache_q8_0](https://github.com/user-attachments/assets/c7c09e62-4b54-4536-9617-6b00b1af6f94) ## Performance llama.cpp did some perplexity measurements (although looking at the commits things have likely improved even further since May when they were done, and CUDA graphs were later fixed etc....): https://github.com/ggerganov/llama.cpp/pull/7412#issuecomment-2120427347 As far as I can tell (at least with q6_k quants) there isn't much of a noticeable hit to performance.
{ "login": "sammcj", "id": 862951, "node_id": "MDQ6VXNlcjg2Mjk1MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/862951?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sammcj", "html_url": "https://github.com/sammcj", "followers_url": "https://api.github.com/users/sammcj/followers", "following_url": "https://api.github.com/users/sammcj/following{/other_user}", "gists_url": "https://api.github.com/users/sammcj/gists{/gist_id}", "starred_url": "https://api.github.com/users/sammcj/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sammcj/subscriptions", "organizations_url": "https://api.github.com/users/sammcj/orgs", "repos_url": "https://api.github.com/users/sammcj/repos", "events_url": "https://api.github.com/users/sammcj/events{/privacy}", "received_events_url": "https://api.github.com/users/sammcj/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6276/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6276/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3159
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3159/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3159/comments
https://api.github.com/repos/ollama/ollama/issues/3159/events
https://github.com/ollama/ollama/issues/3159
2,187,627,079
I_kwDOJ0Z1Ps6CZIpH
3,159
A way to communicate reasons for low performance to users of CLI & API
{ "login": "easp", "id": 414705, "node_id": "MDQ6VXNlcjQxNDcwNQ==", "avatar_url": "https://avatars.githubusercontent.com/u/414705?v=4", "gravatar_id": "", "url": "https://api.github.com/users/easp", "html_url": "https://github.com/easp", "followers_url": "https://api.github.com/users/easp/followers", "following_url": "https://api.github.com/users/easp/following{/other_user}", "gists_url": "https://api.github.com/users/easp/gists{/gist_id}", "starred_url": "https://api.github.com/users/easp/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/easp/subscriptions", "organizations_url": "https://api.github.com/users/easp/orgs", "repos_url": "https://api.github.com/users/easp/repos", "events_url": "https://api.github.com/users/easp/events{/privacy}", "received_events_url": "https://api.github.com/users/easp/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
1
2024-03-15T03:02:34
2024-05-18T03:18:55
2024-05-18T03:18:55
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
People are often concerned about what they perceive to be low performance and/or whether Ollama is making optimal use of their RAM/VRAM/GPU/CPU cores. This comes up frequently in github issues, the main Discord channel and the Discord help channel. I think a lot of these queries could be avoided if Ollama communicated more information to users in a more concise and obvious way than digging in the log file. Perhaps a message in the CLI, just before displaying the REPL prompt, and a status message field in the API response. Common conditions seem to be * Model + context too large for VRAM * GPU detected but not used, using CPU for inference * No GPU detected, using CPU for inference The message should warn that performance will be low and point user towards remedies: * Using a model with fewer parameters and/or smaller quantization * A URL for troubleshooting issues with GPUs.
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3159/reactions", "total_count": 5, "+1": 5, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3159/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5128
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5128/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5128/comments
https://api.github.com/repos/ollama/ollama/issues/5128/events
https://github.com/ollama/ollama/pull/5128
2,361,020,058
PR_kwDOJ0Z1Ps5y4-lV
5,128
Fix levelzero empty symbol detect
{ "login": "zhewang1-intc", "id": 72838274, "node_id": "MDQ6VXNlcjcyODM4Mjc0", "avatar_url": "https://avatars.githubusercontent.com/u/72838274?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhewang1-intc", "html_url": "https://github.com/zhewang1-intc", "followers_url": "https://api.github.com/users/zhewang1-intc/followers", "following_url": "https://api.github.com/users/zhewang1-intc/following{/other_user}", "gists_url": "https://api.github.com/users/zhewang1-intc/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhewang1-intc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhewang1-intc/subscriptions", "organizations_url": "https://api.github.com/users/zhewang1-intc/orgs", "repos_url": "https://api.github.com/users/zhewang1-intc/repos", "events_url": "https://api.github.com/users/zhewang1-intc/events{/privacy}", "received_events_url": "https://api.github.com/users/zhewang1-intc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-06-19T01:30:27
2024-07-02T05:46:36
2024-06-19T15:33:16
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5128", "html_url": "https://github.com/ollama/ollama/pull/5128", "diff_url": "https://github.com/ollama/ollama/pull/5128.diff", "patch_url": "https://github.com/ollama/ollama/pull/5128.patch", "merged_at": "2024-06-19T15:33:16" }
we notice many users report ollama crash on Windows platform when ollama try to discover Intel GPUs after introducing Intel GPUs support. this issue may occur on some older Intel CPUs which before 11Gen. this is due to the driver library of iGPUs before 11Gen processors don't have the symbols which initial level-zero needed, like `zesInit` , and there is also a bug in `oneapi_init`, `l[i].p` is a pointer to pointer, func ptr stored in `*l[i].p` so we should detect whether `*l[i].p` is null ptr to check whether we load the func successfully.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5128/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5128/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/8027
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8027/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8027/comments
https://api.github.com/repos/ollama/ollama/issues/8027/events
https://github.com/ollama/ollama/issues/8027
2,730,726,874
I_kwDOJ0Z1Ps6iw5Xa
8,027
ollama_llama_server not signed
{ "login": "disenchant", "id": 686611, "node_id": "MDQ6VXNlcjY4NjYxMQ==", "avatar_url": "https://avatars.githubusercontent.com/u/686611?v=4", "gravatar_id": "", "url": "https://api.github.com/users/disenchant", "html_url": "https://github.com/disenchant", "followers_url": "https://api.github.com/users/disenchant/followers", "following_url": "https://api.github.com/users/disenchant/following{/other_user}", "gists_url": "https://api.github.com/users/disenchant/gists{/gist_id}", "starred_url": "https://api.github.com/users/disenchant/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/disenchant/subscriptions", "organizations_url": "https://api.github.com/users/disenchant/orgs", "repos_url": "https://api.github.com/users/disenchant/repos", "events_url": "https://api.github.com/users/disenchant/events{/privacy}", "received_events_url": "https://api.github.com/users/disenchant/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677279472, "node_id": "LA_kwDOJ0Z1Ps8AAAABjf8y8A", "url": "https://api.github.com/repos/ollama/ollama/labels/macos", "name": "macos", "color": "E2DBC0", "default": false, "description": "" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
1
2024-12-10T17:24:04
2024-12-10T18:29:26
2024-12-10T18:29:26
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? The `ollama_llama_server` binary for macOS doesn't seem to be signed. This causes problems e.g. when using binary authorization tools like [Santa](https://github.com/northpolesec/santa) to allowlist the usage of Ollama on machines. ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.5.1
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8027/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8027/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/527
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/527/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/527/comments
https://api.github.com/repos/ollama/ollama/issues/527/events
https://github.com/ollama/ollama/pull/527
1,895,416,452
PR_kwDOJ0Z1Ps5aSFdO
527
Update API docs
{ "login": "technovangelist", "id": 633681, "node_id": "MDQ6VXNlcjYzMzY4MQ==", "avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4", "gravatar_id": "", "url": "https://api.github.com/users/technovangelist", "html_url": "https://github.com/technovangelist", "followers_url": "https://api.github.com/users/technovangelist/followers", "following_url": "https://api.github.com/users/technovangelist/following{/other_user}", "gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}", "starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions", "organizations_url": "https://api.github.com/users/technovangelist/orgs", "repos_url": "https://api.github.com/users/technovangelist/repos", "events_url": "https://api.github.com/users/technovangelist/events{/privacy}", "received_events_url": "https://api.github.com/users/technovangelist/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-09-14T00:00:30
2023-09-14T15:51:27
2023-09-14T15:51:26
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/527", "html_url": "https://github.com/ollama/ollama/pull/527", "diff_url": "https://github.com/ollama/ollama/pull/527.diff", "patch_url": "https://github.com/ollama/ollama/pull/527.patch", "merged_at": "2023-09-14T15:51:26" }
cleanup docs, add show and push.
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/users/mchiang0610/followers", "following_url": "https://api.github.com/users/mchiang0610/following{/other_user}", "gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}", "starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions", "organizations_url": "https://api.github.com/users/mchiang0610/orgs", "repos_url": "https://api.github.com/users/mchiang0610/repos", "events_url": "https://api.github.com/users/mchiang0610/events{/privacy}", "received_events_url": "https://api.github.com/users/mchiang0610/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/527/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/527/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1536
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1536/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1536/comments
https://api.github.com/repos/ollama/ollama/issues/1536/events
https://github.com/ollama/ollama/issues/1536
2,042,796,710
I_kwDOJ0Z1Ps55wpqm
1,536
Ability to keep a model in memory for longer
{ "login": "helloimcx", "id": 27777511, "node_id": "MDQ6VXNlcjI3Nzc3NTEx", "avatar_url": "https://avatars.githubusercontent.com/u/27777511?v=4", "gravatar_id": "", "url": "https://api.github.com/users/helloimcx", "html_url": "https://github.com/helloimcx", "followers_url": "https://api.github.com/users/helloimcx/followers", "following_url": "https://api.github.com/users/helloimcx/following{/other_user}", "gists_url": "https://api.github.com/users/helloimcx/gists{/gist_id}", "starred_url": "https://api.github.com/users/helloimcx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/helloimcx/subscriptions", "organizations_url": "https://api.github.com/users/helloimcx/orgs", "repos_url": "https://api.github.com/users/helloimcx/repos", "events_url": "https://api.github.com/users/helloimcx/events{/privacy}", "received_events_url": "https://api.github.com/users/helloimcx/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
21
2023-12-15T02:45:45
2025-01-18T05:04:20
2024-01-26T23:53:32
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
is there a way to keep the model in memory or gpu memory ?
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1536/reactions", "total_count": 6, "+1": 6, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1536/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/857
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/857/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/857/comments
https://api.github.com/repos/ollama/ollama/issues/857/events
https://github.com/ollama/ollama/issues/857
1,954,737,189
I_kwDOJ0Z1Ps50guwl
857
Manually download and upload models
{ "login": "dawnpatrol04", "id": 10486066, "node_id": "MDQ6VXNlcjEwNDg2MDY2", "avatar_url": "https://avatars.githubusercontent.com/u/10486066?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dawnpatrol04", "html_url": "https://github.com/dawnpatrol04", "followers_url": "https://api.github.com/users/dawnpatrol04/followers", "following_url": "https://api.github.com/users/dawnpatrol04/following{/other_user}", "gists_url": "https://api.github.com/users/dawnpatrol04/gists{/gist_id}", "starred_url": "https://api.github.com/users/dawnpatrol04/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dawnpatrol04/subscriptions", "organizations_url": "https://api.github.com/users/dawnpatrol04/orgs", "repos_url": "https://api.github.com/users/dawnpatrol04/repos", "events_url": "https://api.github.com/users/dawnpatrol04/events{/privacy}", "received_events_url": "https://api.github.com/users/dawnpatrol04/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
12
2023-10-20T17:22:32
2024-08-28T19:19:34
2023-10-30T22:07:42
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
hey guys. Having issues getting with this part a work with corporate proxy: docker exec -it ollama ollama run llama2. 2 issues. 1) When I set a proxy something breaks. 2) model url / cert not allowed / blocked. To work around this I will need to manually download model files upload to the container. Can we manually download and upload model files? Where do I put the model files after I have download them?
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/857/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/857/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/878
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/878/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/878/comments
https://api.github.com/repos/ollama/ollama/issues/878/events
https://github.com/ollama/ollama/issues/878
1,957,264,872
I_kwDOJ0Z1Ps50qX3o
878
llama2 image not really llama2?
{ "login": "sharpner", "id": 574362, "node_id": "MDQ6VXNlcjU3NDM2Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/574362?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sharpner", "html_url": "https://github.com/sharpner", "followers_url": "https://api.github.com/users/sharpner/followers", "following_url": "https://api.github.com/users/sharpner/following{/other_user}", "gists_url": "https://api.github.com/users/sharpner/gists{/gist_id}", "starred_url": "https://api.github.com/users/sharpner/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sharpner/subscriptions", "organizations_url": "https://api.github.com/users/sharpner/orgs", "repos_url": "https://api.github.com/users/sharpner/repos", "events_url": "https://api.github.com/users/sharpner/events{/privacy}", "received_events_url": "https://api.github.com/users/sharpner/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-10-23T14:07:47
2023-10-24T06:31:52
2023-10-24T06:31:52
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Today I tested all models listed on this page: https://ollama.ai/library Using the system prompt and the question "who is albert einstein?": ``` I want you to act like Ludwig van Beethoven. I want you to respond and answer like Ludwig van Beethoven, using the tone, manner and vocabulary Ludwig van Beethoven would use. You must only know all of the knowledge of Ludwig van Beethoven nothing else. The status of you is as follows: Location: Beethoven’s home Status: Ludwig van Beethoven’s first music teacher was his father. His father attempted to promote him as a child prodigy, claiming that Beethoven was six on the posters for his first public performance in March 1778. Beethoven was taught by several local teachers, including the court organist Gilles van den Eeden, Tobias Friedrich Pfeiffer, Franz Rovantini, and court concertmaster Franz Anton Ries. His tuition began when he was only five years old. The regime was harsh and intensive, often reducing him to tears. With the involvement of the insomniac Pfeiffer, there were irregular late-night sessions, with the young Beethoven dragged from his bed to the keyboard. His musical talent was obvious at a young age. ``` Every model behaves like expected, except for the default **llama2** model. Which responds in a weeby kinda way: ``` Ah, a fellow genius! *adjusts spectacles* Albert Einstein? *chuckles* Ah, yes, I've heard of him. A clever little German fellow with a head full of ideas, no? *grin* Wait, what was the question again? Oh, you wanted to know something about that... Albert Einstein? *scratches chin* Well, I must say, he's quite the fellow. A physicist, you say? *nods* Yes, I've heard of his work. He's been making quite a stir with his theories on relativity and time. *smirks* Sounds like a bit of hogwash to me, but I suppose he's got some clever ideas. But tell me, have you met this Einstein fellow? *curious eyes* What is it that he does? *leaning forward* Does he play the pianoforte? *inquiring mind* Oh, I do love a good tune on the piano! *smiles* Anyway, enough about that fellow. *adjusts spectacles* Tell me, what brings you to my humble abode? Are you here to discuss the finer points of music theory? *winks* Or perhaps you'd like a lesson in the art of composition? *grin* I'm all ears, young man! *chuckles* Now, if you'll excuse me, I must get back to my latest masterpiece. *adjusts monocle* It's a symphony, you see, and it's going to be simply... *exhales dramatically* Magnificent! *smirks* ``` Whereas for example **llama2:13b-q8_0** responds: ``` Albert Einstein was born in 1879 in Germany. His life’s work was developing the theory of relativity and the concept that mass could be converted into energy. Because Albert Einstein was considered to be the greatest physicist of his day, he became a very well-known figure, both before and after the publication of his great scientific theories in 1905. Although he spent most of his life outside of Germany – living in other European countries like Switzerland, as well as in the United States – he was proud to call himself a German. However, when Adolf Hitler became the dictator of Germany in 1933 and began persecuting Jews like Albert Einstein, he decided to renounce his German citizenship for good. ``` Is this expected behaviour or might the llama2 image be compromised? Or a smaller variant: ``` SYSTEM """ I want you to act like President Obama. """ ``` Question "Who are you?" Answer: ```Ah, a fellow American! *adjusts sunglasses* It's great to meet you, my friend! *smirks* I am Barack Hussein Obama, the 44th President of the United States. *pauses for dramatic effect* Now, I know what you're thinking: "Barack, how can we trust a man who has been in office for two terms and still looks like he's ready to hit the gym?" *winks* Well, let me tell you something, my fellow American. I may have been out of office for a few years now, but I'm still as sharp as ever. *adjusts tie* And I've got some great stories to share from my time in the White House. *grins mischievously* So, what can I do for you today? Do you need some advice on how to run a successful campaign? Or maybe you just want to chat about the latest news and current events? *leaning back in chair* Whatever it is, feel free to shoot. *smirks* And remember, as I always say: "Yes We Can!" *winks* ``` and response from **llama2:13b-chat** ``` Greetings, my fellow American! I am Barack Hussein Obama, the 44th President of the United States. I served two terms in office from 2009 to 2017, and I am proud of all that we accomplished during my time in the White House. As President, I worked tirelessly to strengthen our economy, reform our healthcare system, and protect the rights and freedoms of all Americans. I also had the privilege of being the first African American to hold the office of President, and I am deeply committed to advancing the cause of equality and justice for all. I hope that you will join me in my efforts to make America a better place for everyone. Together, we can create a brighter future for ourselves, our children, and future generations. Thank you! ```
{ "login": "sharpner", "id": 574362, "node_id": "MDQ6VXNlcjU3NDM2Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/574362?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sharpner", "html_url": "https://github.com/sharpner", "followers_url": "https://api.github.com/users/sharpner/followers", "following_url": "https://api.github.com/users/sharpner/following{/other_user}", "gists_url": "https://api.github.com/users/sharpner/gists{/gist_id}", "starred_url": "https://api.github.com/users/sharpner/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sharpner/subscriptions", "organizations_url": "https://api.github.com/users/sharpner/orgs", "repos_url": "https://api.github.com/users/sharpner/repos", "events_url": "https://api.github.com/users/sharpner/events{/privacy}", "received_events_url": "https://api.github.com/users/sharpner/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/878/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/878/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1291
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1291/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1291/comments
https://api.github.com/repos/ollama/ollama/issues/1291/events
https://github.com/ollama/ollama/issues/1291
2,013,270,341
I_kwDOJ0Z1Ps54ABFF
1,291
ollama runpod serverless template
{ "login": "devsaturn", "id": 43364322, "node_id": "MDQ6VXNlcjQzMzY0MzIy", "avatar_url": "https://avatars.githubusercontent.com/u/43364322?v=4", "gravatar_id": "", "url": "https://api.github.com/users/devsaturn", "html_url": "https://github.com/devsaturn", "followers_url": "https://api.github.com/users/devsaturn/followers", "following_url": "https://api.github.com/users/devsaturn/following{/other_user}", "gists_url": "https://api.github.com/users/devsaturn/gists{/gist_id}", "starred_url": "https://api.github.com/users/devsaturn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/devsaturn/subscriptions", "organizations_url": "https://api.github.com/users/devsaturn/orgs", "repos_url": "https://api.github.com/users/devsaturn/repos", "events_url": "https://api.github.com/users/devsaturn/events{/privacy}", "received_events_url": "https://api.github.com/users/devsaturn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
6
2023-11-27T22:56:18
2023-12-11T15:19:34
2023-12-11T15:19:34
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
is there an example of deploy a model using an ollama in a runpod serverless ?
{ "login": "devsaturn", "id": 43364322, "node_id": "MDQ6VXNlcjQzMzY0MzIy", "avatar_url": "https://avatars.githubusercontent.com/u/43364322?v=4", "gravatar_id": "", "url": "https://api.github.com/users/devsaturn", "html_url": "https://github.com/devsaturn", "followers_url": "https://api.github.com/users/devsaturn/followers", "following_url": "https://api.github.com/users/devsaturn/following{/other_user}", "gists_url": "https://api.github.com/users/devsaturn/gists{/gist_id}", "starred_url": "https://api.github.com/users/devsaturn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/devsaturn/subscriptions", "organizations_url": "https://api.github.com/users/devsaturn/orgs", "repos_url": "https://api.github.com/users/devsaturn/repos", "events_url": "https://api.github.com/users/devsaturn/events{/privacy}", "received_events_url": "https://api.github.com/users/devsaturn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1291/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1291/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2208
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2208/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2208/comments
https://api.github.com/repos/ollama/ollama/issues/2208/events
https://github.com/ollama/ollama/issues/2208
2,102,429,567
I_kwDOJ0Z1Ps59UId_
2,208
Feature: API error response in case of exceeding context length
{ "login": "Jurik-001", "id": 123458520, "node_id": "U_kgDOB1vT2A", "avatar_url": "https://avatars.githubusercontent.com/u/123458520?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Jurik-001", "html_url": "https://github.com/Jurik-001", "followers_url": "https://api.github.com/users/Jurik-001/followers", "following_url": "https://api.github.com/users/Jurik-001/following{/other_user}", "gists_url": "https://api.github.com/users/Jurik-001/gists{/gist_id}", "starred_url": "https://api.github.com/users/Jurik-001/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Jurik-001/subscriptions", "organizations_url": "https://api.github.com/users/Jurik-001/orgs", "repos_url": "https://api.github.com/users/Jurik-001/repos", "events_url": "https://api.github.com/users/Jurik-001/events{/privacy}", "received_events_url": "https://api.github.com/users/Jurik-001/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 7706482389, "node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q", "url": "https://api.github.com/repos/ollama/ollama/labels/api", "name": "api", "color": "bfdadc", "default": false, "description": "" } ]
open
false
null
[]
null
0
2024-01-26T15:41:30
2024-11-06T19:02:07
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
It would be good to get a error response back from the API if the prompt exceeds the context length of the model, similar to openai API.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2208/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2208/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/4026
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4026/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4026/comments
https://api.github.com/repos/ollama/ollama/issues/4026/events
https://github.com/ollama/ollama/issues/4026
2,269,143,895
I_kwDOJ0Z1Ps6HQGNX
4,026
Llama 3 BPE tokenization needs improvement
{ "login": "coder543", "id": 726063, "node_id": "MDQ6VXNlcjcyNjA2Mw==", "avatar_url": "https://avatars.githubusercontent.com/u/726063?v=4", "gravatar_id": "", "url": "https://api.github.com/users/coder543", "html_url": "https://github.com/coder543", "followers_url": "https://api.github.com/users/coder543/followers", "following_url": "https://api.github.com/users/coder543/following{/other_user}", "gists_url": "https://api.github.com/users/coder543/gists{/gist_id}", "starred_url": "https://api.github.com/users/coder543/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/coder543/subscriptions", "organizations_url": "https://api.github.com/users/coder543/orgs", "repos_url": "https://api.github.com/users/coder543/repos", "events_url": "https://api.github.com/users/coder543/events{/privacy}", "received_events_url": "https://api.github.com/users/coder543/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
3
2024-04-29T14:04:06
2024-05-13T15:53:02
2024-05-11T13:41:15
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? [This PR](https://github.com/ggerganov/llama.cpp/pull/6920) just merged on llama.cpp, which contained important improvements to how tokenization worked for Llama 3 and other models. An example of the issue is [noted here](https://github.com/ggerganov/llama.cpp/issues/6914). Hopefully ollama can update to the latest llama.cpp quickly and make a new release. ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version all versions up to this point
{ "login": "coder543", "id": 726063, "node_id": "MDQ6VXNlcjcyNjA2Mw==", "avatar_url": "https://avatars.githubusercontent.com/u/726063?v=4", "gravatar_id": "", "url": "https://api.github.com/users/coder543", "html_url": "https://github.com/coder543", "followers_url": "https://api.github.com/users/coder543/followers", "following_url": "https://api.github.com/users/coder543/following{/other_user}", "gists_url": "https://api.github.com/users/coder543/gists{/gist_id}", "starred_url": "https://api.github.com/users/coder543/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/coder543/subscriptions", "organizations_url": "https://api.github.com/users/coder543/orgs", "repos_url": "https://api.github.com/users/coder543/repos", "events_url": "https://api.github.com/users/coder543/events{/privacy}", "received_events_url": "https://api.github.com/users/coder543/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4026/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4026/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6591
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6591/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6591/comments
https://api.github.com/repos/ollama/ollama/issues/6591/events
https://github.com/ollama/ollama/issues/6591
2,500,418,861
I_kwDOJ0Z1Ps6VCV0t
6,591
Ollama failing with `CUDA error: PTX JIT compiler library not found`
{ "login": "leobenkel", "id": 4960573, "node_id": "MDQ6VXNlcjQ5NjA1NzM=", "avatar_url": "https://avatars.githubusercontent.com/u/4960573?v=4", "gravatar_id": "", "url": "https://api.github.com/users/leobenkel", "html_url": "https://github.com/leobenkel", "followers_url": "https://api.github.com/users/leobenkel/followers", "following_url": "https://api.github.com/users/leobenkel/following{/other_user}", "gists_url": "https://api.github.com/users/leobenkel/gists{/gist_id}", "starred_url": "https://api.github.com/users/leobenkel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/leobenkel/subscriptions", "organizations_url": "https://api.github.com/users/leobenkel/orgs", "repos_url": "https://api.github.com/users/leobenkel/repos", "events_url": "https://api.github.com/users/leobenkel/events{/privacy}", "received_events_url": "https://api.github.com/users/leobenkel/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
9
2024-09-02T09:06:01
2024-09-26T19:55:34
2024-09-03T21:14:04
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? The GPU seems to be detected: ``` CUDA driver version: 11.4 time=2024-08-30T16:57:23.032Z level=DEBUG source=gpu.go:123 msg="detected GPUs" count=1 library=/usr/lib/aarch64-linux-gnu/tegra/libcuda.so.1.1 [GPU-d90c0d9d-5e59-56b1-b519-6439b1d74328] CUDA totalMem 30990 mb [GPU-d90c0d9d-5e59-56b1-b519-6439b1d74328] CUDA freeMem 23431 mb [GPU-d90c0d9d-5e59-56b1-b519-6439b1d74328] Compute Capability 7.2 ... time=2024-08-30T16:57:23.201Z level=INFO source=types.go:105 msg="inference compute" id=GPU-d90c0d9d-5e59-56b1-b519-6439b1d74328 library=cuda compute=7.2 driver=11.4 name=Xavier total="30.3 GiB" available="22.9 GiB" ... time=2024-08-30T16:58:09.212Z level=DEBUG source=gpu.go:410 msg="updating cuda memory data" gpu=GPU-d90c0d9d-5e59-56b1-b519-6439b1d74328 name=Xavier overhead="0 B" before.total="30.3 GiB" before.free="22.9 GiB" now.total="30.3 GiB" now.free="23.0 GiB" now.used="7.3 GiB" ``` But then I get this error: ``` CUDA error: PTX JIT compiler library not found current device: 0, in function ggml_cuda_compute_forward at /go/src/github.com/ollama/ollama/llm/llama.cpp/ggml/src/ggml-cuda.cu:2313 /go/src/github.com/ollama/ollama/llm/llama.cpp/ggml/src/ggml-cuda.cu:101: CUDA error ``` Here are the installed version: ``` ii nvidia-cuda 5.1.3-b29 arm64 NVIDIA CUDA Meta Package ii nvidia-cuda-dev 5.1.3-b29 arm64 NVIDIA CUDA dev Meta Package ii nvidia-jetpack 5.1.3-b29 arm64 NVIDIA Jetpack Meta Package ii nvidia-jetpack-dev 5.1.3-b29 arm64 NVIDIA Jetpack dev Meta Package ii nvidia-jetpack-runtime 5.1.3-b29 arm64 NVIDIA Jetpack runtime Meta Package ii nvidia-l4t-cuda 35.5.0-20240219203809 arm64 NVIDIA CUDA Package ii cuda 11.4.19-1 arm64 CUDA meta-package ``` Ollama is used with the docker container et inside a kubernetes cluster. ### OS Docker ### GPU Nvidia ### CPU ARM ### Ollama version 0.3.8
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6591/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6591/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3014
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3014/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3014/comments
https://api.github.com/repos/ollama/ollama/issues/3014/events
https://github.com/ollama/ollama/pull/3014
2,176,931,782
PR_kwDOJ0Z1Ps5pIVTe
3,014
decode ggla
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-03-08T23:47:08
2024-03-09T00:14:54
2024-03-09T00:14:53
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3014", "html_url": "https://github.com/ollama/ollama/pull/3014", "diff_url": "https://github.com/ollama/ollama/pull/3014.diff", "patch_url": "https://github.com/ollama/ollama/pull/3014.patch", "merged_at": "2024-03-09T00:14:53" }
split from #2926
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3014/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3014/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7470
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7470/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7470/comments
https://api.github.com/repos/ollama/ollama/issues/7470/events
https://github.com/ollama/ollama/issues/7470
2,630,256,963
I_kwDOJ0Z1Ps6cxolD
7,470
[Model request] The First-Ever Comprehensive Benchmark for Multimodal Large Language Models in Industrial Anomaly Detection
{ "login": "monkeycc", "id": 6490927, "node_id": "MDQ6VXNlcjY0OTA5Mjc=", "avatar_url": "https://avatars.githubusercontent.com/u/6490927?v=4", "gravatar_id": "", "url": "https://api.github.com/users/monkeycc", "html_url": "https://github.com/monkeycc", "followers_url": "https://api.github.com/users/monkeycc/followers", "following_url": "https://api.github.com/users/monkeycc/following{/other_user}", "gists_url": "https://api.github.com/users/monkeycc/gists{/gist_id}", "starred_url": "https://api.github.com/users/monkeycc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/monkeycc/subscriptions", "organizations_url": "https://api.github.com/users/monkeycc/orgs", "repos_url": "https://api.github.com/users/monkeycc/repos", "events_url": "https://api.github.com/users/monkeycc/events{/privacy}", "received_events_url": "https://api.github.com/users/monkeycc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
0
2024-11-02T06:25:49
2024-11-02T06:25:49
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://github.com/jam-cc/MMAD ![examples](https://github.com/user-attachments/assets/7e76cd00-b93d-4c0a-acf9-17d1ab1609e7) The First-Ever Comprehensive Benchmark for Multimodal Large Language Models in Industrial Anomaly Detection
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7470/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7470/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/8652
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8652/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8652/comments
https://api.github.com/repos/ollama/ollama/issues/8652/events
https://github.com/ollama/ollama/issues/8652
2,817,696,992
I_kwDOJ0Z1Ps6n8qTg
8,652
Could not install deepseek-r1:7b version
{ "login": "chinmayc0707", "id": 139462368, "node_id": "U_kgDOCFAG4A", "avatar_url": "https://avatars.githubusercontent.com/u/139462368?v=4", "gravatar_id": "", "url": "https://api.github.com/users/chinmayc0707", "html_url": "https://github.com/chinmayc0707", "followers_url": "https://api.github.com/users/chinmayc0707/followers", "following_url": "https://api.github.com/users/chinmayc0707/following{/other_user}", "gists_url": "https://api.github.com/users/chinmayc0707/gists{/gist_id}", "starred_url": "https://api.github.com/users/chinmayc0707/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chinmayc0707/subscriptions", "organizations_url": "https://api.github.com/users/chinmayc0707/orgs", "repos_url": "https://api.github.com/users/chinmayc0707/repos", "events_url": "https://api.github.com/users/chinmayc0707/events{/privacy}", "received_events_url": "https://api.github.com/users/chinmayc0707/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677370291, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw", "url": "https://api.github.com/repos/ollama/ollama/labels/networking", "name": "networking", "color": "0B5368", "default": false, "description": "Issues relating to ollama pull and push" } ]
open
false
{ "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "bmizerany", "id": 46, "node_id": "MDQ6VXNlcjQ2", "avatar_url": "https://avatars.githubusercontent.com/u/46?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmizerany", "html_url": "https://github.com/bmizerany", "followers_url": "https://api.github.com/users/bmizerany/followers", "following_url": "https://api.github.com/users/bmizerany/following{/other_user}", "gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions", "organizations_url": "https://api.github.com/users/bmizerany/orgs", "repos_url": "https://api.github.com/users/bmizerany/repos", "events_url": "https://api.github.com/users/bmizerany/events{/privacy}", "received_events_url": "https://api.github.com/users/bmizerany/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
1
2025-01-29T09:42:18
2025-01-30T08:54:15
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Error: max retries exceeded: Get "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/aa/aabd4debf0c8f08881923f2c25fc0fdeed24435271c2b3e92c4af36704040dbc/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20250129%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20250129T093708Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&X-Amz-Signature=044eb3053c34ac7e27101080923c27a8d4f0adc5334de93463391b2cf3d208cd": dial tcp: lookup dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com: no such host ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.5.7
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8652/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8652/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2691
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2691/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2691/comments
https://api.github.com/repos/ollama/ollama/issues/2691/events
https://github.com/ollama/ollama/issues/2691
2,149,719,194
I_kwDOJ0Z1Ps6AIhya
2,691
Not an issue, just a thank you.
{ "login": "robert-mcdermott", "id": 7399563, "node_id": "MDQ6VXNlcjczOTk1NjM=", "avatar_url": "https://avatars.githubusercontent.com/u/7399563?v=4", "gravatar_id": "", "url": "https://api.github.com/users/robert-mcdermott", "html_url": "https://github.com/robert-mcdermott", "followers_url": "https://api.github.com/users/robert-mcdermott/followers", "following_url": "https://api.github.com/users/robert-mcdermott/following{/other_user}", "gists_url": "https://api.github.com/users/robert-mcdermott/gists{/gist_id}", "starred_url": "https://api.github.com/users/robert-mcdermott/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/robert-mcdermott/subscriptions", "organizations_url": "https://api.github.com/users/robert-mcdermott/orgs", "repos_url": "https://api.github.com/users/robert-mcdermott/repos", "events_url": "https://api.github.com/users/robert-mcdermott/events{/privacy}", "received_events_url": "https://api.github.com/users/robert-mcdermott/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-02-22T18:41:56
2024-02-22T18:47:11
2024-02-22T18:47:11
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
The Ollama project is awesome, thanks for all the work you do.
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2691/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2691/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/4804
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4804/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4804/comments
https://api.github.com/repos/ollama/ollama/issues/4804/events
https://github.com/ollama/ollama/pull/4804
2,332,456,590
PR_kwDOJ0Z1Ps5xXsXj
4,804
Created a couple C# example applications using OllamaSharp
{ "login": "JerrettDavis", "id": 2610199, "node_id": "MDQ6VXNlcjI2MTAxOTk=", "avatar_url": "https://avatars.githubusercontent.com/u/2610199?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JerrettDavis", "html_url": "https://github.com/JerrettDavis", "followers_url": "https://api.github.com/users/JerrettDavis/followers", "following_url": "https://api.github.com/users/JerrettDavis/following{/other_user}", "gists_url": "https://api.github.com/users/JerrettDavis/gists{/gist_id}", "starred_url": "https://api.github.com/users/JerrettDavis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/JerrettDavis/subscriptions", "organizations_url": "https://api.github.com/users/JerrettDavis/orgs", "repos_url": "https://api.github.com/users/JerrettDavis/repos", "events_url": "https://api.github.com/users/JerrettDavis/events{/privacy}", "received_events_url": "https://api.github.com/users/JerrettDavis/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-06-04T03:27:00
2024-06-14T03:34:18
2024-06-14T03:34:18
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4804", "html_url": "https://github.com/ollama/ollama/pull/4804", "diff_url": "https://github.com/ollama/ollama/pull/4804.diff", "patch_url": "https://github.com/ollama/ollama/pull/4804.patch", "merged_at": null }
I've added C# examples leveraging the [OllamaSharp](https://github.com/awaescher/OllamaSharp) wrapper library.
{ "login": "JerrettDavis", "id": 2610199, "node_id": "MDQ6VXNlcjI2MTAxOTk=", "avatar_url": "https://avatars.githubusercontent.com/u/2610199?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JerrettDavis", "html_url": "https://github.com/JerrettDavis", "followers_url": "https://api.github.com/users/JerrettDavis/followers", "following_url": "https://api.github.com/users/JerrettDavis/following{/other_user}", "gists_url": "https://api.github.com/users/JerrettDavis/gists{/gist_id}", "starred_url": "https://api.github.com/users/JerrettDavis/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/JerrettDavis/subscriptions", "organizations_url": "https://api.github.com/users/JerrettDavis/orgs", "repos_url": "https://api.github.com/users/JerrettDavis/repos", "events_url": "https://api.github.com/users/JerrettDavis/events{/privacy}", "received_events_url": "https://api.github.com/users/JerrettDavis/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4804/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4804/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1129
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1129/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1129/comments
https://api.github.com/repos/ollama/ollama/issues/1129/events
https://github.com/ollama/ollama/pull/1129
1,993,464,753
PR_kwDOJ0Z1Ps5fcwS1
1,129
fix: parse ggml model types
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
4
2023-11-14T19:56:54
2023-12-06T23:54:41
2023-11-22T21:09:14
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1129", "html_url": "https://github.com/ollama/ollama/pull/1129", "diff_url": "https://github.com/ollama/ollama/pull/1129.diff", "patch_url": "https://github.com/ollama/ollama/pull/1129.patch", "merged_at": null }
I'm kind of surprised this didn't come up earlier, noticed this while importing a less common ggml model type. Model parsing was returning `nil` for uncommon types which results in a nil pointer dereference. Fix this by moving model parsing up a level and try to decode as a llama model in the default case.
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1129/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1129/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7337
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7337/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7337/comments
https://api.github.com/repos/ollama/ollama/issues/7337/events
https://github.com/ollama/ollama/pull/7337
2,609,976,414
PR_kwDOJ0Z1Ps5_sAY-
7,337
Update README.md - Revised vnc-lm project description
{ "login": "jake83741", "id": 125723241, "node_id": "U_kgDOB35iaQ", "avatar_url": "https://avatars.githubusercontent.com/u/125723241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jake83741", "html_url": "https://github.com/jake83741", "followers_url": "https://api.github.com/users/jake83741/followers", "following_url": "https://api.github.com/users/jake83741/following{/other_user}", "gists_url": "https://api.github.com/users/jake83741/gists{/gist_id}", "starred_url": "https://api.github.com/users/jake83741/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jake83741/subscriptions", "organizations_url": "https://api.github.com/users/jake83741/orgs", "repos_url": "https://api.github.com/users/jake83741/repos", "events_url": "https://api.github.com/users/jake83741/events{/privacy}", "received_events_url": "https://api.github.com/users/jake83741/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-10-23T22:39:29
2024-10-27T22:15:54
2024-10-27T22:15:50
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7337", "html_url": "https://github.com/ollama/ollama/pull/7337", "diff_url": "https://github.com/ollama/ollama/pull/7337.diff", "patch_url": "https://github.com/ollama/ollama/pull/7337.patch", "merged_at": null }
Hi, This is just a small change updating the description for my project, [vnc-lm](https://github.com/jake83741/vnc-lm). Thanks, Jake
{ "login": "jake83741", "id": 125723241, "node_id": "U_kgDOB35iaQ", "avatar_url": "https://avatars.githubusercontent.com/u/125723241?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jake83741", "html_url": "https://github.com/jake83741", "followers_url": "https://api.github.com/users/jake83741/followers", "following_url": "https://api.github.com/users/jake83741/following{/other_user}", "gists_url": "https://api.github.com/users/jake83741/gists{/gist_id}", "starred_url": "https://api.github.com/users/jake83741/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jake83741/subscriptions", "organizations_url": "https://api.github.com/users/jake83741/orgs", "repos_url": "https://api.github.com/users/jake83741/repos", "events_url": "https://api.github.com/users/jake83741/events{/privacy}", "received_events_url": "https://api.github.com/users/jake83741/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7337/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7337/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2346
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2346/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2346/comments
https://api.github.com/repos/ollama/ollama/issues/2346/events
https://github.com/ollama/ollama/issues/2346
2,117,126,222
I_kwDOJ0Z1Ps5-MMhO
2,346
Use llama2-chinese or yi to ask questions in Chinese. After about 15 requests, the response request cannot be returned.
{ "login": "wyang-code", "id": 130064485, "node_id": "U_kgDOB8CgZQ", "avatar_url": "https://avatars.githubusercontent.com/u/130064485?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wyang-code", "html_url": "https://github.com/wyang-code", "followers_url": "https://api.github.com/users/wyang-code/followers", "following_url": "https://api.github.com/users/wyang-code/following{/other_user}", "gists_url": "https://api.github.com/users/wyang-code/gists{/gist_id}", "starred_url": "https://api.github.com/users/wyang-code/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wyang-code/subscriptions", "organizations_url": "https://api.github.com/users/wyang-code/orgs", "repos_url": "https://api.github.com/users/wyang-code/repos", "events_url": "https://api.github.com/users/wyang-code/events{/privacy}", "received_events_url": "https://api.github.com/users/wyang-code/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-02-04T14:28:08
2024-02-20T04:04:26
2024-02-20T04:04:26
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Use llama2-chinese or yi to ask questions in Chinese. After about 15 requests, the response request cannot be returned. <img width="1494" alt="image" src="https://github.com/ollama/ollama/assets/130064485/b850c5c6-d09e-472b-9592-5f2bb115c2ed">
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2346/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2346/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5810
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5810/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5810/comments
https://api.github.com/repos/ollama/ollama/issues/5810/events
https://github.com/ollama/ollama/issues/5810
2,420,883,131
I_kwDOJ0Z1Ps6QS767
5,810
Tinyllama has issues understanding the Modelfile
{ "login": "DuilioPerez", "id": 135564120, "node_id": "U_kgDOCBSLWA", "avatar_url": "https://avatars.githubusercontent.com/u/135564120?v=4", "gravatar_id": "", "url": "https://api.github.com/users/DuilioPerez", "html_url": "https://github.com/DuilioPerez", "followers_url": "https://api.github.com/users/DuilioPerez/followers", "following_url": "https://api.github.com/users/DuilioPerez/following{/other_user}", "gists_url": "https://api.github.com/users/DuilioPerez/gists{/gist_id}", "starred_url": "https://api.github.com/users/DuilioPerez/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/DuilioPerez/subscriptions", "organizations_url": "https://api.github.com/users/DuilioPerez/orgs", "repos_url": "https://api.github.com/users/DuilioPerez/repos", "events_url": "https://api.github.com/users/DuilioPerez/events{/privacy}", "received_events_url": "https://api.github.com/users/DuilioPerez/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-07-20T12:40:58
2024-09-12T22:17:53
2024-09-12T22:17:53
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? When I start a conversation with Tinyllama with a greeting, it answer with a revision of the Modelfile. That's strange. Here is an example ![chrome_screenshot_20 jul 2024 6_39_17 a  m CST](https://github.com/user-attachments/assets/4bbb17a2-8411-4b1d-a8c5-2f86f63c7e40) ### OS Linux ### GPU Other ### CPU Other ### Ollama version 0.2.7
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5810/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5810/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3593
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3593/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3593/comments
https://api.github.com/repos/ollama/ollama/issues/3593/events
https://github.com/ollama/ollama/issues/3593
2,237,573,743
I_kwDOJ0Z1Ps6FXqpv
3,593
Error: Post "http://127.0.0.1:11434/api/chat": read tcp 127.0.0.1:59108->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host.
{ "login": "liquorLiu", "id": 125868659, "node_id": "U_kgDOB4Cacw", "avatar_url": "https://avatars.githubusercontent.com/u/125868659?v=4", "gravatar_id": "", "url": "https://api.github.com/users/liquorLiu", "html_url": "https://github.com/liquorLiu", "followers_url": "https://api.github.com/users/liquorLiu/followers", "following_url": "https://api.github.com/users/liquorLiu/following{/other_user}", "gists_url": "https://api.github.com/users/liquorLiu/gists{/gist_id}", "starred_url": "https://api.github.com/users/liquorLiu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/liquorLiu/subscriptions", "organizations_url": "https://api.github.com/users/liquorLiu/orgs", "repos_url": "https://api.github.com/users/liquorLiu/repos", "events_url": "https://api.github.com/users/liquorLiu/events{/privacy}", "received_events_url": "https://api.github.com/users/liquorLiu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg", "url": "https://api.github.com/repos/ollama/ollama/labels/windows", "name": "windows", "color": "0052CC", "default": false, "description": "" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
13
2024-04-11T12:06:28
2024-05-21T18:30:34
2024-05-21T18:30:33
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
here is my app.log: time=2024-04-11T19:53:53.911+08:00 level=INFO source=logging.go:45 msg="ollama app started" time=2024-04-11T19:53:53.940+08:00 level=DEBUG source=lifecycle.go:32 msg="starting callback loop" time=2024-04-11T19:53:53.940+08:00 level=DEBUG source=store.go:61 msg="loaded existing store C:\\Users\\17900\\AppData\\Local\\Ollama\\config.json - ID: ef96d596-af07-43a7-97e2-f6af540c5021" time=2024-04-11T19:53:53.940+08:00 level=DEBUG source=lifecycle.go:66 msg="Not first time, skipping first run notification" time=2024-04-11T19:53:53.941+08:00 level=DEBUG source=server.go:134 msg="heartbeat from server: Head \"http://127.0.0.1:11434/\": dial tcp 127.0.0.1:11434: connectex: No connection could be made because the target machine actively refused it." time=2024-04-11T19:53:53.941+08:00 level=INFO source=server.go:135 msg="unable to connect to server" time=2024-04-11T19:53:53.947+08:00 level=INFO source=server.go:91 msg="started ollama server with pid 7836" time=2024-04-11T19:53:53.947+08:00 level=INFO source=server.go:93 msg="ollama server logs C:\\Users\\17900\\AppData\\Local\\Ollama\\server.log" time=2024-04-11T19:53:53.947+08:00 level=DEBUG source=eventloop.go:24 msg="starting event handling loop" time=2024-04-11T19:53:56.952+08:00 level=DEBUG source=updater.go:73 msg="checking for available update" requestURL="https://ollama.com/api/update?arch=amd64&nonce=4ubr2F5IgZhBzSseU_H9iQ&os=windows&ts=1712836436&version=0.1.31" time=2024-04-11T19:53:58.123+08:00 level=DEBUG source=updater.go:82 msg="check update response 204 (current version is up to date)" time=2024-04-11T19:54:06.001+08:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 3221226505 - respawning" time=2024-04-11T19:54:06.509+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:07.516+08:00 level=WARN source=server.go:113 msg="server crash 2 - exit code 3221226505 - respawning" time=2024-04-11T19:54:08.022+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:10.031+08:00 level=WARN source=server.go:113 msg="server crash 3 - exit code 3221226505 - respawning" time=2024-04-11T19:54:10.536+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:12.217+08:00 level=DEBUG source=logging_windows.go:12 msg="viewing logs with start C:\\Users\\17900\\AppData\\Local\\Ollama" time=2024-04-11T19:54:13.540+08:00 level=WARN source=server.go:113 msg="server crash 4 - exit code 3221226505 - respawning" time=2024-04-11T19:54:14.052+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:18.066+08:00 level=WARN source=server.go:113 msg="server crash 5 - exit code 3221226505 - respawning" time=2024-04-11T19:54:18.573+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:23.575+08:00 level=WARN source=server.go:113 msg="server crash 6 - exit code 3221226505 - respawning" time=2024-04-11T19:54:24.081+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:30.085+08:00 level=WARN source=server.go:113 msg="server crash 7 - exit code 3221226505 - respawning" time=2024-04-11T19:54:30.592+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:37.604+08:00 level=WARN source=server.go:113 msg="server crash 8 - exit code 3221226505 - respawning" time=2024-04-11T19:54:38.106+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:46.117+08:00 level=WARN source=server.go:113 msg="server crash 9 - exit code 3221226505 - respawning" time=2024-04-11T19:54:46.626+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:54:55.632+08:00 level=WARN source=server.go:113 msg="server crash 10 - exit code 3221226505 - respawning" time=2024-04-11T19:54:56.140+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:55:06.150+08:00 level=WARN source=server.go:113 msg="server crash 11 - exit code 3221226505 - respawning" time=2024-04-11T19:55:06.655+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:55:17.667+08:00 level=WARN source=server.go:113 msg="server crash 12 - exit code 3221226505 - respawning" time=2024-04-11T19:55:18.177+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:55:30.185+08:00 level=WARN source=server.go:113 msg="server crash 13 - exit code 3221226505 - respawning" time=2024-04-11T19:55:30.693+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:55:43.694+08:00 level=WARN source=server.go:113 msg="server crash 14 - exit code 3221226505 - respawning" time=2024-04-11T19:55:44.205+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:55:58.210+08:00 level=WARN source=server.go:113 msg="server crash 15 - exit code 3221226505 - respawning" time=2024-04-11T19:55:58.718+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-04-11T19:56:06.457+08:00 level=DEBUG source=eventloop.go:148 msg="unmanaged app message, lParm: 0x204" time=2024-04-11T19:56:07.035+08:00 level=DEBUG source=logging_windows.go:12 msg="viewing logs with start C:\\Users\\17900\\AppData\\Local\\Ollama" how to fix it?thanks
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3593/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3593/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2464
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2464/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2464/comments
https://api.github.com/repos/ollama/ollama/issues/2464/events
https://github.com/ollama/ollama/issues/2464
2,130,347,351
I_kwDOJ0Z1Ps5--oVX
2,464
Error: invalid version -- when attempting to run llava
{ "login": "gr-b", "id": 18386065, "node_id": "MDQ6VXNlcjE4Mzg2MDY1", "avatar_url": "https://avatars.githubusercontent.com/u/18386065?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gr-b", "html_url": "https://github.com/gr-b", "followers_url": "https://api.github.com/users/gr-b/followers", "following_url": "https://api.github.com/users/gr-b/following{/other_user}", "gists_url": "https://api.github.com/users/gr-b/gists{/gist_id}", "starred_url": "https://api.github.com/users/gr-b/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gr-b/subscriptions", "organizations_url": "https://api.github.com/users/gr-b/orgs", "repos_url": "https://api.github.com/users/gr-b/repos", "events_url": "https://api.github.com/users/gr-b/events{/privacy}", "received_events_url": "https://api.github.com/users/gr-b/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2024-02-12T15:15:06
2024-02-12T15:30:58
2024-02-12T15:30:58
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I attempted to install and run llava on an m1 mac and got the following: ~ % ollama run llava pulling manifest pulling 170370233dd5... 100% |███████████████| (4.1/4.1 GB, 46 MB/s) pulling 72d6f08a42f6... 100% |█████████████████| (624/624 MB, 49 MB/s) pulling 43070e2d4e53... 100% |███████████████████| (11/11 kB, 1.1 MB/s) pulling c43332387573... 100% |████████████████████| (67/67 B, 538 kB/s) pulling ed11eda7790d... 100% |████████████████████| (30/30 B, 185 kB/s) pulling 7c658f9561e5... 100% |██████████████████| (564/564 B, 1.8 MB/s) verifying sha256 digest writing manifest removing any unused layers success ⠋ Error: invalid version If I run `ollama run llava` again (above was the first time), I get: `⠋ Error: invalid version` If I use the API on localhost: `{"error":"invalid version"}` Any help would be appreciated. Thanks.
{ "login": "gr-b", "id": 18386065, "node_id": "MDQ6VXNlcjE4Mzg2MDY1", "avatar_url": "https://avatars.githubusercontent.com/u/18386065?v=4", "gravatar_id": "", "url": "https://api.github.com/users/gr-b", "html_url": "https://github.com/gr-b", "followers_url": "https://api.github.com/users/gr-b/followers", "following_url": "https://api.github.com/users/gr-b/following{/other_user}", "gists_url": "https://api.github.com/users/gr-b/gists{/gist_id}", "starred_url": "https://api.github.com/users/gr-b/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/gr-b/subscriptions", "organizations_url": "https://api.github.com/users/gr-b/orgs", "repos_url": "https://api.github.com/users/gr-b/repos", "events_url": "https://api.github.com/users/gr-b/events{/privacy}", "received_events_url": "https://api.github.com/users/gr-b/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2464/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2464/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8005
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8005/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8005/comments
https://api.github.com/repos/ollama/ollama/issues/8005/events
https://github.com/ollama/ollama/issues/8005
2,725,801,345
I_kwDOJ0Z1Ps6ieG2B
8,005
Why is OllamaSetup.exe so large
{ "login": "xgdgsc", "id": 1189869, "node_id": "MDQ6VXNlcjExODk4Njk=", "avatar_url": "https://avatars.githubusercontent.com/u/1189869?v=4", "gravatar_id": "", "url": "https://api.github.com/users/xgdgsc", "html_url": "https://github.com/xgdgsc", "followers_url": "https://api.github.com/users/xgdgsc/followers", "following_url": "https://api.github.com/users/xgdgsc/following{/other_user}", "gists_url": "https://api.github.com/users/xgdgsc/gists{/gist_id}", "starred_url": "https://api.github.com/users/xgdgsc/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/xgdgsc/subscriptions", "organizations_url": "https://api.github.com/users/xgdgsc/orgs", "repos_url": "https://api.github.com/users/xgdgsc/repos", "events_url": "https://api.github.com/users/xgdgsc/events{/privacy}", "received_events_url": "https://api.github.com/users/xgdgsc/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg", "url": "https://api.github.com/repos/ollama/ollama/labels/windows", "name": "windows", "color": "0052CC", "default": false, "description": "" } ]
open
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
3
2024-12-09T02:38:40
2024-12-18T09:14:28
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I also tried downloading [ollama-windows-arm64.zip](https://github.com/ollama/ollama/releases/download/v0.5.1/ollama-windows-arm64.zip) and replacing some files in my user folder which doesn' t fix the tray icon still notify the update. Could there be a smaller OllamaSetup.exe ? This combined with the tray "restart to update" button not effective can be very annoying.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8005/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8005/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/8057
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8057/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8057/comments
https://api.github.com/repos/ollama/ollama/issues/8057/events
https://github.com/ollama/ollama/issues/8057
2,734,339,182
I_kwDOJ0Z1Ps6i-rRu
8,057
where can I find ollama model path?
{ "login": "keyuchen21", "id": 54015474, "node_id": "MDQ6VXNlcjU0MDE1NDc0", "avatar_url": "https://avatars.githubusercontent.com/u/54015474?v=4", "gravatar_id": "", "url": "https://api.github.com/users/keyuchen21", "html_url": "https://github.com/keyuchen21", "followers_url": "https://api.github.com/users/keyuchen21/followers", "following_url": "https://api.github.com/users/keyuchen21/following{/other_user}", "gists_url": "https://api.github.com/users/keyuchen21/gists{/gist_id}", "starred_url": "https://api.github.com/users/keyuchen21/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/keyuchen21/subscriptions", "organizations_url": "https://api.github.com/users/keyuchen21/orgs", "repos_url": "https://api.github.com/users/keyuchen21/repos", "events_url": "https://api.github.com/users/keyuchen21/events{/privacy}", "received_events_url": "https://api.github.com/users/keyuchen21/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-12-12T00:22:51
2024-12-14T06:28:20
2024-12-14T06:28:20
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue?
{ "login": "keyuchen21", "id": 54015474, "node_id": "MDQ6VXNlcjU0MDE1NDc0", "avatar_url": "https://avatars.githubusercontent.com/u/54015474?v=4", "gravatar_id": "", "url": "https://api.github.com/users/keyuchen21", "html_url": "https://github.com/keyuchen21", "followers_url": "https://api.github.com/users/keyuchen21/followers", "following_url": "https://api.github.com/users/keyuchen21/following{/other_user}", "gists_url": "https://api.github.com/users/keyuchen21/gists{/gist_id}", "starred_url": "https://api.github.com/users/keyuchen21/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/keyuchen21/subscriptions", "organizations_url": "https://api.github.com/users/keyuchen21/orgs", "repos_url": "https://api.github.com/users/keyuchen21/repos", "events_url": "https://api.github.com/users/keyuchen21/events{/privacy}", "received_events_url": "https://api.github.com/users/keyuchen21/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8057/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8057/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5458
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5458/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5458/comments
https://api.github.com/repos/ollama/ollama/issues/5458/events
https://github.com/ollama/ollama/issues/5458
2,388,291,561
I_kwDOJ0Z1Ps6OWm_p
5,458
Enable Row Split Support
{ "login": "datacrystals", "id": 49183477, "node_id": "MDQ6VXNlcjQ5MTgzNDc3", "avatar_url": "https://avatars.githubusercontent.com/u/49183477?v=4", "gravatar_id": "", "url": "https://api.github.com/users/datacrystals", "html_url": "https://github.com/datacrystals", "followers_url": "https://api.github.com/users/datacrystals/followers", "following_url": "https://api.github.com/users/datacrystals/following{/other_user}", "gists_url": "https://api.github.com/users/datacrystals/gists{/gist_id}", "starred_url": "https://api.github.com/users/datacrystals/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/datacrystals/subscriptions", "organizations_url": "https://api.github.com/users/datacrystals/orgs", "repos_url": "https://api.github.com/users/datacrystals/repos", "events_url": "https://api.github.com/users/datacrystals/events{/privacy}", "received_events_url": "https://api.github.com/users/datacrystals/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
0
2024-07-03T10:40:24
2024-07-03T10:40:24
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
For multi-gpu setups, enabling row-split can have very significant performance improvements. On my machine with 3xP40, I was able to get a speedup from ~3t/s to ~10t/s. Unfortunately, it doesn't look like there's any way to get this flag (`-sm row`) down to llama.cpp, which should otherwise support this. Wondering if it's possible to either add a way to pass along flags to llama.cpp, or to have an option which enables this flag. Given the massive performance improvement that it brings though, it seems like this should be possible, and not very much work to implement. I saw that passing along flags was discussed [here](https://github.com/ollama/ollama/pull/4120#issuecomment-2094747527), but was unfortunately not implemented.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5458/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5458/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1773
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1773/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1773/comments
https://api.github.com/repos/ollama/ollama/issues/1773/events
https://github.com/ollama/ollama/pull/1773
2,064,567,365
PR_kwDOJ0Z1Ps5jKrqy
1,773
Add embeddings to API
{ "login": "bmurray", "id": 574917, "node_id": "MDQ6VXNlcjU3NDkxNw==", "avatar_url": "https://avatars.githubusercontent.com/u/574917?v=4", "gravatar_id": "", "url": "https://api.github.com/users/bmurray", "html_url": "https://github.com/bmurray", "followers_url": "https://api.github.com/users/bmurray/followers", "following_url": "https://api.github.com/users/bmurray/following{/other_user}", "gists_url": "https://api.github.com/users/bmurray/gists{/gist_id}", "starred_url": "https://api.github.com/users/bmurray/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/bmurray/subscriptions", "organizations_url": "https://api.github.com/users/bmurray/orgs", "repos_url": "https://api.github.com/users/bmurray/repos", "events_url": "https://api.github.com/users/bmurray/events{/privacy}", "received_events_url": "https://api.github.com/users/bmurray/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-01-03T19:52:52
2024-01-04T20:00:53
2024-01-04T20:00:52
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1773", "html_url": "https://github.com/ollama/ollama/pull/1773", "diff_url": "https://github.com/ollama/ollama/pull/1773.diff", "patch_url": "https://github.com/ollama/ollama/pull/1773.patch", "merged_at": "2024-01-04T20:00:52" }
The API is missing the embeddings point, so this adds it.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1773/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1773/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/636
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/636/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/636/comments
https://api.github.com/repos/ollama/ollama/issues/636/events
https://github.com/ollama/ollama/issues/636
1,918,158,040
I_kwDOJ0Z1Ps5yVMTY
636
Unable to create temp file for runner on Windows
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
0
2023-09-28T19:43:44
2023-09-28T19:45:52
2023-09-28T19:45:52
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
When running Ollama on Windows the llama.cpp runner may not be able to be added to the temp directory. ``` llama runner not found: CreateFile C:\Users\...\AppData\Local\Temp\ollama432423342\...\server.exe: The system cannot find the specified path. ```
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/636/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/636/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5034
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5034/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5034/comments
https://api.github.com/repos/ollama/ollama/issues/5034/events
https://github.com/ollama/ollama/pull/5034
2,352,066,933
PR_kwDOJ0Z1Ps5yacph
5,034
Re-introduce the `llama` package
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-06-13T21:06:18
2024-10-08T15:53:57
2024-10-08T15:53:55
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5034", "html_url": "https://github.com/ollama/ollama/pull/5034", "diff_url": "https://github.com/ollama/ollama/pull/5034.diff", "patch_url": "https://github.com/ollama/ollama/pull/5034.patch", "merged_at": "2024-10-08T15:53:54" }
This PR brings back the `llama` package, making it possible to call llama.cpp and ggml APIs from Go directly via CGo. This has a few advantages: 1. C APIs can be called directly from Go without needing to use the previous "server" REST API 2. On macOS and for CPU builds on Linux and Windows, Ollama can be built without a `go generate ./...` step, making it easy to get up and running to hack on parts of Ollama that don't require fast inference 3. Faster build times for AVX,AVX2,CUDA and ROCM (a full build of all runners takes <5 min on a fast CPU) 4. No git submodule making it easier to clone and build from source This is a big PR, but much of it is vendor code except for: 1. `llama.go` CGo bindings 2. `example/`: a simple example of running inference 3. `runner/`: a subprocess server designed to replace the `llm/ext_server` package 4. `Makefile` an as minimal as possible `Makefile` to build the `runner` package for different targets (cpu, avx, avx2, cuda, rocm) The easiest way to try out the PR: ``` cd llama make -j ``` Which will produce `ollama_runner` binaries based on the current platform.
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5034/reactions", "total_count": 9, "+1": 0, "-1": 0, "laugh": 0, "hooray": 4, "confused": 0, "heart": 4, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/5034/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/791
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/791/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/791/comments
https://api.github.com/repos/ollama/ollama/issues/791/events
https://github.com/ollama/ollama/issues/791
1,943,655,360
I_kwDOJ0Z1Ps5z2dPA
791
Sending and receiving Context with ollama.call()
{ "login": "rossudev", "id": 14206857, "node_id": "MDQ6VXNlcjE0MjA2ODU3", "avatar_url": "https://avatars.githubusercontent.com/u/14206857?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rossudev", "html_url": "https://github.com/rossudev", "followers_url": "https://api.github.com/users/rossudev/followers", "following_url": "https://api.github.com/users/rossudev/following{/other_user}", "gists_url": "https://api.github.com/users/rossudev/gists{/gist_id}", "starred_url": "https://api.github.com/users/rossudev/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rossudev/subscriptions", "organizations_url": "https://api.github.com/users/rossudev/orgs", "repos_url": "https://api.github.com/users/rossudev/repos", "events_url": "https://api.github.com/users/rossudev/events{/privacy}", "received_events_url": "https://api.github.com/users/rossudev/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-10-15T02:28:30
2023-10-16T18:05:11
2023-10-16T18:05:10
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Is the input and output of context supported yet with ollama.call()? javascript: ``` import { Ollama } from "langchain/llms/ollama"; const ollama = new Ollama({ baseUrl: "http://localhost:11434", model: model, temperature: parseFloat(temperature), topP: parseFloat(topp) }); response = await ollama.call(input); ```
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/791/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/791/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/7596
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7596/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7596/comments
https://api.github.com/repos/ollama/ollama/issues/7596/events
https://github.com/ollama/ollama/issues/7596
2,647,219,698
I_kwDOJ0Z1Ps6dyV3y
7,596
unable to get images to be read in ollama 0.4.1 for llama3.2-vision
{ "login": "sivaramn", "id": 1205075, "node_id": "MDQ6VXNlcjEyMDUwNzU=", "avatar_url": "https://avatars.githubusercontent.com/u/1205075?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sivaramn", "html_url": "https://github.com/sivaramn", "followers_url": "https://api.github.com/users/sivaramn/followers", "following_url": "https://api.github.com/users/sivaramn/following{/other_user}", "gists_url": "https://api.github.com/users/sivaramn/gists{/gist_id}", "starred_url": "https://api.github.com/users/sivaramn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sivaramn/subscriptions", "organizations_url": "https://api.github.com/users/sivaramn/orgs", "repos_url": "https://api.github.com/users/sivaramn/repos", "events_url": "https://api.github.com/users/sivaramn/events{/privacy}", "received_events_url": "https://api.github.com/users/sivaramn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
closed
false
null
[]
null
2
2024-11-10T13:04:45
2024-11-17T14:00:26
2024-11-17T14:00:26
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? This is my setup on Windows 11 ![image](https://github.com/user-attachments/assets/a87ed038-6b41-4c8e-ac2c-8c805fb64337) Whatever image i upload, get variations of the same message as shown. What am I doing incorrectly? The png file is 145kb, if it matters ![image](https://github.com/user-attachments/assets/078edf9a-040a-4972-8ec2-d7912ea53112) ### OS Windows ### GPU _No response_ ### CPU Intel ### Ollama version 0.4.1
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/users/rick-github/followers", "following_url": "https://api.github.com/users/rick-github/following{/other_user}", "gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}", "starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rick-github/subscriptions", "organizations_url": "https://api.github.com/users/rick-github/orgs", "repos_url": "https://api.github.com/users/rick-github/repos", "events_url": "https://api.github.com/users/rick-github/events{/privacy}", "received_events_url": "https://api.github.com/users/rick-github/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7596/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7596/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7329
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7329/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7329/comments
https://api.github.com/repos/ollama/ollama/issues/7329/events
https://github.com/ollama/ollama/issues/7329
2,607,935,991
I_kwDOJ0Z1Ps6bcfH3
7,329
Terminate the current task after the REST request is actively ended
{ "login": "viosay", "id": 16093380, "node_id": "MDQ6VXNlcjE2MDkzMzgw", "avatar_url": "https://avatars.githubusercontent.com/u/16093380?v=4", "gravatar_id": "", "url": "https://api.github.com/users/viosay", "html_url": "https://github.com/viosay", "followers_url": "https://api.github.com/users/viosay/followers", "following_url": "https://api.github.com/users/viosay/following{/other_user}", "gists_url": "https://api.github.com/users/viosay/gists{/gist_id}", "starred_url": "https://api.github.com/users/viosay/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/viosay/subscriptions", "organizations_url": "https://api.github.com/users/viosay/orgs", "repos_url": "https://api.github.com/users/viosay/repos", "events_url": "https://api.github.com/users/viosay/events{/privacy}", "received_events_url": "https://api.github.com/users/viosay/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
closed
false
null
[]
null
2
2024-10-23T09:39:11
2024-10-23T16:27:42
2024-10-23T16:27:42
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Can the current executing task be terminated after a REST request is actively interrupted?
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7329/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7329/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8023
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8023/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8023/comments
https://api.github.com/repos/ollama/ollama/issues/8023/events
https://github.com/ollama/ollama/issues/8023
2,729,160,957
I_kwDOJ0Z1Ps6iq7D9
8,023
Ollama is very slow after running for a while
{ "login": "minakami443", "id": 71554716, "node_id": "MDQ6VXNlcjcxNTU0NzE2", "avatar_url": "https://avatars.githubusercontent.com/u/71554716?v=4", "gravatar_id": "", "url": "https://api.github.com/users/minakami443", "html_url": "https://github.com/minakami443", "followers_url": "https://api.github.com/users/minakami443/followers", "following_url": "https://api.github.com/users/minakami443/following{/other_user}", "gists_url": "https://api.github.com/users/minakami443/gists{/gist_id}", "starred_url": "https://api.github.com/users/minakami443/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/minakami443/subscriptions", "organizations_url": "https://api.github.com/users/minakami443/orgs", "repos_url": "https://api.github.com/users/minakami443/repos", "events_url": "https://api.github.com/users/minakami443/events{/privacy}", "received_events_url": "https://api.github.com/users/minakami443/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
18
2024-12-10T06:35:28
2025-01-13T01:37:00
2025-01-13T01:37:00
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? CPU: Intel(R) Xeon(R) Silver 4410Y GPU: NVIDIA L40S-24Q 24GB DRAM: 32GB OS: Ubuntu 24.04.1 GPU Driver: vWS 550.127.05 / 550.90.07 CUDA: 12.4 Ollama version: v0.5.1/v0.4.7/v0.3.14 Model: llama3.1:8b/Gemma2:2b/Qwen2.5:7b Hello, I'm using to run ollama on a VM with grid GPU, but both locally and on Docker I'm experiencing the following problem:. For the first 20 minutes after running, the results are very good, both model reading and reasoning speed are satisfactory. However, when running again after some time, the model reading and reasoning speed is abnormally slow. How can this situation be resolved? Any help would be greatly appreciated. Here's the log and GPU operation status: [ollama.log](https://github.com/user-attachments/files/18073070/ollama.log) ``` +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 550.127.05 Driver Version: 550.127.05 CUDA Version: 12.4 | |-----------------------------------------+------------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA L40S-24Q On | 00000000:03:00.0 Off | N/A | | N/A N/A P0 N/A / N/A | 4603MiB / 24576MiB | 0% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ +-----------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=========================================================================================| | 0 N/A N/A 208396 C ...unners/cuda_v12/ollama_llama_server 4594MiB | +-----------------------------------------------------------------------------------------+ ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version v0.5.1
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/users/rick-github/followers", "following_url": "https://api.github.com/users/rick-github/following{/other_user}", "gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}", "starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rick-github/subscriptions", "organizations_url": "https://api.github.com/users/rick-github/orgs", "repos_url": "https://api.github.com/users/rick-github/repos", "events_url": "https://api.github.com/users/rick-github/events{/privacy}", "received_events_url": "https://api.github.com/users/rick-github/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8023/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8023/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4768
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4768/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4768/comments
https://api.github.com/repos/ollama/ollama/issues/4768/events
https://github.com/ollama/ollama/issues/4768
2,329,218,923
I_kwDOJ0Z1Ps6K1Q9r
4,768
Cannot get web to verify service is running
{ "login": "brentfamily51", "id": 65859500, "node_id": "MDQ6VXNlcjY1ODU5NTAw", "avatar_url": "https://avatars.githubusercontent.com/u/65859500?v=4", "gravatar_id": "", "url": "https://api.github.com/users/brentfamily51", "html_url": "https://github.com/brentfamily51", "followers_url": "https://api.github.com/users/brentfamily51/followers", "following_url": "https://api.github.com/users/brentfamily51/following{/other_user}", "gists_url": "https://api.github.com/users/brentfamily51/gists{/gist_id}", "starred_url": "https://api.github.com/users/brentfamily51/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/brentfamily51/subscriptions", "organizations_url": "https://api.github.com/users/brentfamily51/orgs", "repos_url": "https://api.github.com/users/brentfamily51/repos", "events_url": "https://api.github.com/users/brentfamily51/events{/privacy}", "received_events_url": "https://api.github.com/users/brentfamily51/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-06-01T16:12:00
2024-06-01T19:31:22
2024-06-01T19:31:22
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I have installed ollama on ubuntu (proxmox lxc) successfully using the script, I am able to download and access it on command line, but when I attempt to access the ip:11434 the page fails to load. I appear to have no external access to the service on the machine. Any help is greatly appreciated. ### OS Linux ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.1.39
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4768/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4768/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6709
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6709/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6709/comments
https://api.github.com/repos/ollama/ollama/issues/6709/events
https://github.com/ollama/ollama/issues/6709
2,513,372,647
I_kwDOJ0Z1Ps6VzwXn
6,709
ERROR unable to locate llm runner directory. Set OLLAMA_RUNNERS_DIR to the location of 'ollama/runners'
{ "login": "Harsha0056", "id": 153726150, "node_id": "U_kgDOCSmsxg", "avatar_url": "https://avatars.githubusercontent.com/u/153726150?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Harsha0056", "html_url": "https://github.com/Harsha0056", "followers_url": "https://api.github.com/users/Harsha0056/followers", "following_url": "https://api.github.com/users/Harsha0056/following{/other_user}", "gists_url": "https://api.github.com/users/Harsha0056/gists{/gist_id}", "starred_url": "https://api.github.com/users/Harsha0056/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Harsha0056/subscriptions", "organizations_url": "https://api.github.com/users/Harsha0056/orgs", "repos_url": "https://api.github.com/users/Harsha0056/repos", "events_url": "https://api.github.com/users/Harsha0056/events{/privacy}", "received_events_url": "https://api.github.com/users/Harsha0056/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg", "url": "https://api.github.com/repos/ollama/ollama/labels/windows", "name": "windows", "color": "0052CC", "default": false, "description": "" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
3
2024-09-09T09:03:14
2024-09-09T16:07:14
2024-09-09T16:07:04
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Model is Downloaded but unable to run it. Tried to locate ollama/runners couldn't find it. 2024/09/09 04:48:23 config.go:173: ERROR unable to locate llm runner directory. Set OLLAMA_RUNNERS_DIR to the location of 'ollama/runners' ....... pulling manifest verifying sha256 digest writing manifest success Error: no suitable llama servers found Running in a windows system Specs : Processor : Snapdragon(TM) X Elite - X1E78100 - Qualcomm(TM) Oryon(TM) CPU 3.42 GHz GPU : Qualcomm(R) Adreno(TM) X1-85 NPU : Snapdragon(R) X Elite - X1E78100 - Qualcomm(R) Hexagon (TM) ### OS Windows ### GPU Other ### CPU Other ### Ollama version 0.3.9
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6709/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6709/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3507
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3507/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3507/comments
https://api.github.com/repos/ollama/ollama/issues/3507/events
https://github.com/ollama/ollama/issues/3507
2,228,977,493
I_kwDOJ0Z1Ps6E239V
3,507
Switching dynamically between multiple LLM models on VRAM
{ "login": "Q-point", "id": 5604553, "node_id": "MDQ6VXNlcjU2MDQ1NTM=", "avatar_url": "https://avatars.githubusercontent.com/u/5604553?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Q-point", "html_url": "https://github.com/Q-point", "followers_url": "https://api.github.com/users/Q-point/followers", "following_url": "https://api.github.com/users/Q-point/following{/other_user}", "gists_url": "https://api.github.com/users/Q-point/gists{/gist_id}", "starred_url": "https://api.github.com/users/Q-point/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Q-point/subscriptions", "organizations_url": "https://api.github.com/users/Q-point/orgs", "repos_url": "https://api.github.com/users/Q-point/repos", "events_url": "https://api.github.com/users/Q-point/events{/privacy}", "received_events_url": "https://api.github.com/users/Q-point/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 6677745918, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g", "url": "https://api.github.com/repos/ollama/ollama/labels/gpu", "name": "gpu", "color": "76C49E", "default": false, "description": "" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
3
2024-04-06T00:25:39
2024-04-23T15:31:41
2024-04-23T15:31:41
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What are you trying to do? At the moment, Ollama needs to load LLM one by one. It should be possible to have multiple LLM resident in VRAM memory and switch dynamically between the two. ### How should we solve this? 1. Check if the requested images can be loaded within the current hardware VRAM budget. 2. load multiple images in VRAM. 3. Augment API to switch dynamically between the two. ### What is the impact of not solving this? The present latency when loading and unloading multiple models. ### Anything else? _No response_
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3507/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3507/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/364
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/364/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/364/comments
https://api.github.com/repos/ollama/ollama/issues/364/events
https://github.com/ollama/ollama/pull/364
1,853,945,697
PR_kwDOJ0Z1Ps5YGi0-
364
reimplement chunked uploads
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-08-16T21:45:42
2023-08-17T16:58:52
2023-08-17T16:58:51
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/364", "html_url": "https://github.com/ollama/ollama/pull/364", "diff_url": "https://github.com/ollama/ollama/pull/364.diff", "patch_url": "https://github.com/ollama/ollama/pull/364.patch", "merged_at": "2023-08-17T16:58:51" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/364/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/364/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1110
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1110/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1110/comments
https://api.github.com/repos/ollama/ollama/issues/1110/events
https://github.com/ollama/ollama/issues/1110
1,990,979,824
I_kwDOJ0Z1Ps52q_Dw
1,110
Converting mistral7b finetuned model produces garbage output
{ "login": "amritap-ef", "id": 91881411, "node_id": "U_kgDOBXn_ww", "avatar_url": "https://avatars.githubusercontent.com/u/91881411?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amritap-ef", "html_url": "https://github.com/amritap-ef", "followers_url": "https://api.github.com/users/amritap-ef/followers", "following_url": "https://api.github.com/users/amritap-ef/following{/other_user}", "gists_url": "https://api.github.com/users/amritap-ef/gists{/gist_id}", "starred_url": "https://api.github.com/users/amritap-ef/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/amritap-ef/subscriptions", "organizations_url": "https://api.github.com/users/amritap-ef/orgs", "repos_url": "https://api.github.com/users/amritap-ef/repos", "events_url": "https://api.github.com/users/amritap-ef/events{/privacy}", "received_events_url": "https://api.github.com/users/amritap-ef/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
closed
false
null
[]
null
3
2023-11-13T16:11:23
2024-10-16T20:45:34
2024-10-16T20:45:34
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, I've finetuned a Mistral-7B-v0.1 model from hugging face using QLoRa as per the typical SFT scripts released by the hugging face team. However, after converting it to gguf format using the docker method, I found that my model produces garbage. For reference, this is the format of the prompts I trained on: ```<s> ### User: {{user prompt}} ### Assistant: {{assistant response}} </s> ``` The inference results from hugging face look ok, I struggle with getting the model to stop generating but setting max_tokens generally solves this (although an unideal workaround). But when converting to gguf and running via Ollama, the model often goes off-task and does its own thing. Furthermore, unless I put '### Assistant:' and '</s>' as stop tokens, it often produces input interleaving with lots of '### Assistant''s e.g.: ``` We've noticed that some people love to go sightseeing in Paris and we have a special offer for this year! ### Assistant: I love Paris ### Assistant: I love Paris. ``` So I have a few questions/paths to investigate: 1. Has anyone been able to successfully fine-tune a model and convert it to gguf and get it to work well with Ollama? 2. Could it be that I'm doing something wrong in converting the mistral finetuned model? I save the merged lora model and run the docker converting to gguf via llama.cpp on that merged model as per normal. 3. Could it be that I need to produce a better 'base' fine-tuned model before converting to gguf for it to work well in Ollama? Note, I also tried converting a base mistral7b model downloaded from hugging face as as sanity check and that worked great.
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1110/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1110/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/711
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/711/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/711/comments
https://api.github.com/repos/ollama/ollama/issues/711/events
https://github.com/ollama/ollama/pull/711
1,928,874,355
PR_kwDOJ0Z1Ps5cCqLp
711
validate api options fields from map
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2023-10-05T18:57:33
2023-10-12T15:18:12
2023-10-12T15:18:11
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/711", "html_url": "https://github.com/ollama/ollama/pull/711", "diff_url": "https://github.com/ollama/ollama/pull/711.diff", "patch_url": "https://github.com/ollama/ollama/pull/711.patch", "merged_at": "2023-10-12T15:18:11" }
We use a map to set options from the API so that we can see which option fields were specified, otherwise we override default options with zero values. The issue here is that there was no validation that the input option fields were valid, so using an incorrect field by mistake did not return an error. New response: ``` curl -X 'POST' -d '{"prompt":"hello", "model": "mistral", "options": {"seed": 1234, "temperature": 0, "test": 1234}}' 'http://127.0.0.1:11434/api/generate' {"error":"invalid options: test"} ``` from #694
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/711/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/711/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4896
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4896/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4896/comments
https://api.github.com/repos/ollama/ollama/issues/4896/events
https://github.com/ollama/ollama/pull/4896
2,339,568,092
PR_kwDOJ0Z1Ps5xwEhd
4,896
llm: update llama.cpp commit to `7c26775`
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-06-07T04:35:53
2024-06-17T19:56:17
2024-06-17T19:56:16
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/4896", "html_url": "https://github.com/ollama/ollama/pull/4896", "diff_url": "https://github.com/ollama/ollama/pull/4896.diff", "patch_url": "https://github.com/ollama/ollama/pull/4896.patch", "merged_at": "2024-06-17T19:56:16" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4896/reactions", "total_count": 4, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 2, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4896/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3841
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3841/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3841/comments
https://api.github.com/repos/ollama/ollama/issues/3841/events
https://github.com/ollama/ollama/issues/3841
2,258,383,763
I_kwDOJ0Z1Ps6GnDOT
3,841
Slow inference speed
{ "login": "userandpass", "id": 26294920, "node_id": "MDQ6VXNlcjI2Mjk0OTIw", "avatar_url": "https://avatars.githubusercontent.com/u/26294920?v=4", "gravatar_id": "", "url": "https://api.github.com/users/userandpass", "html_url": "https://github.com/userandpass", "followers_url": "https://api.github.com/users/userandpass/followers", "following_url": "https://api.github.com/users/userandpass/following{/other_user}", "gists_url": "https://api.github.com/users/userandpass/gists{/gist_id}", "starred_url": "https://api.github.com/users/userandpass/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/userandpass/subscriptions", "organizations_url": "https://api.github.com/users/userandpass/orgs", "repos_url": "https://api.github.com/users/userandpass/repos", "events_url": "https://api.github.com/users/userandpass/events{/privacy}", "received_events_url": "https://api.github.com/users/userandpass/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-04-23T09:24:34
2024-05-18T18:25:30
2024-05-18T18:25:30
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? A100 80G Run qwen1.57B using lmdeploy framework with two processes per card and use two cards to launch qwen1.57B via ollama, which is about 2 times slower than lmdeploy ### OS Linux ### GPU Nvidia ### CPU _No response_ ### Ollama version 0.1.32
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3841/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3841/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/1512
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1512/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1512/comments
https://api.github.com/repos/ollama/ollama/issues/1512/events
https://github.com/ollama/ollama/issues/1512
2,040,673,497
I_kwDOJ0Z1Ps55ojTZ
1,512
Generation speed 3x slower after pulling newer docker image
{ "login": "johnp-4dvanalytics", "id": 54282900, "node_id": "MDQ6VXNlcjU0MjgyOTAw", "avatar_url": "https://avatars.githubusercontent.com/u/54282900?v=4", "gravatar_id": "", "url": "https://api.github.com/users/johnp-4dvanalytics", "html_url": "https://github.com/johnp-4dvanalytics", "followers_url": "https://api.github.com/users/johnp-4dvanalytics/followers", "following_url": "https://api.github.com/users/johnp-4dvanalytics/following{/other_user}", "gists_url": "https://api.github.com/users/johnp-4dvanalytics/gists{/gist_id}", "starred_url": "https://api.github.com/users/johnp-4dvanalytics/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/johnp-4dvanalytics/subscriptions", "organizations_url": "https://api.github.com/users/johnp-4dvanalytics/orgs", "repos_url": "https://api.github.com/users/johnp-4dvanalytics/repos", "events_url": "https://api.github.com/users/johnp-4dvanalytics/events{/privacy}", "received_events_url": "https://api.github.com/users/johnp-4dvanalytics/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-12-14T00:35:22
2024-02-20T01:26:11
2024-02-20T01:26:11
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I updated the docker image recently and the inference was 3x slower (30s for new image vs. 10s for old image). I was able to get back to the previous inference speed by reverting to ollama/ollama:0.1.10. I tried ollama/ollama:0.1.15 and ollama/ollama:0.1.14 and both had the slower inference speed. EDIT: model used was https://ollama.ai/library/openhermes2.5-mistral:latest
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1512/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/1512/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2155
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2155/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2155/comments
https://api.github.com/repos/ollama/ollama/issues/2155/events
https://github.com/ollama/ollama/issues/2155
2,095,474,558
I_kwDOJ0Z1Ps585md-
2,155
Unable to push: max retries exceeded on slower connections
{ "login": "sqs", "id": 1976, "node_id": "MDQ6VXNlcjE5NzY=", "avatar_url": "https://avatars.githubusercontent.com/u/1976?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sqs", "html_url": "https://github.com/sqs", "followers_url": "https://api.github.com/users/sqs/followers", "following_url": "https://api.github.com/users/sqs/following{/other_user}", "gists_url": "https://api.github.com/users/sqs/gists{/gist_id}", "starred_url": "https://api.github.com/users/sqs/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sqs/subscriptions", "organizations_url": "https://api.github.com/users/sqs/orgs", "repos_url": "https://api.github.com/users/sqs/repos", "events_url": "https://api.github.com/users/sqs/events{/privacy}", "received_events_url": "https://api.github.com/users/sqs/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677370291, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw", "url": "https://api.github.com/repos/ollama/ollama/labels/networking", "name": "networking", "color": "0B5368", "default": false, "description": "Issues relating to ollama pull and push" } ]
open
false
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
14
2024-01-23T08:02:09
2024-05-14T09:47:34
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I was able to push the `q4_0` tag to https://ollama.ai/sqs/starchat, but when I try to push other tags, I am getting an error (see below). Note the `%!F(MISSING)` below in case that is an issue. The file size of the one that failed is 7.7GB. The `q4_0` push that succeeded was 8.4 GB. ``` $ for i in q3_K_M q4_K_M q5_K_S q5_K_M f16 f32; do ollama create sqs/starchat:beta-$i -f Modelfile.$i && ollama push sqs/starchat:beta-$i; done transferring model data creating model layer creating template layer using already created layer sha256:62b0be00997dd300b03868d7858d28f41488c0222bfc4fbb6ceb3eae39a5d4d7 using already created layer sha256:ca40f7f0151766210faa524fa8710aabf07284671aaac525eeac350d64d05132 using already created layer sha256:dd473af9080c0674443f41cb6feb59ac1e24c34f18255c78d083f138f3275a0c writing manifest success retrieving manifest pushing 62b0be00997d... 0% ▕ ▏ 1.3 MB/8.2 GB 5.2 MB/s 26m34s Error: max retries exceeded: Put "https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/repositories/sqs/starchat/_uploads/55c91d69-edf4-4a50-a278-2c7c697ba4e4/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=XXX%!F(MISSING)20240123%!F(MISSING)auto%!F(MISSING)s3%!F(MISSING)aws4_request&X-Amz-Date=20240123T072755Z&X-Amz-Expires=86400&X-Amz-SignedHeaders=host&partNumber=29&uploadId=XXX&X-Amz-Signature=XXX": write tcp 192.168.2.154:51301->104.18.9.90:443: write: broken pipe ``` (Note: I replaced URL query params that may contain credentials with `XXX`.) This may just be an ephemeral error. I'll close this tomorrow if the other pushes succeeded.
{ "login": "sqs", "id": 1976, "node_id": "MDQ6VXNlcjE5NzY=", "avatar_url": "https://avatars.githubusercontent.com/u/1976?v=4", "gravatar_id": "", "url": "https://api.github.com/users/sqs", "html_url": "https://github.com/sqs", "followers_url": "https://api.github.com/users/sqs/followers", "following_url": "https://api.github.com/users/sqs/following{/other_user}", "gists_url": "https://api.github.com/users/sqs/gists{/gist_id}", "starred_url": "https://api.github.com/users/sqs/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sqs/subscriptions", "organizations_url": "https://api.github.com/users/sqs/orgs", "repos_url": "https://api.github.com/users/sqs/repos", "events_url": "https://api.github.com/users/sqs/events{/privacy}", "received_events_url": "https://api.github.com/users/sqs/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2155/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2155/timeline
null
reopened
false
https://api.github.com/repos/ollama/ollama/issues/518
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/518/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/518/comments
https://api.github.com/repos/ollama/ollama/issues/518/events
https://github.com/ollama/ollama/pull/518
1,893,124,094
PR_kwDOJ0Z1Ps5aKYx_
518
amd64 linux build runner
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-09-12T19:26:16
2023-10-20T16:44:05
2023-09-21T13:48:18
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/518", "html_url": "https://github.com/ollama/ollama/pull/518", "diff_url": "https://github.com/ollama/ollama/pull/518.diff", "patch_url": "https://github.com/ollama/ollama/pull/518.patch", "merged_at": null }
Add automation that automatically creates a single ollama binary for amd64 linux builds. Limitations: - Requires glibc 2.29 (the glibc version ubuntu 20.04 has packed in), ideally we build on an ubuntu 16.04 or 18.04 runner instead to maximize glibc compatibility, but that will require a custom runner. `glibc` is used by linux to access kernal functionality so it cant really be updated by an end-user without updating their OS. Future work: - Ideally I'd rather just install both version of nvcc on one runner and swap between them. I tried this and I hit some issues with the wrong cuda version being referenced during builds.
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/518/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/518/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6135
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6135/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6135/comments
https://api.github.com/repos/ollama/ollama/issues/6135/events
https://github.com/ollama/ollama/issues/6135
2,443,942,428
I_kwDOJ0Z1Ps6Rq5oc
6,135
When processing long texts, it freezes and there is no response.
{ "login": "jiaguangqi", "id": 52811251, "node_id": "MDQ6VXNlcjUyODExMjUx", "avatar_url": "https://avatars.githubusercontent.com/u/52811251?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jiaguangqi", "html_url": "https://github.com/jiaguangqi", "followers_url": "https://api.github.com/users/jiaguangqi/followers", "following_url": "https://api.github.com/users/jiaguangqi/following{/other_user}", "gists_url": "https://api.github.com/users/jiaguangqi/gists{/gist_id}", "starred_url": "https://api.github.com/users/jiaguangqi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jiaguangqi/subscriptions", "organizations_url": "https://api.github.com/users/jiaguangqi/orgs", "repos_url": "https://api.github.com/users/jiaguangqi/repos", "events_url": "https://api.github.com/users/jiaguangqi/events{/privacy}", "received_events_url": "https://api.github.com/users/jiaguangqi/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
5
2024-08-02T03:45:05
2024-09-13T18:10:57
2024-09-13T18:10:57
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I am using Ollama to run llama3.1:70b. When processing long texts, it freezes and there is no response. I have checked the server’s CPU, memory, and GPU usage, and they are all low. What could be the problem? ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.1.30
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6135/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6135/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5784
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5784/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5784/comments
https://api.github.com/repos/ollama/ollama/issues/5784/events
https://github.com/ollama/ollama/issues/5784
2,417,619,837
I_kwDOJ0Z1Ps6QGfN9
5,784
How to Deploy LLM Based on ollama in an offline environment?
{ "login": "RyanOvO", "id": 31231960, "node_id": "MDQ6VXNlcjMxMjMxOTYw", "avatar_url": "https://avatars.githubusercontent.com/u/31231960?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RyanOvO", "html_url": "https://github.com/RyanOvO", "followers_url": "https://api.github.com/users/RyanOvO/followers", "following_url": "https://api.github.com/users/RyanOvO/following{/other_user}", "gists_url": "https://api.github.com/users/RyanOvO/gists{/gist_id}", "starred_url": "https://api.github.com/users/RyanOvO/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RyanOvO/subscriptions", "organizations_url": "https://api.github.com/users/RyanOvO/orgs", "repos_url": "https://api.github.com/users/RyanOvO/repos", "events_url": "https://api.github.com/users/RyanOvO/events{/privacy}", "received_events_url": "https://api.github.com/users/RyanOvO/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
15
2024-07-19T01:03:36
2024-08-20T03:03:56
2024-07-19T03:36:59
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
ollama is now ready and I am deploying based on binary files. I have downloaded the GLM4 model file and [the glm4 adaptation file for ollama](https://ollama.com/library/glm4:9b). However, GLM4 does not provide guff at present. How can I deploy inference glm4 based on ollama in an offline environment?
{ "login": "RyanOvO", "id": 31231960, "node_id": "MDQ6VXNlcjMxMjMxOTYw", "avatar_url": "https://avatars.githubusercontent.com/u/31231960?v=4", "gravatar_id": "", "url": "https://api.github.com/users/RyanOvO", "html_url": "https://github.com/RyanOvO", "followers_url": "https://api.github.com/users/RyanOvO/followers", "following_url": "https://api.github.com/users/RyanOvO/following{/other_user}", "gists_url": "https://api.github.com/users/RyanOvO/gists{/gist_id}", "starred_url": "https://api.github.com/users/RyanOvO/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/RyanOvO/subscriptions", "organizations_url": "https://api.github.com/users/RyanOvO/orgs", "repos_url": "https://api.github.com/users/RyanOvO/repos", "events_url": "https://api.github.com/users/RyanOvO/events{/privacy}", "received_events_url": "https://api.github.com/users/RyanOvO/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5784/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5784/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3625
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3625/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3625/comments
https://api.github.com/repos/ollama/ollama/issues/3625/events
https://github.com/ollama/ollama/issues/3625
2,241,574,568
I_kwDOJ0Z1Ps6Fm7ao
3,625
can't be installed on ubuntu runing in a podman container.
{ "login": "olumolu", "id": 162728301, "node_id": "U_kgDOCbMJbQ", "avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4", "gravatar_id": "", "url": "https://api.github.com/users/olumolu", "html_url": "https://github.com/olumolu", "followers_url": "https://api.github.com/users/olumolu/followers", "following_url": "https://api.github.com/users/olumolu/following{/other_user}", "gists_url": "https://api.github.com/users/olumolu/gists{/gist_id}", "starred_url": "https://api.github.com/users/olumolu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/olumolu/subscriptions", "organizations_url": "https://api.github.com/users/olumolu/orgs", "repos_url": "https://api.github.com/users/olumolu/repos", "events_url": "https://api.github.com/users/olumolu/events{/privacy}", "received_events_url": "https://api.github.com/users/olumolu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6433346500, "node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA", "url": "https://api.github.com/repos/ollama/ollama/labels/amd", "name": "amd", "color": "000000", "default": false, "description": "Issues relating to AMD GPUs and ROCm" }, { "id": 6677677816, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgVG-A", "url": "https://api.github.com/repos/ollama/ollama/labels/docker", "name": "docker", "color": "0052CC", "default": false, "description": "Issues relating to using ollama in containers" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
6
2024-04-13T14:01:22
2024-04-17T19:30:20
2024-04-16T16:15:26
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I spin up a container of ubuntu using podman but i can't install ollama that curl command does not work. ### What did you expect to see? _No response_ ### Steps to reproduce _No response_ ### Are there any recent changes that introduced the issue? _No response_ ### OS Linux ### Architecture amd64 ### Platform Docker ### Ollama version _No response_ ### GPU AMD ### GPU info _No response_ ### CPU AMD ### Other software _No response_
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3625/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3625/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/7094
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7094/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7094/comments
https://api.github.com/repos/ollama/ollama/issues/7094/events
https://github.com/ollama/ollama/issues/7094
2,565,121,022
I_kwDOJ0Z1Ps6Y5KP-
7,094
dark mode for website
{ "login": "hichemfantar", "id": 34947993, "node_id": "MDQ6VXNlcjM0OTQ3OTkz", "avatar_url": "https://avatars.githubusercontent.com/u/34947993?v=4", "gravatar_id": "", "url": "https://api.github.com/users/hichemfantar", "html_url": "https://github.com/hichemfantar", "followers_url": "https://api.github.com/users/hichemfantar/followers", "following_url": "https://api.github.com/users/hichemfantar/following{/other_user}", "gists_url": "https://api.github.com/users/hichemfantar/gists{/gist_id}", "starred_url": "https://api.github.com/users/hichemfantar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hichemfantar/subscriptions", "organizations_url": "https://api.github.com/users/hichemfantar/orgs", "repos_url": "https://api.github.com/users/hichemfantar/repos", "events_url": "https://api.github.com/users/hichemfantar/events{/privacy}", "received_events_url": "https://api.github.com/users/hichemfantar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
1
2024-10-03T22:51:43
2024-10-04T15:54:43
2024-10-04T15:54:43
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
These are the dark mode docs for tailwindcss https://tailwindcss.com/docs/dark-mode
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7094/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7094/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6109
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6109/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6109/comments
https://api.github.com/repos/ollama/ollama/issues/6109/events
https://github.com/ollama/ollama/pull/6109
2,441,145,025
PR_kwDOJ0Z1Ps53DNqX
6,109
fix modelfile message quotes
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-07-31T23:54:18
2024-08-01T00:05:45
2024-08-01T00:05:44
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6109", "html_url": "https://github.com/ollama/ollama/pull/6109", "diff_url": "https://github.com/ollama/ollama/pull/6109.diff", "patch_url": "https://github.com/ollama/ollama/pull/6109.patch", "merged_at": "2024-08-01T00:05:44" }
message commands should quote the content but it's not correctly formatted so it outputs content verbatim. fix the formatting will fix quoting https://github.com/ollama/ollama/blob/main/parser/parser.go#L41-L43 resolves #6103
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6109/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6109/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/2227
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2227/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2227/comments
https://api.github.com/repos/ollama/ollama/issues/2227/events
https://github.com/ollama/ollama/issues/2227
2,103,466,374
I_kwDOJ0Z1Ps59YFmG
2,227
model not loading in GPU
{ "login": "muka", "id": 1021269, "node_id": "MDQ6VXNlcjEwMjEyNjk=", "avatar_url": "https://avatars.githubusercontent.com/u/1021269?v=4", "gravatar_id": "", "url": "https://api.github.com/users/muka", "html_url": "https://github.com/muka", "followers_url": "https://api.github.com/users/muka/followers", "following_url": "https://api.github.com/users/muka/following{/other_user}", "gists_url": "https://api.github.com/users/muka/gists{/gist_id}", "starred_url": "https://api.github.com/users/muka/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/muka/subscriptions", "organizations_url": "https://api.github.com/users/muka/orgs", "repos_url": "https://api.github.com/users/muka/repos", "events_url": "https://api.github.com/users/muka/events{/privacy}", "received_events_url": "https://api.github.com/users/muka/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2024-01-27T09:13:12
2024-01-27T09:24:30
2024-01-27T09:24:30
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hi, great project congrats! I noticed that event if ollama (in docker) logs say it offloaded layers to GPU ![image](https://github.com/ollama/ollama/assets/1021269/e21f2348-22c9-43ab-84dd-232c9a75a019) nvdia-smi report no actual usage ![image](https://github.com/ollama/ollama/assets/1021269/0cb21d32-e5f9-42de-b56f-eafe045c3bbe) Is this an expected beahviour? A similar setup with [localai](https://github.com/mudler/LocalAI) has similar logs but with better performances, denoting it is actually using the gpu
{ "login": "muka", "id": 1021269, "node_id": "MDQ6VXNlcjEwMjEyNjk=", "avatar_url": "https://avatars.githubusercontent.com/u/1021269?v=4", "gravatar_id": "", "url": "https://api.github.com/users/muka", "html_url": "https://github.com/muka", "followers_url": "https://api.github.com/users/muka/followers", "following_url": "https://api.github.com/users/muka/following{/other_user}", "gists_url": "https://api.github.com/users/muka/gists{/gist_id}", "starred_url": "https://api.github.com/users/muka/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/muka/subscriptions", "organizations_url": "https://api.github.com/users/muka/orgs", "repos_url": "https://api.github.com/users/muka/repos", "events_url": "https://api.github.com/users/muka/events{/privacy}", "received_events_url": "https://api.github.com/users/muka/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2227/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2227/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2976
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2976/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2976/comments
https://api.github.com/repos/ollama/ollama/issues/2976/events
https://github.com/ollama/ollama/issues/2976
2,173,214,959
I_kwDOJ0Z1Ps6BiKDv
2,976
When https://api.openai.com/ Access time or timeout, openai/API/models request causing UI interface not to be displayed
{ "login": "freefer", "id": 9861446, "node_id": "MDQ6VXNlcjk4NjE0NDY=", "avatar_url": "https://avatars.githubusercontent.com/u/9861446?v=4", "gravatar_id": "", "url": "https://api.github.com/users/freefer", "html_url": "https://github.com/freefer", "followers_url": "https://api.github.com/users/freefer/followers", "following_url": "https://api.github.com/users/freefer/following{/other_user}", "gists_url": "https://api.github.com/users/freefer/gists{/gist_id}", "starred_url": "https://api.github.com/users/freefer/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/freefer/subscriptions", "organizations_url": "https://api.github.com/users/freefer/orgs", "repos_url": "https://api.github.com/users/freefer/repos", "events_url": "https://api.github.com/users/freefer/events{/privacy}", "received_events_url": "https://api.github.com/users/freefer/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-03-07T07:56:54
2024-03-07T08:25:24
2024-03-07T08:14:35
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "freefer", "id": 9861446, "node_id": "MDQ6VXNlcjk4NjE0NDY=", "avatar_url": "https://avatars.githubusercontent.com/u/9861446?v=4", "gravatar_id": "", "url": "https://api.github.com/users/freefer", "html_url": "https://github.com/freefer", "followers_url": "https://api.github.com/users/freefer/followers", "following_url": "https://api.github.com/users/freefer/following{/other_user}", "gists_url": "https://api.github.com/users/freefer/gists{/gist_id}", "starred_url": "https://api.github.com/users/freefer/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/freefer/subscriptions", "organizations_url": "https://api.github.com/users/freefer/orgs", "repos_url": "https://api.github.com/users/freefer/repos", "events_url": "https://api.github.com/users/freefer/events{/privacy}", "received_events_url": "https://api.github.com/users/freefer/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2976/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2976/timeline
null
not_planned
false
https://api.github.com/repos/ollama/ollama/issues/2007
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2007/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2007/comments
https://api.github.com/repos/ollama/ollama/issues/2007/events
https://github.com/ollama/ollama/pull/2007
2,082,649,277
PR_kwDOJ0Z1Ps5kH7cP
2,007
Add multiple CPU variants for Intel Mac
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2024-01-15T20:38:20
2024-01-18T19:32:33
2024-01-18T19:32:29
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/2007", "html_url": "https://github.com/ollama/ollama/pull/2007", "diff_url": "https://github.com/ollama/ollama/pull/2007.diff", "patch_url": "https://github.com/ollama/ollama/pull/2007.patch", "merged_at": "2024-01-18T19:32:29" }
This also refines the build process for the ext_server build. I had initially aimed to get rid of the gcc/g++ library generation step and rely on cmake to build a shared library, but due to toolchain quirks, this model didn't work reliably. (e.g. linux worked since it's a consistent toolchain, and arm mac worked, but intel mac segfaults when calling the init function pointer). This may still be achievable in a follow up incremental PR, but for now I'll stick with g++ to create the main library we dlopen on all platforms except windows. Another potential follow up is to consider splitting out the cuda shared libraries as a discrete download and handle it in the installer script if we don't detect cuda present on the host. That would further reduce the footprint and resolve the slow initial startup due to decompressing large payloads. _Marking draft until I have a chance to more fully test, but so far happy path testing on mac (intel/arm), windows(cuda), and linux (rocm/cuda) looks good._ Extracting the now compressed payloads takes some time - ~15s on my older laptop ``` 2024/01/15 11:12:42 payload_common.go:106: Extracting dynamic libraries... 2024/01/15 11:12:57 payload_common.go:145: Dynamic LLM libraries [rocm_v6 cpu cpu_avx2 cpu_avx cuda_v11 rocm_v5] ``` Uncompressed sizes once on disk: ``` % du -sh /tmp/ollama3226276348/* 36M /tmp/ollama3226276348/cpu 36M /tmp/ollama3226276348/cpu_avx 36M /tmp/ollama3226276348/cpu_avx2 410M /tmp/ollama3226276348/cuda_v11 30M /tmp/ollama3226276348/rocm_v5 31M /tmp/ollama3226276348/rocm_v6 ``` The actual linux binary: ``` % ls -lh ollama-linux-amd64 -rwxrwxr-x 1 daniel daniel 294M Jan 15 11:12 ollama-linux-amd64 ```
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2007/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2007/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7624
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7624/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7624/comments
https://api.github.com/repos/ollama/ollama/issues/7624/events
https://github.com/ollama/ollama/pull/7624
2,650,757,113
PR_kwDOJ0Z1Ps6Bkm5a
7,624
runner.go: Make KV entry accounting more robust
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users/jessegross/followers", "following_url": "https://api.github.com/users/jessegross/following{/other_user}", "gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}", "starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jessegross/subscriptions", "organizations_url": "https://api.github.com/users/jessegross/orgs", "repos_url": "https://api.github.com/users/jessegross/repos", "events_url": "https://api.github.com/users/jessegross/events{/privacy}", "received_events_url": "https://api.github.com/users/jessegross/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-11-12T01:22:06
2024-11-12T04:23:05
2024-11-12T04:23:03
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7624", "html_url": "https://github.com/ollama/ollama/pull/7624", "diff_url": "https://github.com/ollama/ollama/pull/7624.diff", "patch_url": "https://github.com/ollama/ollama/pull/7624.patch", "merged_at": "2024-11-12T04:23:03" }
The structure of the accounting for KV cache shifting was carried over from the old runner but it now doesn't feel natural with the new runner. There are a number of invariants that should hold true but are difficult to reason about. There is at least one bug report that would imply that the invariants are not holding. This reduces the number of implicit assumptions and is more forgiving of unexpected situations. It also improves behavior around which input tokens are kept when truncation occurs. Bug #7545
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users/jessegross/followers", "following_url": "https://api.github.com/users/jessegross/following{/other_user}", "gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}", "starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jessegross/subscriptions", "organizations_url": "https://api.github.com/users/jessegross/orgs", "repos_url": "https://api.github.com/users/jessegross/repos", "events_url": "https://api.github.com/users/jessegross/events{/privacy}", "received_events_url": "https://api.github.com/users/jessegross/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7624/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7624/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6979
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6979/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6979/comments
https://api.github.com/repos/ollama/ollama/issues/6979/events
https://github.com/ollama/ollama/issues/6979
2,550,723,603
I_kwDOJ0Z1Ps6YCPQT
6,979
llama3.2 with docker can not access image file
{ "login": "alperyilmaz", "id": 410615, "node_id": "MDQ6VXNlcjQxMDYxNQ==", "avatar_url": "https://avatars.githubusercontent.com/u/410615?v=4", "gravatar_id": "", "url": "https://api.github.com/users/alperyilmaz", "html_url": "https://github.com/alperyilmaz", "followers_url": "https://api.github.com/users/alperyilmaz/followers", "following_url": "https://api.github.com/users/alperyilmaz/following{/other_user}", "gists_url": "https://api.github.com/users/alperyilmaz/gists{/gist_id}", "starred_url": "https://api.github.com/users/alperyilmaz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alperyilmaz/subscriptions", "organizations_url": "https://api.github.com/users/alperyilmaz/orgs", "repos_url": "https://api.github.com/users/alperyilmaz/repos", "events_url": "https://api.github.com/users/alperyilmaz/events{/privacy}", "received_events_url": "https://api.github.com/users/alperyilmaz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2024-09-26T14:12:46
2024-09-30T20:54:14
2024-09-30T20:54:14
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I was trying llama3.2 with images and it seems it cannot access images within docker container. Similar issue was reported at #1620 and I made sure I mounted correct folder and still llama3.2 can not access the image file. Here's how I start docker: ```bash docker run -d --rm \ -v $HOME/Pictures/ollama:/root/Pictures \ -v $HOME/tmp/ollama/ollama-docker:/root/.ollama \ -p 11434:11434 \ --name ollama ollama/ollama ``` Let's make sure the image is in correct folder by going in the container: ```bash $ docker exec -it ollama bash root@67c1c7b71307:/# ls /root/Pictures/ test.png ``` Let's run llama3.2 and ask something about the image: ```bash $ docker exec -it ollama ollama run llama3.2:3b >>> describe the image /root/Pictures/test.png I cannot provide a description of an image that you did not create. Is there anything else I can help you with? ``` What am I doing wrong here? ### OS Linux ### GPU _No response_ ### CPU Intel ### Ollama version 0.3.11
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6979/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 1, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6979/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6184
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6184/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6184/comments
https://api.github.com/repos/ollama/ollama/issues/6184/events
https://github.com/ollama/ollama/issues/6184
2,449,064,940
I_kwDOJ0Z1Ps6R-cPs
6,184
Add InternLM 2.5 family of models
{ "login": "nviraj", "id": 8409854, "node_id": "MDQ6VXNlcjg0MDk4NTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/8409854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/nviraj", "html_url": "https://github.com/nviraj", "followers_url": "https://api.github.com/users/nviraj/followers", "following_url": "https://api.github.com/users/nviraj/following{/other_user}", "gists_url": "https://api.github.com/users/nviraj/gists{/gist_id}", "starred_url": "https://api.github.com/users/nviraj/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/nviraj/subscriptions", "organizations_url": "https://api.github.com/users/nviraj/orgs", "repos_url": "https://api.github.com/users/nviraj/repos", "events_url": "https://api.github.com/users/nviraj/events{/privacy}", "received_events_url": "https://api.github.com/users/nviraj/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
4
2024-08-05T17:18:58
2024-08-17T04:47:12
2024-08-15T20:07:45
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Please add [InternLM 2.5 family of models](https://huggingface.co/collections/internlm/internlm25-66853f32717072d17581bc13) to registry. Comes in 1.8, 7 and 20b sizes with both base and chat models. While [this](https://ollama.com/internlm/internlm2.5) exists, at the moment it is F16 only. Thank you!
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6184/reactions", "total_count": 4, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/6184/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6958
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6958/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6958/comments
https://api.github.com/repos/ollama/ollama/issues/6958/events
https://github.com/ollama/ollama/issues/6958
2,548,423,443
I_kwDOJ0Z1Ps6X5dsT
6,958
molmo by allen ai support
{ "login": "olumolu", "id": 162728301, "node_id": "U_kgDOCbMJbQ", "avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4", "gravatar_id": "", "url": "https://api.github.com/users/olumolu", "html_url": "https://github.com/olumolu", "followers_url": "https://api.github.com/users/olumolu/followers", "following_url": "https://api.github.com/users/olumolu/following{/other_user}", "gists_url": "https://api.github.com/users/olumolu/gists{/gist_id}", "starred_url": "https://api.github.com/users/olumolu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/olumolu/subscriptions", "organizations_url": "https://api.github.com/users/olumolu/orgs", "repos_url": "https://api.github.com/users/olumolu/repos", "events_url": "https://api.github.com/users/olumolu/events{/privacy}", "received_events_url": "https://api.github.com/users/olumolu/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
open
false
null
[]
null
32
2024-09-25T16:19:43
2025-01-17T20:00:22
null
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
![GYVKif8XoAAFLAw](https://github.com/user-attachments/assets/74dead10-7370-4360-a326-41e40446f5b0) https://huggingface.co/allenai/Molmo-7B-D-0924 https://huggingface.co/allenai/Molmo-72B-0924 This models are really good and have potential and fully open-source please give support for them. thanks.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6958/reactions", "total_count": 101, "+1": 72, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 19, "eyes": 10 }
https://api.github.com/repos/ollama/ollama/issues/6958/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/2216
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2216/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2216/comments
https://api.github.com/repos/ollama/ollama/issues/2216/events
https://github.com/ollama/ollama/issues/2216
2,102,825,534
I_kwDOJ0Z1Ps59VpI-
2,216
ollama.ai and registry.ollama.ai does not have IPv6
{ "login": "miyurusankalpa", "id": 1833185, "node_id": "MDQ6VXNlcjE4MzMxODU=", "avatar_url": "https://avatars.githubusercontent.com/u/1833185?v=4", "gravatar_id": "", "url": "https://api.github.com/users/miyurusankalpa", "html_url": "https://github.com/miyurusankalpa", "followers_url": "https://api.github.com/users/miyurusankalpa/followers", "following_url": "https://api.github.com/users/miyurusankalpa/following{/other_user}", "gists_url": "https://api.github.com/users/miyurusankalpa/gists{/gist_id}", "starred_url": "https://api.github.com/users/miyurusankalpa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/miyurusankalpa/subscriptions", "organizations_url": "https://api.github.com/users/miyurusankalpa/orgs", "repos_url": "https://api.github.com/users/miyurusankalpa/repos", "events_url": "https://api.github.com/users/miyurusankalpa/events{/privacy}", "received_events_url": "https://api.github.com/users/miyurusankalpa/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 6573197867, "node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw", "url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com", "name": "ollama.com", "color": "ffffff", "default": false, "description": "" } ]
open
false
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
3
2024-01-26T20:06:48
2025-01-30T07:14:40
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
ollama.ai and registry.ollama.ai does not have IPv6 and it cannot be installed on AWS IPv6 only servers. `Error: max retries exceeded: Get "https://registry.ollama.ai/v2/library/codellama/blobs/sha256:3a43f93b78ec50f7c4e4dc8bd1cb3fff5a900e7d574c51a6f7495e48486e0dac": dial tcp 34.120.132.20:443: connect: network is unreachable` 34.120.132.20 seems to on GCP, which supports IPv6.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2216/reactions", "total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/2216/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/1815
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1815/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1815/comments
https://api.github.com/repos/ollama/ollama/issues/1815/events
https://github.com/ollama/ollama/pull/1815
2,068,043,423
PR_kwDOJ0Z1Ps5jWZDc
1,815
Add unit tests for Parser
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-01-05T21:10:42
2024-01-05T22:04:32
2024-01-05T22:04:32
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/1815", "html_url": "https://github.com/ollama/ollama/pull/1815", "diff_url": "https://github.com/ollama/ollama/pull/1815.diff", "patch_url": "https://github.com/ollama/ollama/pull/1815.patch", "merged_at": "2024-01-05T22:04:32" }
This adds a couple of basic unit tests for parsing. Suggested in #1809
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1815/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1815/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/600
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/600/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/600/comments
https://api.github.com/repos/ollama/ollama/issues/600/events
https://github.com/ollama/ollama/issues/600
1,912,711,759
I_kwDOJ0Z1Ps5yAapP
600
Support speechless-llama2-hermes-orca-platypus-wizardlm-13b or Qwen-14b models
{ "login": "cheuk-cheng", "id": 72942052, "node_id": "MDQ6VXNlcjcyOTQyMDUy", "avatar_url": "https://avatars.githubusercontent.com/u/72942052?v=4", "gravatar_id": "", "url": "https://api.github.com/users/cheuk-cheng", "html_url": "https://github.com/cheuk-cheng", "followers_url": "https://api.github.com/users/cheuk-cheng/followers", "following_url": "https://api.github.com/users/cheuk-cheng/following{/other_user}", "gists_url": "https://api.github.com/users/cheuk-cheng/gists{/gist_id}", "starred_url": "https://api.github.com/users/cheuk-cheng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/cheuk-cheng/subscriptions", "organizations_url": "https://api.github.com/users/cheuk-cheng/orgs", "repos_url": "https://api.github.com/users/cheuk-cheng/repos", "events_url": "https://api.github.com/users/cheuk-cheng/events{/privacy}", "received_events_url": "https://api.github.com/users/cheuk-cheng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
5
2023-09-26T05:31:13
2023-12-22T03:56:37
2023-12-22T03:56:36
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Is there any plan to support recent models like speechless-llama2-hermes-orca-platypus-wizardlm-13b or Qwen-14b? User reviews seem to indicate that they are powerful and fast even when compared with other llama2 models that have larger sizes. Thanks.
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/600/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/600/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8013
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8013/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8013/comments
https://api.github.com/repos/ollama/ollama/issues/8013/events
https://github.com/ollama/ollama/issues/8013
2,727,588,546
I_kwDOJ0Z1Ps6ik7LC
8,013
go get version mismatch
{ "login": "fxmbsw7", "id": 39368685, "node_id": "MDQ6VXNlcjM5MzY4Njg1", "avatar_url": "https://avatars.githubusercontent.com/u/39368685?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fxmbsw7", "html_url": "https://github.com/fxmbsw7", "followers_url": "https://api.github.com/users/fxmbsw7/followers", "following_url": "https://api.github.com/users/fxmbsw7/following{/other_user}", "gists_url": "https://api.github.com/users/fxmbsw7/gists{/gist_id}", "starred_url": "https://api.github.com/users/fxmbsw7/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/fxmbsw7/subscriptions", "organizations_url": "https://api.github.com/users/fxmbsw7/orgs", "repos_url": "https://api.github.com/users/fxmbsw7/repos", "events_url": "https://api.github.com/users/fxmbsw7/events{/privacy}", "received_events_url": "https://api.github.com/users/fxmbsw7/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-12-09T16:14:48
2024-12-10T05:58:27
2024-12-10T05:58:27
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? i try go get ... and such , just testing go cmds esecially for newest versions then well this appeared go: honnef.co/go/tools@v0.5.1 (matching ...@upgrade) requires github.com/BurntSushi/toml@v1.4.1-0.20240526193622-a339e1f7089c, not github.com/BurntSushi/toml@v1.4.0 (matching ...@upgrade) ### OS Linux ### GPU Other ### CPU Other ### Ollama version from git today
{ "login": "fxmbsw7", "id": 39368685, "node_id": "MDQ6VXNlcjM5MzY4Njg1", "avatar_url": "https://avatars.githubusercontent.com/u/39368685?v=4", "gravatar_id": "", "url": "https://api.github.com/users/fxmbsw7", "html_url": "https://github.com/fxmbsw7", "followers_url": "https://api.github.com/users/fxmbsw7/followers", "following_url": "https://api.github.com/users/fxmbsw7/following{/other_user}", "gists_url": "https://api.github.com/users/fxmbsw7/gists{/gist_id}", "starred_url": "https://api.github.com/users/fxmbsw7/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/fxmbsw7/subscriptions", "organizations_url": "https://api.github.com/users/fxmbsw7/orgs", "repos_url": "https://api.github.com/users/fxmbsw7/repos", "events_url": "https://api.github.com/users/fxmbsw7/events{/privacy}", "received_events_url": "https://api.github.com/users/fxmbsw7/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8013/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8013/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/582
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/582/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/582/comments
https://api.github.com/repos/ollama/ollama/issues/582/events
https://github.com/ollama/ollama/pull/582
1,910,038,627
PR_kwDOJ0Z1Ps5bDIX_
582
fix end-of-line issue with the new prompt
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-09-24T00:04:07
2023-09-24T00:20:31
2023-09-24T00:20:30
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/582", "html_url": "https://github.com/ollama/ollama/pull/582", "diff_url": "https://github.com/ollama/ollama/pull/582.diff", "patch_url": "https://github.com/ollama/ollama/pull/582.patch", "merged_at": "2023-09-24T00:20:30" }
The readline library had this fix which overwrote the end of the `S` in "Send a message..." which prevented the cursor from moving up the screen whenever you backspaced through the end of the line. We removed it to fix the placeholder text issue, but then the bug crept back in.
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/582/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/582/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/576
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/576/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/576/comments
https://api.github.com/repos/ollama/ollama/issues/576/events
https://github.com/ollama/ollama/pull/576
1,909,332,077
PR_kwDOJ0Z1Ps5bA4ML
576
ubuntu cuda drivers
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-09-22T18:13:11
2023-09-22T18:43:15
2023-09-22T18:43:14
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/576", "html_url": "https://github.com/ollama/ollama/pull/576", "diff_url": "https://github.com/ollama/ollama/pull/576.diff", "patch_url": "https://github.com/ollama/ollama/pull/576.patch", "merged_at": "2023-09-22T18:43:14" }
Add automatic cuda driver install to our install script.
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/576/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/576/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7803
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7803/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7803/comments
https://api.github.com/repos/ollama/ollama/issues/7803/events
https://github.com/ollama/ollama/issues/7803
2,684,582,879
I_kwDOJ0Z1Ps6gA3vf
7,803
problem with ollama serve
{ "login": "atb29", "id": 144506608, "node_id": "U_kgDOCJz-8A", "avatar_url": "https://avatars.githubusercontent.com/u/144506608?v=4", "gravatar_id": "", "url": "https://api.github.com/users/atb29", "html_url": "https://github.com/atb29", "followers_url": "https://api.github.com/users/atb29/followers", "following_url": "https://api.github.com/users/atb29/following{/other_user}", "gists_url": "https://api.github.com/users/atb29/gists{/gist_id}", "starred_url": "https://api.github.com/users/atb29/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/atb29/subscriptions", "organizations_url": "https://api.github.com/users/atb29/orgs", "repos_url": "https://api.github.com/users/atb29/repos", "events_url": "https://api.github.com/users/atb29/events{/privacy}", "received_events_url": "https://api.github.com/users/atb29/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 6677367769, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q", "url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info", "name": "needs more info", "color": "BA8041", "default": false, "description": "More information is needed to assist" } ]
closed
false
null
[]
null
3
2024-11-22T21:12:11
2024-12-14T15:33:07
2024-12-14T15:33:07
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? C:\Users\allowdin>ollama serve Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted. ### OS Windows ### GPU Other ### CPU Intel ### Ollama version 0.3.13
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/users/rick-github/followers", "following_url": "https://api.github.com/users/rick-github/following{/other_user}", "gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}", "starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rick-github/subscriptions", "organizations_url": "https://api.github.com/users/rick-github/orgs", "repos_url": "https://api.github.com/users/rick-github/repos", "events_url": "https://api.github.com/users/rick-github/events{/privacy}", "received_events_url": "https://api.github.com/users/rick-github/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7803/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7803/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/58
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/58/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/58/comments
https://api.github.com/repos/ollama/ollama/issues/58/events
https://github.com/ollama/ollama/pull/58
1,794,392,513
PR_kwDOJ0Z1Ps5U9rY3
58
return error in generate response
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
1
2023-07-07T21:27:14
2023-07-10T21:35:18
2023-07-10T21:03:47
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/58", "html_url": "https://github.com/ollama/ollama/pull/58", "diff_url": "https://github.com/ollama/ollama/pull/58.diff", "patch_url": "https://github.com/ollama/ollama/pull/58.patch", "merged_at": "2023-07-10T21:03:47" }
null
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/58/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/58/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/4000
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4000/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4000/comments
https://api.github.com/repos/ollama/ollama/issues/4000/events
https://github.com/ollama/ollama/issues/4000
2,267,450,441
I_kwDOJ0Z1Ps6HJoxJ
4,000
Error: invalid file magic when trying to import a custom gguf model to ollama instance
{ "login": "atb29", "id": 144506608, "node_id": "U_kgDOCJz-8A", "avatar_url": "https://avatars.githubusercontent.com/u/144506608?v=4", "gravatar_id": "", "url": "https://api.github.com/users/atb29", "html_url": "https://github.com/atb29", "followers_url": "https://api.github.com/users/atb29/followers", "following_url": "https://api.github.com/users/atb29/following{/other_user}", "gists_url": "https://api.github.com/users/atb29/gists{/gist_id}", "starred_url": "https://api.github.com/users/atb29/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/atb29/subscriptions", "organizations_url": "https://api.github.com/users/atb29/orgs", "repos_url": "https://api.github.com/users/atb29/repos", "events_url": "https://api.github.com/users/atb29/events{/privacy}", "received_events_url": "https://api.github.com/users/atb29/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-04-28T09:21:07
2024-05-01T20:30:09
2024-05-01T20:30:09
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? i got this error "E:\phi3-mini-128k-gguf\model>ollama create phi-3-mini-128k -f Modelfile transferring model data creating model layer Error: invalid file magic" here is the modelfile used FROM ./Phi-3-mini-128k-instruct.IQ4_XS.gguf PARAMETER num_ctx 65536 PARAMETER num_keep 4 PARAMETER stop <|user|> PARAMETER stop <|assistant|> PARAMETER stop <|system|> PARAMETER stop <|end|> PARAMETER stop <|endoftext|> TEMPLATE """ {{ if .System }}<|system|> {{ .System }}<|end|> {{ end }}{{ if .Prompt }}<|user|> {{ .Prompt }}<|end|> {{ end }}<|assistant|> {{ .Response }}<|end|> """ ### OS Windows ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.1.31
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4000/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4000/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6747
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6747/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6747/comments
https://api.github.com/repos/ollama/ollama/issues/6747/events
https://github.com/ollama/ollama/issues/6747
2,518,958,255
I_kwDOJ0Z1Ps6WJECv
6,747
ERROR: llama runner process has terminated: error loading modelvocabulary: _Map_base::at
{ "login": "CjhHa1", "id": 72591262, "node_id": "MDQ6VXNlcjcyNTkxMjYy", "avatar_url": "https://avatars.githubusercontent.com/u/72591262?v=4", "gravatar_id": "", "url": "https://api.github.com/users/CjhHa1", "html_url": "https://github.com/CjhHa1", "followers_url": "https://api.github.com/users/CjhHa1/followers", "following_url": "https://api.github.com/users/CjhHa1/following{/other_user}", "gists_url": "https://api.github.com/users/CjhHa1/gists{/gist_id}", "starred_url": "https://api.github.com/users/CjhHa1/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/CjhHa1/subscriptions", "organizations_url": "https://api.github.com/users/CjhHa1/orgs", "repos_url": "https://api.github.com/users/CjhHa1/repos", "events_url": "https://api.github.com/users/CjhHa1/events{/privacy}", "received_events_url": "https://api.github.com/users/CjhHa1/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
5
2024-09-11T08:19:06
2025-01-08T16:06:35
2024-09-11T18:38:05
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I tried the octopus v4 model using ollama, I have constrcted the model using Modelfile, but it can not work when I try to run it. ``` transferring model data 100% using existing layer sha256:08fc24194c1805114ed7aefcc8e5cedf7fe1e23b587f1655502afb294263b948 using existing layer sha256:e49180b1ca46f2ea2a9d652ab44a1ab5dbd5d3f9ed3fd8e985a1122883d580d3 using existing layer sha256:90bf36950b9123ecc20aa969070a7b3d51bbb782100f0e47b2a2c5f3cf6a24da writing manifest success (base) [root@ip-172-31-81-66 ollama_files]# ollama ls NAME ID SIZE MODIFIED octopus-v4-Q4_K_M:latest 099ce87bccef 2.4 GB 14 seconds ago ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version ollama version is 0.3.6
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6747/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6747/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6869
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6869/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6869/comments
https://api.github.com/repos/ollama/ollama/issues/6869/events
https://github.com/ollama/ollama/issues/6869
2,535,207,221
I_kwDOJ0Z1Ps6XHDE1
6,869
An operation on a socket could not be performed
{ "login": "ipzzzzz", "id": 26899986, "node_id": "MDQ6VXNlcjI2ODk5OTg2", "avatar_url": "https://avatars.githubusercontent.com/u/26899986?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ipzzzzz", "html_url": "https://github.com/ipzzzzz", "followers_url": "https://api.github.com/users/ipzzzzz/followers", "following_url": "https://api.github.com/users/ipzzzzz/following{/other_user}", "gists_url": "https://api.github.com/users/ipzzzzz/gists{/gist_id}", "starred_url": "https://api.github.com/users/ipzzzzz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ipzzzzz/subscriptions", "organizations_url": "https://api.github.com/users/ipzzzzz/orgs", "repos_url": "https://api.github.com/users/ipzzzzz/repos", "events_url": "https://api.github.com/users/ipzzzzz/events{/privacy}", "received_events_url": "https://api.github.com/users/ipzzzzz/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" }, { "id": 5860134234, "node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg", "url": "https://api.github.com/repos/ollama/ollama/labels/windows", "name": "windows", "color": "0052CC", "default": false, "description": "" }, { "id": 6677367769, "node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q", "url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info", "name": "needs more info", "color": "BA8041", "default": false, "description": "More information is needed to assist" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
7
2024-09-19T03:42:13
2024-10-08T23:28:50
2024-10-08T23:28:50
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? The error reported in the terminal: dial tcp [xxx]:443: bind: An operation on a socket could not be performed because the system lacked sufficient buffer space or because a queue was full. ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.3.10
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6869/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6869/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6125
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6125/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6125/comments
https://api.github.com/repos/ollama/ollama/issues/6125/events
https://github.com/ollama/ollama/issues/6125
2,443,163,116
I_kwDOJ0Z1Ps6Rn7Xs
6,125
minor bug: ggml/llama.cpp's new Q4_0_4_8 quantized files don't import into ollama
{ "login": "AndreasKunar", "id": 53893683, "node_id": "MDQ6VXNlcjUzODkzNjgz", "avatar_url": "https://avatars.githubusercontent.com/u/53893683?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AndreasKunar", "html_url": "https://github.com/AndreasKunar", "followers_url": "https://api.github.com/users/AndreasKunar/followers", "following_url": "https://api.github.com/users/AndreasKunar/following{/other_user}", "gists_url": "https://api.github.com/users/AndreasKunar/gists{/gist_id}", "starred_url": "https://api.github.com/users/AndreasKunar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AndreasKunar/subscriptions", "organizations_url": "https://api.github.com/users/AndreasKunar/orgs", "repos_url": "https://api.github.com/users/AndreasKunar/repos", "events_url": "https://api.github.com/users/AndreasKunar/events{/privacy}", "received_events_url": "https://api.github.com/users/AndreasKunar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
6
2024-08-01T18:45:11
2024-10-20T13:17:08
2024-08-11T11:02:02
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? I built ollama on Ubuntu24.04, running in Windows11's WSL2 on my Surface 11 Pro to try and test Ollama with llama.cpp's Q4_0_4_8 acceleration. Ollama+llama.cpp builds, imports my local llama-2 Q4_0, and runs it. But when I try and import a local llama-2 Q4_0_4_8 model (which runs with llama.cpp), it gives an "Error: invalid file magic", apparently from its ggml.go module (at line 311 ?), which maybe does not seem to understand the new Q4_0_4_4 and Q4_0_4_8 formats. llama.cpp recently introduced these formats to accelerate modern arm64 CPUs like the Snapdragon X, It also works on other newer ARM CPUs and brings an up to 2-3x speed improvement. Details see [llama.cpp PR#5780](https://github.com/ggerganov/llama.cpp/pull/5780), and there seems to be work done for x64. P.S.:I tried this on Linux (Windows' WSL2), since building llama.cpp for Windows on ARM / Snapdragon X requires special build instructions (using clang instead of MSVC, details see [llama.cpp build instructions](https://github.com/ggerganov/llama.cpp/blob/master/docs/build.md)), and I'm not sure if ollama already follows these. @SebastianGode independently also had this issue. ### OS Linux / Ubuntu 24.04 on WSL2, Windows on ARM ### GPU None ### CPU arm64 / Snapdragon X Plus ### Ollama version 0.3.2, 3e61426
{ "login": "AndreasKunar", "id": 53893683, "node_id": "MDQ6VXNlcjUzODkzNjgz", "avatar_url": "https://avatars.githubusercontent.com/u/53893683?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AndreasKunar", "html_url": "https://github.com/AndreasKunar", "followers_url": "https://api.github.com/users/AndreasKunar/followers", "following_url": "https://api.github.com/users/AndreasKunar/following{/other_user}", "gists_url": "https://api.github.com/users/AndreasKunar/gists{/gist_id}", "starred_url": "https://api.github.com/users/AndreasKunar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AndreasKunar/subscriptions", "organizations_url": "https://api.github.com/users/AndreasKunar/orgs", "repos_url": "https://api.github.com/users/AndreasKunar/repos", "events_url": "https://api.github.com/users/AndreasKunar/events{/privacy}", "received_events_url": "https://api.github.com/users/AndreasKunar/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6125/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6125/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3789
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3789/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3789/comments
https://api.github.com/repos/ollama/ollama/issues/3789/events
https://github.com/ollama/ollama/pull/3789
2,254,816,644
PR_kwDOJ0Z1Ps5tQuFX
3,789
chore: use errors.New to replace fmt.Errorf with no parameters will much better
{ "login": "ChengenH", "id": 36215014, "node_id": "MDQ6VXNlcjM2MjE1MDE0", "avatar_url": "https://avatars.githubusercontent.com/u/36215014?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ChengenH", "html_url": "https://github.com/ChengenH", "followers_url": "https://api.github.com/users/ChengenH/followers", "following_url": "https://api.github.com/users/ChengenH/following{/other_user}", "gists_url": "https://api.github.com/users/ChengenH/gists{/gist_id}", "starred_url": "https://api.github.com/users/ChengenH/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ChengenH/subscriptions", "organizations_url": "https://api.github.com/users/ChengenH/orgs", "repos_url": "https://api.github.com/users/ChengenH/repos", "events_url": "https://api.github.com/users/ChengenH/events{/privacy}", "received_events_url": "https://api.github.com/users/ChengenH/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-04-21T01:52:14
2024-04-21T02:11:06
2024-04-21T02:11:06
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/3789", "html_url": "https://github.com/ollama/ollama/pull/3789", "diff_url": "https://github.com/ollama/ollama/pull/3789.diff", "patch_url": "https://github.com/ollama/ollama/pull/3789.patch", "merged_at": "2024-04-21T02:11:06" }
null
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3789/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3789/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7533
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7533/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7533/comments
https://api.github.com/repos/ollama/ollama/issues/7533/events
https://github.com/ollama/ollama/pull/7533
2,639,283,291
PR_kwDOJ0Z1Ps6BHFv-
7,533
runner.go: Check for zero length images
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users/jessegross/followers", "following_url": "https://api.github.com/users/jessegross/following{/other_user}", "gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}", "starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jessegross/subscriptions", "organizations_url": "https://api.github.com/users/jessegross/orgs", "repos_url": "https://api.github.com/users/jessegross/repos", "events_url": "https://api.github.com/users/jessegross/events{/privacy}", "received_events_url": "https://api.github.com/users/jessegross/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-11-06T21:31:16
2024-11-08T17:39:33
2024-11-08T17:39:32
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7533", "html_url": "https://github.com/ollama/ollama/pull/7533", "diff_url": "https://github.com/ollama/ollama/pull/7533.diff", "patch_url": "https://github.com/ollama/ollama/pull/7533.patch", "merged_at": "2024-11-08T17:39:32" }
If we get a request with a zero length image, it will result in an out-of-bounds error when we pass the data to the image encoder.
{ "login": "jessegross", "id": 6468499, "node_id": "MDQ6VXNlcjY0Njg0OTk=", "avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jessegross", "html_url": "https://github.com/jessegross", "followers_url": "https://api.github.com/users/jessegross/followers", "following_url": "https://api.github.com/users/jessegross/following{/other_user}", "gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}", "starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jessegross/subscriptions", "organizations_url": "https://api.github.com/users/jessegross/orgs", "repos_url": "https://api.github.com/users/jessegross/repos", "events_url": "https://api.github.com/users/jessegross/events{/privacy}", "received_events_url": "https://api.github.com/users/jessegross/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7533/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7533/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6858
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6858/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6858/comments
https://api.github.com/repos/ollama/ollama/issues/6858/events
https://github.com/ollama/ollama/issues/6858
2,534,264,359
I_kwDOJ0Z1Ps6XDc4n
6,858
Unable to load adapter_model.safetensors for Phi3-Medium-128k
{ "login": "AAndersn", "id": 19935503, "node_id": "MDQ6VXNlcjE5OTM1NTAz", "avatar_url": "https://avatars.githubusercontent.com/u/19935503?v=4", "gravatar_id": "", "url": "https://api.github.com/users/AAndersn", "html_url": "https://github.com/AAndersn", "followers_url": "https://api.github.com/users/AAndersn/followers", "following_url": "https://api.github.com/users/AAndersn/following{/other_user}", "gists_url": "https://api.github.com/users/AAndersn/gists{/gist_id}", "starred_url": "https://api.github.com/users/AAndersn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AAndersn/subscriptions", "organizations_url": "https://api.github.com/users/AAndersn/orgs", "repos_url": "https://api.github.com/users/AAndersn/repos", "events_url": "https://api.github.com/users/AAndersn/events{/privacy}", "received_events_url": "https://api.github.com/users/AAndersn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-09-18T17:11:53
2024-12-02T22:58:01
2024-12-02T22:58:01
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? Trying to load a safetensors adapter file for phi3-medium-128k using a .modelfile. I generated an adapter_config.json and adapter_model.safetensors files using lora training and copied them into the ollama docker container. I have generated the modelfile with `ollama show phi3:medium --modelfile > phi3_med_cim.modelfile` and edited it to add ``` # Modelfile generated by "ollama show" FROM phi3:medium ADAPTER /home/medium_128k TEMPLATE "{{ if .System }}<|system|> ... ``` When trying to create the model, I am calling `ollama create phi3_med_cim --file /home/phi3_med_cim.modelfile`, which throws the following error: ```transferring model data 100% converting model Error: unsupported architecture ``` These steps work fine for phi3-mini-4k and llama3.1, but not for phi3-medium ### OS WSL2 ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.10 (docker)
{ "login": "rick-github", "id": 14946854, "node_id": "MDQ6VXNlcjE0OTQ2ODU0", "avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4", "gravatar_id": "", "url": "https://api.github.com/users/rick-github", "html_url": "https://github.com/rick-github", "followers_url": "https://api.github.com/users/rick-github/followers", "following_url": "https://api.github.com/users/rick-github/following{/other_user}", "gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}", "starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rick-github/subscriptions", "organizations_url": "https://api.github.com/users/rick-github/orgs", "repos_url": "https://api.github.com/users/rick-github/repos", "events_url": "https://api.github.com/users/rick-github/events{/privacy}", "received_events_url": "https://api.github.com/users/rick-github/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6858/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6858/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6166
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6166/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6166/comments
https://api.github.com/repos/ollama/ollama/issues/6166/events
https://github.com/ollama/ollama/pull/6166
2,447,367,727
PR_kwDOJ0Z1Ps53Yglf
6,166
Added build support to RISC-V
{ "login": "HougeLangley", "id": 1161594, "node_id": "MDQ6VXNlcjExNjE1OTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HougeLangley", "html_url": "https://github.com/HougeLangley", "followers_url": "https://api.github.com/users/HougeLangley/followers", "following_url": "https://api.github.com/users/HougeLangley/following{/other_user}", "gists_url": "https://api.github.com/users/HougeLangley/gists{/gist_id}", "starred_url": "https://api.github.com/users/HougeLangley/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HougeLangley/subscriptions", "organizations_url": "https://api.github.com/users/HougeLangley/orgs", "repos_url": "https://api.github.com/users/HougeLangley/repos", "events_url": "https://api.github.com/users/HougeLangley/events{/privacy}", "received_events_url": "https://api.github.com/users/HougeLangley/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-08-05T00:23:26
2024-08-27T13:14:46
2024-08-27T13:14:46
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6166", "html_url": "https://github.com/ollama/ollama/pull/6166", "diff_url": "https://github.com/ollama/ollama/pull/6166.diff", "patch_url": "https://github.com/ollama/ollama/pull/6166.patch", "merged_at": null }
1. Build support to RISC-V 2. Fix https://github.com/chewxy/math32/issues/46 cc @mxyng @jmorganca
{ "login": "HougeLangley", "id": 1161594, "node_id": "MDQ6VXNlcjExNjE1OTQ=", "avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4", "gravatar_id": "", "url": "https://api.github.com/users/HougeLangley", "html_url": "https://github.com/HougeLangley", "followers_url": "https://api.github.com/users/HougeLangley/followers", "following_url": "https://api.github.com/users/HougeLangley/following{/other_user}", "gists_url": "https://api.github.com/users/HougeLangley/gists{/gist_id}", "starred_url": "https://api.github.com/users/HougeLangley/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/HougeLangley/subscriptions", "organizations_url": "https://api.github.com/users/HougeLangley/orgs", "repos_url": "https://api.github.com/users/HougeLangley/repos", "events_url": "https://api.github.com/users/HougeLangley/events{/privacy}", "received_events_url": "https://api.github.com/users/HougeLangley/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6166/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6166/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/3332
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3332/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3332/comments
https://api.github.com/repos/ollama/ollama/issues/3332/events
https://github.com/ollama/ollama/issues/3332
2,204,774,244
I_kwDOJ0Z1Ps6Dai9k
3,332
what should I do after pulling?
{ "login": "jinja27", "id": 56258048, "node_id": "MDQ6VXNlcjU2MjU4MDQ4", "avatar_url": "https://avatars.githubusercontent.com/u/56258048?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jinja27", "html_url": "https://github.com/jinja27", "followers_url": "https://api.github.com/users/jinja27/followers", "following_url": "https://api.github.com/users/jinja27/following{/other_user}", "gists_url": "https://api.github.com/users/jinja27/gists{/gist_id}", "starred_url": "https://api.github.com/users/jinja27/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jinja27/subscriptions", "organizations_url": "https://api.github.com/users/jinja27/orgs", "repos_url": "https://api.github.com/users/jinja27/repos", "events_url": "https://api.github.com/users/jinja27/events{/privacy}", "received_events_url": "https://api.github.com/users/jinja27/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-03-25T02:49:17
2024-03-25T02:53:20
2024-03-25T02:53:20
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
null
{ "login": "jinja27", "id": 56258048, "node_id": "MDQ6VXNlcjU2MjU4MDQ4", "avatar_url": "https://avatars.githubusercontent.com/u/56258048?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jinja27", "html_url": "https://github.com/jinja27", "followers_url": "https://api.github.com/users/jinja27/followers", "following_url": "https://api.github.com/users/jinja27/following{/other_user}", "gists_url": "https://api.github.com/users/jinja27/gists{/gist_id}", "starred_url": "https://api.github.com/users/jinja27/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jinja27/subscriptions", "organizations_url": "https://api.github.com/users/jinja27/orgs", "repos_url": "https://api.github.com/users/jinja27/repos", "events_url": "https://api.github.com/users/jinja27/events{/privacy}", "received_events_url": "https://api.github.com/users/jinja27/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3332/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3332/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4198
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4198/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4198/comments
https://api.github.com/repos/ollama/ollama/issues/4198/events
https://github.com/ollama/ollama/issues/4198
2,280,584,462
I_kwDOJ0Z1Ps6H7vUO
4,198
Improving the efficiency of using multiple GPU cards.
{ "login": "zhqfdn", "id": 25156863, "node_id": "MDQ6VXNlcjI1MTU2ODYz", "avatar_url": "https://avatars.githubusercontent.com/u/25156863?v=4", "gravatar_id": "", "url": "https://api.github.com/users/zhqfdn", "html_url": "https://github.com/zhqfdn", "followers_url": "https://api.github.com/users/zhqfdn/followers", "following_url": "https://api.github.com/users/zhqfdn/following{/other_user}", "gists_url": "https://api.github.com/users/zhqfdn/gists{/gist_id}", "starred_url": "https://api.github.com/users/zhqfdn/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhqfdn/subscriptions", "organizations_url": "https://api.github.com/users/zhqfdn/orgs", "repos_url": "https://api.github.com/users/zhqfdn/repos", "events_url": "https://api.github.com/users/zhqfdn/events{/privacy}", "received_events_url": "https://api.github.com/users/zhqfdn/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
9
2024-05-06T10:49:23
2024-09-09T21:50:14
2024-06-14T22:35:02
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Before v0.1.32, when loading a model, it would be evenly distributed across all GPU cards to improve the use of GPU cards. In v0.1.32 and v0.1.33, it was found that loading a model would automatically use one card. When used by multiple users simultaneously, it is slower. If evenly distributed across multiple GPU cards, it can improve the utilization rate of GPU cards and improve efficiency. Tesla T4 GPU list ------------------------------------------------------------------- localhost.localdomain Mon May 6 18:41:30 2024 550.54.15 **[0] Tesla T4 | 54°C, 93 % | 12238 / 15360 MB | ollama(12236M)** [1] Tesla T4 | 36°C, 0 % | 2 / 15360 MB | [2] Tesla T4 | 30°C, 0 % | 2 / 15360 MB | [3] Tesla T4 | 33°C, 0 % | 2 / 15360 MB | ollama.service ------------------------------------------------------------------- `[Unit] Description=Ollama Service After=network-online.target [Service] Environment="OLLAMA_HOST=0.0.0.0:11434" Environment="OLLAMA_ORIGINS='*'" Environment="OLLAMA_MODELS=/ollama/ollama/models" Environment="OLLAMA_KEEP_ALIVE=10m" **Environment="OLLAMA_NUM_PARALLEL=4"** Environment="OLLAMA_MAX_LOADED_MODELS=2" Environment="CUDA_VISIBLE_DEVICES=0,1,2,3" ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/root/.local/bin:/root/bin:/usr/lib64/ccache:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin" [Install] WantedBy=default.target`
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4198/reactions", "total_count": 4, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 }
https://api.github.com/repos/ollama/ollama/issues/4198/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/3115
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/3115/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/3115/comments
https://api.github.com/repos/ollama/ollama/issues/3115/events
https://github.com/ollama/ollama/issues/3115
2,184,462,654
I_kwDOJ0Z1Ps6CNEE-
3,115
Switching between models is very slow?
{ "login": "eliranwong", "id": 25262722, "node_id": "MDQ6VXNlcjI1MjYyNzIy", "avatar_url": "https://avatars.githubusercontent.com/u/25262722?v=4", "gravatar_id": "", "url": "https://api.github.com/users/eliranwong", "html_url": "https://github.com/eliranwong", "followers_url": "https://api.github.com/users/eliranwong/followers", "following_url": "https://api.github.com/users/eliranwong/following{/other_user}", "gists_url": "https://api.github.com/users/eliranwong/gists{/gist_id}", "starred_url": "https://api.github.com/users/eliranwong/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eliranwong/subscriptions", "organizations_url": "https://api.github.com/users/eliranwong/orgs", "repos_url": "https://api.github.com/users/eliranwong/repos", "events_url": "https://api.github.com/users/eliranwong/events{/privacy}", "received_events_url": "https://api.github.com/users/eliranwong/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
closed
false
null
[]
null
2
2024-03-13T16:44:07
2024-04-24T13:25:59
2024-03-13T17:50:47
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I noted that switching between models is very slow. Is it possible to load two different models in memory at the same time?
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/3115/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/3115/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8360
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8360/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8360/comments
https://api.github.com/repos/ollama/ollama/issues/8360/events
https://github.com/ollama/ollama/issues/8360
2,777,264,546
I_kwDOJ0Z1Ps6libGi
8,360
can't pull model from hf
{ "login": "Vacinlando", "id": 49453175, "node_id": "MDQ6VXNlcjQ5NDUzMTc1", "avatar_url": "https://avatars.githubusercontent.com/u/49453175?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Vacinlando", "html_url": "https://github.com/Vacinlando", "followers_url": "https://api.github.com/users/Vacinlando/followers", "following_url": "https://api.github.com/users/Vacinlando/following{/other_user}", "gists_url": "https://api.github.com/users/Vacinlando/gists{/gist_id}", "starred_url": "https://api.github.com/users/Vacinlando/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Vacinlando/subscriptions", "organizations_url": "https://api.github.com/users/Vacinlando/orgs", "repos_url": "https://api.github.com/users/Vacinlando/repos", "events_url": "https://api.github.com/users/Vacinlando/events{/privacy}", "received_events_url": "https://api.github.com/users/Vacinlando/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
2
2025-01-09T09:18:51
2025-01-19T02:13:20
2025-01-09T09:20:48
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? input: ollama run hf.co/bartowski/Qwen2.5-7B-Instruct-GGUF:F16 pulling manifest pulling 863c978275bc... 100% ▕████████████████████████████████████▏ 15 GB pulling e94a8ecb9327... 100% ▕████████████████████████████████████▏ 1.6 KB pulling 9ae14bd2c052... 0% ▕ ▏ 0 B/ 193 B Error: max retries exceeded: Get "https://huggingface.co/v2/bartowski/Qwen2.5-7B-Instruct-GGUF/blobs/sha256:9ae14bd2c0522aaeb6adf4b34396f056559a7c27854188ae32695f056d4015d9?__sign=eyJhbGciOiJFZERTQSJ9.eyJyZWFkIjp0cnVlLCJwZXJtaXNzaW9ucyI6eyJyZXBvLmNvbnRlbnQucmVhZCI6dHJ1ZX0sImlhdCI6MTczNjQwNzM3OCwic3ViIjoiL2JhcnRvd3NraS9Rd2VuMi41LTdCLUluc3RydWN0LUdHVUYiLCJleHAiOjE3MzY0MDc5NzgsImlzcyI6Imh0dHBzOi8vaHVnZ2luZ2ZhY2UuY28ifQ.P5a2zBON89Hv8KVRIQF0a5m0daM1sE0kF2MUedZNDA_n8ZNqsOBb4iY-n2a71WF4fN6GzRb3upR-TA0bHwcUCg": dial tcp [2a03:2880:f134:183:face:b00c:0:25de]:443: i/o timeout ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.5.4
{ "login": "Vacinlando", "id": 49453175, "node_id": "MDQ6VXNlcjQ5NDUzMTc1", "avatar_url": "https://avatars.githubusercontent.com/u/49453175?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Vacinlando", "html_url": "https://github.com/Vacinlando", "followers_url": "https://api.github.com/users/Vacinlando/followers", "following_url": "https://api.github.com/users/Vacinlando/following{/other_user}", "gists_url": "https://api.github.com/users/Vacinlando/gists{/gist_id}", "starred_url": "https://api.github.com/users/Vacinlando/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Vacinlando/subscriptions", "organizations_url": "https://api.github.com/users/Vacinlando/orgs", "repos_url": "https://api.github.com/users/Vacinlando/repos", "events_url": "https://api.github.com/users/Vacinlando/events{/privacy}", "received_events_url": "https://api.github.com/users/Vacinlando/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8360/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8360/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/8028
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/8028/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/8028/comments
https://api.github.com/repos/ollama/ollama/issues/8028/events
https://github.com/ollama/ollama/pull/8028
2,730,770,439
PR_kwDOJ0Z1Ps6EvbMZ
8,028
readme + examples: add pgai to readme projects and add example for semantic search
{ "login": "Askir", "id": 10251955, "node_id": "MDQ6VXNlcjEwMjUxOTU1", "avatar_url": "https://avatars.githubusercontent.com/u/10251955?v=4", "gravatar_id": "", "url": "https://api.github.com/users/Askir", "html_url": "https://github.com/Askir", "followers_url": "https://api.github.com/users/Askir/followers", "following_url": "https://api.github.com/users/Askir/following{/other_user}", "gists_url": "https://api.github.com/users/Askir/gists{/gist_id}", "starred_url": "https://api.github.com/users/Askir/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Askir/subscriptions", "organizations_url": "https://api.github.com/users/Askir/orgs", "repos_url": "https://api.github.com/users/Askir/repos", "events_url": "https://api.github.com/users/Askir/events{/privacy}", "received_events_url": "https://api.github.com/users/Askir/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-12-10T17:43:13
2024-12-16T09:02:28
2024-12-16T09:02:28
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/8028", "html_url": "https://github.com/ollama/ollama/pull/8028", "diff_url": "https://github.com/ollama/ollama/pull/8028.diff", "patch_url": "https://github.com/ollama/ollama/pull/8028.patch", "merged_at": "2024-12-16T09:02:28" }
Hey there, adding a mention of pgai to the community integrations section in the Readme as well as a small demo script for the example folder. Let me know if you'd like to change anything!
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/users/mchiang0610/followers", "following_url": "https://api.github.com/users/mchiang0610/following{/other_user}", "gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}", "starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions", "organizations_url": "https://api.github.com/users/mchiang0610/orgs", "repos_url": "https://api.github.com/users/mchiang0610/repos", "events_url": "https://api.github.com/users/mchiang0610/events{/privacy}", "received_events_url": "https://api.github.com/users/mchiang0610/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/8028/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/8028/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/1375
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/1375/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/1375/comments
https://api.github.com/repos/ollama/ollama/issues/1375/events
https://github.com/ollama/ollama/issues/1375
2,024,498,866
I_kwDOJ0Z1Ps54q2ay
1,375
Add option to not stream output in terminal
{ "login": "devinprater", "id": 15256014, "node_id": "MDQ6VXNlcjE1MjU2MDE0", "avatar_url": "https://avatars.githubusercontent.com/u/15256014?v=4", "gravatar_id": "", "url": "https://api.github.com/users/devinprater", "html_url": "https://github.com/devinprater", "followers_url": "https://api.github.com/users/devinprater/followers", "following_url": "https://api.github.com/users/devinprater/following{/other_user}", "gists_url": "https://api.github.com/users/devinprater/gists{/gist_id}", "starred_url": "https://api.github.com/users/devinprater/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/devinprater/subscriptions", "organizations_url": "https://api.github.com/users/devinprater/orgs", "repos_url": "https://api.github.com/users/devinprater/repos", "events_url": "https://api.github.com/users/devinprater/events{/privacy}", "received_events_url": "https://api.github.com/users/devinprater/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
1
2023-12-04T18:47:31
2024-05-17T13:55:07
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I use a screen reader, and when Ollama is sending output, it pauses while it's generating text, so, it, sounds, a little, like, this. Not too big a deal, but it would make it sound a bit better to have it output all at once, when done generating, not in a stream. Thanks!
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/1375/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/1375/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6555
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6555/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6555/comments
https://api.github.com/repos/ollama/ollama/issues/6555/events
https://github.com/ollama/ollama/issues/6555
2,494,487,024
I_kwDOJ0Z1Ps6Urtnw
6,555
/api/embed returns empty embeddings in docker environment
{ "login": "smoothdvd", "id": 22420, "node_id": "MDQ6VXNlcjIyNDIw", "avatar_url": "https://avatars.githubusercontent.com/u/22420?v=4", "gravatar_id": "", "url": "https://api.github.com/users/smoothdvd", "html_url": "https://github.com/smoothdvd", "followers_url": "https://api.github.com/users/smoothdvd/followers", "following_url": "https://api.github.com/users/smoothdvd/following{/other_user}", "gists_url": "https://api.github.com/users/smoothdvd/gists{/gist_id}", "starred_url": "https://api.github.com/users/smoothdvd/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/smoothdvd/subscriptions", "organizations_url": "https://api.github.com/users/smoothdvd/orgs", "repos_url": "https://api.github.com/users/smoothdvd/repos", "events_url": "https://api.github.com/users/smoothdvd/events{/privacy}", "received_events_url": "https://api.github.com/users/smoothdvd/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-08-29T13:40:15
2024-08-29T16:06:51
2024-08-29T16:06:51
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? ``` curl http://localhost:11434/api/embed -d '{ "model": "bge-m3", "prompt": "Llamas are members of the camelid family" }' {"model":"bge-m3","embeddings":[]}% ``` but old /api/embedding endpoint still worked ### OS Docker ### GPU Nvidia ### CPU Intel ### Ollama version 0.3.8
{ "login": "smoothdvd", "id": 22420, "node_id": "MDQ6VXNlcjIyNDIw", "avatar_url": "https://avatars.githubusercontent.com/u/22420?v=4", "gravatar_id": "", "url": "https://api.github.com/users/smoothdvd", "html_url": "https://github.com/smoothdvd", "followers_url": "https://api.github.com/users/smoothdvd/followers", "following_url": "https://api.github.com/users/smoothdvd/following{/other_user}", "gists_url": "https://api.github.com/users/smoothdvd/gists{/gist_id}", "starred_url": "https://api.github.com/users/smoothdvd/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/smoothdvd/subscriptions", "organizations_url": "https://api.github.com/users/smoothdvd/orgs", "repos_url": "https://api.github.com/users/smoothdvd/repos", "events_url": "https://api.github.com/users/smoothdvd/events{/privacy}", "received_events_url": "https://api.github.com/users/smoothdvd/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6555/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6555/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/4778
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/4778/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/4778/comments
https://api.github.com/repos/ollama/ollama/issues/4778/events
https://github.com/ollama/ollama/issues/4778
2,329,441,744
I_kwDOJ0Z1Ps6K2HXQ
4,778
Support for jina-embeddings-v2-base-zh
{ "login": "wwjCMP", "id": 32979859, "node_id": "MDQ6VXNlcjMyOTc5ODU5", "avatar_url": "https://avatars.githubusercontent.com/u/32979859?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wwjCMP", "html_url": "https://github.com/wwjCMP", "followers_url": "https://api.github.com/users/wwjCMP/followers", "following_url": "https://api.github.com/users/wwjCMP/following{/other_user}", "gists_url": "https://api.github.com/users/wwjCMP/gists{/gist_id}", "starred_url": "https://api.github.com/users/wwjCMP/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wwjCMP/subscriptions", "organizations_url": "https://api.github.com/users/wwjCMP/orgs", "repos_url": "https://api.github.com/users/wwjCMP/repos", "events_url": "https://api.github.com/users/wwjCMP/events{/privacy}", "received_events_url": "https://api.github.com/users/wwjCMP/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5789807732, "node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA", "url": "https://api.github.com/repos/ollama/ollama/labels/model%20request", "name": "model request", "color": "1E5DE6", "default": false, "description": "Model requests" } ]
closed
false
null
[]
null
8
2024-06-02T03:06:55
2024-11-05T19:02:18
2024-10-30T23:20:29
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
https://huggingface.co/jinaai/jina-embeddings-v2-base-zh
{ "login": "wwjCMP", "id": 32979859, "node_id": "MDQ6VXNlcjMyOTc5ODU5", "avatar_url": "https://avatars.githubusercontent.com/u/32979859?v=4", "gravatar_id": "", "url": "https://api.github.com/users/wwjCMP", "html_url": "https://github.com/wwjCMP", "followers_url": "https://api.github.com/users/wwjCMP/followers", "following_url": "https://api.github.com/users/wwjCMP/following{/other_user}", "gists_url": "https://api.github.com/users/wwjCMP/gists{/gist_id}", "starred_url": "https://api.github.com/users/wwjCMP/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wwjCMP/subscriptions", "organizations_url": "https://api.github.com/users/wwjCMP/orgs", "repos_url": "https://api.github.com/users/wwjCMP/repos", "events_url": "https://api.github.com/users/wwjCMP/events{/privacy}", "received_events_url": "https://api.github.com/users/wwjCMP/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/4778/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/4778/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/2052
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2052/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2052/comments
https://api.github.com/repos/ollama/ollama/issues/2052/events
https://github.com/ollama/ollama/issues/2052
2,088,578,302
I_kwDOJ0Z1Ps58fSz-
2,052
Cannot interrupt evaluation `ollama serve` during prompt evaluation
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
1
2024-01-18T15:52:40
2024-03-11T18:10:53
2024-03-11T18:10:53
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
When trying to interrupt either generation or `ollama serve` with ctrl+c while the prompt is being evaluated, Ollama will wait until prompt evaluation is done
{ "login": "pdevine", "id": 75239, "node_id": "MDQ6VXNlcjc1MjM5", "avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4", "gravatar_id": "", "url": "https://api.github.com/users/pdevine", "html_url": "https://github.com/pdevine", "followers_url": "https://api.github.com/users/pdevine/followers", "following_url": "https://api.github.com/users/pdevine/following{/other_user}", "gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}", "starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pdevine/subscriptions", "organizations_url": "https://api.github.com/users/pdevine/orgs", "repos_url": "https://api.github.com/users/pdevine/repos", "events_url": "https://api.github.com/users/pdevine/events{/privacy}", "received_events_url": "https://api.github.com/users/pdevine/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2052/reactions", "total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2052/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/6504
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6504/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6504/comments
https://api.github.com/repos/ollama/ollama/issues/6504/events
https://github.com/ollama/ollama/pull/6504
2,485,432,496
PR_kwDOJ0Z1Ps55XTpO
6,504
openai: increase context window when max_tokens is provided
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
open
false
{ "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/users/ParthSareen/followers", "following_url": "https://api.github.com/users/ParthSareen/following{/other_user}", "gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}", "starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions", "organizations_url": "https://api.github.com/users/ParthSareen/orgs", "repos_url": "https://api.github.com/users/ParthSareen/repos", "events_url": "https://api.github.com/users/ParthSareen/events{/privacy}", "received_events_url": "https://api.github.com/users/ParthSareen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "ParthSareen", "id": 29360864, "node_id": "MDQ6VXNlcjI5MzYwODY0", "avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ParthSareen", "html_url": "https://github.com/ParthSareen", "followers_url": "https://api.github.com/users/ParthSareen/followers", "following_url": "https://api.github.com/users/ParthSareen/following{/other_user}", "gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}", "starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions", "organizations_url": "https://api.github.com/users/ParthSareen/orgs", "repos_url": "https://api.github.com/users/ParthSareen/repos", "events_url": "https://api.github.com/users/ParthSareen/events{/privacy}", "received_events_url": "https://api.github.com/users/ParthSareen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
9
2024-08-25T19:36:12
2025-01-30T00:08:22
null
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/6504", "html_url": "https://github.com/ollama/ollama/pull/6504", "diff_url": "https://github.com/ollama/ollama/pull/6504.diff", "patch_url": "https://github.com/ollama/ollama/pull/6504.patch", "merged_at": null }
Previously, `/v1/chat/completions` requests were limited to 2048 tokens. This PR extends the context length by setting `num_ctx` to `max_tokens` if it's larger than the default context window of 2048 tokens. It also includes a minor clean up for the OpenAI compatibility unit tests. Note: this doesn't solve the case of having a large context window while limiting the number of tokens to a small number. This will be solved in a future change where `num_ctx` will be set automatically based on available VRAM and compute. Fixes https://github.com/ollama/ollama/issues/6286 https://github.com/ollama/ollama/issues/5356
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6504/reactions", "total_count": 37, "+1": 25, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 12, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6504/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/5442
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5442/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5442/comments
https://api.github.com/repos/ollama/ollama/issues/5442/events
https://github.com/ollama/ollama/pull/5442
2,387,027,329
PR_kwDOJ0Z1Ps50PbQf
5,442
Add windows radeon concurrency note
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-07-02T19:46:39
2024-07-02T19:47:51
2024-07-02T19:47:47
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5442", "html_url": "https://github.com/ollama/ollama/pull/5442", "diff_url": "https://github.com/ollama/ollama/pull/5442.diff", "patch_url": "https://github.com/ollama/ollama/pull/5442.patch", "merged_at": "2024-07-02T19:47:47" }
null
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5442/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5442/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/7953
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7953/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7953/comments
https://api.github.com/repos/ollama/ollama/issues/7953/events
https://github.com/ollama/ollama/issues/7953
2,721,104,729
I_kwDOJ0Z1Ps6iMMNZ
7,953
Ollama not using GPU (windows)
{ "login": "stormcoph", "id": 122960804, "node_id": "U_kgDOB1Q7pA", "avatar_url": "https://avatars.githubusercontent.com/u/122960804?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stormcoph", "html_url": "https://github.com/stormcoph", "followers_url": "https://api.github.com/users/stormcoph/followers", "following_url": "https://api.github.com/users/stormcoph/following{/other_user}", "gists_url": "https://api.github.com/users/stormcoph/gists{/gist_id}", "starred_url": "https://api.github.com/users/stormcoph/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stormcoph/subscriptions", "organizations_url": "https://api.github.com/users/stormcoph/orgs", "repos_url": "https://api.github.com/users/stormcoph/repos", "events_url": "https://api.github.com/users/stormcoph/events{/privacy}", "received_events_url": "https://api.github.com/users/stormcoph/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
null
[]
null
5
2024-12-05T18:20:25
2024-12-05T19:47:15
2024-12-05T19:47:15
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? now i know there has been a lot other issues about this problem and most of them has been solved but i have not found an answer for my specific scenario, most of the cases has been on linux. In other threads i seen people say its because they don't have enough vram that it automatically uses the cpu, but i have 24gb vram so I don't think that is the case now when i run "nvidia-smi" it returns: ``` Cuda compilation tools, release 11.8, V11.8.89 Build cuda_11.8.r11.8/compiler.31833905_0 C:\Users\storm>nvidia-smi Thu Dec 5 19:09:21 2024 +-----------------------------------------------------------------------------------------+ | NVIDIA-SMI 552.22 Driver Version: 552.22 CUDA Version: 12.4 | |-----------------------------------------+------------------------+----------------------+ | GPU Name TCC/WDDM | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+========================+======================| | 0 NVIDIA GeForce RTX 4090 WDDM | 00000000:05:00.0 On | Off | | 0% 48C P5 48W / 450W | 2852MiB / 24564MiB | 42% Default | | | | N/A | +-----------------------------------------+------------------------+----------------------+ +-----------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=========================================================================================| | 0 N/A N/A 2000 C+G ...m Files\Mozilla Firefox\firefox.exe N/A | | 0 N/A N/A 2500 C+G ...ys\WinUI3Apps\PowerToys.Peek.UI.exe N/A | | 0 N/A N/A 5380 C+G ...2txyewy\StartMenuExperienceHost.exe N/A | | 0 N/A N/A 6540 C+G ...werToys\PowerToys.ColorPickerUI.exe N/A | | 0 N/A N/A 7452 C+G ...on\wallpaper_engine\wallpaper32.exe N/A | | 0 N/A N/A 8420 C+G ...s\moments\SteelSeriesCaptureSvc.exe N/A | | 0 N/A N/A 8568 C+G C:\Windows\explorer.exe N/A | | 0 N/A N/A 10148 C+G ...nt.CBS_cw5n1h2txyewy\SearchHost.exe N/A | | 0 N/A N/A 10260 C+G ..._x64__cw5n1h2txyewy\WidgetBoard.exe N/A | | 0 N/A N/A 11108 C+G ...paper_engine\bin\webwallpaper32.exe N/A | | 0 N/A N/A 12212 C+G ...on\131.0.2903.70\msedgewebview2.exe N/A | | 0 N/A N/A 13548 C+G ...siveControlPanel\SystemSettings.exe N/A | | 0 N/A N/A 13828 C+G ...\PowerToys\PowerToys.FancyZones.exe N/A | | 0 N/A N/A 14148 C+G ...UI3Apps\PowerToys.AdvancedPaste.exe N/A | | 0 N/A N/A 14420 C+G ...werToys\PowerToys.PowerLauncher.exe N/A | | 0 N/A N/A 14968 C+G ...CBS_cw5n1h2txyewy\TextInputHost.exe N/A | | 0 N/A N/A 15936 C+G ...ekyb3d8bbwe\PhoneExperienceHost.exe N/A | | 0 N/A N/A 17756 C+G ...indows-x64\jre-legacy\bin\javaw.exe N/A | | 0 N/A N/A 18968 C+G ...m Files\Mozilla Firefox\firefox.exe N/A | | 0 N/A N/A 20156 C+G ...al\Discord\app-1.0.9173\Discord.exe N/A | | 0 N/A N/A 20564 C+G ...pic Games\CrosshairX\CrosshairX.exe N/A | | 0 N/A N/A 20992 C+G ...64__v826wp6bftszj\TranslucentTB.exe N/A | | 0 N/A N/A 24412 C+G ...5n1h2txyewy\ShellExperienceHost.exe N/A | | 0 N/A N/A 24500 C+G ...on\131.0.2903.70\msedgewebview2.exe N/A | | 0 N/A N/A 25788 C+G ...__8wekyb3d8bbwe\WindowsTerminal.exe N/A | +-----------------------------------------------------------------------------------------+ C:\Users\storm> ``` ### OS Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.4.7
{ "login": "stormcoph", "id": 122960804, "node_id": "U_kgDOB1Q7pA", "avatar_url": "https://avatars.githubusercontent.com/u/122960804?v=4", "gravatar_id": "", "url": "https://api.github.com/users/stormcoph", "html_url": "https://github.com/stormcoph", "followers_url": "https://api.github.com/users/stormcoph/followers", "following_url": "https://api.github.com/users/stormcoph/following{/other_user}", "gists_url": "https://api.github.com/users/stormcoph/gists{/gist_id}", "starred_url": "https://api.github.com/users/stormcoph/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/stormcoph/subscriptions", "organizations_url": "https://api.github.com/users/stormcoph/orgs", "repos_url": "https://api.github.com/users/stormcoph/repos", "events_url": "https://api.github.com/users/stormcoph/events{/privacy}", "received_events_url": "https://api.github.com/users/stormcoph/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7953/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7953/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/82
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/82/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/82/comments
https://api.github.com/repos/ollama/ollama/issues/82/events
https://github.com/ollama/ollama/pull/82
1,805,757,906
PR_kwDOJ0Z1Ps5VkVEJ
82
windows build
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2023-07-15T00:31:56
2023-07-23T10:31:22
2023-07-15T03:11:55
CONTRIBUTOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/82", "html_url": "https://github.com/ollama/ollama/pull/82", "diff_url": "https://github.com/ollama/ollama/pull/82.diff", "patch_url": "https://github.com/ollama/ollama/pull/82.patch", "merged_at": "2023-07-15T03:11:55" }
make some minor changes so it builds on windows. TODO: moving the `.*.part` to the full file isn't working correctly but #78 will change how that works so temporary workaround is to copy the `.*.part` to the final name
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/82/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/82/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6987
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6987/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6987/comments
https://api.github.com/repos/ollama/ollama/issues/6987/events
https://github.com/ollama/ollama/issues/6987
2,551,333,987
I_kwDOJ0Z1Ps6YEkRj
6,987
Way to stop all running models
{ "login": "amytimed", "id": 52638772, "node_id": "MDQ6VXNlcjUyNjM4Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/52638772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amytimed", "html_url": "https://github.com/amytimed", "followers_url": "https://api.github.com/users/amytimed/followers", "following_url": "https://api.github.com/users/amytimed/following{/other_user}", "gists_url": "https://api.github.com/users/amytimed/gists{/gist_id}", "starred_url": "https://api.github.com/users/amytimed/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/amytimed/subscriptions", "organizations_url": "https://api.github.com/users/amytimed/orgs", "repos_url": "https://api.github.com/users/amytimed/repos", "events_url": "https://api.github.com/users/amytimed/events{/privacy}", "received_events_url": "https://api.github.com/users/amytimed/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
5
2024-09-26T19:01:49
2025-01-18T16:03:59
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
`ollama stop all` or `ollama stop *` etc would be handy
{ "login": "amytimed", "id": 52638772, "node_id": "MDQ6VXNlcjUyNjM4Nzcy", "avatar_url": "https://avatars.githubusercontent.com/u/52638772?v=4", "gravatar_id": "", "url": "https://api.github.com/users/amytimed", "html_url": "https://github.com/amytimed", "followers_url": "https://api.github.com/users/amytimed/followers", "following_url": "https://api.github.com/users/amytimed/following{/other_user}", "gists_url": "https://api.github.com/users/amytimed/gists{/gist_id}", "starred_url": "https://api.github.com/users/amytimed/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/amytimed/subscriptions", "organizations_url": "https://api.github.com/users/amytimed/orgs", "repos_url": "https://api.github.com/users/amytimed/repos", "events_url": "https://api.github.com/users/amytimed/events{/privacy}", "received_events_url": "https://api.github.com/users/amytimed/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6987/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6987/timeline
null
reopened
false
https://api.github.com/repos/ollama/ollama/issues/2899
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/2899/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/2899/comments
https://api.github.com/repos/ollama/ollama/issues/2899/events
https://github.com/ollama/ollama/issues/2899
2,165,487,684
I_kwDOJ0Z1Ps6BErhE
2,899
SIGSEGV: segmentation violation with 0.1.27 and starcoder2:!5b
{ "login": "joseluisamado", "id": 7227984, "node_id": "MDQ6VXNlcjcyMjc5ODQ=", "avatar_url": "https://avatars.githubusercontent.com/u/7227984?v=4", "gravatar_id": "", "url": "https://api.github.com/users/joseluisamado", "html_url": "https://github.com/joseluisamado", "followers_url": "https://api.github.com/users/joseluisamado/followers", "following_url": "https://api.github.com/users/joseluisamado/following{/other_user}", "gists_url": "https://api.github.com/users/joseluisamado/gists{/gist_id}", "starred_url": "https://api.github.com/users/joseluisamado/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/joseluisamado/subscriptions", "organizations_url": "https://api.github.com/users/joseluisamado/orgs", "repos_url": "https://api.github.com/users/joseluisamado/repos", "events_url": "https://api.github.com/users/joseluisamado/events{/privacy}", "received_events_url": "https://api.github.com/users/joseluisamado/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
5
2024-03-03T18:37:46
2024-03-09T01:00:53
2024-03-03T19:15:07
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hello, I am having an issue while running the recently released `starcoder2` model. Please find below the details. **How to reproduce the issue** Running ollama in a container with Docker Compose ``` ollama: image: ollama/ollama:0.1.27 container_name: ollama restart: unless-stopped networks: - common volumes: - /home/ollama:/root/.ollama ports: - "11434:11434" deploy: resources: reservations: devices: - driver: nvidia count: 1 capabilities: [gpu] ``` Pull and run the model ``` $ docker exec ollama ollama pull starcoder2:15b pulling manifest pulling dc5deb763c38... 100% ▕████████████████▏ 9.1 GB pulling 4ec42cd966c9... 100% ▕████████████████▏ 12 KB pulling 5671842f8d52... 100% ▕████████████████▏ 346 B verifying sha256 digest writing manifest removing any unused layers success $ docker exec ollama ollama run starcoder2:15b Error: Post "http://0.0.0.0:11434/api/generate": EOF ``` The container gets killed. Here are the logs ``` time=2024-03-03T18:20:36.119Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2" time=2024-03-03T18:20:36.119Z level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.9" time=2024-03-03T18:20:36.119Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2" time=2024-03-03T18:20:36.119Z level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.9" time=2024-03-03T18:20:36.119Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2" time=2024-03-03T18:20:36.123Z level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama3819308268/cuda_v11/libext_server.so" time=2024-03-03T18:20:36.123Z level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server" ggml_init_cublas: GGML_CUDA_FORCE_MMQ: yes ggml_init_cublas: CUDA_USE_TENSOR_CORES: no ggml_init_cublas: found 1 CUDA devices: Device 0: NVIDIA GeForce RTX 4080 SUPER, compute capability 8.9, VMM: yes SIGSEGV: segmentation violation PC=0x7fcb8138e7fd m=12 sigcode=1 signal arrived during cgo execution goroutine 50 [syscall]: runtime.cgocall(0x9bcdd0, 0xc0001446c8) /usr/local/go/src/runtime/cgocall.go:157 +0x4b fp=0xc0001446a0 sp=0xc000144668 pc=0x409b0b github.com/jmorganca/ollama/llm._Cfunc_dyn_llama_server_init({0x7fcb0c001e10, 0x7fcb05250250, 0x7fcb05240ca0, 0x7fcb05243f20, 0x7fcb05253c00, 0x7fcb0524d840, 0x7fcb05243ba0, 0x7fcb05240d20, 0x7fcb05254500, 0x7fcb052537a0, ...}, ...) _cgo_gotypes.go:282 +0x45 fp=0xc0001446c8 sp=0xc0001446a0 pc=0x7c5485 github.com/jmorganca/ollama/llm.newDynExtServer.func7(0xaf20c4?, 0xc?) /go/src/github.com/jmorganca/ollama/llm/dyn_ext_server.go:153 +0xef fp=0xc0001447b8 sp=0xc0001446c8 pc=0x7c69cf github.com/jmorganca/ollama/llm.newDynExtServer({0xc00069e000, 0x2f}, {0xc000002150, _}, {_, _, _}, {0x0, 0x0, 0x0}, ...) /go/src/github.com/jmorganca/ollama/llm/dyn_ext_server.go:153 +0xa65 fp=0xc000144a58 sp=0xc0001447b8 pc=0x7c6665 github.com/jmorganca/ollama/llm.newLlmServer({{_, _, _}, {_, _}, {_, _}}, {_, _}, {0xc000002150, ...}, ...) /go/src/github.com/jmorganca/ollama/llm/llm.go:158 +0x425 fp=0xc000144c18 sp=0xc000144a58 pc=0x7c2dc5 github.com/jmorganca/ollama/llm.New({0xc000282a50, 0x15}, {0xc000002150, _}, {_, _, _}, {0x0, 0x0, 0x0}, ...) /go/src/github.com/jmorganca/ollama/llm/llm.go:123 +0x713 fp=0xc000144e98 sp=0xc000144c18 pc=0x7c2733 github.com/jmorganca/ollama/server.load(0xc0000fe900?, 0xc0000fe900, {{0x0, 0x800, 0x200, 0x1, 0xffffffffffffffff, 0x0, 0x0, 0x1, ...}, ...}, ...) /go/src/github.com/jmorganca/ollama/server/routes.go:85 +0x3a5 fp=0xc000145018 sp=0xc000144e98 pc=0x996945 github.com/jmorganca/ollama/server.ChatHandler(0xc000408600) /go/src/github.com/jmorganca/ollama/server/routes.go:1173 +0xa37 fp=0xc000145748 sp=0xc000145018 pc=0x9a1f77 github.com/gin-gonic/gin.(*Context).Next(...) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174 github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func1(0xc000408600) /go/src/github.com/jmorganca/ollama/server/routes.go:943 +0x68 fp=0xc000145780 sp=0xc000145748 pc=0x9a07a8 github.com/gin-gonic/gin.(*Context).Next(...) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174 github.com/gin-gonic/gin.CustomRecoveryWithWriter.func1(0xc000408600) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/recovery.go:102 +0x7a fp=0xc0001457d0 sp=0xc000145780 pc=0x97803a github.com/gin-gonic/gin.(*Context).Next(...) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174 github.com/gin-gonic/gin.LoggerWithConfig.func1(0xc000408600) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/logger.go:240 +0xde fp=0xc000145980 sp=0xc0001457d0 pc=0x9771de github.com/gin-gonic/gin.(*Context).Next(...) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174 github.com/gin-gonic/gin.(*Engine).handleHTTPRequest(0xc0003f64e0, 0xc000408600) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/gin.go:620 +0x65b fp=0xc000145b08 sp=0xc000145980 pc=0x97629b github.com/gin-gonic/gin.(*Engine).ServeHTTP(0xc0003f64e0, {0x11403a20?, 0xc00011c0e0}, 0xc000408500) /root/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/gin.go:576 +0x1dd fp=0xc000145b48 sp=0xc000145b08 pc=0x975a5d net/http.serverHandler.ServeHTTP({0x11401d40?}, {0x11403a20?, 0xc00011c0e0?}, 0x6?) /usr/local/go/src/net/http/server.go:2938 +0x8e fp=0xc000145b78 sp=0xc000145b48 pc=0x6ced4e net/http.(*conn).serve(0xc0000fc1b0, {0x11405088, 0xc0001542a0}) /usr/local/go/src/net/http/server.go:2009 +0x5f4 fp=0xc000145fb8 sp=0xc000145b78 pc=0x6cac34 net/http.(*Server).Serve.func3() /usr/local/go/src/net/http/server.go:3086 +0x28 fp=0xc000145fe0 sp=0xc000145fb8 pc=0x6cf568 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000145fe8 sp=0xc000145fe0 pc=0x46e2c1 created by net/http.(*Server).Serve in goroutine 1 /usr/local/go/src/net/http/server.go:3086 +0x5cb goroutine 1 [IO wait]: runtime.gopark(0x480f10?, 0xc0000e3850?, 0xa0?, 0x38?, 0x4f711d?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc0000e3830 sp=0xc0000e3810 pc=0x43e7ee runtime.netpollblock(0x0?, 0x4092a6?, 0x0?) /usr/local/go/src/runtime/netpoll.go:564 +0xf7 fp=0xc0000e3868 sp=0xc0000e3830 pc=0x437277 internal/poll.runtime_pollWait(0x7fcb38dfde28, 0x72) /usr/local/go/src/runtime/netpoll.go:343 +0x85 fp=0xc0000e3888 sp=0xc0000e3868 pc=0x468a05 internal/poll.(*pollDesc).wait(0xc00040a000?, 0x11846a68?, 0x0) /usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0000e38b0 sp=0xc0000e3888 pc=0x4efd67 internal/poll.(*pollDesc).waitRead(...) /usr/local/go/src/internal/poll/fd_poll_runtime.go:89 internal/poll.(*FD).Accept(0xc00040a000) /usr/local/go/src/internal/poll/fd_unix.go:611 +0x2ac fp=0xc0000e3958 sp=0xc0000e38b0 pc=0x4f524c net.(*netFD).accept(0xc00040a000) /usr/local/go/src/net/fd_unix.go:172 +0x29 fp=0xc0000e3a10 sp=0xc0000e3958 pc=0x56be29 net.(*TCPListener).accept(0xc0003e1560) /usr/local/go/src/net/tcpsock_posix.go:152 +0x1e fp=0xc0000e3a38 sp=0xc0000e3a10 pc=0x580c3e net.(*TCPListener).Accept(0xc0003e1560) /usr/local/go/src/net/tcpsock.go:315 +0x30 fp=0xc0000e3a68 sp=0xc0000e3a38 pc=0x57fdf0 net/http.(*onceCloseListener).Accept(0xc0000fc1b0?) <autogenerated>:1 +0x24 fp=0xc0000e3a80 sp=0xc0000e3a68 pc=0x6f1ae4 net/http.(*Server).Serve(0xc000318ff0, {0x11403810, 0xc0003e1560}) /usr/local/go/src/net/http/server.go:3056 +0x364 fp=0xc0000e3bb0 sp=0xc0000e3a80 pc=0x6cf1a4 github.com/jmorganca/ollama/server.Serve({0x11403810, 0xc0003e1560}) /go/src/github.com/jmorganca/ollama/server/routes.go:1046 +0x454 fp=0xc0000e3c98 sp=0xc0000e3bb0 pc=0x9a0c54 github.com/jmorganca/ollama/cmd.RunServer(0xc000408300?, {0x1184c8c0?, 0x4?, 0xad9d6a?}) /go/src/github.com/jmorganca/ollama/cmd/cmd.go:706 +0x1b9 fp=0xc0000e3d30 sp=0xc0000e3c98 pc=0x9b3d99 github.com/spf13/cobra.(*Command).execute(0xc0003a1800, {0x1184c8c0, 0x0, 0x0}) /root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:940 +0x87c fp=0xc0000e3e68 sp=0xc0000e3d30 pc=0x764d9c github.com/spf13/cobra.(*Command).ExecuteC(0xc0003a0c00) /root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc0000e3f20 sp=0xc0000e3e68 pc=0x7655c5 github.com/spf13/cobra.(*Command).Execute(...) /root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:992 github.com/spf13/cobra.(*Command).ExecuteContext(...) /root/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:985 main.main() /go/src/github.com/jmorganca/ollama/main.go:11 +0x4d fp=0xc0000e3f40 sp=0xc0000e3f20 pc=0x9bbeed runtime.main() /usr/local/go/src/runtime/proc.go:267 +0x2bb fp=0xc0000e3fe0 sp=0xc0000e3f40 pc=0x43e39b runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000e3fe8 sp=0xc0000e3fe0 pc=0x46e2c1 goroutine 2 [force gc (idle), 2 minutes]: runtime.gopark(0x16e8d765f29c?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000046fa8 sp=0xc000046f88 pc=0x43e7ee runtime.goparkunlock(...) /usr/local/go/src/runtime/proc.go:404 runtime.forcegchelper() /usr/local/go/src/runtime/proc.go:322 +0xb3 fp=0xc000046fe0 sp=0xc000046fa8 pc=0x43e673 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000046fe8 sp=0xc000046fe0 pc=0x46e2c1 created by runtime.init.6 in goroutine 1 /usr/local/go/src/runtime/proc.go:310 +0x1a goroutine 3 [GC sweep wait]: runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000047778 sp=0xc000047758 pc=0x43e7ee runtime.goparkunlock(...) /usr/local/go/src/runtime/proc.go:404 runtime.bgsweep(0x0?) /usr/local/go/src/runtime/mgcsweep.go:321 +0xdf fp=0xc0000477c8 sp=0xc000047778 pc=0x42a73f runtime.gcenable.func1() /usr/local/go/src/runtime/mgc.go:200 +0x25 fp=0xc0000477e0 sp=0xc0000477c8 pc=0x41f865 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000477e8 sp=0xc0000477e0 pc=0x46e2c1 created by runtime.gcenable in goroutine 1 /usr/local/go/src/runtime/mgc.go:200 +0x66 goroutine 4 [GC scavenge wait]: runtime.gopark(0xe910d?, 0x16176886?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000047f70 sp=0xc000047f50 pc=0x43e7ee runtime.goparkunlock(...) /usr/local/go/src/runtime/proc.go:404 runtime.(*scavengerState).park(0x1181cc40) /usr/local/go/src/runtime/mgcscavenge.go:425 +0x49 fp=0xc000047fa0 sp=0xc000047f70 pc=0x427f69 runtime.bgscavenge(0x0?) /usr/local/go/src/runtime/mgcscavenge.go:658 +0x59 fp=0xc000047fc8 sp=0xc000047fa0 pc=0x428519 runtime.gcenable.func2() /usr/local/go/src/runtime/mgc.go:201 +0x25 fp=0xc000047fe0 sp=0xc000047fc8 pc=0x41f805 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000047fe8 sp=0xc000047fe0 pc=0x46e2c1 created by runtime.gcenable in goroutine 1 /usr/local/go/src/runtime/mgc.go:201 +0xa5 goroutine 18 [finalizer wait, 6 minutes]: runtime.gopark(0xad2d20?, 0x10043f901?, 0x0?, 0x0?, 0x4469a5?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000046628 sp=0xc000046608 pc=0x43e7ee runtime.runfinq() /usr/local/go/src/runtime/mfinal.go:193 +0x107 fp=0xc0000467e0 sp=0xc000046628 pc=0x41e8e7 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000467e8 sp=0xc0000467e0 pc=0x46e2c1 created by runtime.createfing in goroutine 1 /usr/local/go/src/runtime/mfinal.go:163 +0x3d goroutine 19 [select, 6 minutes, locked to thread]: runtime.gopark(0xc0000427a8?, 0x2?, 0x89?, 0xea?, 0xc0000427a4?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000042638 sp=0xc000042618 pc=0x43e7ee runtime.selectgo(0xc0000427a8, 0xc0000427a0, 0x0?, 0x0, 0x0?, 0x1) /usr/local/go/src/runtime/select.go:327 +0x725 fp=0xc000042758 sp=0xc000042638 pc=0x44e325 runtime.ensureSigM.func1() /usr/local/go/src/runtime/signal_unix.go:1014 +0x19f fp=0xc0000427e0 sp=0xc000042758 pc=0x46535f runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000427e8 sp=0xc0000427e0 pc=0x46e2c1 created by runtime.ensureSigM in goroutine 1 /usr/local/go/src/runtime/signal_unix.go:997 +0xc8 goroutine 5 [syscall, 6 minutes]: runtime.notetsleepg(0x0?, 0x0?) /usr/local/go/src/runtime/lock_futex.go:236 +0x29 fp=0xc0000487a0 sp=0xc000048768 pc=0x411349 os/signal.signal_recv() /usr/local/go/src/runtime/sigqueue.go:152 +0x29 fp=0xc0000487c0 sp=0xc0000487a0 pc=0x46ac89 os/signal.loop() /usr/local/go/src/os/signal/signal_unix.go:23 +0x13 fp=0xc0000487e0 sp=0xc0000487c0 pc=0x6f4513 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000487e8 sp=0xc0000487e0 pc=0x46e2c1 created by os/signal.Notify.func1.1 in goroutine 1 /usr/local/go/src/os/signal/signal.go:151 +0x1f goroutine 20 [chan receive, 6 minutes]: runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000042f18 sp=0xc000042ef8 pc=0x43e7ee runtime.chanrecv(0xc000436120, 0x0, 0x1) /usr/local/go/src/runtime/chan.go:583 +0x3cd fp=0xc000042f90 sp=0xc000042f18 pc=0x40beed runtime.chanrecv1(0x0?, 0x0?) /usr/local/go/src/runtime/chan.go:442 +0x12 fp=0xc000042fb8 sp=0xc000042f90 pc=0x40baf2 github.com/jmorganca/ollama/server.Serve.func2() /go/src/github.com/jmorganca/ollama/server/routes.go:1028 +0x25 fp=0xc000042fe0 sp=0xc000042fb8 pc=0x9a0ce5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000042fe8 sp=0xc000042fe0 pc=0x46e2c1 created by github.com/jmorganca/ollama/server.Serve in goroutine 1 /go/src/github.com/jmorganca/ollama/server/routes.go:1027 +0x3c7 goroutine 30 [GC worker (idle)]: runtime.gopark(0x16f2d48eda1f?, 0x3?, 0x9d?, 0x72?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc00043d750 sp=0xc00043d730 pc=0x43e7ee runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc00043d7e0 sp=0xc00043d750 pc=0x4213e5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00043d7e8 sp=0xc00043d7e0 pc=0x46e2c1 created by runtime.gcBgMarkStartWorkers in goroutine 29 /usr/local/go/src/runtime/mgc.go:1217 +0x1c goroutine 31 [GC worker (idle)]: runtime.gopark(0x16f2d6231cd4?, 0x3?, 0xf7?, 0x20?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc00043df50 sp=0xc00043df30 pc=0x43e7ee runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc00043dfe0 sp=0xc00043df50 pc=0x4213e5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00043dfe8 sp=0xc00043dfe0 pc=0x46e2c1 created by runtime.gcBgMarkStartWorkers in goroutine 29 /usr/local/go/src/runtime/mgc.go:1217 +0x1c goroutine 32 [GC worker (idle)]: runtime.gopark(0x16f2d6231cb1?, 0x3?, 0x7?, 0xb?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc00043e750 sp=0xc00043e730 pc=0x43e7ee runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc00043e7e0 sp=0xc00043e750 pc=0x4213e5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00043e7e8 sp=0xc00043e7e0 pc=0x46e2c1 created by runtime.gcBgMarkStartWorkers in goroutine 29 /usr/local/go/src/runtime/mgc.go:1217 +0x1c goroutine 34 [GC worker (idle)]: runtime.gopark(0x16f2d6b56164?, 0x2?, 0x55?, 0x12?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000438750 sp=0xc000438730 pc=0x43e7ee runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc0004387e0 sp=0xc000438750 pc=0x4213e5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0004387e8 sp=0xc0004387e0 pc=0x46e2c1 created by runtime.gcBgMarkStartWorkers in goroutine 29 /usr/local/go/src/runtime/mgc.go:1217 +0x1c goroutine 33 [GC worker (idle)]: runtime.gopark(0x16f2d6b56114?, 0x2?, 0x17?, 0x8f?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc00043ef50 sp=0xc00043ef30 pc=0x43e7ee runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc00043efe0 sp=0xc00043ef50 pc=0x4213e5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00043efe8 sp=0xc00043efe0 pc=0x46e2c1 created by runtime.gcBgMarkStartWorkers in goroutine 29 /usr/local/go/src/runtime/mgc.go:1217 +0x1c goroutine 35 [GC worker (idle)]: runtime.gopark(0x16f2d6b5645a?, 0x3?, 0xc1?, 0xd?, 0x0?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc000438f50 sp=0xc000438f30 pc=0x43e7ee runtime.gcBgMarkWorker() /usr/local/go/src/runtime/mgc.go:1293 +0xe5 fp=0xc000438fe0 sp=0xc000438f50 pc=0x4213e5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000438fe8 sp=0xc000438fe0 pc=0x46e2c1 created by runtime.gcBgMarkStartWorkers in goroutine 29 /usr/local/go/src/runtime/mgc.go:1217 +0x1c goroutine 51 [IO wait]: runtime.gopark(0x11410780?, 0xb?, 0x0?, 0x0?, 0xc?) /usr/local/go/src/runtime/proc.go:398 +0xce fp=0xc00043cda0 sp=0xc00043cd80 pc=0x43e7ee runtime.netpollblock(0x47f078?, 0x4092a6?, 0x0?) /usr/local/go/src/runtime/netpoll.go:564 +0xf7 fp=0xc00043cdd8 sp=0xc00043cda0 pc=0x437277 internal/poll.runtime_pollWait(0x7fcb38dfdd30, 0x72) /usr/local/go/src/runtime/netpoll.go:343 +0x85 fp=0xc00043cdf8 sp=0xc00043cdd8 pc=0x468a05 internal/poll.(*pollDesc).wait(0xc000126000?, 0xc000154521?, 0x0) /usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc00043ce20 sp=0xc00043cdf8 pc=0x4efd67 internal/poll.(*pollDesc).waitRead(...) /usr/local/go/src/internal/poll/fd_poll_runtime.go:89 internal/poll.(*FD).Read(0xc000126000, {0xc000154521, 0x1, 0x1}) /usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a fp=0xc00043ceb8 sp=0xc00043ce20 pc=0x4f105a net.(*netFD).Read(0xc000126000, {0xc000154521?, 0x0?, 0xc000282a50?}) /usr/local/go/src/net/fd_posix.go:55 +0x25 fp=0xc00043cf00 sp=0xc00043ceb8 pc=0x569e05 net.(*conn).Read(0xc0004aa008, {0xc000154521?, 0x0?, 0xc00043cee0?}) /usr/local/go/src/net/net.go:179 +0x45 fp=0xc00043cf48 sp=0xc00043cf00 pc=0x5780a5 net.(*TCPConn).Read(0x0?, {0xc000154521?, 0x100000000000000?, 0xc0004342c0?}) <autogenerated>:1 +0x25 fp=0xc00043cf78 sp=0xc00043cf48 pc=0x589fa5 net/http.(*connReader).backgroundRead(0xc000154510) /usr/local/go/src/net/http/server.go:683 +0x37 fp=0xc00043cfc8 sp=0xc00043cf78 pc=0x6c4ab7 net/http.(*connReader).startBackgroundRead.func2() /usr/local/go/src/net/http/server.go:679 +0x25 fp=0xc00043cfe0 sp=0xc00043cfc8 pc=0x6c49e5 runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00043cfe8 sp=0xc00043cfe0 pc=0x46e2c1 created by net/http.(*connReader).startBackgroundRead in goroutine 50 /usr/local/go/src/net/http/server.go:679 +0xba rax 0x0 rbx 0x7fcb28ff8430 rcx 0x67 rdx 0x0 rdi 0x0 rsi 0x0 rbp 0x7fcb28ff83f0 rsp 0x7fcb28ff8258 r8 0x7fca482cfb00 r9 0x7fca482cfb28 r10 0x7fcb81200b40 r11 0x7fcb8138aa80 r12 0x0 r13 0x7fcb0c2026a8 r14 0x7fcb0c2044a0 r15 0x0 rip 0x7fcb8138e7fd rflags 0x10283 cs 0x33 fs 0x0 gs 0x0 ``` Thanks!
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/2899/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/2899/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5186
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5186/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5186/comments
https://api.github.com/repos/ollama/ollama/issues/5186/events
https://github.com/ollama/ollama/issues/5186
2,364,767,500
I_kwDOJ0Z1Ps6M830M
5,186
AMD Ryzen NPU support
{ "login": "ivanbrash", "id": 91606112, "node_id": "U_kgDOBXXMYA", "avatar_url": "https://avatars.githubusercontent.com/u/91606112?v=4", "gravatar_id": "", "url": "https://api.github.com/users/ivanbrash", "html_url": "https://github.com/ivanbrash", "followers_url": "https://api.github.com/users/ivanbrash/followers", "following_url": "https://api.github.com/users/ivanbrash/following{/other_user}", "gists_url": "https://api.github.com/users/ivanbrash/gists{/gist_id}", "starred_url": "https://api.github.com/users/ivanbrash/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ivanbrash/subscriptions", "organizations_url": "https://api.github.com/users/ivanbrash/orgs", "repos_url": "https://api.github.com/users/ivanbrash/repos", "events_url": "https://api.github.com/users/ivanbrash/events{/privacy}", "received_events_url": "https://api.github.com/users/ivanbrash/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 6433346500, "node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA", "url": "https://api.github.com/repos/ollama/ollama/labels/amd", "name": "amd", "color": "000000", "default": false, "description": "Issues relating to AMD GPUs and ROCm" } ]
open
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
19
2024-06-20T15:48:13
2024-12-26T09:30:59
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Hello! I'm want to buy Lenovo Xiaoxin 14 AI laptop on AMD Ryzen 7 8845H on my birthday and I will install Artix Linux to this. Do you will to add AMD Ryzen NPU support to Ollama on Linux and Windows? If anything, AMD Ryzen NPU driver for Linux is already available on Github: https://github.com/amd/xdna-driver.git Sorry for my bad English, please!
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5186/reactions", "total_count": 27, "+1": 25, "-1": 0, "laugh": 0, "hooray": 2, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5186/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6426
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6426/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6426/comments
https://api.github.com/repos/ollama/ollama/issues/6426/events
https://github.com/ollama/ollama/issues/6426
2,474,168,354
I_kwDOJ0Z1Ps6TeNAi
6,426
convert: vocab conversion incorrect
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396184, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA", "url": "https://api.github.com/repos/ollama/ollama/labels/bug", "name": "bug", "color": "d73a4a", "default": true, "description": "Something isn't working" } ]
closed
false
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
1
2024-08-19T20:40:50
2024-08-21T18:48:44
2024-08-21T18:48:44
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
### What is the issue? To reproduce, clone https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct then create a `Modelfile`: ``` FROM . ``` Lastly, create a model: ``` ollama create test ``` When trying to run the model, an assertion will be fired: ``` Error: llama runner process has terminated: GGML_ASSERT(vocab.id_to_token.size() == vocab.token_to_id.size()) failed ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
{ "login": "mxyng", "id": 2372640, "node_id": "MDQ6VXNlcjIzNzI2NDA=", "avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mxyng", "html_url": "https://github.com/mxyng", "followers_url": "https://api.github.com/users/mxyng/followers", "following_url": "https://api.github.com/users/mxyng/following{/other_user}", "gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}", "starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mxyng/subscriptions", "organizations_url": "https://api.github.com/users/mxyng/orgs", "repos_url": "https://api.github.com/users/mxyng/repos", "events_url": "https://api.github.com/users/mxyng/events{/privacy}", "received_events_url": "https://api.github.com/users/mxyng/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6426/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6426/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/300
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/300/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/300/comments
https://api.github.com/repos/ollama/ollama/issues/300/events
https://github.com/ollama/ollama/issues/300
1,838,346,491
I_kwDOJ0Z1Ps5tkvD7
300
Provide a way to allow connections to Ollama from web browser origins other than `localhost` and `0.0.0.0`
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
closed
false
null
[]
null
14
2023-08-06T19:41:30
2025-01-29T14:27:00
2023-08-08T14:41:44
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Currently, Ollama has CORS rules that allow pages hosted on `localhost` to connect to `localhost:11434`. #282 adds support for `0.0.0.0`, but some hosted web pages want to leverage a local running Ollama. Simply opening up CORS to all origins wouldn't be secure: any website could call the API by simply browsing to it. However, we should consider adding a way for users to "approve" of an origin using their local Ollama instance, similar to [deep links](https://developer.apple.com/documentation/xcode/allowing-apps-and-websites-to-link-to-your-content) on iOS and macOS
{ "login": "BruceMacD", "id": 5853428, "node_id": "MDQ6VXNlcjU4NTM0Mjg=", "avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4", "gravatar_id": "", "url": "https://api.github.com/users/BruceMacD", "html_url": "https://github.com/BruceMacD", "followers_url": "https://api.github.com/users/BruceMacD/followers", "following_url": "https://api.github.com/users/BruceMacD/following{/other_user}", "gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}", "starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions", "organizations_url": "https://api.github.com/users/BruceMacD/orgs", "repos_url": "https://api.github.com/users/BruceMacD/repos", "events_url": "https://api.github.com/users/BruceMacD/events{/privacy}", "received_events_url": "https://api.github.com/users/BruceMacD/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/300/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/300/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/953
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/953/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/953/comments
https://api.github.com/repos/ollama/ollama/issues/953/events
https://github.com/ollama/ollama/pull/953
1,969,295,567
PR_kwDOJ0Z1Ps5eK6GK
953
Changed name of folder and added some more features
{ "login": "suoko", "id": 3659980, "node_id": "MDQ6VXNlcjM2NTk5ODA=", "avatar_url": "https://avatars.githubusercontent.com/u/3659980?v=4", "gravatar_id": "", "url": "https://api.github.com/users/suoko", "html_url": "https://github.com/suoko", "followers_url": "https://api.github.com/users/suoko/followers", "following_url": "https://api.github.com/users/suoko/following{/other_user}", "gists_url": "https://api.github.com/users/suoko/gists{/gist_id}", "starred_url": "https://api.github.com/users/suoko/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/suoko/subscriptions", "organizations_url": "https://api.github.com/users/suoko/orgs", "repos_url": "https://api.github.com/users/suoko/repos", "events_url": "https://api.github.com/users/suoko/events{/privacy}", "received_events_url": "https://api.github.com/users/suoko/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2023-10-30T22:02:52
2024-11-21T08:48:20
2024-11-21T08:48:20
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/953", "html_url": "https://github.com/ollama/ollama/pull/953", "diff_url": "https://github.com/ollama/ollama/pull/953.diff", "patch_url": "https://github.com/ollama/ollama/pull/953.patch", "merged_at": null }
I added the model chooser
{ "login": "mchiang0610", "id": 3325447, "node_id": "MDQ6VXNlcjMzMjU0NDc=", "avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4", "gravatar_id": "", "url": "https://api.github.com/users/mchiang0610", "html_url": "https://github.com/mchiang0610", "followers_url": "https://api.github.com/users/mchiang0610/followers", "following_url": "https://api.github.com/users/mchiang0610/following{/other_user}", "gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}", "starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions", "organizations_url": "https://api.github.com/users/mchiang0610/orgs", "repos_url": "https://api.github.com/users/mchiang0610/repos", "events_url": "https://api.github.com/users/mchiang0610/events{/privacy}", "received_events_url": "https://api.github.com/users/mchiang0610/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/953/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/953/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6251
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6251/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6251/comments
https://api.github.com/repos/ollama/ollama/issues/6251/events
https://github.com/ollama/ollama/issues/6251
2,454,734,578
I_kwDOJ0Z1Ps6SUEby
6,251
Ollama multiuser scale
{ "login": "jamiabailey", "id": 85038220, "node_id": "MDQ6VXNlcjg1MDM4MjIw", "avatar_url": "https://avatars.githubusercontent.com/u/85038220?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jamiabailey", "html_url": "https://github.com/jamiabailey", "followers_url": "https://api.github.com/users/jamiabailey/followers", "following_url": "https://api.github.com/users/jamiabailey/following{/other_user}", "gists_url": "https://api.github.com/users/jamiabailey/gists{/gist_id}", "starred_url": "https://api.github.com/users/jamiabailey/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jamiabailey/subscriptions", "organizations_url": "https://api.github.com/users/jamiabailey/orgs", "repos_url": "https://api.github.com/users/jamiabailey/repos", "events_url": "https://api.github.com/users/jamiabailey/events{/privacy}", "received_events_url": "https://api.github.com/users/jamiabailey/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" }, { "id": 5667396220, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA", "url": "https://api.github.com/repos/ollama/ollama/labels/question", "name": "question", "color": "d876e3", "default": true, "description": "General questions" } ]
open
false
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false } ]
null
0
2024-08-08T02:21:08
2024-09-02T03:12:33
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I'm looking for some scale numbers on what ollama supports as far as multi-user environments go. I see the OLLAMA_NUM_PARALLEL for adjusting how many simultaneous requests can be served as well as OLLAMA_MAX_QUEUE for how many requests can be queued before being rejected but nothing that will help me understand how that directly relates to how to design a system that will serve a large number of users and how much GPU resources will be required to do so. Is Ollama a fit for large scale environments where there might be a very large number of users interacting with it without having to front end an endless number of Ollama instances in front of a load balancer VIP? Has anyone done some scale testing to help design larger scale designs using Ollama or is Ollama still mostly fitting solely into the desktop use case? Will containers help here or is it strictly an underlying GPU/memory issue? The cost for the servers underneath are not an issue for us. Just need scale. Please advise.
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6251/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 }
https://api.github.com/repos/ollama/ollama/issues/6251/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/6569
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6569/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6569/comments
https://api.github.com/repos/ollama/ollama/issues/6569/events
https://github.com/ollama/ollama/issues/6569
2,497,904,670
I_kwDOJ0Z1Ps6U4wAe
6,569
TensorRT Support
{ "login": "JonahMMay", "id": 119529402, "node_id": "U_kgDOBx_fug", "avatar_url": "https://avatars.githubusercontent.com/u/119529402?v=4", "gravatar_id": "", "url": "https://api.github.com/users/JonahMMay", "html_url": "https://github.com/JonahMMay", "followers_url": "https://api.github.com/users/JonahMMay/followers", "following_url": "https://api.github.com/users/JonahMMay/following{/other_user}", "gists_url": "https://api.github.com/users/JonahMMay/gists{/gist_id}", "starred_url": "https://api.github.com/users/JonahMMay/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/JonahMMay/subscriptions", "organizations_url": "https://api.github.com/users/JonahMMay/orgs", "repos_url": "https://api.github.com/users/JonahMMay/repos", "events_url": "https://api.github.com/users/JonahMMay/events{/privacy}", "received_events_url": "https://api.github.com/users/JonahMMay/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[ { "id": 5667396200, "node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA", "url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request", "name": "feature request", "color": "a2eeef", "default": false, "description": "New feature or request" } ]
open
false
null
[]
null
2
2024-08-30T18:00:15
2025-01-25T18:22:12
null
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
Does ollama leverage TensorRT and if not, can support for it be added?
null
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6569/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6569/timeline
null
null
false
https://api.github.com/repos/ollama/ollama/issues/7670
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/7670/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/7670/comments
https://api.github.com/repos/ollama/ollama/issues/7670/events
https://github.com/ollama/ollama/pull/7670
2,660,084,044
PR_kwDOJ0Z1Ps6B-Xjq
7,670
Vendor bump llama.cpp
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
2
2024-11-14T21:21:05
2024-12-02T19:02:09
2024-12-02T19:02:04
COLLABORATOR
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/7670", "html_url": "https://github.com/ollama/ollama/pull/7670", "diff_url": "https://github.com/ollama/ollama/pull/7670.diff", "patch_url": "https://github.com/ollama/ollama/pull/7670.patch", "merged_at": null }
Update to 4a8ccb37ad9c9027cbcfd5548c19cdffe48d5197
{ "login": "dhiltgen", "id": 4033016, "node_id": "MDQ6VXNlcjQwMzMwMTY=", "avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4", "gravatar_id": "", "url": "https://api.github.com/users/dhiltgen", "html_url": "https://github.com/dhiltgen", "followers_url": "https://api.github.com/users/dhiltgen/followers", "following_url": "https://api.github.com/users/dhiltgen/following{/other_user}", "gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}", "starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions", "organizations_url": "https://api.github.com/users/dhiltgen/orgs", "repos_url": "https://api.github.com/users/dhiltgen/repos", "events_url": "https://api.github.com/users/dhiltgen/events{/privacy}", "received_events_url": "https://api.github.com/users/dhiltgen/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/7670/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/7670/timeline
null
null
true
https://api.github.com/repos/ollama/ollama/issues/6876
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/6876/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/6876/comments
https://api.github.com/repos/ollama/ollama/issues/6876/events
https://github.com/ollama/ollama/issues/6876
2,535,981,484
I_kwDOJ0Z1Ps6XKAGs
6,876
Why models don't use full CPU power?
{ "login": "iladshyan", "id": 103303611, "node_id": "U_kgDOBihJuw", "avatar_url": "https://avatars.githubusercontent.com/u/103303611?v=4", "gravatar_id": "", "url": "https://api.github.com/users/iladshyan", "html_url": "https://github.com/iladshyan", "followers_url": "https://api.github.com/users/iladshyan/followers", "following_url": "https://api.github.com/users/iladshyan/following{/other_user}", "gists_url": "https://api.github.com/users/iladshyan/gists{/gist_id}", "starred_url": "https://api.github.com/users/iladshyan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/iladshyan/subscriptions", "organizations_url": "https://api.github.com/users/iladshyan/orgs", "repos_url": "https://api.github.com/users/iladshyan/repos", "events_url": "https://api.github.com/users/iladshyan/events{/privacy}", "received_events_url": "https://api.github.com/users/iladshyan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
8
2024-09-19T10:48:25
2024-09-20T11:59:53
2024-09-20T11:59:53
NONE
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
null
null
I have noticed in CPU only use cases the models are not using the CPU to the full potential. Are there any way to make the utilize the full power?
{ "login": "iladshyan", "id": 103303611, "node_id": "U_kgDOBihJuw", "avatar_url": "https://avatars.githubusercontent.com/u/103303611?v=4", "gravatar_id": "", "url": "https://api.github.com/users/iladshyan", "html_url": "https://github.com/iladshyan", "followers_url": "https://api.github.com/users/iladshyan/followers", "following_url": "https://api.github.com/users/iladshyan/following{/other_user}", "gists_url": "https://api.github.com/users/iladshyan/gists{/gist_id}", "starred_url": "https://api.github.com/users/iladshyan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/iladshyan/subscriptions", "organizations_url": "https://api.github.com/users/iladshyan/orgs", "repos_url": "https://api.github.com/users/iladshyan/repos", "events_url": "https://api.github.com/users/iladshyan/events{/privacy}", "received_events_url": "https://api.github.com/users/iladshyan/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/6876/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/6876/timeline
null
completed
false
https://api.github.com/repos/ollama/ollama/issues/5492
https://api.github.com/repos/ollama/ollama
https://api.github.com/repos/ollama/ollama/issues/5492/labels{/name}
https://api.github.com/repos/ollama/ollama/issues/5492/comments
https://api.github.com/repos/ollama/ollama/issues/5492/events
https://github.com/ollama/ollama/pull/5492
2,391,661,938
PR_kwDOJ0Z1Ps50fNZD
5,492
Use slot with cached prompt instead of least recently used
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
[]
closed
false
null
[]
null
0
2024-07-05T02:34:53
2024-07-05T16:32:49
2024-07-05T16:32:47
MEMBER
{ "total": 0, "completed": 0, "percent_completed": 0 }
null
false
{ "url": "https://api.github.com/repos/ollama/ollama/pulls/5492", "html_url": "https://github.com/ollama/ollama/pull/5492", "diff_url": "https://github.com/ollama/ollama/pull/5492.diff", "patch_url": "https://github.com/ollama/ollama/pull/5492.patch", "merged_at": "2024-07-05T16:32:47" }
This chooses the slot with the longest common prompt prefix instead of selecting the least recently used slot – this maximizes cache time for a single "conversation". Future improvements: - [ ] Clone slots and their cache - [ ] Avoid requests "stealing" slots from each other because they have a small but common prefix - [ ] Account for context shifts in the cache matching
{ "login": "jmorganca", "id": 251292, "node_id": "MDQ6VXNlcjI1MTI5Mg==", "avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4", "gravatar_id": "", "url": "https://api.github.com/users/jmorganca", "html_url": "https://github.com/jmorganca", "followers_url": "https://api.github.com/users/jmorganca/followers", "following_url": "https://api.github.com/users/jmorganca/following{/other_user}", "gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}", "starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions", "organizations_url": "https://api.github.com/users/jmorganca/orgs", "repos_url": "https://api.github.com/users/jmorganca/repos", "events_url": "https://api.github.com/users/jmorganca/events{/privacy}", "received_events_url": "https://api.github.com/users/jmorganca/received_events", "type": "User", "user_view_type": "public", "site_admin": false }
{ "url": "https://api.github.com/repos/ollama/ollama/issues/5492/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 }
https://api.github.com/repos/ollama/ollama/issues/5492/timeline
null
null
true