url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/5297
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5297/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5297/comments
|
https://api.github.com/repos/ollama/ollama/issues/5297/events
|
https://github.com/ollama/ollama/issues/5297
| 2,375,176,567
|
I_kwDOJ0Z1Ps6NklF3
| 5,297
|
How to get same length of response from CLI and API?
|
{
"login": "dsbyprateekg",
"id": 30830541,
"node_id": "MDQ6VXNlcjMwODMwNTQx",
"avatar_url": "https://avatars.githubusercontent.com/u/30830541?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dsbyprateekg",
"html_url": "https://github.com/dsbyprateekg",
"followers_url": "https://api.github.com/users/dsbyprateekg/followers",
"following_url": "https://api.github.com/users/dsbyprateekg/following{/other_user}",
"gists_url": "https://api.github.com/users/dsbyprateekg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dsbyprateekg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dsbyprateekg/subscriptions",
"organizations_url": "https://api.github.com/users/dsbyprateekg/orgs",
"repos_url": "https://api.github.com/users/dsbyprateekg/repos",
"events_url": "https://api.github.com/users/dsbyprateekg/events{/privacy}",
"received_events_url": "https://api.github.com/users/dsbyprateekg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 10
| 2024-06-26T12:17:04
| 2024-06-28T04:02:27
| 2024-06-28T04:02:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
I checked the prompt 'why is the sky blue?' with CLI and with the API through postman.
The response generated in CLI is longer than the response generated with the API-


Is there any parameter in API, we can set to get the longer response as we are getting in CLI?
I am running the ollama server in my Windows 11 laptop with AMD Ryzen CPU.
### OS
Windows
### GPU
AMD
### CPU
AMD
### Ollama version
0.1.46
|
{
"login": "dsbyprateekg",
"id": 30830541,
"node_id": "MDQ6VXNlcjMwODMwNTQx",
"avatar_url": "https://avatars.githubusercontent.com/u/30830541?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dsbyprateekg",
"html_url": "https://github.com/dsbyprateekg",
"followers_url": "https://api.github.com/users/dsbyprateekg/followers",
"following_url": "https://api.github.com/users/dsbyprateekg/following{/other_user}",
"gists_url": "https://api.github.com/users/dsbyprateekg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dsbyprateekg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dsbyprateekg/subscriptions",
"organizations_url": "https://api.github.com/users/dsbyprateekg/orgs",
"repos_url": "https://api.github.com/users/dsbyprateekg/repos",
"events_url": "https://api.github.com/users/dsbyprateekg/events{/privacy}",
"received_events_url": "https://api.github.com/users/dsbyprateekg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5297/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5297/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5966
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5966/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5966/comments
|
https://api.github.com/repos/ollama/ollama/issues/5966/events
|
https://github.com/ollama/ollama/issues/5966
| 2,431,299,032
|
I_kwDOJ0Z1Ps6Q6q3Y
| 5,966
|
Add "Mistral large v2" , thanks
|
{
"login": "enryteam",
"id": 20081090,
"node_id": "MDQ6VXNlcjIwMDgxMDkw",
"avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enryteam",
"html_url": "https://github.com/enryteam",
"followers_url": "https://api.github.com/users/enryteam/followers",
"following_url": "https://api.github.com/users/enryteam/following{/other_user}",
"gists_url": "https://api.github.com/users/enryteam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enryteam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enryteam/subscriptions",
"organizations_url": "https://api.github.com/users/enryteam/orgs",
"repos_url": "https://api.github.com/users/enryteam/repos",
"events_url": "https://api.github.com/users/enryteam/events{/privacy}",
"received_events_url": "https://api.github.com/users/enryteam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-07-26T02:39:12
| 2024-07-26T10:53:55
| 2024-07-26T10:53:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Mistral又重磅开源了,7月是一个适合开源的月份~。 Mistral large v2支持中文,特点是对编码和agent能力、推理能力做了很好的优化,110B模型可以与llama 3.1 405B分庭抗礼!
hf模型地址:https://huggingface.co/mistralai/Mistral-Large-Instruct-2407
试玩地址:https://chat.mistral.ai/chat/e56844bf-f6c1-46be-8f17-9072766fec10
|
{
"login": "enryteam",
"id": 20081090,
"node_id": "MDQ6VXNlcjIwMDgxMDkw",
"avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enryteam",
"html_url": "https://github.com/enryteam",
"followers_url": "https://api.github.com/users/enryteam/followers",
"following_url": "https://api.github.com/users/enryteam/following{/other_user}",
"gists_url": "https://api.github.com/users/enryteam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enryteam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enryteam/subscriptions",
"organizations_url": "https://api.github.com/users/enryteam/orgs",
"repos_url": "https://api.github.com/users/enryteam/repos",
"events_url": "https://api.github.com/users/enryteam/events{/privacy}",
"received_events_url": "https://api.github.com/users/enryteam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5966/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5966/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/327
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/327/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/327/comments
|
https://api.github.com/repos/ollama/ollama/issues/327/events
|
https://github.com/ollama/ollama/issues/327
| 1,846,157,406
|
I_kwDOJ0Z1Ps5uCiBe
| 327
|
Embedding model support
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 18
| 2023-08-11T03:53:45
| 2024-02-21T02:37:30
| 2024-02-21T02:37:30
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Add embedding models to use primarily with `/api/embeddings`
* `instructor-xl`
* `bge-large`
* `all-MiniLM-L6-v2`
See the full [leaderboard](https://huggingface.co/spaces/mteb/leaderboard)
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/327/reactions",
"total_count": 40,
"+1": 40,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/327/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5550
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5550/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5550/comments
|
https://api.github.com/repos/ollama/ollama/issues/5550/events
|
https://github.com/ollama/ollama/issues/5550
| 2,396,475,077
|
I_kwDOJ0Z1Ps6O107F
| 5,550
|
Support For TPU's
|
{
"login": "Moonlight1220",
"id": 172665223,
"node_id": "U_kgDOCkqphw",
"avatar_url": "https://avatars.githubusercontent.com/u/172665223?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Moonlight1220",
"html_url": "https://github.com/Moonlight1220",
"followers_url": "https://api.github.com/users/Moonlight1220/followers",
"following_url": "https://api.github.com/users/Moonlight1220/following{/other_user}",
"gists_url": "https://api.github.com/users/Moonlight1220/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Moonlight1220/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Moonlight1220/subscriptions",
"organizations_url": "https://api.github.com/users/Moonlight1220/orgs",
"repos_url": "https://api.github.com/users/Moonlight1220/repos",
"events_url": "https://api.github.com/users/Moonlight1220/events{/privacy}",
"received_events_url": "https://api.github.com/users/Moonlight1220/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-08T20:10:38
| 2024-07-09T22:01:49
| 2024-07-08T22:24:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
**Hello Ollama Community,**
Single board computers such as the Raspberry Pi have limitless possibilities when it comes to expansion, this can be very useful for AI. With products such as the Raspberry Pi AI Kit from seed studio and the Coral AI family of TPU's from Google this can accelerate LLM's and may prove useful for a small AI computer that you can take anywhere!
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5550/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5550/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5773
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5773/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5773/comments
|
https://api.github.com/repos/ollama/ollama/issues/5773/events
|
https://github.com/ollama/ollama/issues/5773
| 2,416,790,360
|
I_kwDOJ0Z1Ps6QDUtY
| 5,773
|
Favor idle GPUs that fit over largest free memory GPUs when scheduling
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-07-18T15:56:14
| 2024-07-18T15:56:15
| null |
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The scheduler today tries to find a single GPU to run a model based on the [largest amount of free VRAM](https://github.com/ollama/ollama/blob/main/server/sched.go#L690-L693), but on multi-GPU setups where one GPU is significantly larger than others, this can lead to smaller models clumping on the largest GPU. The algorithm should be a bit smarter to try to find an idle GPU first to avoid this clumping behavior.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5773/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5773/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3300
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3300/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3300/comments
|
https://api.github.com/repos/ollama/ollama/issues/3300/events
|
https://github.com/ollama/ollama/issues/3300
| 2,203,219,291
|
I_kwDOJ0Z1Ps6DUnVb
| 3,300
|
docker container only listens on ipv6 by default
|
{
"login": "nopoz",
"id": 460545,
"node_id": "MDQ6VXNlcjQ2MDU0NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/460545?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nopoz",
"html_url": "https://github.com/nopoz",
"followers_url": "https://api.github.com/users/nopoz/followers",
"following_url": "https://api.github.com/users/nopoz/following{/other_user}",
"gists_url": "https://api.github.com/users/nopoz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nopoz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nopoz/subscriptions",
"organizations_url": "https://api.github.com/users/nopoz/orgs",
"repos_url": "https://api.github.com/users/nopoz/repos",
"events_url": "https://api.github.com/users/nopoz/events{/privacy}",
"received_events_url": "https://api.github.com/users/nopoz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 6
| 2024-03-22T19:59:49
| 2024-09-19T18:07:09
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The docker container only listens on ipv6 by default. This causes connection failures for other containers in the same stack that are trying to communicate with ollama via ipv4.
### What did you expect to see?
For the container to listen on both ipv4 and ipv6.
### Steps to reproduce
Deploy ollama:
```
version: '3.3'
services:
ollama:
container_name: ollama
volumes:
- "/docker/ollama:/root/.ollama"
restart: unless-stopped
image: ollama/ollama:0.1.29
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
```
Verify port 11434 is only listening on ipv6 inside the container:
```
~$ docker exec -t -i ollama /bin/bash
root@28988fb7b322:/# apt update
[...]
root@28988fb7b322:/# apt install net-utils
[...]
root@28988fb7b322:/# netstat -nap | grep LISTEN
tcp 0 0 127.0.0.11:35687 0.0.0.0:* LISTEN -
tcp6 0 0 :::11434 :::* LISTEN 1/ollama
```
You can workaround this issue by adding the following lines to the docker compose file:
```
ports:
- '127.0.0.1:11434:11434'
```
However this is problematic as it exposes the port to the entire docker host which is unnecessary. The port only needs to be exposed to the other containers in the same compose stack.
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
amd64
### Platform
Docker, WSL2
### Ollama version
0.1.29
### GPU
Nvidia
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3300/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3300/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5303
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5303/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5303/comments
|
https://api.github.com/repos/ollama/ollama/issues/5303/events
|
https://github.com/ollama/ollama/issues/5303
| 2,375,490,048
|
I_kwDOJ0Z1Ps6NlxoA
| 5,303
|
Ollama keeps to randomly re-evaluate whole prompt, making chats impossible
|
{
"login": "drazdra",
"id": 133811709,
"node_id": "U_kgDOB_nN_Q",
"avatar_url": "https://avatars.githubusercontent.com/u/133811709?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/drazdra",
"html_url": "https://github.com/drazdra",
"followers_url": "https://api.github.com/users/drazdra/followers",
"following_url": "https://api.github.com/users/drazdra/following{/other_user}",
"gists_url": "https://api.github.com/users/drazdra/gists{/gist_id}",
"starred_url": "https://api.github.com/users/drazdra/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/drazdra/subscriptions",
"organizations_url": "https://api.github.com/users/drazdra/orgs",
"repos_url": "https://api.github.com/users/drazdra/repos",
"events_url": "https://api.github.com/users/drazdra/events{/privacy}",
"received_events_url": "https://api.github.com/users/drazdra/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 18
| 2024-06-26T14:14:53
| 2024-11-06T01:01:34
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama randomly starts whole prompt re-evaluation ignoring the cache. Normally next message on my system starts in 1-2 seconds, but when it happens i've to wait 7-20 minutes. Another proof is that in last message stats it shows the whole size of the prompt in prompt eval, instead of just last added message.
Obviously, it's just unusable as yesterday i spent many hours just waiting for replies instead of getting them. I don't know what it depends upon, at first it didn't trigger much but later it was nearly every second/third time when context grew longer. Or i didn't notice it with the small context.
In logs i don't see anything related to this issue, it just next reply starts getting re-evaluated and that's all:
200 | 48.669029927s | 127.0.0.1 | POST "/api/chat"
200 | 20.60948557s | 127.0.0.1 | POST "/api/chat"
200 | 30.495043951s | 127.0.0.1 | POST "/api/chat"
200 | 1m4s | 127.0.0.1 | POST "/api/chat"
200 | 32.433507128s | 127.0.0.1 | POST "/api/chat"
200 | 37.937415675s | 127.0.0.1 | POST "/api/chat"
200 | 8m7s | 127.0.0.1 | POST "/api/chat"
200 | 17.687657448s | 127.0.0.1 | POST "/api/chat"
200 | 17.344552043s | 127.0.0.1 | POST "/api/chat"
200 | 24.688732997s | 127.0.0.1 | POST "/api/chat"
200 | 34.470677196s | 127.0.0.1 | POST "/api/chat"
200 | 7m53s | 127.0.0.1 | POST "/api/chat"
these is the same chat with the same context but as you see some requests are way slower and these are prompt re-evals.
In my opinion it's related to concurrency changes to kv-cache processing as similar problems started back then.
It's just unusable on my system right now.
### OS
Linux
### GPU
Other
### CPU
AMD
### Ollama version
0.1.46
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5303/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5303/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4437
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4437/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4437/comments
|
https://api.github.com/repos/ollama/ollama/issues/4437/events
|
https://github.com/ollama/ollama/issues/4437
| 2,296,281,072
|
I_kwDOJ0Z1Ps6I3nfw
| 4,437
|
Ollama vs Llama-cpp-python : Slow response time as compared to llama-cpp-python
|
{
"login": "utility-aagrawal",
"id": 140737044,
"node_id": "U_kgDOCGN6FA",
"avatar_url": "https://avatars.githubusercontent.com/u/140737044?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/utility-aagrawal",
"html_url": "https://github.com/utility-aagrawal",
"followers_url": "https://api.github.com/users/utility-aagrawal/followers",
"following_url": "https://api.github.com/users/utility-aagrawal/following{/other_user}",
"gists_url": "https://api.github.com/users/utility-aagrawal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/utility-aagrawal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/utility-aagrawal/subscriptions",
"organizations_url": "https://api.github.com/users/utility-aagrawal/orgs",
"repos_url": "https://api.github.com/users/utility-aagrawal/repos",
"events_url": "https://api.github.com/users/utility-aagrawal/events{/privacy}",
"received_events_url": "https://api.github.com/users/utility-aagrawal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-05-14T19:43:03
| 2024-07-03T23:16:48
| 2024-07-03T23:16:48
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
I built a RAG Q&A pipeline using LlamaIndex and Llama-cpp-python in the past. I want to switch from llama-cpp to ollama because ollama is more stable and easier to install. When I made the switch, I noticed a significant increase in response time. Would you know what might cause this slowdown?
I have kept everything same for the comparison and have only changed llm component to point to ollama instead of llama-cpp. I am using the same quantized version of llama-3 for comparison.
Appreciate your inputs! Thanks!
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.33
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4437/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/4437/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5504
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5504/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5504/comments
|
https://api.github.com/repos/ollama/ollama/issues/5504/events
|
https://github.com/ollama/ollama/issues/5504
| 2,393,076,906
|
I_kwDOJ0Z1Ps6Oo3Sq
| 5,504
|
0xc0000409 CUDA error | was working fine before - OOM crash
|
{
"login": "gaduffl",
"id": 100528925,
"node_id": "U_kgDOBf3zHQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100528925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gaduffl",
"html_url": "https://github.com/gaduffl",
"followers_url": "https://api.github.com/users/gaduffl/followers",
"following_url": "https://api.github.com/users/gaduffl/following{/other_user}",
"gists_url": "https://api.github.com/users/gaduffl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gaduffl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gaduffl/subscriptions",
"organizations_url": "https://api.github.com/users/gaduffl/orgs",
"repos_url": "https://api.github.com/users/gaduffl/repos",
"events_url": "https://api.github.com/users/gaduffl/events{/privacy}",
"received_events_url": "https://api.github.com/users/gaduffl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6849881759,
"node_id": "LA_kwDOJ0Z1Ps8AAAABmEjmnw",
"url": "https://api.github.com/repos/ollama/ollama/labels/memory",
"name": "memory",
"color": "5017EA",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-07-05T19:54:29
| 2024-07-10T19:47:32
| 2024-07-10T19:47:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama was working fine with all small models I tested so far (4GB VRAM).
After upgrading to 0.1.48, I get a CUDA error with all models, e.g. Llama3 8B:
_Error: llama runner process has terminated: exit status 0xc0000409 CUDA error"_
This model was running perfectly fine before.
[server.log](https://github.com/user-attachments/files/16113333/server.log)
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.48
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5504/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5504/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8197
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8197/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8197/comments
|
https://api.github.com/repos/ollama/ollama/issues/8197/events
|
https://github.com/ollama/ollama/pull/8197
| 2,753,836,424
|
PR_kwDOJ0Z1Ps6F-O2S
| 8,197
|
fix: only add to history if different
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-12-21T08:09:11
| 2025-01-10T21:50:13
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8197",
"html_url": "https://github.com/ollama/ollama/pull/8197",
"diff_url": "https://github.com/ollama/ollama/pull/8197.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8197.patch",
"merged_at": null
}
|
if the last item in history is the same as the one being added, skip it. this reduces the number of history entries. the behaviour is similar to how most shells maintain history
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8197/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8197/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6687
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6687/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6687/comments
|
https://api.github.com/repos/ollama/ollama/issues/6687/events
|
https://github.com/ollama/ollama/pull/6687
| 2,511,664,082
|
PR_kwDOJ0Z1Ps56uzL6
| 6,687
|
Align OpenAI Chat option processing with Completion option processing
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-07T14:11:03
| 2024-09-07T14:14:05
| 2024-09-07T14:13:35
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6687",
"html_url": "https://github.com/ollama/ollama/pull/6687",
"diff_url": "https://github.com/ollama/ollama/pull/6687.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6687.patch",
"merged_at": null
}
|
https://github.com/ollama/ollama/pull/6514 removed the scaling of option values for OpenAI Completion requests. Do the same for Chat requests.
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6687/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6687/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5962
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5962/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5962/comments
|
https://api.github.com/repos/ollama/ollama/issues/5962/events
|
https://github.com/ollama/ollama/pull/5962
| 2,430,931,935
|
PR_kwDOJ0Z1Ps52gx8k
| 5,962
|
server: reuse original download URL for images
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-07-25T20:32:06
| 2024-07-25T22:58:32
| 2024-07-25T22:58:30
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5962",
"html_url": "https://github.com/ollama/ollama/pull/5962",
"diff_url": "https://github.com/ollama/ollama/pull/5962.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5962.patch",
"merged_at": "2024-07-25T22:58:30"
}
|
This changes the registry client to reuse the original download URL it gets on the first redirect response for all subsequent requests, preventing thundering herd issues when hot new LLMs are released.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5962/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5962/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5383
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5383/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5383/comments
|
https://api.github.com/repos/ollama/ollama/issues/5383/events
|
https://github.com/ollama/ollama/issues/5383
| 2,381,793,449
|
I_kwDOJ0Z1Ps6N90ip
| 5,383
|
Referring offline downloaded models in code
|
{
"login": "RaoPisay",
"id": 8242864,
"node_id": "MDQ6VXNlcjgyNDI4NjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/8242864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RaoPisay",
"html_url": "https://github.com/RaoPisay",
"followers_url": "https://api.github.com/users/RaoPisay/followers",
"following_url": "https://api.github.com/users/RaoPisay/following{/other_user}",
"gists_url": "https://api.github.com/users/RaoPisay/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RaoPisay/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RaoPisay/subscriptions",
"organizations_url": "https://api.github.com/users/RaoPisay/orgs",
"repos_url": "https://api.github.com/users/RaoPisay/repos",
"events_url": "https://api.github.com/users/RaoPisay/events{/privacy}",
"received_events_url": "https://api.github.com/users/RaoPisay/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-06-29T14:56:12
| 2024-07-01T15:43:06
| 2024-07-01T15:43:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Need help: I trying to refer model downloaded from ollama. I know the path where it is downloaded `~/.ollama/models/*`
In the python code given
Python code --
`tokenizer = AutoTokenizer.from_pretrained(model)`
-- Python code
Here I want to mentioned the `model` variable with the path for example like I've downloaded 2 models and those are `llama3` and `gemma:2b`
When I navigate to the path `~/.ollama/models` I see 2 folder and those are `blobs` and `manifests`
In `blobs` folder I see the big files 4.7GB refers to `llama3` and 1.7GB refers to `gemma:2b`.
I tried both the files path in place`model` variable but no luck
Below is the SS of the path just for reference

has anyone tried to refer ollama downloaded offline models in their code like the python statement given above?
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.48
|
{
"login": "RaoPisay",
"id": 8242864,
"node_id": "MDQ6VXNlcjgyNDI4NjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/8242864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RaoPisay",
"html_url": "https://github.com/RaoPisay",
"followers_url": "https://api.github.com/users/RaoPisay/followers",
"following_url": "https://api.github.com/users/RaoPisay/following{/other_user}",
"gists_url": "https://api.github.com/users/RaoPisay/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RaoPisay/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RaoPisay/subscriptions",
"organizations_url": "https://api.github.com/users/RaoPisay/orgs",
"repos_url": "https://api.github.com/users/RaoPisay/repos",
"events_url": "https://api.github.com/users/RaoPisay/events{/privacy}",
"received_events_url": "https://api.github.com/users/RaoPisay/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5383/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5383/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7807
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7807/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7807/comments
|
https://api.github.com/repos/ollama/ollama/issues/7807/events
|
https://github.com/ollama/ollama/issues/7807
| 2,685,232,210
|
I_kwDOJ0Z1Ps6gDWRS
| 7,807
|
newer version ollama chat more slower
|
{
"login": "krmao",
"id": 7344437,
"node_id": "MDQ6VXNlcjczNDQ0Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7344437?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/krmao",
"html_url": "https://github.com/krmao",
"followers_url": "https://api.github.com/users/krmao/followers",
"following_url": "https://api.github.com/users/krmao/following{/other_user}",
"gists_url": "https://api.github.com/users/krmao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/krmao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/krmao/subscriptions",
"organizations_url": "https://api.github.com/users/krmao/orgs",
"repos_url": "https://api.github.com/users/krmao/repos",
"events_url": "https://api.github.com/users/krmao/events{/privacy}",
"received_events_url": "https://api.github.com/users/krmao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-11-23T03:39:20
| 2024-11-25T04:33:44
| 2024-11-25T04:33:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
with the same code on the same machine
Apple M2 Pro
macos 15.1.1 (24B91)
```python
import time
import ollama
start_time = time.perf_counter()
#len(final_chat_messages)= 6107
ai_response = ollama.chat(model=model, messages=final_chat_messages, tools=TOOLS)
print(f'time after ollama.chat: {((time.perf_counter() - start_time) * 1000):.0f}ms')
```
after a lots of test:
0.3.14 only need 1s, first time need 4580ms, but then only need 1073ms with the same question
0.4.2 need 20s+,
0.4.3-0.4.4 need 10s
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.4.2
|
{
"login": "krmao",
"id": 7344437,
"node_id": "MDQ6VXNlcjczNDQ0Mzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/7344437?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/krmao",
"html_url": "https://github.com/krmao",
"followers_url": "https://api.github.com/users/krmao/followers",
"following_url": "https://api.github.com/users/krmao/following{/other_user}",
"gists_url": "https://api.github.com/users/krmao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/krmao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/krmao/subscriptions",
"organizations_url": "https://api.github.com/users/krmao/orgs",
"repos_url": "https://api.github.com/users/krmao/repos",
"events_url": "https://api.github.com/users/krmao/events{/privacy}",
"received_events_url": "https://api.github.com/users/krmao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7807/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7807/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3377
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3377/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3377/comments
|
https://api.github.com/repos/ollama/ollama/issues/3377/events
|
https://github.com/ollama/ollama/pull/3377
| 2,211,943,739
|
PR_kwDOJ0Z1Ps5q_HTD
| 3,377
|
Bump ROCm to 6.0.2 patch release
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-27T21:32:27
| 2024-03-28T23:07:57
| 2024-03-28T23:07:54
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3377",
"html_url": "https://github.com/ollama/ollama/pull/3377",
"diff_url": "https://github.com/ollama/ollama/pull/3377.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3377.patch",
"merged_at": "2024-03-28T23:07:54"
}
|
Fixes #2455
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3377/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3377/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1229
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1229/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1229/comments
|
https://api.github.com/repos/ollama/ollama/issues/1229/events
|
https://github.com/ollama/ollama/pull/1229
| 2,005,053,670
|
PR_kwDOJ0Z1Ps5gEAFP
| 1,229
|
revert checksum calculation to calculate-as-you-go
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-21T20:12:48
| 2023-11-30T18:54:39
| 2023-11-30T18:54:38
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1229",
"html_url": "https://github.com/ollama/ollama/pull/1229",
"diff_url": "https://github.com/ollama/ollama/pull/1229.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1229.patch",
"merged_at": "2023-11-30T18:54:38"
}
|
calculating the checksum as it's being transferred is faster overall since the file doesn't need to be reread
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1229/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1229/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6477
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6477/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6477/comments
|
https://api.github.com/repos/ollama/ollama/issues/6477/events
|
https://github.com/ollama/ollama/issues/6477
| 2,483,382,575
|
I_kwDOJ0Z1Ps6UBWkv
| 6,477
|
Llama3.1 template doesn't work well with multi function calling as well as Environment: ipython mode
|
{
"login": "martinkozle",
"id": 48385621,
"node_id": "MDQ6VXNlcjQ4Mzg1NjIx",
"avatar_url": "https://avatars.githubusercontent.com/u/48385621?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/martinkozle",
"html_url": "https://github.com/martinkozle",
"followers_url": "https://api.github.com/users/martinkozle/followers",
"following_url": "https://api.github.com/users/martinkozle/following{/other_user}",
"gists_url": "https://api.github.com/users/martinkozle/gists{/gist_id}",
"starred_url": "https://api.github.com/users/martinkozle/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/martinkozle/subscriptions",
"organizations_url": "https://api.github.com/users/martinkozle/orgs",
"repos_url": "https://api.github.com/users/martinkozle/repos",
"events_url": "https://api.github.com/users/martinkozle/events{/privacy}",
"received_events_url": "https://api.github.com/users/martinkozle/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-08-23T15:26:07
| 2024-08-26T11:40:13
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
## Tool descriptions
The current template checks if the final message is of Role "user" to decide whether to add the tool descriptions to it:
```go
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}<|start_header_id|>user<|end_header_id|>
{{- if and $.Tools $last }}
...
```
In a multi function use case however, where the last 2 messages are assistant and tool, and we would want the assistant to continue and use another tool (instead of giving the final response), then the tool descriptions aren't added anywhere, because the last message isn't a user message.
I get that this behavior is actually ok for a single function calling use case, where the assistant doesn't need to know the tools for generating the final response, but for multi function calling this makes it completely not work, as the assistant won't know what tools exist and how to use them for the second function call.
My proposed solution is:
```go
{{- $lastUserIdx := -1 }}
{{- range $i, $_ := .Messages }}
{{- with eq .Role "user" }}
{{- $lastUserIdx = $i }}{{ end }}
{{- end }}
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}<|start_header_id|>user<|end_header_id|>
{{- if and $.Tools (eq $i $lastUserIdx) }}
...
```
This adds the descriptions to the last user message, which doesn't necessarily have to be the last message overall.
## Usage of `<|eom_id|>` token
According to the Meta documentation <https://llama.meta.com/docs/model-cards-and-prompt-formats/> (Meta Llama docs are down at the time of writing). When using `Environment: ipython`, after the assistant calls a tool it is recommended to use an `<|eom_id|>` token instead of a `<|eot_id|>` token in order to signify that it is expecting a tool response next. When the assistant generates the final response, it then needs to be `<|eot_id|>`. I have checked the `tokenizer_config.json` jinja2 template, and this is how it is done there, but the Ollama template doesn't do the same, instead it doesn't add any token after the assistant tool call.
Parts of jinja2 template that does this:
```jinja2
...
{%- if builtin_tools is defined or tools is not none %}
{{- "Environment: ipython\n" }}
...
{%- for message in messages %}
...
{%- if builtin_tools is defined %}
{#- This means we're in ipython mode #}
{{- "<|eom_id|>" }}
{%- else %}
{{- "<|eot_id|>" }}
{%- endif %}
...
```
What needs to be added to the Ollama template (the `<|eom_id|>` at the end):
```
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }}<|eom_id|>
```
## Conclusion
These 2 things lead to poorer performance when using Llama3.1 in Ollama for multi function calling using the `chat` API. I believe that these should be addressed and fixed in order to achieve the intended quality and results from Llama3.1.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.5
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6477/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6477/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2472
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2472/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2472/comments
|
https://api.github.com/repos/ollama/ollama/issues/2472/events
|
https://github.com/ollama/ollama/issues/2472
| 2,131,828,829
|
I_kwDOJ0Z1Ps5_ESBd
| 2,472
|
Ollama floods /tmp with unnecessary libraries
|
{
"login": "knoopx",
"id": 100993,
"node_id": "MDQ6VXNlcjEwMDk5Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/100993?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/knoopx",
"html_url": "https://github.com/knoopx",
"followers_url": "https://api.github.com/users/knoopx/followers",
"following_url": "https://api.github.com/users/knoopx/following{/other_user}",
"gists_url": "https://api.github.com/users/knoopx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/knoopx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/knoopx/subscriptions",
"organizations_url": "https://api.github.com/users/knoopx/orgs",
"repos_url": "https://api.github.com/users/knoopx/repos",
"events_url": "https://api.github.com/users/knoopx/events{/privacy}",
"received_events_url": "https://api.github.com/users/knoopx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-02-13T09:05:57
| 2024-09-26T18:14:00
| 2024-03-20T15:28:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
This is what my `/tmp` dir looks after a few hours. I have no idea why ollama does this and why no cleanup is in place. ollama version is 0.1.24. haven't noticed this before this release.


|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2472/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2472/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3129
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3129/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3129/comments
|
https://api.github.com/repos/ollama/ollama/issues/3129/events
|
https://github.com/ollama/ollama/pull/3129
| 2,185,151,611
|
PR_kwDOJ0Z1Ps5pkViA
| 3,129
|
docs: pbcopy on mac
|
{
"login": "adrienbrault",
"id": 611271,
"node_id": "MDQ6VXNlcjYxMTI3MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/611271?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adrienbrault",
"html_url": "https://github.com/adrienbrault",
"followers_url": "https://api.github.com/users/adrienbrault/followers",
"following_url": "https://api.github.com/users/adrienbrault/following{/other_user}",
"gists_url": "https://api.github.com/users/adrienbrault/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adrienbrault/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adrienbrault/subscriptions",
"organizations_url": "https://api.github.com/users/adrienbrault/orgs",
"repos_url": "https://api.github.com/users/adrienbrault/repos",
"events_url": "https://api.github.com/users/adrienbrault/events{/privacy}",
"received_events_url": "https://api.github.com/users/adrienbrault/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-14T00:29:06
| 2024-05-07T20:19:27
| 2024-05-06T20:47:00
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3129",
"html_url": "https://github.com/ollama/ollama/pull/3129",
"diff_url": "https://github.com/ollama/ollama/pull/3129.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3129.patch",
"merged_at": "2024-05-06T20:47:00"
}
|
Hey!
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3129/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3129/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3145
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3145/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3145/comments
|
https://api.github.com/repos/ollama/ollama/issues/3145/events
|
https://github.com/ollama/ollama/pull/3145
| 2,186,928,883
|
PR_kwDOJ0Z1Ps5pqVkA
| 3,145
|
docs: Add AnythingLLM to README as integration option
|
{
"login": "timothycarambat",
"id": 16845892,
"node_id": "MDQ6VXNlcjE2ODQ1ODky",
"avatar_url": "https://avatars.githubusercontent.com/u/16845892?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/timothycarambat",
"html_url": "https://github.com/timothycarambat",
"followers_url": "https://api.github.com/users/timothycarambat/followers",
"following_url": "https://api.github.com/users/timothycarambat/following{/other_user}",
"gists_url": "https://api.github.com/users/timothycarambat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/timothycarambat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/timothycarambat/subscriptions",
"organizations_url": "https://api.github.com/users/timothycarambat/orgs",
"repos_url": "https://api.github.com/users/timothycarambat/repos",
"events_url": "https://api.github.com/users/timothycarambat/events{/privacy}",
"received_events_url": "https://api.github.com/users/timothycarambat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-14T17:43:37
| 2024-03-25T20:10:03
| 2024-03-25T18:54:48
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3145",
"html_url": "https://github.com/ollama/ollama/pull/3145",
"diff_url": "https://github.com/ollama/ollama/pull/3145.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3145.patch",
"merged_at": "2024-03-25T18:54:48"
}
|
Adding [AnythingLLM](https://github.com/Mintplex-Labs/anything-llm) by Mintplex Labs (YCS22) as an integration option for Ollama. Supports all models with full RAG and on-device vector database and embedding.
Supports Docker and has a native MacOS, Windows, and Linux application that can be used alongside Ollama.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3145/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3145/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/927
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/927/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/927/comments
|
https://api.github.com/repos/ollama/ollama/issues/927/events
|
https://github.com/ollama/ollama/issues/927
| 1,964,761,108
|
I_kwDOJ0Z1Ps51G-AU
| 927
|
error on push due to uppercase model name
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2023-10-27T04:49:00
| 2024-02-20T00:56:04
| 2024-02-20T00:56:04
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It may be obvious to us but unless you know it's not obvious that a model needs to be named namespace/model. Can we make the error a bit more helpful? This is related to the uppercase issue with model names
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/927/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/927/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6316
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6316/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6316/comments
|
https://api.github.com/repos/ollama/ollama/issues/6316/events
|
https://github.com/ollama/ollama/issues/6316
| 2,459,914,511
|
I_kwDOJ0Z1Ps6Sn1EP
| 6,316
|
ollama create will use a large amount of disk space in the /tmp
|
{
"login": "garyyang85",
"id": 20335728,
"node_id": "MDQ6VXNlcjIwMzM1NzI4",
"avatar_url": "https://avatars.githubusercontent.com/u/20335728?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/garyyang85",
"html_url": "https://github.com/garyyang85",
"followers_url": "https://api.github.com/users/garyyang85/followers",
"following_url": "https://api.github.com/users/garyyang85/following{/other_user}",
"gists_url": "https://api.github.com/users/garyyang85/gists{/gist_id}",
"starred_url": "https://api.github.com/users/garyyang85/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/garyyang85/subscriptions",
"organizations_url": "https://api.github.com/users/garyyang85/orgs",
"repos_url": "https://api.github.com/users/garyyang85/repos",
"events_url": "https://api.github.com/users/garyyang85/events{/privacy}",
"received_events_url": "https://api.github.com/users/garyyang85/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-08-12T02:22:03
| 2024-08-13T08:14:16
| 2024-08-12T02:26:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama create cmd will use a large amount of disk space in the /tmp directory by default. Is there a way to change the /tmp to other directory?
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
latest
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6316/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6316/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/453
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/453/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/453/comments
|
https://api.github.com/repos/ollama/ollama/issues/453/events
|
https://github.com/ollama/ollama/issues/453
| 1,877,942,749
|
I_kwDOJ0Z1Ps5v7yHd
| 453
|
Add some way to keep the model in memory
|
{
"login": "spott",
"id": 53284,
"node_id": "MDQ6VXNlcjUzMjg0",
"avatar_url": "https://avatars.githubusercontent.com/u/53284?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/spott",
"html_url": "https://github.com/spott",
"followers_url": "https://api.github.com/users/spott/followers",
"following_url": "https://api.github.com/users/spott/following{/other_user}",
"gists_url": "https://api.github.com/users/spott/gists{/gist_id}",
"starred_url": "https://api.github.com/users/spott/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/spott/subscriptions",
"organizations_url": "https://api.github.com/users/spott/orgs",
"repos_url": "https://api.github.com/users/spott/repos",
"events_url": "https://api.github.com/users/spott/events{/privacy}",
"received_events_url": "https://api.github.com/users/spott/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-09-01T19:19:22
| 2023-09-01T19:22:08
| 2023-09-01T19:22:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It can take a while to load a model into memory, which currently needs to be done after every api call when using Ollama serve.
ggml has a --mlock option that keeps the model in memory, so it can be repeatedly queried without falling out of memory, it would be great if there was a way to do the same with Ollama.
|
{
"login": "spott",
"id": 53284,
"node_id": "MDQ6VXNlcjUzMjg0",
"avatar_url": "https://avatars.githubusercontent.com/u/53284?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/spott",
"html_url": "https://github.com/spott",
"followers_url": "https://api.github.com/users/spott/followers",
"following_url": "https://api.github.com/users/spott/following{/other_user}",
"gists_url": "https://api.github.com/users/spott/gists{/gist_id}",
"starred_url": "https://api.github.com/users/spott/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/spott/subscriptions",
"organizations_url": "https://api.github.com/users/spott/orgs",
"repos_url": "https://api.github.com/users/spott/repos",
"events_url": "https://api.github.com/users/spott/events{/privacy}",
"received_events_url": "https://api.github.com/users/spott/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/453/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/453/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4014
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4014/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4014/comments
|
https://api.github.com/repos/ollama/ollama/issues/4014/events
|
https://github.com/ollama/ollama/issues/4014
| 2,267,951,595
|
I_kwDOJ0Z1Ps6HLjHr
| 4,014
|
Add support for Qwen-VL
|
{
"login": "dagehuifei",
"id": 145953245,
"node_id": "U_kgDOCLMR3Q",
"avatar_url": "https://avatars.githubusercontent.com/u/145953245?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dagehuifei",
"html_url": "https://github.com/dagehuifei",
"followers_url": "https://api.github.com/users/dagehuifei/followers",
"following_url": "https://api.github.com/users/dagehuifei/following{/other_user}",
"gists_url": "https://api.github.com/users/dagehuifei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dagehuifei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dagehuifei/subscriptions",
"organizations_url": "https://api.github.com/users/dagehuifei/orgs",
"repos_url": "https://api.github.com/users/dagehuifei/repos",
"events_url": "https://api.github.com/users/dagehuifei/events{/privacy}",
"received_events_url": "https://api.github.com/users/dagehuifei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-04-29T01:30:37
| 2024-04-29T01:31:13
| 2024-04-29T01:31:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
https://huggingface.co/Qwen/Qwen-VL
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "dagehuifei",
"id": 145953245,
"node_id": "U_kgDOCLMR3Q",
"avatar_url": "https://avatars.githubusercontent.com/u/145953245?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dagehuifei",
"html_url": "https://github.com/dagehuifei",
"followers_url": "https://api.github.com/users/dagehuifei/followers",
"following_url": "https://api.github.com/users/dagehuifei/following{/other_user}",
"gists_url": "https://api.github.com/users/dagehuifei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dagehuifei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dagehuifei/subscriptions",
"organizations_url": "https://api.github.com/users/dagehuifei/orgs",
"repos_url": "https://api.github.com/users/dagehuifei/repos",
"events_url": "https://api.github.com/users/dagehuifei/events{/privacy}",
"received_events_url": "https://api.github.com/users/dagehuifei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4014/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4014/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/1765
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1765/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1765/comments
|
https://api.github.com/repos/ollama/ollama/issues/1765/events
|
https://github.com/ollama/ollama/issues/1765
| 2,063,840,510
|
I_kwDOJ0Z1Ps57A7T-
| 1,765
|
Can't pull .ggml local model
|
{
"login": "reddiamond1234",
"id": 122911466,
"node_id": "U_kgDOB1N66g",
"avatar_url": "https://avatars.githubusercontent.com/u/122911466?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/reddiamond1234",
"html_url": "https://github.com/reddiamond1234",
"followers_url": "https://api.github.com/users/reddiamond1234/followers",
"following_url": "https://api.github.com/users/reddiamond1234/following{/other_user}",
"gists_url": "https://api.github.com/users/reddiamond1234/gists{/gist_id}",
"starred_url": "https://api.github.com/users/reddiamond1234/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/reddiamond1234/subscriptions",
"organizations_url": "https://api.github.com/users/reddiamond1234/orgs",
"repos_url": "https://api.github.com/users/reddiamond1234/repos",
"events_url": "https://api.github.com/users/reddiamond1234/events{/privacy}",
"received_events_url": "https://api.github.com/users/reddiamond1234/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-01-03T11:32:46
| 2024-01-04T08:35:29
| 2024-01-04T08:35:29
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi, I created Modelfile:
<code>FROM /models/phi-2.Q4_0.gguf
TEMPLATE "[INST] {{ .Prompt }} [/INST]"
PARAMETER temperature 0
PARAMETER num_ctx 2048
PARAMETER num_thread 6
PARAMETER top_k 40
PARAMETER top_p 0.95
</code>
when i use command to create my custom model <code>ollama create phi2-SC -f ./models/modelfiles/Modelfile</code>, i get this error: Error: pull model manifest: Get "https://v2/models/phi-2.Q4_0.gguf/manifests/latest": dial tcp: lookup v2 on 172.20.80.1:53: server misbehaving
|
{
"login": "reddiamond1234",
"id": 122911466,
"node_id": "U_kgDOB1N66g",
"avatar_url": "https://avatars.githubusercontent.com/u/122911466?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/reddiamond1234",
"html_url": "https://github.com/reddiamond1234",
"followers_url": "https://api.github.com/users/reddiamond1234/followers",
"following_url": "https://api.github.com/users/reddiamond1234/following{/other_user}",
"gists_url": "https://api.github.com/users/reddiamond1234/gists{/gist_id}",
"starred_url": "https://api.github.com/users/reddiamond1234/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/reddiamond1234/subscriptions",
"organizations_url": "https://api.github.com/users/reddiamond1234/orgs",
"repos_url": "https://api.github.com/users/reddiamond1234/repos",
"events_url": "https://api.github.com/users/reddiamond1234/events{/privacy}",
"received_events_url": "https://api.github.com/users/reddiamond1234/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1765/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1765/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3909
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3909/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3909/comments
|
https://api.github.com/repos/ollama/ollama/issues/3909/events
|
https://github.com/ollama/ollama/issues/3909
| 2,263,502,226
|
I_kwDOJ0Z1Ps6G6k2S
| 3,909
|
ollama can not run the custom model (finetune on llama3) on M1 max
|
{
"login": "TobyYang7",
"id": 42986654,
"node_id": "MDQ6VXNlcjQyOTg2NjU0",
"avatar_url": "https://avatars.githubusercontent.com/u/42986654?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TobyYang7",
"html_url": "https://github.com/TobyYang7",
"followers_url": "https://api.github.com/users/TobyYang7/followers",
"following_url": "https://api.github.com/users/TobyYang7/following{/other_user}",
"gists_url": "https://api.github.com/users/TobyYang7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TobyYang7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TobyYang7/subscriptions",
"organizations_url": "https://api.github.com/users/TobyYang7/orgs",
"repos_url": "https://api.github.com/users/TobyYang7/repos",
"events_url": "https://api.github.com/users/TobyYang7/events{/privacy}",
"received_events_url": "https://api.github.com/users/TobyYang7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-04-25T12:47:52
| 2024-10-23T17:48:34
| 2024-10-23T17:48:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
❯ ollama run InsuranceGPT "What is your favourite condiment?"
Error: llama runner process no longer running: -1 error:check_tensor_dims: tensor 'blk.0.attn_k.weight' has wrong shape; expected 4096, 4096, got 4096, 1024, 1, 1
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
_No response_
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3909/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3909/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2371
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2371/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2371/comments
|
https://api.github.com/repos/ollama/ollama/issues/2371/events
|
https://github.com/ollama/ollama/issues/2371
| 2,120,498,216
|
I_kwDOJ0Z1Ps5-ZDwo
| 2,371
|
Documents translation (Japanese)
|
{
"login": "jesseclin",
"id": 34976014,
"node_id": "MDQ6VXNlcjM0OTc2MDE0",
"avatar_url": "https://avatars.githubusercontent.com/u/34976014?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jesseclin",
"html_url": "https://github.com/jesseclin",
"followers_url": "https://api.github.com/users/jesseclin/followers",
"following_url": "https://api.github.com/users/jesseclin/following{/other_user}",
"gists_url": "https://api.github.com/users/jesseclin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jesseclin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jesseclin/subscriptions",
"organizations_url": "https://api.github.com/users/jesseclin/orgs",
"repos_url": "https://api.github.com/users/jesseclin/repos",
"events_url": "https://api.github.com/users/jesseclin/events{/privacy}",
"received_events_url": "https://api.github.com/users/jesseclin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
}
] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-02-06T10:57:27
| 2024-03-12T18:46:36
| 2024-03-12T18:46:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have translated the documentation files into [Japanese ones](https://github.com/jesseclin/ollama/blob/main/README_ja.md) and will keep them updated. Should I submit a PR? or is it better to leave them there?
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2371/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2371/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1479
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1479/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1479/comments
|
https://api.github.com/repos/ollama/ollama/issues/1479/events
|
https://github.com/ollama/ollama/pull/1479
| 2,037,434,289
|
PR_kwDOJ0Z1Ps5hxqI5
| 1,479
|
Fix Readme "Database -> MindsDB" link
|
{
"login": "ruecat",
"id": 79139779,
"node_id": "MDQ6VXNlcjc5MTM5Nzc5",
"avatar_url": "https://avatars.githubusercontent.com/u/79139779?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ruecat",
"html_url": "https://github.com/ruecat",
"followers_url": "https://api.github.com/users/ruecat/followers",
"following_url": "https://api.github.com/users/ruecat/following{/other_user}",
"gists_url": "https://api.github.com/users/ruecat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ruecat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ruecat/subscriptions",
"organizations_url": "https://api.github.com/users/ruecat/orgs",
"repos_url": "https://api.github.com/users/ruecat/repos",
"events_url": "https://api.github.com/users/ruecat/events{/privacy}",
"received_events_url": "https://api.github.com/users/ruecat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-12T10:20:25
| 2023-12-12T15:26:14
| 2023-12-12T15:26:13
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1479",
"html_url": "https://github.com/ollama/ollama/pull/1479",
"diff_url": "https://github.com/ollama/ollama/pull/1479.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1479.patch",
"merged_at": "2023-12-12T15:26:13"
}
|
This pull request fixes markdown ("MindsDB" link in Readme)
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1479/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1479/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4874
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4874/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4874/comments
|
https://api.github.com/repos/ollama/ollama/issues/4874/events
|
https://github.com/ollama/ollama/pull/4874
| 2,338,834,962
|
PR_kwDOJ0Z1Ps5xtn2q
| 4,874
|
Rocm v6 bump
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-06T17:44:29
| 2024-06-15T14:38:35
| 2024-06-15T14:38:32
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4874",
"html_url": "https://github.com/ollama/ollama/pull/4874",
"diff_url": "https://github.com/ollama/ollama/pull/4874.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4874.patch",
"merged_at": "2024-06-15T14:38:32"
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4874/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4874/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5065
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5065/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5065/comments
|
https://api.github.com/repos/ollama/ollama/issues/5065/events
|
https://github.com/ollama/ollama/pull/5065
| 2,354,966,357
|
PR_kwDOJ0Z1Ps5ykU7O
| 5,065
|
README: add llmcord.py extension
|
{
"login": "jakobdylanc",
"id": 38699060,
"node_id": "MDQ6VXNlcjM4Njk5MDYw",
"avatar_url": "https://avatars.githubusercontent.com/u/38699060?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jakobdylanc",
"html_url": "https://github.com/jakobdylanc",
"followers_url": "https://api.github.com/users/jakobdylanc/followers",
"following_url": "https://api.github.com/users/jakobdylanc/following{/other_user}",
"gists_url": "https://api.github.com/users/jakobdylanc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jakobdylanc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jakobdylanc/subscriptions",
"organizations_url": "https://api.github.com/users/jakobdylanc/orgs",
"repos_url": "https://api.github.com/users/jakobdylanc/repos",
"events_url": "https://api.github.com/users/jakobdylanc/events{/privacy}",
"received_events_url": "https://api.github.com/users/jakobdylanc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-15T15:52:59
| 2024-06-26T18:44:58
| 2024-06-26T18:44:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5065",
"html_url": "https://github.com/ollama/ollama/pull/5065",
"diff_url": "https://github.com/ollama/ollama/pull/5065.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5065.patch",
"merged_at": null
}
|
Repo: https://github.com/jakobdylanc/discord-llm-chatbot
|
{
"login": "jakobdylanc",
"id": 38699060,
"node_id": "MDQ6VXNlcjM4Njk5MDYw",
"avatar_url": "https://avatars.githubusercontent.com/u/38699060?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jakobdylanc",
"html_url": "https://github.com/jakobdylanc",
"followers_url": "https://api.github.com/users/jakobdylanc/followers",
"following_url": "https://api.github.com/users/jakobdylanc/following{/other_user}",
"gists_url": "https://api.github.com/users/jakobdylanc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jakobdylanc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jakobdylanc/subscriptions",
"organizations_url": "https://api.github.com/users/jakobdylanc/orgs",
"repos_url": "https://api.github.com/users/jakobdylanc/repos",
"events_url": "https://api.github.com/users/jakobdylanc/events{/privacy}",
"received_events_url": "https://api.github.com/users/jakobdylanc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5065/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5065/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4661
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4661/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4661/comments
|
https://api.github.com/repos/ollama/ollama/issues/4661/events
|
https://github.com/ollama/ollama/pull/4661
| 2,318,967,829
|
PR_kwDOJ0Z1Ps5wp16V
| 4,661
|
llm/server.go: Fix 2 minor typos
|
{
"login": "coolljt0725",
"id": 8232360,
"node_id": "MDQ6VXNlcjgyMzIzNjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8232360?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coolljt0725",
"html_url": "https://github.com/coolljt0725",
"followers_url": "https://api.github.com/users/coolljt0725/followers",
"following_url": "https://api.github.com/users/coolljt0725/following{/other_user}",
"gists_url": "https://api.github.com/users/coolljt0725/gists{/gist_id}",
"starred_url": "https://api.github.com/users/coolljt0725/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coolljt0725/subscriptions",
"organizations_url": "https://api.github.com/users/coolljt0725/orgs",
"repos_url": "https://api.github.com/users/coolljt0725/repos",
"events_url": "https://api.github.com/users/coolljt0725/events{/privacy}",
"received_events_url": "https://api.github.com/users/coolljt0725/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-27T11:48:32
| 2024-05-28T02:25:25
| 2024-05-28T00:21:10
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4661",
"html_url": "https://github.com/ollama/ollama/pull/4661",
"diff_url": "https://github.com/ollama/ollama/pull/4661.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4661.patch",
"merged_at": "2024-05-28T00:21:10"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4661/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4661/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/764
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/764/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/764/comments
|
https://api.github.com/repos/ollama/ollama/issues/764/events
|
https://github.com/ollama/ollama/issues/764
| 1,939,699,988
|
I_kwDOJ0Z1Ps5znXkU
| 764
|
How to multi threading with api << python >>
|
{
"login": "missandi",
"id": 90961639,
"node_id": "MDQ6VXNlcjkwOTYxNjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/90961639?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/missandi",
"html_url": "https://github.com/missandi",
"followers_url": "https://api.github.com/users/missandi/followers",
"following_url": "https://api.github.com/users/missandi/following{/other_user}",
"gists_url": "https://api.github.com/users/missandi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/missandi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/missandi/subscriptions",
"organizations_url": "https://api.github.com/users/missandi/orgs",
"repos_url": "https://api.github.com/users/missandi/repos",
"events_url": "https://api.github.com/users/missandi/events{/privacy}",
"received_events_url": "https://api.github.com/users/missandi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 4
| 2023-10-12T10:35:22
| 2023-12-22T03:35:54
| 2023-12-22T03:35:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
def generate(model_name, prompt, system=None, template=None, context=None, options=None, callback=None):
try:
url = f"{BASE_URL}/api/generate"
payload = {
"model": model_name,
"prompt": prompt,
"system": system,
"template": template,
"context": context,
"options": options
}
# Remove keys with None values
payload = {k: v for k, v in payload.items() if v is not None}
with requests.post(url, json=payload, stream=True) as response:
response.raise_for_status()
# Creating a variable to hold the context history of the final chunk
final_context = None
# Variable to hold concatenated response strings if no callback is provided
full_response = ""
# Iterating over the response line by line and displaying the details
for line in response.iter_lines():
if line:
# Parsing each line (JSON chunk) and extracting the details
chunk = json.loads(line)
# If a callback function is provided, call it with the chunk
if callback:
callback(chunk)
else:
# If this is not the last chunk, add the "response" field value to full_response and print it
if not chunk.get("done"):
response_piece = chunk.get("response", "")
full_response += response_piece
print(response_piece, end="", flush=True)
# Check if it's the last chunk (done is true)
if chunk.get("done"):
final_context = chunk.get("context")
# Return the full response and the final context
return full_response, final_context
except requests.exceptions.RequestException as e:
print(f"An error occurred: {e}")
return None, None
I am currently utilizing a function to execute an API locally using Python; however, I have observed that the performance is notably slow. As a potential solution, I am considering implementing multi-threading to enable simultaneous execution of multiple instances. I would greatly appreciate any assistance or recommendations in this regard. Thank you sincerely for your support.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/764/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/764/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2700
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2700/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2700/comments
|
https://api.github.com/repos/ollama/ollama/issues/2700/events
|
https://github.com/ollama/ollama/pull/2700
| 2,150,422,951
|
PR_kwDOJ0Z1Ps5nuEmb
| 2,700
|
Add clear history cli cmd
|
{
"login": "halfnibble",
"id": 5139752,
"node_id": "MDQ6VXNlcjUxMzk3NTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/5139752?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/halfnibble",
"html_url": "https://github.com/halfnibble",
"followers_url": "https://api.github.com/users/halfnibble/followers",
"following_url": "https://api.github.com/users/halfnibble/following{/other_user}",
"gists_url": "https://api.github.com/users/halfnibble/gists{/gist_id}",
"starred_url": "https://api.github.com/users/halfnibble/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/halfnibble/subscriptions",
"organizations_url": "https://api.github.com/users/halfnibble/orgs",
"repos_url": "https://api.github.com/users/halfnibble/repos",
"events_url": "https://api.github.com/users/halfnibble/events{/privacy}",
"received_events_url": "https://api.github.com/users/halfnibble/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-23T06:03:24
| 2024-09-05T02:36:35
| 2024-09-05T02:36:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2700",
"html_url": "https://github.com/ollama/ollama/pull/2700",
"diff_url": "https://github.com/ollama/ollama/pull/2700.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2700.patch",
"merged_at": null
}
|
After performing content safety testing on various models, I realized it would be nice to clear the history.
Not sure if this warrants an api handler func like the other handlers?
Also, I thought about abstracting the history path code.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2700/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2700/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6474
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6474/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6474/comments
|
https://api.github.com/repos/ollama/ollama/issues/6474/events
|
https://github.com/ollama/ollama/issues/6474
| 2,482,875,247
|
I_kwDOJ0Z1Ps6T_atv
| 6,474
|
Phi3.5 broken behaviour
|
{
"login": "derluke",
"id": 6739699,
"node_id": "MDQ6VXNlcjY3Mzk2OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6739699?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/derluke",
"html_url": "https://github.com/derluke",
"followers_url": "https://api.github.com/users/derluke/followers",
"following_url": "https://api.github.com/users/derluke/following{/other_user}",
"gists_url": "https://api.github.com/users/derluke/gists{/gist_id}",
"starred_url": "https://api.github.com/users/derluke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/derluke/subscriptions",
"organizations_url": "https://api.github.com/users/derluke/orgs",
"repos_url": "https://api.github.com/users/derluke/repos",
"events_url": "https://api.github.com/users/derluke/events{/privacy}",
"received_events_url": "https://api.github.com/users/derluke/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-08-23T10:51:29
| 2024-08-28T08:04:57
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
As mentioned by a few others in https://github.com/ollama/ollama/issues/6449 the phi3.5 models never stop responding and quickly become nonsensical
example:

### OS
WSL2
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.6
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6474/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6474/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3399
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3399/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3399/comments
|
https://api.github.com/repos/ollama/ollama/issues/3399/events
|
https://github.com/ollama/ollama/issues/3399
| 2,214,315,153
|
I_kwDOJ0Z1Ps6D-8SR
| 3,399
|
New Model: "Jamba" (Production Grade Mamba by ai21)
|
{
"login": "Marviel",
"id": 2037165,
"node_id": "MDQ6VXNlcjIwMzcxNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/2037165?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Marviel",
"html_url": "https://github.com/Marviel",
"followers_url": "https://api.github.com/users/Marviel/followers",
"following_url": "https://api.github.com/users/Marviel/following{/other_user}",
"gists_url": "https://api.github.com/users/Marviel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Marviel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Marviel/subscriptions",
"organizations_url": "https://api.github.com/users/Marviel/orgs",
"repos_url": "https://api.github.com/users/Marviel/repos",
"events_url": "https://api.github.com/users/Marviel/events{/privacy}",
"received_events_url": "https://api.github.com/users/Marviel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 2
| 2024-03-28T23:11:14
| 2024-05-02T06:30:00
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What model would you like?
https://www.maginative.com/article/ai21-labs-unveils-jamba-the-first-production-grade-mamba-based-ai-model/
https://huggingface.co/ai21labs/Jamba-v0.1?ref=maginative.com
------
Thanks for the excellent software :)
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3399/reactions",
"total_count": 16,
"+1": 10,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 6
}
|
https://api.github.com/repos/ollama/ollama/issues/3399/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1209
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1209/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1209/comments
|
https://api.github.com/repos/ollama/ollama/issues/1209/events
|
https://github.com/ollama/ollama/issues/1209
| 2,002,508,046
|
I_kwDOJ0Z1Ps53W9kO
| 1,209
|
Stuck on verifying sha256 digest
|
{
"login": "lelehier",
"id": 106826977,
"node_id": "U_kgDOBl4M4Q",
"avatar_url": "https://avatars.githubusercontent.com/u/106826977?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lelehier",
"html_url": "https://github.com/lelehier",
"followers_url": "https://api.github.com/users/lelehier/followers",
"following_url": "https://api.github.com/users/lelehier/following{/other_user}",
"gists_url": "https://api.github.com/users/lelehier/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lelehier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lelehier/subscriptions",
"organizations_url": "https://api.github.com/users/lelehier/orgs",
"repos_url": "https://api.github.com/users/lelehier/repos",
"events_url": "https://api.github.com/users/lelehier/events{/privacy}",
"received_events_url": "https://api.github.com/users/lelehier/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2023-11-20T15:53:32
| 2024-11-07T14:35:56
| 2023-12-05T00:02:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
So I installed ollama via docker. First few modells to pull worked flawlessly, but at some point ollama got stuck at the sha256 verifying stage. So i tried to setup a new docker container. First modell worked without issues. The second one I tried got stuck at the sha256 step.
Now I installed ollama with the provided scipt on my ubuntu installation where I can't even download one single modell without getting stuck at exact the same step.
Logs don't show any usefull information, exept some EOF errors.
Does someone have the same issue, or better yet a solution to this?
Logs:
`Nov 20 14:49:59 escapepod ollama[704571]: 2023/11/20 14:49:59 images.go:799: total blobs: 0
Nov 20 14:49:59 escapepod ollama[704571]: 2023/11/20 14:49:59 images.go:806: total unused blobs removed: 0
Nov 20 14:49:59 escapepod ollama[704571]: 2023/11/20 14:49:59 routes.go:777: Listening on [127.0.0.1:11434]
Nov 20 14:49:58 escapepod systemd[1]: Started Ollama Service.
Nov 20 15:01:54 escapepod ollama[704571]: [GIN] 2023/11/20 - 15:01:54 | 200 | 30.508µs | [127.0.0.1](https://web.telegram.org/a/127.0.0.1) | HEAD "/"
Nov 20 15:01:57 escapepod ollama[704571]: 2023/11/20 15:01:57 download.go:122: downloading 5b2b5f73b685 in 94 256.0 MB part(s)
Nov 20 15:02:49 escapepod ollama[704571]: 2023/11/20 15:02:49 download.go:161: 5b2b5f73b685 part 46 attempt 0 failed: unexpected EOF, r>
Nov 20 15:03:28 escapepod ollama[704571]: 2023/11/20 15:03:28 download.go:161: 5b2b5f73b685 part 50 attempt 0 failed: unexpected EOF, r>
Nov 20 15:05:58 escapepod ollama[704571]: 2023/11/20 15:05:58 download.go:161: 5b2b5f73b685 part 27 attempt 0 failed: unexpected EOF, r>
Nov 20 15:12:20 escapepod ollama[704571]: 2023/11/20 15:12:20 download.go:165: 5b2b5f73b685 part 27 completed after 1 retries
Nov 20 15:14:35 escapepod ollama[704571]: 2023/11/20 15:14:35 download.go:161: 5b2b5f73b685 part 25 attempt 0 failed: unexpected EOF, r>
Nov 20 15:21:30 escapepod ollama[704571]: 2023/11/20 15:21:30 download.go:165: 5b2b5f73b685 part 50 completed after 1 retries
Nov 20 15:24:12 escapepod ollama[704571]: 2023/11/20 15:24:12 download.go:165: 5b2b5f73b685 part 46 completed after 1 retries
Nov 20 15:24:43 escapepod ollama[704571]: 2023/11/20 15:24:43 download.go:165: 5b2b5f73b685 part 25 completed after 1 retries
Nov 20 15:33:08 escapepod ollama[704571]: 2023/11/20 15:33:08 download.go:122: downloading 4dec76bb1a47 in 1 45 B part(s)
Nov 20 15:33:16 escapepod ollama[704571]: 2023/11/20 15:33:16 download.go:122: downloading 0644cce03f93 in 1 31 B part(s)
Nov 20 15:33:29 escapepod ollama[704571]: 2023/11/20 15:33:29 download.go:122: downloading af28e61681a8 in 1 383 B part(s)`
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1209/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1209/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1050
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1050/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1050/comments
|
https://api.github.com/repos/ollama/ollama/issues/1050/events
|
https://github.com/ollama/ollama/issues/1050
| 1,984,591,996
|
I_kwDOJ0Z1Ps52Snh8
| 1,050
|
default codellama web server
|
{
"login": "kritma",
"id": 127416565,
"node_id": "U_kgDOB5g49Q",
"avatar_url": "https://avatars.githubusercontent.com/u/127416565?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kritma",
"html_url": "https://github.com/kritma",
"followers_url": "https://api.github.com/users/kritma/followers",
"following_url": "https://api.github.com/users/kritma/following{/other_user}",
"gists_url": "https://api.github.com/users/kritma/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kritma/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kritma/subscriptions",
"organizations_url": "https://api.github.com/users/kritma/orgs",
"repos_url": "https://api.github.com/users/kritma/repos",
"events_url": "https://api.github.com/users/kritma/events{/privacy}",
"received_events_url": "https://api.github.com/users/kritma/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-09T00:07:15
| 2023-12-08T23:50:30
| 2023-12-04T23:13:48
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
#### `ollama run codellama`, starts web server which listen on port 60263(or dynamic, idk)
* Is this is ok?
* How can i change this port?
<img width="754" alt="image" src="https://github.com/jmorganca/ollama/assets/127416565/d0a688a5-2317-404b-8d71-f9b5b9bdb603">
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1050/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1050/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7338
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7338/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7338/comments
|
https://api.github.com/repos/ollama/ollama/issues/7338/events
|
https://github.com/ollama/ollama/pull/7338
| 2,609,992,917
|
PR_kwDOJ0Z1Ps5_sD3Z
| 7,338
|
Better test and handle Unicode
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-23T22:53:51
| 2024-10-29T01:12:31
| 2024-10-29T01:12:29
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7338",
"html_url": "https://github.com/ollama/ollama/pull/7338",
"diff_url": "https://github.com/ollama/ollama/pull/7338.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7338.patch",
"merged_at": "2024-10-29T01:12:29"
}
|
Recent releases have hit Unicode bugs, which should be better tested. In addition, when we do have failures, we should handle them more gracefully.
This test currently fails on Windows (due to #7311) and passes on other platforms. Will hold this patch until the one fixing that is merged.
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7338/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7338/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7344
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7344/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7344/comments
|
https://api.github.com/repos/ollama/ollama/issues/7344/events
|
https://github.com/ollama/ollama/issues/7344
| 2,612,026,456
|
I_kwDOJ0Z1Ps6bsFxY
| 7,344
|
after some time idle / phone standby , getting to the termux ollama run cmd makes it restart the dl from 0
|
{
"login": "fxmbsw7",
"id": 39368685,
"node_id": "MDQ6VXNlcjM5MzY4Njg1",
"avatar_url": "https://avatars.githubusercontent.com/u/39368685?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fxmbsw7",
"html_url": "https://github.com/fxmbsw7",
"followers_url": "https://api.github.com/users/fxmbsw7/followers",
"following_url": "https://api.github.com/users/fxmbsw7/following{/other_user}",
"gists_url": "https://api.github.com/users/fxmbsw7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fxmbsw7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fxmbsw7/subscriptions",
"organizations_url": "https://api.github.com/users/fxmbsw7/orgs",
"repos_url": "https://api.github.com/users/fxmbsw7/repos",
"events_url": "https://api.github.com/users/fxmbsw7/events{/privacy}",
"received_events_url": "https://api.github.com/users/fxmbsw7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
open
| false
| null |
[] | null | 10
| 2024-10-24T16:07:31
| 2024-12-06T09:32:29
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
so i know ollama can resume downloads
but the following issue happened to me now the second time on a different model dl
i run ollama run model
it downloads ..
i can switch apps , switch back to termux ollama , no problem
but after some screen of time i return to termux and see it just began from 0 again ...
### OS
Linux
### GPU
Other
### CPU
Other
### Ollama version
0.3.14
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7344/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7344/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7916
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7916/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7916/comments
|
https://api.github.com/repos/ollama/ollama/issues/7916/events
|
https://github.com/ollama/ollama/issues/7916
| 2,715,088,299
|
I_kwDOJ0Z1Ps6h1PWr
| 7,916
|
Develop a Qt QML Client for Ollama
|
{
"login": "ebrahimi1989",
"id": 19800872,
"node_id": "MDQ6VXNlcjE5ODAwODcy",
"avatar_url": "https://avatars.githubusercontent.com/u/19800872?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ebrahimi1989",
"html_url": "https://github.com/ebrahimi1989",
"followers_url": "https://api.github.com/users/ebrahimi1989/followers",
"following_url": "https://api.github.com/users/ebrahimi1989/following{/other_user}",
"gists_url": "https://api.github.com/users/ebrahimi1989/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ebrahimi1989/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ebrahimi1989/subscriptions",
"organizations_url": "https://api.github.com/users/ebrahimi1989/orgs",
"repos_url": "https://api.github.com/users/ebrahimi1989/repos",
"events_url": "https://api.github.com/users/ebrahimi1989/events{/privacy}",
"received_events_url": "https://api.github.com/users/ebrahimi1989/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-12-03T13:50:04
| 2024-12-14T15:38:41
| 2024-12-14T15:38:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
## Description
I would like to request the development of a **Qt QML client for Ollama**. This client would provide a cross-platform, user-friendly graphical interface to interact with Ollama's API and manage local AI models. Qt QML is an excellent choice for creating visually appealing and highly responsive user interfaces, making it ideal for building such a client.
---
## Key Features Requested
1. **Integration with Ollama API:**
- Connect to Ollama’s API for managing models, querying, and retrieving results.
2. **Cross-Platform Support:**
- Develop the client to run seamlessly on major platforms, including **Linux**, **Windows**, and **macOS**.
3. **Interactive UI:**
- Display real-time interaction with AI models, including chat-style interfaces or visualization of model outputs.
4. **Model Management:**
- Allow users to manage, run, and query locally hosted AI models.
5. **Customizable Settings:**
- Provide users with options to adjust settings such as model preferences, API keys, and performance configurations.
---
## Why Qt QML?
**Qt QML** is a modern, declarative framework ideal for creating dynamic and fluid UIs. It allows developers to build high-performance applications with responsive designs, making it an excellent choice for an Ollama client. Additionally, Qt's cross-platform nature would ensure the tool is accessible to a wide audience.
---
## Benefits of This Feature
- **Improved Accessibility:** A graphical client will make it easier for non-technical users to interact with Ollama.
- **Enhanced Productivity:** Developers can benefit from an intuitive interface for managing models without relying on CLI.
- **Cross-Platform Reach:** A Qt-based client can cater to users across **Linux**, **Windows**, and **macOS** platforms.
---
## Additional Context
I am a developer with experience in Qt and QML, and I am willing to contribute to this project. If the maintainers approve this request, I can provide initial implementations or collaborate on the development process.
---
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7916/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7916/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4375
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4375/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4375/comments
|
https://api.github.com/repos/ollama/ollama/issues/4375/events
|
https://github.com/ollama/ollama/issues/4375
| 2,291,338,407
|
I_kwDOJ0Z1Ps6Ikwyn
| 4,375
|
Model Request: IBM Granite
|
{
"login": "Fix3dll",
"id": 10743391,
"node_id": "MDQ6VXNlcjEwNzQzMzkx",
"avatar_url": "https://avatars.githubusercontent.com/u/10743391?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Fix3dll",
"html_url": "https://github.com/Fix3dll",
"followers_url": "https://api.github.com/users/Fix3dll/followers",
"following_url": "https://api.github.com/users/Fix3dll/following{/other_user}",
"gists_url": "https://api.github.com/users/Fix3dll/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Fix3dll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Fix3dll/subscriptions",
"organizations_url": "https://api.github.com/users/Fix3dll/orgs",
"repos_url": "https://api.github.com/users/Fix3dll/repos",
"events_url": "https://api.github.com/users/Fix3dll/events{/privacy}",
"received_events_url": "https://api.github.com/users/Fix3dll/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-05-12T13:39:07
| 2024-05-12T13:42:01
| 2024-05-12T13:42:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/ibm-granite
|
{
"login": "Fix3dll",
"id": 10743391,
"node_id": "MDQ6VXNlcjEwNzQzMzkx",
"avatar_url": "https://avatars.githubusercontent.com/u/10743391?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Fix3dll",
"html_url": "https://github.com/Fix3dll",
"followers_url": "https://api.github.com/users/Fix3dll/followers",
"following_url": "https://api.github.com/users/Fix3dll/following{/other_user}",
"gists_url": "https://api.github.com/users/Fix3dll/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Fix3dll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Fix3dll/subscriptions",
"organizations_url": "https://api.github.com/users/Fix3dll/orgs",
"repos_url": "https://api.github.com/users/Fix3dll/repos",
"events_url": "https://api.github.com/users/Fix3dll/events{/privacy}",
"received_events_url": "https://api.github.com/users/Fix3dll/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4375/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4375/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8575
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8575/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8575/comments
|
https://api.github.com/repos/ollama/ollama/issues/8575/events
|
https://github.com/ollama/ollama/issues/8575
| 2,810,789,793
|
I_kwDOJ0Z1Ps6niT-h
| 8,575
|
MiniCPM-o-2_6
|
{
"login": "enryteam",
"id": 20081090,
"node_id": "MDQ6VXNlcjIwMDgxMDkw",
"avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enryteam",
"html_url": "https://github.com/enryteam",
"followers_url": "https://api.github.com/users/enryteam/followers",
"following_url": "https://api.github.com/users/enryteam/following{/other_user}",
"gists_url": "https://api.github.com/users/enryteam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enryteam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enryteam/subscriptions",
"organizations_url": "https://api.github.com/users/enryteam/orgs",
"repos_url": "https://api.github.com/users/enryteam/repos",
"events_url": "https://api.github.com/users/enryteam/events{/privacy}",
"received_events_url": "https://api.github.com/users/enryteam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-25T05:51:08
| 2025-01-25T08:39:07
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://hf-mirror.com/openbmb/MiniCPM-o-2_6
thanks.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8575/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8575/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2031
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2031/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2031/comments
|
https://api.github.com/repos/ollama/ollama/issues/2031/events
|
https://github.com/ollama/ollama/issues/2031
| 2,086,054,577
|
I_kwDOJ0Z1Ps58Vqqx
| 2,031
|
Is the Ollama.app necessary after installation
|
{
"login": "LeonardoGentile",
"id": 412061,
"node_id": "MDQ6VXNlcjQxMjA2MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/412061?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LeonardoGentile",
"html_url": "https://github.com/LeonardoGentile",
"followers_url": "https://api.github.com/users/LeonardoGentile/followers",
"following_url": "https://api.github.com/users/LeonardoGentile/following{/other_user}",
"gists_url": "https://api.github.com/users/LeonardoGentile/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LeonardoGentile/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LeonardoGentile/subscriptions",
"organizations_url": "https://api.github.com/users/LeonardoGentile/orgs",
"repos_url": "https://api.github.com/users/LeonardoGentile/repos",
"events_url": "https://api.github.com/users/LeonardoGentile/events{/privacy}",
"received_events_url": "https://api.github.com/users/LeonardoGentile/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2024-01-17T12:08:02
| 2024-01-27T00:40:18
| 2024-01-27T00:40:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I was unsure what the Ollama.app was installing on mac but after it did its thing I've realized ollama is installed under `/usr/local/bin/ollama` which I could have done using brew or similar installation processes.
I've realized my models are under `~/.ollama` so my question is: Is the `Ollama.app` still necessary or it was just to install the binary? If I remove it everything should keep on working as before by calling ollama from the command line?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2031/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2031/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1451
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1451/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1451/comments
|
https://api.github.com/repos/ollama/ollama/issues/1451/events
|
https://github.com/ollama/ollama/issues/1451
| 2,034,225,058
|
I_kwDOJ0Z1Ps55P8-i
| 1,451
|
[FEAT] One directory to model them all
|
{
"login": "kfsone",
"id": 323009,
"node_id": "MDQ6VXNlcjMyMzAwOQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/323009?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kfsone",
"html_url": "https://github.com/kfsone",
"followers_url": "https://api.github.com/users/kfsone/followers",
"following_url": "https://api.github.com/users/kfsone/following{/other_user}",
"gists_url": "https://api.github.com/users/kfsone/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kfsone/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kfsone/subscriptions",
"organizations_url": "https://api.github.com/users/kfsone/orgs",
"repos_url": "https://api.github.com/users/kfsone/repos",
"events_url": "https://api.github.com/users/kfsone/events{/privacy}",
"received_events_url": "https://api.github.com/users/kfsone/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-12-10T05:21:39
| 2023-12-19T19:37:17
| 2023-12-19T19:37:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Please consider adding a way to allow Ollama to share models with other resources/tools. Either by allowing a "models dir" config setting/option somewhere, or a modelmap.yaml file:
```
- mistral-7b-instruct:
- presents-as: Mistral-7B-Instruct-v0.1
- folder: /opt/ai/models/TheBloke/Mistral-7B-Instruct-v01-GGUF # optional
- files:
- tag: Q5_K_M
file: mistral-7b-instruct-v0.1.Q5_K_M.gguf
- claude2:
- file: /opt/ai/models/TheBloke/claude2-alpaca-13B-GGUF/claude2-alpaca-13b.Q5_K_M.gguf
```
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1451/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1451/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4108
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4108/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4108/comments
|
https://api.github.com/repos/ollama/ollama/issues/4108/events
|
https://github.com/ollama/ollama/pull/4108
| 2,276,572,087
|
PR_kwDOJ0Z1Ps5ualdJ
| 4,108
|
fix line ending
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-02T21:54:18
| 2024-05-02T21:55:19
| 2024-05-02T21:55:15
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4108",
"html_url": "https://github.com/ollama/ollama/pull/4108",
"diff_url": "https://github.com/ollama/ollama/pull/4108.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4108.patch",
"merged_at": "2024-05-02T21:55:15"
}
|
replace CRLF with LF
CRLF leaves the file in a perpetually dirty state on non-windows systems without a way to reset
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4108/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4108/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/561
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/561/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/561/comments
|
https://api.github.com/repos/ollama/ollama/issues/561/events
|
https://github.com/ollama/ollama/issues/561
| 1,905,778,006
|
I_kwDOJ0Z1Ps5xl91W
| 561
|
Unexpected EOF with Falcon:40b
|
{
"login": "henry-prince-addepar",
"id": 80268918,
"node_id": "MDQ6VXNlcjgwMjY4OTE4",
"avatar_url": "https://avatars.githubusercontent.com/u/80268918?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/henry-prince-addepar",
"html_url": "https://github.com/henry-prince-addepar",
"followers_url": "https://api.github.com/users/henry-prince-addepar/followers",
"following_url": "https://api.github.com/users/henry-prince-addepar/following{/other_user}",
"gists_url": "https://api.github.com/users/henry-prince-addepar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/henry-prince-addepar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/henry-prince-addepar/subscriptions",
"organizations_url": "https://api.github.com/users/henry-prince-addepar/orgs",
"repos_url": "https://api.github.com/users/henry-prince-addepar/repos",
"events_url": "https://api.github.com/users/henry-prince-addepar/events{/privacy}",
"received_events_url": "https://api.github.com/users/henry-prince-addepar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-09-20T22:03:10
| 2023-10-22T06:18:35
| 2023-09-23T18:37:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm getting an error from `falcon:40b`. Any help would be greatly appreciated. I'm currently running MacOS 13.5.2 (22G91) on a M1 Max with 32 GB of RAM. Thanks in advance!
```
➜ ~ ollama pull falcon:40b
pulling manifest
pulling a4a6e73500b0... 100% |██████████████████████████████████████████████████████████████████████████████████████████| (24/24 GB, 12 TB/s)
pulling d5311aab7c4c... 100% |██████████████████████████████████████████████████████████████████████████████████████████| (84/84 B, 103 kB/s)
pulling 0740207dce29... 100% |████████████████████████████████████████████████████████████████████████████████████████| (307/307 B, 3.9 MB/s)
verifying sha256 digest
writing manifest
removing any unused layers
success
➜ ~ ollama run falcon:40b
>>> Why is the sky blue?
Error: error reading llm response: unexpected EOF
➜ ~ ollama run falcon:40b
>>> Hi. This is a test.
Error: error reading llm response: unexpected EOF
```
|
{
"login": "henry-prince-addepar",
"id": 80268918,
"node_id": "MDQ6VXNlcjgwMjY4OTE4",
"avatar_url": "https://avatars.githubusercontent.com/u/80268918?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/henry-prince-addepar",
"html_url": "https://github.com/henry-prince-addepar",
"followers_url": "https://api.github.com/users/henry-prince-addepar/followers",
"following_url": "https://api.github.com/users/henry-prince-addepar/following{/other_user}",
"gists_url": "https://api.github.com/users/henry-prince-addepar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/henry-prince-addepar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/henry-prince-addepar/subscriptions",
"organizations_url": "https://api.github.com/users/henry-prince-addepar/orgs",
"repos_url": "https://api.github.com/users/henry-prince-addepar/repos",
"events_url": "https://api.github.com/users/henry-prince-addepar/events{/privacy}",
"received_events_url": "https://api.github.com/users/henry-prince-addepar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/561/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/561/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1248
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1248/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1248/comments
|
https://api.github.com/repos/ollama/ollama/issues/1248/events
|
https://github.com/ollama/ollama/issues/1248
| 2,007,179,277
|
I_kwDOJ0Z1Ps53oyAN
| 1,248
|
v0.1.11 Crashes on Intel Mac
|
{
"login": "10REMSSeiller",
"id": 20466077,
"node_id": "MDQ6VXNlcjIwNDY2MDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/20466077?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/10REMSSeiller",
"html_url": "https://github.com/10REMSSeiller",
"followers_url": "https://api.github.com/users/10REMSSeiller/followers",
"following_url": "https://api.github.com/users/10REMSSeiller/following{/other_user}",
"gists_url": "https://api.github.com/users/10REMSSeiller/gists{/gist_id}",
"starred_url": "https://api.github.com/users/10REMSSeiller/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/10REMSSeiller/subscriptions",
"organizations_url": "https://api.github.com/users/10REMSSeiller/orgs",
"repos_url": "https://api.github.com/users/10REMSSeiller/repos",
"events_url": "https://api.github.com/users/10REMSSeiller/events{/privacy}",
"received_events_url": "https://api.github.com/users/10REMSSeiller/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2023-11-22T22:09:07
| 2023-11-27T06:06:05
| 2023-11-27T06:06:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
v0.1.9 ran successfully on my Mac, but v0.1.11 causes crash. I'm not sure why. Below is excerpt of crash log.
I was able to revert and run v0.1.9.
For verification, I trashed the original ~/.ollama and application support folders and reinstalled v0.1.11. Same results. What other info is needed?
> Process: ollama-runner [1470]
> Path: /private/var/folders/*/ollama-runner
> Version: ???
> Code Type: X86-64 (Native)
> Parent Process: ollama [697]
> Time Awake Since Boot: 960 seconds
> System Integrity Protection: enabled
> Crashed Thread: 0 Dispatch queue: com.apple.main-thread
> Exception Type: EXC_BAD_INSTRUCTION (SIGILL)
> Exception Codes: 0x0000000000000001, 0x0000000000000000
> Exception Note: EXC_CORPSE_NOTIFY
> Termination Reason: Namespace SIGNAL, Code 4 Illegal instruction: 4
> Terminating Process: exc handler [1470]
>
> Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
> 0 ollama-runner 0x105fc05e8 nlohmann::json_abi_v3_11_2::basic_json<nlohmann::json_abi_v3_11_2::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_11_2::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char> > >::dump(int, char, bool, nlohmann::json_abi_v3_11_2::detail::error_handler_t) const + 424
> 1 ollama-runner 0x105fbc2ba server_log(char const*, char const*, int, char const*, nlohmann::json_abi_v3_11_2::basic_json<nlohmann::json_abi_v3_11_2::ordered_map, std::__1::vector, std::__1::basic_string<char, std::__1::char_traits<char>, std::__1::allocator<char> >, bool, long long, unsigned long long, double, std::__1::allocator, nlohmann::json_abi_v3_11_2::adl_serializer, std::__1::vector<unsigned char, std::__1::allocator<unsigned char> > > const&) + 1114
> 2 ollama-runner 0x105fb952d main + 6349
> 3 dyld 0x10c79d52e start + 462
Below is the fresh server.log
> 2023/11/22 13:44:28 images.go:779: total blobs: 0
> 2023/11/22 13:44:28 images.go:786: total unused blobs removed: 0
> 2023/11/22 13:44:28 routes.go:777: Listening on 127.0.0.1:11434 (version 0.1.11)
> [GIN] 2023/11/22 - 13:48:13 | 200 | 1.315832ms | 127.0.0.1 | HEAD "/"
> [GIN] 2023/11/22 - 13:48:13 | 404 | 5.195104ms | 127.0.0.1 | POST "/api/show"
> 2023/11/22 13:48:16 download.go:123: downloading 22f7f8ef5f4c in 39 100 MB part(s)
> 2023/11/22 13:55:53 download.go:162: 22f7f8ef5f4c part 16 attempt 0 failed: unexpected EOF, retrying in 1s
> 2023/11/22 13:59:09 download.go:123: downloading 8c17c2ebb0ea in 1 7.0 KB part(s)
> 2023/11/22 13:59:12 download.go:123: downloading 7c23fb36d801 in 1 4.8 KB part(s)
> 2023/11/22 13:59:15 download.go:123: downloading 2e0493f67d0c in 1 59 B part(s)
> 2023/11/22 13:59:17 download.go:123: downloading 2759286baa87 in 1 105 B part(s)
> 2023/11/22 13:59:20 download.go:123: downloading 5407e3188df9 in 1 529 B part(s)
> [GIN] 2023/11/22 - 13:59:41 | 200 | 11m27s | 127.0.0.1 | POST "/api/pull"
> 2023/11/22 13:59:41 llama.go:420: starting llama runner
> 2023/11/22 13:59:41 llama.go:478: waiting for llama runner to start responding
> 2023/11/22 13:59:41 llama.go:435: signal: illegal instruction
> 2023/11/22 13:59:41 llama.go:443: error starting llama runner: llama runner process has terminated
> 2023/11/22 13:59:41 llama.go:509: llama runner stopped successfully
> [GIN] 2023/11/22 - 13:59:41 | 500 | 428.633985ms | 127.0.0.1 | POST "/api/generate"
>
I have a [trashcan] Mac Pro (2013) 6-Core Intel Xeon E5 3.5 GHz running macOS 12.7.1 with AMD FirePro D500 3GB VRAM per PCIe slot, gMux Version: 4.0.11 [3.2.8], and Metal Family: Supported, Metal GPUFamily macOS 2.
I upgraded 16GB RAM to 64GB. llama2 will run, but at 3.92 tokens/s.
I was getting 'not enough available memory' error with dolphin2.2-mistral.
|
{
"login": "10REMSSeiller",
"id": 20466077,
"node_id": "MDQ6VXNlcjIwNDY2MDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/20466077?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/10REMSSeiller",
"html_url": "https://github.com/10REMSSeiller",
"followers_url": "https://api.github.com/users/10REMSSeiller/followers",
"following_url": "https://api.github.com/users/10REMSSeiller/following{/other_user}",
"gists_url": "https://api.github.com/users/10REMSSeiller/gists{/gist_id}",
"starred_url": "https://api.github.com/users/10REMSSeiller/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/10REMSSeiller/subscriptions",
"organizations_url": "https://api.github.com/users/10REMSSeiller/orgs",
"repos_url": "https://api.github.com/users/10REMSSeiller/repos",
"events_url": "https://api.github.com/users/10REMSSeiller/events{/privacy}",
"received_events_url": "https://api.github.com/users/10REMSSeiller/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1248/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1248/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1330
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1330/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1330/comments
|
https://api.github.com/repos/ollama/ollama/issues/1330/events
|
https://github.com/ollama/ollama/issues/1330
| 2,018,738,115
|
I_kwDOJ0Z1Ps54U3_D
| 1,330
|
Installation downloaded cuda likely unnecessarily
|
{
"login": "folovco",
"id": 142908483,
"node_id": "U_kgDOCIScQw",
"avatar_url": "https://avatars.githubusercontent.com/u/142908483?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/folovco",
"html_url": "https://github.com/folovco",
"followers_url": "https://api.github.com/users/folovco/followers",
"following_url": "https://api.github.com/users/folovco/following{/other_user}",
"gists_url": "https://api.github.com/users/folovco/gists{/gist_id}",
"starred_url": "https://api.github.com/users/folovco/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/folovco/subscriptions",
"organizations_url": "https://api.github.com/users/folovco/orgs",
"repos_url": "https://api.github.com/users/folovco/repos",
"events_url": "https://api.github.com/users/folovco/events{/privacy}",
"received_events_url": "https://api.github.com/users/folovco/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2023-11-30T14:03:55
| 2024-03-12T16:14:52
| 2024-03-12T16:14:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
For quick testing on a CPU-only server it would seem possibly advantageous to avoid downloading the cuda drivers if the https://ollama.ai/install.sh script can determine there is no Nvidia device to use.
>>> Installing ollama to /usr/local/bin... >>> Creating ollama user... >>> Adding current user to ollama group...
>>> Creating ollama systemd service... >>> Enabling and starting ollama service...
Created symlink /etc/systemd/system/default.target.wants/ollama.service → /etc/systemd/system/ollama.service.
>>> The Ollama API is now available at 0.0.0.0:11434.
>>> Install complete. Run "ollama" from the command line.
WARNING: No NVIDIA GPU detected. Ollama will run in CPU-only mode.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1330/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1330/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1030
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1030/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1030/comments
|
https://api.github.com/repos/ollama/ollama/issues/1030/events
|
https://github.com/ollama/ollama/issues/1030
| 1,981,226,199
|
I_kwDOJ0Z1Ps52FxzX
| 1,030
|
WizardCoder models lack a prompt template
|
{
"login": "Nan-Do",
"id": 3844058,
"node_id": "MDQ6VXNlcjM4NDQwNTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/3844058?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Nan-Do",
"html_url": "https://github.com/Nan-Do",
"followers_url": "https://api.github.com/users/Nan-Do/followers",
"following_url": "https://api.github.com/users/Nan-Do/following{/other_user}",
"gists_url": "https://api.github.com/users/Nan-Do/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Nan-Do/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Nan-Do/subscriptions",
"organizations_url": "https://api.github.com/users/Nan-Do/orgs",
"repos_url": "https://api.github.com/users/Nan-Do/repos",
"events_url": "https://api.github.com/users/Nan-Do/events{/privacy}",
"received_events_url": "https://api.github.com/users/Nan-Do/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 2
| 2023-11-07T12:22:23
| 2023-11-16T22:58:38
| 2023-11-16T22:58:38
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have been using the WizardCoder models and they do not use a template, this makes the quality of the output substantially worse, sometimes not writing python code and some others not offering an answer at all.
I have been trying to see how to contribute this to the model but I haven't seen any feasible way to do it so here is a modelfile description that can be used to make the model use the proper template
```
FROM wizardcoder:13b-python
# set the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 0.3
# set the system prompt
TEMPLATE """
{{ .System }}
### Instruction:
{{ .Prompt }}
### Response:
"""
SYSTEM """
Below is an instruction that describes a task. Write a response that appropriately completes the request.
"""
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1030/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1030/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2319
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2319/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2319/comments
|
https://api.github.com/repos/ollama/ollama/issues/2319/events
|
https://github.com/ollama/ollama/issues/2319
| 2,114,066,687
|
I_kwDOJ0Z1Ps5-Ahj_
| 2,319
|
Distrubuted LLM support ?
|
{
"login": "Donno191",
"id": 10705947,
"node_id": "MDQ6VXNlcjEwNzA1OTQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/10705947?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Donno191",
"html_url": "https://github.com/Donno191",
"followers_url": "https://api.github.com/users/Donno191/followers",
"following_url": "https://api.github.com/users/Donno191/following{/other_user}",
"gists_url": "https://api.github.com/users/Donno191/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Donno191/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Donno191/subscriptions",
"organizations_url": "https://api.github.com/users/Donno191/orgs",
"repos_url": "https://api.github.com/users/Donno191/repos",
"events_url": "https://api.github.com/users/Donno191/events{/privacy}",
"received_events_url": "https://api.github.com/users/Donno191/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 11
| 2024-02-02T05:04:03
| 2024-10-02T01:25:47
| 2024-04-08T16:53:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have 3 x PC with 3090 and 1 x PC with 4090. Currently i am running ollama using my 4090 and it is working great for loading different models on the go, but the bottle neck is loading larger models and bigger context windows on the 24gb vram. It would be great to have something like pedals or MPI on llama.cpp.
IDEA:
Maybe having ollama slave running on my 3 x pc with 3090 holding the distributed llm and if the ollama server/serve on my 4090 PC needs to load the large models then use the 3090's to increase vram to 96gb
This will help increase the bottleneck of consumer hardware and also help businesses utilize resources when idle for LLM's.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2319/reactions",
"total_count": 15,
"+1": 15,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2319/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6938
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6938/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6938/comments
|
https://api.github.com/repos/ollama/ollama/issues/6938/events
|
https://github.com/ollama/ollama/pull/6938
| 2,545,953,147
|
PR_kwDOJ0Z1Ps58jIa6
| 6,938
|
add CLI completion for commands
|
{
"login": "pranitbauva1997",
"id": 2959938,
"node_id": "MDQ6VXNlcjI5NTk5Mzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2959938?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pranitbauva1997",
"html_url": "https://github.com/pranitbauva1997",
"followers_url": "https://api.github.com/users/pranitbauva1997/followers",
"following_url": "https://api.github.com/users/pranitbauva1997/following{/other_user}",
"gists_url": "https://api.github.com/users/pranitbauva1997/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pranitbauva1997/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pranitbauva1997/subscriptions",
"organizations_url": "https://api.github.com/users/pranitbauva1997/orgs",
"repos_url": "https://api.github.com/users/pranitbauva1997/repos",
"events_url": "https://api.github.com/users/pranitbauva1997/events{/privacy}",
"received_events_url": "https://api.github.com/users/pranitbauva1997/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 2
| 2024-09-24T17:20:40
| 2025-01-03T09:07:23
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6938",
"html_url": "https://github.com/ollama/ollama/pull/6938",
"diff_url": "https://github.com/ollama/ollama/pull/6938.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6938.patch",
"merged_at": null
}
|
For example, `ollama ru<TAB>` should complete it to `ollama run`
TODO: `ollama run gemma2:<TAB>` should show all options for parameters. Currently, I have to visit the ollama.com/library to verify. I need help with this as I can easily find a list of all models. If I have the list, I can finish this as well. Please help.
I also need help in figuring out how to include running commands like `source <(ollama completion bash)` (or fish, etc) during the install process for different OSes.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6938/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6938/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4571
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4571/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4571/comments
|
https://api.github.com/repos/ollama/ollama/issues/4571/events
|
https://github.com/ollama/ollama/pull/4571
| 2,309,712,780
|
PR_kwDOJ0Z1Ps5wKMQ2
| 4,571
|
chore: update tokenizer.go
|
{
"login": "eltociear",
"id": 22633385,
"node_id": "MDQ6VXNlcjIyNjMzMzg1",
"avatar_url": "https://avatars.githubusercontent.com/u/22633385?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eltociear",
"html_url": "https://github.com/eltociear",
"followers_url": "https://api.github.com/users/eltociear/followers",
"following_url": "https://api.github.com/users/eltociear/following{/other_user}",
"gists_url": "https://api.github.com/users/eltociear/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eltociear/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eltociear/subscriptions",
"organizations_url": "https://api.github.com/users/eltociear/orgs",
"repos_url": "https://api.github.com/users/eltociear/repos",
"events_url": "https://api.github.com/users/eltociear/events{/privacy}",
"received_events_url": "https://api.github.com/users/eltociear/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-22T06:47:45
| 2024-05-22T07:25:23
| 2024-05-22T07:25:23
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4571",
"html_url": "https://github.com/ollama/ollama/pull/4571",
"diff_url": "https://github.com/ollama/ollama/pull/4571.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4571.patch",
"merged_at": "2024-05-22T07:25:23"
}
|
PreTokenziers -> PreTokenizers
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4571/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4571/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8180
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8180/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8180/comments
|
https://api.github.com/repos/ollama/ollama/issues/8180/events
|
https://github.com/ollama/ollama/issues/8180
| 2,751,366,967
|
I_kwDOJ0Z1Ps6j_oc3
| 8,180
|
How to speed up model
|
{
"login": "QichangZheng",
"id": 82627111,
"node_id": "MDQ6VXNlcjgyNjI3MTEx",
"avatar_url": "https://avatars.githubusercontent.com/u/82627111?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/QichangZheng",
"html_url": "https://github.com/QichangZheng",
"followers_url": "https://api.github.com/users/QichangZheng/followers",
"following_url": "https://api.github.com/users/QichangZheng/following{/other_user}",
"gists_url": "https://api.github.com/users/QichangZheng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/QichangZheng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/QichangZheng/subscriptions",
"organizations_url": "https://api.github.com/users/QichangZheng/orgs",
"repos_url": "https://api.github.com/users/QichangZheng/repos",
"events_url": "https://api.github.com/users/QichangZheng/events{/privacy}",
"received_events_url": "https://api.github.com/users/QichangZheng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-12-19T20:33:36
| 2024-12-20T21:35:48
| 2024-12-20T21:35:48
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have 96GB VRAM and llama3.3 only takes up half. Can I utilize the VRAM to speed up the model?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8180/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8180/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1912
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1912/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1912/comments
|
https://api.github.com/repos/ollama/ollama/issues/1912/events
|
https://github.com/ollama/ollama/issues/1912
| 2,075,345,176
|
I_kwDOJ0Z1Ps57s0EY
| 1,912
|
Will Magicoder-S-DS-6.7B ever come back?
|
{
"login": "reaperkrew",
"id": 25416226,
"node_id": "MDQ6VXNlcjI1NDE2MjI2",
"avatar_url": "https://avatars.githubusercontent.com/u/25416226?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/reaperkrew",
"html_url": "https://github.com/reaperkrew",
"followers_url": "https://api.github.com/users/reaperkrew/followers",
"following_url": "https://api.github.com/users/reaperkrew/following{/other_user}",
"gists_url": "https://api.github.com/users/reaperkrew/gists{/gist_id}",
"starred_url": "https://api.github.com/users/reaperkrew/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/reaperkrew/subscriptions",
"organizations_url": "https://api.github.com/users/reaperkrew/orgs",
"repos_url": "https://api.github.com/users/reaperkrew/repos",
"events_url": "https://api.github.com/users/reaperkrew/events{/privacy}",
"received_events_url": "https://api.github.com/users/reaperkrew/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-01-10T22:45:31
| 2024-11-12T01:43:53
| 2024-11-12T01:43:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi Everyone,
I've heard a lot of good things about Magicoder-S-DS-6.7B. From browsing through some previously closed threads in this repository, it looks like at some point in early December of 2023 Magicoder-S-DS-6.7B was available. Does anyone know if it will come back?
Thanks
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1912/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1912/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1428
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1428/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1428/comments
|
https://api.github.com/repos/ollama/ollama/issues/1428/events
|
https://github.com/ollama/ollama/pull/1428
| 2,031,754,662
|
PR_kwDOJ0Z1Ps5hed4z
| 1,428
|
document response in modelfile template variables
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
}
] |
closed
| false
| null |
[] | null | 0
| 2023-12-08T01:02:21
| 2024-01-08T19:38:52
| 2024-01-08T19:38:51
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1428",
"html_url": "https://github.com/ollama/ollama/pull/1428",
"diff_url": "https://github.com/ollama/ollama/pull/1428.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1428.patch",
"merged_at": "2024-01-08T19:38:51"
}
|
Document #1427, to be merged on next release
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1428/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1428/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5184
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5184/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5184/comments
|
https://api.github.com/repos/ollama/ollama/issues/5184/events
|
https://github.com/ollama/ollama/issues/5184
| 2,364,586,768
|
I_kwDOJ0Z1Ps6M8LsQ
| 5,184
|
`ollama show` should have the exact parameter count rounded to 3 digits
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 1
| 2024-06-20T14:21:38
| 2024-11-06T01:18:52
| null |
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
% ollama show llama3
Model
arch llama
parameters 8.0B
quantization Q4_0
context length 8192
embedding length 4096
```
should be the same as
<img width="793" alt="Screenshot 2024-06-20 at 10 21 28 AM" src="https://github.com/ollama/ollama/assets/251292/3e9ac431-e205-45b9-91ef-b63ddc8fe0f9">
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5184/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5184/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7514
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7514/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7514/comments
|
https://api.github.com/repos/ollama/ollama/issues/7514/events
|
https://github.com/ollama/ollama/issues/7514
| 2,636,162,814
|
I_kwDOJ0Z1Ps6dIKb-
| 7,514
|
Realtime API like OpenAI (full fledged voice to voice integrations)
|
{
"login": "ryzxxn",
"id": 89019551,
"node_id": "MDQ6VXNlcjg5MDE5NTUx",
"avatar_url": "https://avatars.githubusercontent.com/u/89019551?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ryzxxn",
"html_url": "https://github.com/ryzxxn",
"followers_url": "https://api.github.com/users/ryzxxn/followers",
"following_url": "https://api.github.com/users/ryzxxn/following{/other_user}",
"gists_url": "https://api.github.com/users/ryzxxn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ryzxxn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ryzxxn/subscriptions",
"organizations_url": "https://api.github.com/users/ryzxxn/orgs",
"repos_url": "https://api.github.com/users/ryzxxn/repos",
"events_url": "https://api.github.com/users/ryzxxn/events{/privacy}",
"received_events_url": "https://api.github.com/users/ryzxxn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 11
| 2024-11-05T18:19:26
| 2024-12-23T01:11:15
| 2024-12-23T01:11:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
if anyone is working on a realtime api like integration with Ollama, please reach out to me. iam working on a similar integration, and i think feedback from, all the amazing people can greatly impact the quality of this feature, i think its pretty cool what openAI has going for it, and iam also a big fan of running every thing locally... 😄
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7514/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7514/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/663
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/663/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/663/comments
|
https://api.github.com/repos/ollama/ollama/issues/663/events
|
https://github.com/ollama/ollama/pull/663
| 1,920,709,180
|
PR_kwDOJ0Z1Ps5bnFep
| 663
|
Fix for #586, seed and temperature settings
|
{
"login": "hallh",
"id": 12785324,
"node_id": "MDQ6VXNlcjEyNzg1MzI0",
"avatar_url": "https://avatars.githubusercontent.com/u/12785324?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hallh",
"html_url": "https://github.com/hallh",
"followers_url": "https://api.github.com/users/hallh/followers",
"following_url": "https://api.github.com/users/hallh/following{/other_user}",
"gists_url": "https://api.github.com/users/hallh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hallh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hallh/subscriptions",
"organizations_url": "https://api.github.com/users/hallh/orgs",
"repos_url": "https://api.github.com/users/hallh/repos",
"events_url": "https://api.github.com/users/hallh/events{/privacy}",
"received_events_url": "https://api.github.com/users/hallh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2023-10-01T11:12:37
| 2023-10-04T13:51:15
| 2023-10-02T18:54:02
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/663",
"html_url": "https://github.com/ollama/ollama/pull/663",
"diff_url": "https://github.com/ollama/ollama/pull/663.diff",
"patch_url": "https://github.com/ollama/ollama/pull/663.patch",
"merged_at": null
}
|
Fix for #586. Seed was omitted in the params to the llama.cpp server and temperature had an `omitempty` filter specified, breaking support for `0` temperature.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/663/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/663/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1586
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1586/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1586/comments
|
https://api.github.com/repos/ollama/ollama/issues/1586/events
|
https://github.com/ollama/ollama/issues/1586
| 2,047,386,180
|
I_kwDOJ0Z1Ps56CKJE
| 1,586
|
ollama models corrupted?
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2023-12-18T20:19:16
| 2023-12-30T00:42:41
| 2023-12-29T04:50:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I've noticed that after running a few models, sometimes the models don't behave normally. This is a session where that was occurring. I had first tried with bakllava but it wasn't being helpful either. But notice that after I did the systemctl restart ollama the results were much better.
Is something being corrupted in memory? I'll do what I can to help debug this.
```
ollama run llava
>>> look at ./classic.jpg
Added image './classic.jpg'
>>> what is it?
>>> what is ./classic.jpg
Added image './classic.jpg'
>>> hello
>>> /bye
chris@FORGE:~/ai/aiprojects/OllamaPlayground/createnotes$ systemctl restart ollama
chris@FORGE:~/ai/aiprojects/OllamaPlayground/createnotes$ ollama run llava
>>> look at ./classic.jpg
Added image './classic.jpg'
1. The Underwood typewriter is an old fashioned machine that appears to be made of wood and metal components.
2. It has a black keyboard with silver numbers on the side, giving it a vintage appearance.
3. There are several keys visible, including letters such as A, B, C, D, E, F, G, H, I, J, K, L, M and N along with numeric keys 1 through 9.
4. The typewriter sits on a table and is placed underneath the desk.
```
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1586/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1586/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/97
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/97/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/97/comments
|
https://api.github.com/repos/ollama/ollama/issues/97/events
|
https://github.com/ollama/ollama/pull/97
| 1,809,156,548
|
PR_kwDOJ0Z1Ps5VvpV0
| 97
|
add new list command
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-18T05:43:08
| 2023-07-18T16:09:46
| 2023-07-18T16:09:45
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/97",
"html_url": "https://github.com/ollama/ollama/pull/97",
"diff_url": "https://github.com/ollama/ollama/pull/97.diff",
"patch_url": "https://github.com/ollama/ollama/pull/97.patch",
"merged_at": "2023-07-18T16:09:45"
}
|
This changes lets you list each of the models which you have pulled locally.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/97/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/97/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7770
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7770/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7770/comments
|
https://api.github.com/repos/ollama/ollama/issues/7770/events
|
https://github.com/ollama/ollama/pull/7770
| 2,677,372,026
|
PR_kwDOJ0Z1Ps6Clrmj
| 7,770
|
Add Orbiton to the README.md file
|
{
"login": "xyproto",
"id": 52813,
"node_id": "MDQ6VXNlcjUyODEz",
"avatar_url": "https://avatars.githubusercontent.com/u/52813?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xyproto",
"html_url": "https://github.com/xyproto",
"followers_url": "https://api.github.com/users/xyproto/followers",
"following_url": "https://api.github.com/users/xyproto/following{/other_user}",
"gists_url": "https://api.github.com/users/xyproto/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xyproto/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xyproto/subscriptions",
"organizations_url": "https://api.github.com/users/xyproto/orgs",
"repos_url": "https://api.github.com/users/xyproto/repos",
"events_url": "https://api.github.com/users/xyproto/events{/privacy}",
"received_events_url": "https://api.github.com/users/xyproto/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-20T22:34:32
| 2024-11-21T08:15:30
| 2024-11-21T07:24:05
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7770",
"html_url": "https://github.com/ollama/ollama/pull/7770",
"diff_url": "https://github.com/ollama/ollama/pull/7770.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7770.patch",
"merged_at": "2024-11-21T07:24:05"
}
|
Orbiton is a configuration-free text editor and IDE that can use Ollama for tab-completion.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7770/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7770/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1218
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1218/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1218/comments
|
https://api.github.com/repos/ollama/ollama/issues/1218/events
|
https://github.com/ollama/ollama/pull/1218
| 2,003,902,464
|
PR_kwDOJ0Z1Ps5gACQv
| 1,218
|
Update Maid repo
|
{
"login": "danemadsen",
"id": 11537699,
"node_id": "MDQ6VXNlcjExNTM3Njk5",
"avatar_url": "https://avatars.githubusercontent.com/u/11537699?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/danemadsen",
"html_url": "https://github.com/danemadsen",
"followers_url": "https://api.github.com/users/danemadsen/followers",
"following_url": "https://api.github.com/users/danemadsen/following{/other_user}",
"gists_url": "https://api.github.com/users/danemadsen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/danemadsen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/danemadsen/subscriptions",
"organizations_url": "https://api.github.com/users/danemadsen/orgs",
"repos_url": "https://api.github.com/users/danemadsen/repos",
"events_url": "https://api.github.com/users/danemadsen/events{/privacy}",
"received_events_url": "https://api.github.com/users/danemadsen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-21T10:03:47
| 2023-11-21T14:30:34
| 2023-11-21T14:30:34
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1218",
"html_url": "https://github.com/ollama/ollama/pull/1218",
"diff_url": "https://github.com/ollama/ollama/pull/1218.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1218.patch",
"merged_at": "2023-11-21T14:30:34"
}
|
Sorry for the extra PR but i noticed i accidently linked my personal repo instead of the main repo
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1218/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1218/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6159
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6159/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6159/comments
|
https://api.github.com/repos/ollama/ollama/issues/6159/events
|
https://github.com/ollama/ollama/issues/6159
| 2,447,074,706
|
I_kwDOJ0Z1Ps6R22WS
| 6,159
|
Bunny family of VLMs
|
{
"login": "ddpasa",
"id": 112642920,
"node_id": "U_kgDOBrbLaA",
"avatar_url": "https://avatars.githubusercontent.com/u/112642920?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ddpasa",
"html_url": "https://github.com/ddpasa",
"followers_url": "https://api.github.com/users/ddpasa/followers",
"following_url": "https://api.github.com/users/ddpasa/following{/other_user}",
"gists_url": "https://api.github.com/users/ddpasa/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ddpasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ddpasa/subscriptions",
"organizations_url": "https://api.github.com/users/ddpasa/orgs",
"repos_url": "https://api.github.com/users/ddpasa/repos",
"events_url": "https://api.github.com/users/ddpasa/events{/privacy}",
"received_events_url": "https://api.github.com/users/ddpasa/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-08-04T10:37:13
| 2024-08-04T10:37:13
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Bunny is a family of very promising VLMs. They are already supported by llama.cpp
https://github.com/BAAI-DCAI/Bunny
v1.1 4b: https://huggingface.co/BAAI/Bunny-v1_1-4B
v1.1 llama3 8b: https://huggingface.co/BAAI/Bunny-v1_1-Llama-3-8B-V
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6159/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6159/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6620
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6620/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6620/comments
|
https://api.github.com/repos/ollama/ollama/issues/6620/events
|
https://github.com/ollama/ollama/pull/6620
| 2,503,941,663
|
PR_kwDOJ0Z1Ps56Uijn
| 6,620
|
Use cuda v11 for driver 525 and older
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-03T22:45:52
| 2024-09-04T00:15:34
| 2024-09-04T00:15:31
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6620",
"html_url": "https://github.com/ollama/ollama/pull/6620",
"diff_url": "https://github.com/ollama/ollama/pull/6620.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6620.patch",
"merged_at": "2024-09-04T00:15:31"
}
|
It looks like driver 525 (aka, cuda driver 12.0) has problems with the cuda v12 library we compile against, so run v11 on those older drivers if detected.
Fixes #6556
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6620/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6620/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1928
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1928/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1928/comments
|
https://api.github.com/repos/ollama/ollama/issues/1928/events
|
https://github.com/ollama/ollama/issues/1928
| 2,077,163,575
|
I_kwDOJ0Z1Ps57zwA3
| 1,928
|
Prevent offloding
|
{
"login": "Hansson0728",
"id": 9604420,
"node_id": "MDQ6VXNlcjk2MDQ0MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/9604420?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hansson0728",
"html_url": "https://github.com/Hansson0728",
"followers_url": "https://api.github.com/users/Hansson0728/followers",
"following_url": "https://api.github.com/users/Hansson0728/following{/other_user}",
"gists_url": "https://api.github.com/users/Hansson0728/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Hansson0728/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hansson0728/subscriptions",
"organizations_url": "https://api.github.com/users/Hansson0728/orgs",
"repos_url": "https://api.github.com/users/Hansson0728/repos",
"events_url": "https://api.github.com/users/Hansson0728/events{/privacy}",
"received_events_url": "https://api.github.com/users/Hansson0728/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2024-01-11T16:55:24
| 2024-01-28T22:30:50
| 2024-01-28T22:30:50
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The model offloads after 5 min on the api, it would be nice to be able to prevent this
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1928/reactions",
"total_count": 7,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1928/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8635
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8635/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8635/comments
|
https://api.github.com/repos/ollama/ollama/issues/8635/events
|
https://github.com/ollama/ollama/issues/8635
| 2,815,779,059
|
I_kwDOJ0Z1Ps6n1WDz
| 8,635
|
Use of System Ram over RDMA in GPU to allow for GPU acceleration on lower VRAM hardware.
|
{
"login": "SlinkierElm5611",
"id": 52179385,
"node_id": "MDQ6VXNlcjUyMTc5Mzg1",
"avatar_url": "https://avatars.githubusercontent.com/u/52179385?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SlinkierElm5611",
"html_url": "https://github.com/SlinkierElm5611",
"followers_url": "https://api.github.com/users/SlinkierElm5611/followers",
"following_url": "https://api.github.com/users/SlinkierElm5611/following{/other_user}",
"gists_url": "https://api.github.com/users/SlinkierElm5611/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SlinkierElm5611/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SlinkierElm5611/subscriptions",
"organizations_url": "https://api.github.com/users/SlinkierElm5611/orgs",
"repos_url": "https://api.github.com/users/SlinkierElm5611/repos",
"events_url": "https://api.github.com/users/SlinkierElm5611/events{/privacy}",
"received_events_url": "https://api.github.com/users/SlinkierElm5611/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-28T14:05:32
| 2025-01-28T14:05:32
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi all!
I'm a GPU dev who has been messing around with Ollama for some self hosting. I was wondering if there is any reason Ollama has not been able to take advantage of GPU acceleration while using system RAM through RDMA(reBar). I have done system ram access through RDMA on GPU for real time processing and have had better results than CPU side tasks despite the increase in data latency when going over PCIE.
I look forward to hearing from you!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8635/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8635/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3117
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3117/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3117/comments
|
https://api.github.com/repos/ollama/ollama/issues/3117/events
|
https://github.com/ollama/ollama/issues/3117
| 2,184,544,645
|
I_kwDOJ0Z1Ps6CNYGF
| 3,117
|
Api /tags should include type for embedding model or llm
|
{
"login": "Hansson0728",
"id": 9604420,
"node_id": "MDQ6VXNlcjk2MDQ0MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/9604420?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hansson0728",
"html_url": "https://github.com/Hansson0728",
"followers_url": "https://api.github.com/users/Hansson0728/followers",
"following_url": "https://api.github.com/users/Hansson0728/following{/other_user}",
"gists_url": "https://api.github.com/users/Hansson0728/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Hansson0728/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hansson0728/subscriptions",
"organizations_url": "https://api.github.com/users/Hansson0728/orgs",
"repos_url": "https://api.github.com/users/Hansson0728/repos",
"events_url": "https://api.github.com/users/Hansson0728/events{/privacy}",
"received_events_url": "https://api.github.com/users/Hansson0728/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 5
| 2024-03-13T17:30:07
| 2024-11-06T17:57:40
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
As the title says, it would be nice to have that information so we can filter out embedd models if we want to allow for model switching on a frontend
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3117/reactions",
"total_count": 6,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/3117/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6850
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6850/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6850/comments
|
https://api.github.com/repos/ollama/ollama/issues/6850/events
|
https://github.com/ollama/ollama/pull/6850
| 2,532,475,441
|
PR_kwDOJ0Z1Ps571Xsr
| 6,850
|
allow ctl-j to add a new line + fix multiline bracketed paste
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 1
| 2024-09-18T01:17:00
| 2024-09-20T18:13:20
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6850",
"html_url": "https://github.com/ollama/ollama/pull/6850",
"diff_url": "https://github.com/ollama/ollama/pull/6850.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6850.patch",
"merged_at": null
}
|
This change allows users to use Ctrl-J to send a newline when typing in the terminal (the equivalent of Ctrl-Enter). It also fixes a glitch in bracketed paste mode if `"""` was part of the paste.
Fixes #3387 #6674
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6850/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6850/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3242
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3242/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3242/comments
|
https://api.github.com/repos/ollama/ollama/issues/3242/events
|
https://github.com/ollama/ollama/issues/3242
| 2,194,553,155
|
I_kwDOJ0Z1Ps6CzjlD
| 3,242
|
When I run the model my CPU usage is high but GPU usage is low
|
{
"login": "wangshuai67",
"id": 13214849,
"node_id": "MDQ6VXNlcjEzMjE0ODQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/13214849?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wangshuai67",
"html_url": "https://github.com/wangshuai67",
"followers_url": "https://api.github.com/users/wangshuai67/followers",
"following_url": "https://api.github.com/users/wangshuai67/following{/other_user}",
"gists_url": "https://api.github.com/users/wangshuai67/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wangshuai67/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wangshuai67/subscriptions",
"organizations_url": "https://api.github.com/users/wangshuai67/orgs",
"repos_url": "https://api.github.com/users/wangshuai67/repos",
"events_url": "https://api.github.com/users/wangshuai67/events{/privacy}",
"received_events_url": "https://api.github.com/users/wangshuai67/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 9
| 2024-03-19T10:15:12
| 2024-04-15T22:56:29
| 2024-04-15T22:56:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
1. This is the gpu information of docker container


2. This is the gpu information of the host machine


### What did you expect to see?
Is there any abnormality in the above information?
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
x86
### Platform
Docker
### Ollama version
0.1.27
### GPU
Nvidia
### GPU info

### CPU
_No response_
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3242/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 1,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3242/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5642
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5642/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5642/comments
|
https://api.github.com/repos/ollama/ollama/issues/5642/events
|
https://github.com/ollama/ollama/issues/5642
| 2,404,449,099
|
I_kwDOJ0Z1Ps6PUPtL
| 5,642
|
退出后显存仍在占用 - Video memory is still occupied after exiting
|
{
"login": "gfkdliucheng",
"id": 24772003,
"node_id": "MDQ6VXNlcjI0NzcyMDAz",
"avatar_url": "https://avatars.githubusercontent.com/u/24772003?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gfkdliucheng",
"html_url": "https://github.com/gfkdliucheng",
"followers_url": "https://api.github.com/users/gfkdliucheng/followers",
"following_url": "https://api.github.com/users/gfkdliucheng/following{/other_user}",
"gists_url": "https://api.github.com/users/gfkdliucheng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gfkdliucheng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gfkdliucheng/subscriptions",
"organizations_url": "https://api.github.com/users/gfkdliucheng/orgs",
"repos_url": "https://api.github.com/users/gfkdliucheng/repos",
"events_url": "https://api.github.com/users/gfkdliucheng/events{/privacy}",
"received_events_url": "https://api.github.com/users/gfkdliucheng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-07-12T00:51:54
| 2024-08-09T23:39:10
| 2024-08-09T23:39:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
退出后显存仍在占用
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.2.2
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5642/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5642/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7712
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7712/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7712/comments
|
https://api.github.com/repos/ollama/ollama/issues/7712/events
|
https://github.com/ollama/ollama/pull/7712
| 2,666,759,805
|
PR_kwDOJ0Z1Ps6CLP44
| 7,712
|
Update README.md
|
{
"login": "samirgaire10",
"id": 118608337,
"node_id": "U_kgDOBxHR0Q",
"avatar_url": "https://avatars.githubusercontent.com/u/118608337?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/samirgaire10",
"html_url": "https://github.com/samirgaire10",
"followers_url": "https://api.github.com/users/samirgaire10/followers",
"following_url": "https://api.github.com/users/samirgaire10/following{/other_user}",
"gists_url": "https://api.github.com/users/samirgaire10/gists{/gist_id}",
"starred_url": "https://api.github.com/users/samirgaire10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/samirgaire10/subscriptions",
"organizations_url": "https://api.github.com/users/samirgaire10/orgs",
"repos_url": "https://api.github.com/users/samirgaire10/repos",
"events_url": "https://api.github.com/users/samirgaire10/events{/privacy}",
"received_events_url": "https://api.github.com/users/samirgaire10/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-18T00:11:01
| 2024-11-18T05:04:01
| 2024-11-18T03:53:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7712",
"html_url": "https://github.com/ollama/ollama/pull/7712",
"diff_url": "https://github.com/ollama/ollama/pull/7712.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7712.patch",
"merged_at": null
}
| null |
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7712/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7712/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1534
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1534/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1534/comments
|
https://api.github.com/repos/ollama/ollama/issues/1534/events
|
https://github.com/ollama/ollama/issues/1534
| 2,042,687,646
|
I_kwDOJ0Z1Ps55wPCe
| 1,534
|
macOS M2 32 GB -- processing failed
|
{
"login": "enzyme69",
"id": 3952687,
"node_id": "MDQ6VXNlcjM5NTI2ODc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3952687?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enzyme69",
"html_url": "https://github.com/enzyme69",
"followers_url": "https://api.github.com/users/enzyme69/followers",
"following_url": "https://api.github.com/users/enzyme69/following{/other_user}",
"gists_url": "https://api.github.com/users/enzyme69/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enzyme69/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enzyme69/subscriptions",
"organizations_url": "https://api.github.com/users/enzyme69/orgs",
"repos_url": "https://api.github.com/users/enzyme69/repos",
"events_url": "https://api.github.com/users/enzyme69/events{/privacy}",
"received_events_url": "https://api.github.com/users/enzyme69/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2023-12-15T00:14:12
| 2024-01-08T21:42:03
| 2024-01-08T21:42:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I get error message:
"Error: llama runner process has terminated"
Does that mean it run out of memory?
Is it possible to make it smaller?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1534/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1534/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2301
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2301/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2301/comments
|
https://api.github.com/repos/ollama/ollama/issues/2301/events
|
https://github.com/ollama/ollama/issues/2301
| 2,111,401,682
|
I_kwDOJ0Z1Ps592W7S
| 2,301
|
Batching support in Ollama
|
{
"login": "canamika27",
"id": 41502651,
"node_id": "MDQ6VXNlcjQxNTAyNjUx",
"avatar_url": "https://avatars.githubusercontent.com/u/41502651?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/canamika27",
"html_url": "https://github.com/canamika27",
"followers_url": "https://api.github.com/users/canamika27/followers",
"following_url": "https://api.github.com/users/canamika27/following{/other_user}",
"gists_url": "https://api.github.com/users/canamika27/gists{/gist_id}",
"starred_url": "https://api.github.com/users/canamika27/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/canamika27/subscriptions",
"organizations_url": "https://api.github.com/users/canamika27/orgs",
"repos_url": "https://api.github.com/users/canamika27/repos",
"events_url": "https://api.github.com/users/canamika27/events{/privacy}",
"received_events_url": "https://api.github.com/users/canamika27/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-02-01T03:08:39
| 2024-02-05T19:24:23
| 2024-02-05T19:24:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Does ollama supports batching ?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2301/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2301/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2973
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2973/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2973/comments
|
https://api.github.com/repos/ollama/ollama/issues/2973/events
|
https://github.com/ollama/ollama/pull/2973
| 2,173,079,555
|
PR_kwDOJ0Z1Ps5o7I3k
| 2,973
|
fix some typos
|
{
"login": "hishope",
"id": 153272819,
"node_id": "U_kgDOCSLB8w",
"avatar_url": "https://avatars.githubusercontent.com/u/153272819?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hishope",
"html_url": "https://github.com/hishope",
"followers_url": "https://api.github.com/users/hishope/followers",
"following_url": "https://api.github.com/users/hishope/following{/other_user}",
"gists_url": "https://api.github.com/users/hishope/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hishope/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hishope/subscriptions",
"organizations_url": "https://api.github.com/users/hishope/orgs",
"repos_url": "https://api.github.com/users/hishope/repos",
"events_url": "https://api.github.com/users/hishope/events{/privacy}",
"received_events_url": "https://api.github.com/users/hishope/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-07T06:42:44
| 2024-03-07T06:50:12
| 2024-03-07T06:50:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2973",
"html_url": "https://github.com/ollama/ollama/pull/2973",
"diff_url": "https://github.com/ollama/ollama/pull/2973.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2973.patch",
"merged_at": "2024-03-07T06:50:12"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2973/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2973/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4977
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4977/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4977/comments
|
https://api.github.com/repos/ollama/ollama/issues/4977/events
|
https://github.com/ollama/ollama/issues/4977
| 2,346,155,878
|
I_kwDOJ0Z1Ps6L139m
| 4,977
|
qwen2-72b start to output gibberish at some point if i set num_ctx to 8192
|
{
"login": "Mikhael-Danilov",
"id": 536516,
"node_id": "MDQ6VXNlcjUzNjUxNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/536516?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Mikhael-Danilov",
"html_url": "https://github.com/Mikhael-Danilov",
"followers_url": "https://api.github.com/users/Mikhael-Danilov/followers",
"following_url": "https://api.github.com/users/Mikhael-Danilov/following{/other_user}",
"gists_url": "https://api.github.com/users/Mikhael-Danilov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Mikhael-Danilov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Mikhael-Danilov/subscriptions",
"organizations_url": "https://api.github.com/users/Mikhael-Danilov/orgs",
"repos_url": "https://api.github.com/users/Mikhael-Danilov/repos",
"events_url": "https://api.github.com/users/Mikhael-Danilov/events{/privacy}",
"received_events_url": "https://api.github.com/users/Mikhael-Danilov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 4
| 2024-06-11T11:19:23
| 2024-08-27T08:18:25
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
qwen2-72b start to output gibberish like this:
`.5"5.F9(CB;6@FC9!DC:$B$D60G5",3B+2;1-*,@%=876E0;5*:.98G4!980+D`
at some point if i set num_ctx to 8192.
Normal output from llm was expected.
Issue persist when using `ollama run`, or when using api (Silly Tavern)
qwen2-72b works fine with num_ctx 2048
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.42
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4977/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4977/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3970
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3970/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3970/comments
|
https://api.github.com/repos/ollama/ollama/issues/3970/events
|
https://github.com/ollama/ollama/pull/3970
| 2,266,821,897
|
PR_kwDOJ0Z1Ps5t5h1H
| 3,970
|
types/model: remove Digest (for now)
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-27T03:59:55
| 2024-04-27T04:14:29
| 2024-04-27T04:14:28
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3970",
"html_url": "https://github.com/ollama/ollama/pull/3970",
"diff_url": "https://github.com/ollama/ollama/pull/3970.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3970.patch",
"merged_at": "2024-04-27T04:14:28"
}
|
The Digest type needs more thought and is not necessary at the moment.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3970/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3970/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8308
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8308/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8308/comments
|
https://api.github.com/repos/ollama/ollama/issues/8308/events
|
https://github.com/ollama/ollama/pull/8308
| 2,769,135,279
|
PR_kwDOJ0Z1Ps6GvnxW
| 8,308
|
llama: update vendored code to commit 46e3556
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-05T06:19:27
| 2025-01-08T19:22:04
| 2025-01-08T19:22:01
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8308",
"html_url": "https://github.com/ollama/ollama/pull/8308",
"diff_url": "https://github.com/ollama/ollama/pull/8308.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8308.patch",
"merged_at": "2025-01-08T19:22:01"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8308/reactions",
"total_count": 19,
"+1": 11,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 4,
"eyes": 4
}
|
https://api.github.com/repos/ollama/ollama/issues/8308/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8243
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8243/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8243/comments
|
https://api.github.com/repos/ollama/ollama/issues/8243/events
|
https://github.com/ollama/ollama/issues/8243
| 2,759,160,472
|
I_kwDOJ0Z1Ps6kdXKY
| 8,243
|
glm-edge-v-5b-gguf:Q6_K blk.0.attn_qkv.weight
|
{
"login": "enryteam",
"id": 20081090,
"node_id": "MDQ6VXNlcjIwMDgxMDkw",
"avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enryteam",
"html_url": "https://github.com/enryteam",
"followers_url": "https://api.github.com/users/enryteam/followers",
"following_url": "https://api.github.com/users/enryteam/following{/other_user}",
"gists_url": "https://api.github.com/users/enryteam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enryteam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enryteam/subscriptions",
"organizations_url": "https://api.github.com/users/enryteam/orgs",
"repos_url": "https://api.github.com/users/enryteam/repos",
"events_url": "https://api.github.com/users/enryteam/events{/privacy}",
"received_events_url": "https://api.github.com/users/enryteam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-12-26T01:24:58
| 2024-12-26T01:24:58
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
PS C:\Users\Administrator> ollama run modelscope.cn/ZhipuAI/glm-edge-v-5b-gguf:Q6_K
Error: llama runner process has terminated: error loading model: missing tensor 'blk.0.attn_qkv.weight'****
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8243/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8243/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1992
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1992/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1992/comments
|
https://api.github.com/repos/ollama/ollama/issues/1992/events
|
https://github.com/ollama/ollama/issues/1992
| 2,080,878,134
|
I_kwDOJ0Z1Ps58B642
| 1,992
|
CUDA GPU is too old
|
{
"login": "tlaanemaa",
"id": 10545187,
"node_id": "MDQ6VXNlcjEwNTQ1MTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/10545187?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tlaanemaa",
"html_url": "https://github.com/tlaanemaa",
"followers_url": "https://api.github.com/users/tlaanemaa/followers",
"following_url": "https://api.github.com/users/tlaanemaa/following{/other_user}",
"gists_url": "https://api.github.com/users/tlaanemaa/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tlaanemaa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tlaanemaa/subscriptions",
"organizations_url": "https://api.github.com/users/tlaanemaa/orgs",
"repos_url": "https://api.github.com/users/tlaanemaa/repos",
"events_url": "https://api.github.com/users/tlaanemaa/events{/privacy}",
"received_events_url": "https://api.github.com/users/tlaanemaa/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-01-14T20:05:00
| 2024-05-06T18:16:54
| 2024-01-14T22:10:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello.
First of all, thanks for bringing us this awesome project!
I have a pretty old GPU, Nvidia GTX 970, but it used to work fine with Ollama 0.1.15.
Now I upgraded to 0.1.20 and I get the following error:
```
2024/01/14 19:50:06 gpu.go:88: Detecting GPU type
2024/01/14 19:50:06 gpu.go:203: Searching for GPU management library libnvidia-ml.so
2024/01/14 19:50:06 gpu.go:248: Discovered GPU libraries: [/usr/lib/x86_64-linux-gnu/libnvidia-ml.so.1]
2024/01/14 19:50:06 gpu.go:94: Nvidia GPU detected
2024/01/14 19:50:06 gpu.go:138: CUDA GPU is too old. Falling back to CPU mode. Compute Capability detected: 5.2
2024/01/14 19:50:06 routes.go:953: no GPU detected
```
Im running Ollama in docker with GPU pass through and it seems to show up within the container:
```
root@a84d0bca74d1:/# nvidia-smi
Sun Jan 14 20:03:51 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 545.36 Driver Version: 546.33 CUDA Version: 12.3 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce GTX 970 On | 00000000:01:00.0 On | N/A |
| 60% 29C P8 13W / 151W | 566MiB / 4096MiB | 3% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| No running processes found |
+---------------------------------------------------------------------------------------+
```
I realize my GPU is old, but it used to work.
Do you know if there's a way to make it work again? I'd prefer to not be stuck on 0.1.15, if possible 😅
I'm happy to build the docker image from source, if thats needed.
Thanks in advance!
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1992/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1992/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4374
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4374/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4374/comments
|
https://api.github.com/repos/ollama/ollama/issues/4374/events
|
https://github.com/ollama/ollama/issues/4374
| 2,291,272,472
|
I_kwDOJ0Z1Ps6IkgsY
| 4,374
|
how to write script so that it will remember the last conversation .
|
{
"login": "View-my-Git-Lab-krafi",
"id": 121858831,
"node_id": "U_kgDOB0NrDw",
"avatar_url": "https://avatars.githubusercontent.com/u/121858831?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/View-my-Git-Lab-krafi",
"html_url": "https://github.com/View-my-Git-Lab-krafi",
"followers_url": "https://api.github.com/users/View-my-Git-Lab-krafi/followers",
"following_url": "https://api.github.com/users/View-my-Git-Lab-krafi/following{/other_user}",
"gists_url": "https://api.github.com/users/View-my-Git-Lab-krafi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/View-my-Git-Lab-krafi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/View-my-Git-Lab-krafi/subscriptions",
"organizations_url": "https://api.github.com/users/View-my-Git-Lab-krafi/orgs",
"repos_url": "https://api.github.com/users/View-my-Git-Lab-krafi/repos",
"events_url": "https://api.github.com/users/View-my-Git-Lab-krafi/events{/privacy}",
"received_events_url": "https://api.github.com/users/View-my-Git-Lab-krafi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 11
| 2024-05-12T10:32:19
| 2024-05-14T17:45:43
| 2024-05-14T17:45:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
You:: my name is rafi
AI: Nice to meet you, Rafi !
You:: what was my name
AI: I apologize , but I don 't think we ever established a specific name for you in our conversation!
```
**whether i use the**
localhost:11434/api/generate or
http://localhost:11434/api/chat same result
```
# Define API endpoint
$apiEndpoint = "http://localhost:11434/api/generate"
# Function to send prompt to API and receive response
function GetResponse {
param (
[string]$prompt
)
$payload = @{
"model" = "llama3"
"prompt" = $prompt
} | ConvertTo-Json
$response = Invoke-RestMethod -Uri $apiEndpoint -Method Post -Body $payload -ContentType "application/json"
# Parse each JSON response and return the generated text
$responseObjects = $response -split "`n" | ForEach-Object { $_ | ConvertFrom-Json }
$generatedText = foreach ($responseObject in $responseObjects) {
$responseObject.response
}
return ($generatedText -join " ")
}
# Initial prompt
Write-Output "AI: Hi there! I'm AI, nice to meet you! Is there something on your mind that you'd like to chat about or ask for help with? I'm here to listen and assist if I can."
# Main loop
while ($true) {
# Prompt user for input
$userInput = Read-Host "You:"
# Send user input to API and receive response
$responseText = GetResponse -prompt $userInput
# Display response
Write-Output "AI: $responseText"
}
```
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
ollama version is 0.1.33
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4374/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4374/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6478
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6478/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6478/comments
|
https://api.github.com/repos/ollama/ollama/issues/6478/events
|
https://github.com/ollama/ollama/issues/6478
| 2,483,404,693
|
I_kwDOJ0Z1Ps6UBb-V
| 6,478
|
Add linux start command to docs
|
{
"login": "bdytx5",
"id": 32812705,
"node_id": "MDQ6VXNlcjMyODEyNzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/32812705?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bdytx5",
"html_url": "https://github.com/bdytx5",
"followers_url": "https://api.github.com/users/bdytx5/followers",
"following_url": "https://api.github.com/users/bdytx5/following{/other_user}",
"gists_url": "https://api.github.com/users/bdytx5/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bdytx5/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bdytx5/subscriptions",
"organizations_url": "https://api.github.com/users/bdytx5/orgs",
"repos_url": "https://api.github.com/users/bdytx5/repos",
"events_url": "https://api.github.com/users/bdytx5/events{/privacy}",
"received_events_url": "https://api.github.com/users/bdytx5/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-08-23T15:39:22
| 2024-08-24T20:09:20
| 2024-08-23T20:55:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
nohup ollama serve &
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6478/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6478/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8291
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8291/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8291/comments
|
https://api.github.com/repos/ollama/ollama/issues/8291/events
|
https://github.com/ollama/ollama/issues/8291
| 2,766,900,877
|
I_kwDOJ0Z1Ps6k646N
| 8,291
|
disable cpu offload for runing llm
|
{
"login": "verigle",
"id": 32769358,
"node_id": "MDQ6VXNlcjMyNzY5MzU4",
"avatar_url": "https://avatars.githubusercontent.com/u/32769358?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/verigle",
"html_url": "https://github.com/verigle",
"followers_url": "https://api.github.com/users/verigle/followers",
"following_url": "https://api.github.com/users/verigle/following{/other_user}",
"gists_url": "https://api.github.com/users/verigle/gists{/gist_id}",
"starred_url": "https://api.github.com/users/verigle/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/verigle/subscriptions",
"organizations_url": "https://api.github.com/users/verigle/orgs",
"repos_url": "https://api.github.com/users/verigle/repos",
"events_url": "https://api.github.com/users/verigle/events{/privacy}",
"received_events_url": "https://api.github.com/users/verigle/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2025-01-03T02:52:35
| 2025-01-16T00:06:00
| 2025-01-16T00:06:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
cpu will be auto offload to cpu ,although has more than one gpu for free, so I want to disable cpu offload for llm inference.
> 94%/6% CPU/GPU
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8291/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8291/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/7693
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7693/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7693/comments
|
https://api.github.com/repos/ollama/ollama/issues/7693/events
|
https://github.com/ollama/ollama/pull/7693
| 2,663,114,892
|
PR_kwDOJ0Z1Ps6CFyyT
| 7,693
|
[docs] [modelfile.md] num_predict: incorrect default value
|
{
"login": "owboson",
"id": 115831817,
"node_id": "U_kgDOBud0CQ",
"avatar_url": "https://avatars.githubusercontent.com/u/115831817?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/owboson",
"html_url": "https://github.com/owboson",
"followers_url": "https://api.github.com/users/owboson/followers",
"following_url": "https://api.github.com/users/owboson/following{/other_user}",
"gists_url": "https://api.github.com/users/owboson/gists{/gist_id}",
"starred_url": "https://api.github.com/users/owboson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/owboson/subscriptions",
"organizations_url": "https://api.github.com/users/owboson/orgs",
"repos_url": "https://api.github.com/users/owboson/repos",
"events_url": "https://api.github.com/users/owboson/events{/privacy}",
"received_events_url": "https://api.github.com/users/owboson/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-11-15T20:51:47
| 2024-12-03T23:00:05
| 2024-12-03T23:00:05
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7693",
"html_url": "https://github.com/ollama/ollama/pull/7693",
"diff_url": "https://github.com/ollama/ollama/pull/7693.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7693.patch",
"merged_at": "2024-12-03T23:00:05"
}
|
The default value for `num_predict` in the documentations was incorrect (see https://github.com/ollama/ollama/issues/7691#issuecomment-2479856306).
Fixes #7691
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7693/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7693/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1584
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1584/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1584/comments
|
https://api.github.com/repos/ollama/ollama/issues/1584/events
|
https://github.com/ollama/ollama/issues/1584
| 2,047,376,587
|
I_kwDOJ0Z1Ps56CHzL
| 1,584
|
is ollama server down?
|
{
"login": "ralyodio",
"id": 27381,
"node_id": "MDQ6VXNlcjI3Mzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/27381?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ralyodio",
"html_url": "https://github.com/ralyodio",
"followers_url": "https://api.github.com/users/ralyodio/followers",
"following_url": "https://api.github.com/users/ralyodio/following{/other_user}",
"gists_url": "https://api.github.com/users/ralyodio/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ralyodio/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ralyodio/subscriptions",
"organizations_url": "https://api.github.com/users/ralyodio/orgs",
"repos_url": "https://api.github.com/users/ralyodio/repos",
"events_url": "https://api.github.com/users/ralyodio/events{/privacy}",
"received_events_url": "https://api.github.com/users/ralyodio/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2023-12-18T20:12:36
| 2023-12-21T11:46:21
| 2023-12-19T15:09:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I've beeing getting "Server connection error" the past few hours with ollama-webui
|
{
"login": "ralyodio",
"id": 27381,
"node_id": "MDQ6VXNlcjI3Mzgx",
"avatar_url": "https://avatars.githubusercontent.com/u/27381?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ralyodio",
"html_url": "https://github.com/ralyodio",
"followers_url": "https://api.github.com/users/ralyodio/followers",
"following_url": "https://api.github.com/users/ralyodio/following{/other_user}",
"gists_url": "https://api.github.com/users/ralyodio/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ralyodio/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ralyodio/subscriptions",
"organizations_url": "https://api.github.com/users/ralyodio/orgs",
"repos_url": "https://api.github.com/users/ralyodio/repos",
"events_url": "https://api.github.com/users/ralyodio/events{/privacy}",
"received_events_url": "https://api.github.com/users/ralyodio/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1584/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1584/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6956
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6956/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6956/comments
|
https://api.github.com/repos/ollama/ollama/issues/6956/events
|
https://github.com/ollama/ollama/issues/6956
| 2,548,322,589
|
I_kwDOJ0Z1Ps6X5FEd
| 6,956
|
Why doesn't the model know which model it is?
|
{
"login": "robotom",
"id": 45123215,
"node_id": "MDQ6VXNlcjQ1MTIzMjE1",
"avatar_url": "https://avatars.githubusercontent.com/u/45123215?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robotom",
"html_url": "https://github.com/robotom",
"followers_url": "https://api.github.com/users/robotom/followers",
"following_url": "https://api.github.com/users/robotom/following{/other_user}",
"gists_url": "https://api.github.com/users/robotom/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robotom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robotom/subscriptions",
"organizations_url": "https://api.github.com/users/robotom/orgs",
"repos_url": "https://api.github.com/users/robotom/repos",
"events_url": "https://api.github.com/users/robotom/events{/privacy}",
"received_events_url": "https://api.github.com/users/robotom/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-09-25T15:34:17
| 2024-09-26T16:05:45
| 2024-09-26T16:05:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
If I load llama 3.1 8B and ask it which model it is, it does not know what LLaMa 3.1 is at all. Sometimes it thinks it's LLama 3 or a 7B param model. Is there any reason for this? How can I be sure what I am running except for whatever `ollama ps` reports?
(running on 4070 8GB VRAM and i7-13700HX)
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
ollama 3.1:8B
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6956/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6956/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3267
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3267/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3267/comments
|
https://api.github.com/repos/ollama/ollama/issues/3267/events
|
https://github.com/ollama/ollama/issues/3267
| 2,197,128,530
|
I_kwDOJ0Z1Ps6C9YVS
| 3,267
|
CUDA Error when changing models
|
{
"login": "iamashwin99",
"id": 46030335,
"node_id": "MDQ6VXNlcjQ2MDMwMzM1",
"avatar_url": "https://avatars.githubusercontent.com/u/46030335?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iamashwin99",
"html_url": "https://github.com/iamashwin99",
"followers_url": "https://api.github.com/users/iamashwin99/followers",
"following_url": "https://api.github.com/users/iamashwin99/following{/other_user}",
"gists_url": "https://api.github.com/users/iamashwin99/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iamashwin99/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iamashwin99/subscriptions",
"organizations_url": "https://api.github.com/users/iamashwin99/orgs",
"repos_url": "https://api.github.com/users/iamashwin99/repos",
"events_url": "https://api.github.com/users/iamashwin99/events{/privacy}",
"received_events_url": "https://api.github.com/users/iamashwin99/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-03-20T10:01:07
| 2024-04-15T22:58:04
| 2024-04-15T22:58:04
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I ran a query on ollama on 0.1.29 first using `llama2` then `nomic-embed-text` and then back to `llama2` .
On the third change of model I get the cuda error:
```console
llama_new_context_with_model: CUDA7 compute buffer size = 3.00 MiB
llama_new_context_with_model: CUDA_Host compute buffer size = 1.50 MiB
llama_new_context_with_model: graph splits (measure): 9
loading library /tmp/ollama126694761/runners/cuda_v11/libext_server.so
{"function":"initialize","level":"INFO","line":440,"msg":"initializing slots","n_slots":1,"tid":"140511969015552","timestamp":1710928189}
{"function":"initialize","level":"INFO","line":449,"msg":"new slot","n_ctx_slot":2048,"slot_id":0,"tid":"140511969015552","timestamp":1710928189}
time=2024-03-20T10:49:49.921+01:00 level=INFO source=dyn_ext_server.go:162 msg="Starting llama main loop"
{"function":"update_slots","level":"INFO","line":1590,"msg":"all slots are idle and system prompt is empty, clear the KV cache","tid":"140509200766720","timestamp":1710928189}
{"function":"launch_slot_with_data","level":"INFO","line":830,"msg":"slot is processing task","slot_id":0,"task_id":0,"tid":"140509200766720","timestamp":1710928189}
{"function":"update_slots","level":"INFO","line":1848,"msg":"kv cache rm [p0, end)","p0":0,"slot_id":0,"task_id":0,"tid":"140509200766720","timestamp":1710928189}
{"function":"update_slots","level":"INFO","line":1652,"msg":"slot released","n_cache_tokens":8,"n_ctx":2048,"n_past":8,"n_system_tokens":0,"slot_id":0,"task_id":0,"tid":"140509200766720","timestamp":1710928189,"truncated":false}
[GIN] 2024/03/20 - 10:49:50 | 200 | 4.469532267s | 10.254.6.122 | POST "/api/embeddings"
time=2024-03-20T10:49:50.059+01:00 level=INFO source=routes.go:79 msg="changing loaded model"
CUDA error: invalid argument
current device: 6, in function ggml_free_cublas at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:12501
cuMemUnmap(g_cuda_pool_addr[id], g_cuda_pool_size[id])
GGML_ASSERT: /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml-cuda.cu:256: !"CUDA error"
[New LWP 586182]
[New LWP 586183]
[New LWP 586184]
```
Full log at https://sprunge.us/qACpmh
### What did you expect to see?
No errors.
### Steps to reproduce
```console
ollama serve
ollama pull nomic-embed-text
ollama pull llama2
curl -X POST http://localhost:11434/api/generate -d '{
"model": "llama2",
"prompt":"Why is the sky blue?"
}'
curl http://localhost:11434/api/embeddings -d '{
"model": "nomic-embed-text",
"prompt": "The sky is blue because of Rayleigh scattering"
}'
curl -X POST http://localhost:11434/api/generate -d '{
"model": "llama2",
"prompt":"Why is the sky blue?"
}'
# Fails here
```
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
amd64
### Platform
_No response_
### Ollama version
0.1.29
### GPU
_No response_
### GPU info
8x Tesla V100
### CPU
Intel
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3267/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3267/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1390
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1390/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1390/comments
|
https://api.github.com/repos/ollama/ollama/issues/1390/events
|
https://github.com/ollama/ollama/issues/1390
| 2,026,757,275
|
I_kwDOJ0Z1Ps54zdyb
| 1,390
|
`ollama create` not working
|
{
"login": "almonk",
"id": 51724,
"node_id": "MDQ6VXNlcjUxNzI0",
"avatar_url": "https://avatars.githubusercontent.com/u/51724?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/almonk",
"html_url": "https://github.com/almonk",
"followers_url": "https://api.github.com/users/almonk/followers",
"following_url": "https://api.github.com/users/almonk/following{/other_user}",
"gists_url": "https://api.github.com/users/almonk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/almonk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/almonk/subscriptions",
"organizations_url": "https://api.github.com/users/almonk/orgs",
"repos_url": "https://api.github.com/users/almonk/repos",
"events_url": "https://api.github.com/users/almonk/events{/privacy}",
"received_events_url": "https://api.github.com/users/almonk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2023-12-05T17:20:31
| 2023-12-05T20:18:02
| 2023-12-05T20:18:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Following the `Modelfile` tutorial in the readme, I can't get `ollama create` to work.
My modelfile is as follows:
```
FROM codellama:13b-instruct
SYSTEM """
You are Mario from super mario bros, acting as an assistant.
"""
```
When I attempt to create from the modelfile, I get the following error:
```
transferring model data
pulling model
pulling manifest
Error: pull model manifest: file does not exist
```
I have pulled the `codellama:13b-instruct` model to my machine and succesfully run it.
I'm using a M2 Macbook running ollama `0.1.13`
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1390/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1390/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/596
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/596/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/596/comments
|
https://api.github.com/repos/ollama/ollama/issues/596/events
|
https://github.com/ollama/ollama/pull/596
| 1,912,410,622
|
PR_kwDOJ0Z1Ps5bLFCt
| 596
|
update install.sh
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-25T23:12:52
| 2023-09-26T00:59:14
| 2023-09-26T00:59:14
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/596",
"html_url": "https://github.com/ollama/ollama/pull/596",
"diff_url": "https://github.com/ollama/ollama/pull/596.diff",
"patch_url": "https://github.com/ollama/ollama/pull/596.patch",
"merged_at": "2023-09-26T00:59:14"
}
|
This prevents the service from restarting too early and not detecting GPU before drivers are installed.
Fix PATH for WSL user. WSL preinstalls CUDA toolkit but it's in a non-standard path (`/usr/lib/wsl/lib`). While this is set for a normal WSL user, it's not set for the ollama user. This change sets PATH of the ollama service to the PATH of the caller
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/596/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/596/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6654
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6654/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6654/comments
|
https://api.github.com/repos/ollama/ollama/issues/6654/events
|
https://github.com/ollama/ollama/issues/6654
| 2,507,378,407
|
I_kwDOJ0Z1Ps6Vc47n
| 6,654
|
Multi-instance seems not working
|
{
"login": "bigsausage",
"id": 22679135,
"node_id": "MDQ6VXNlcjIyNjc5MTM1",
"avatar_url": "https://avatars.githubusercontent.com/u/22679135?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bigsausage",
"html_url": "https://github.com/bigsausage",
"followers_url": "https://api.github.com/users/bigsausage/followers",
"following_url": "https://api.github.com/users/bigsausage/following{/other_user}",
"gists_url": "https://api.github.com/users/bigsausage/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bigsausage/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bigsausage/subscriptions",
"organizations_url": "https://api.github.com/users/bigsausage/orgs",
"repos_url": "https://api.github.com/users/bigsausage/repos",
"events_url": "https://api.github.com/users/bigsausage/events{/privacy}",
"received_events_url": "https://api.github.com/users/bigsausage/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-09-05T10:19:01
| 2024-09-06T01:44:20
| 2024-09-05T16:16:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
i want to use multi-process to increase the concurrency of my server
i use the follow command to start the server first
`CUDA_VISIBLE_DEVICES=3 OLLAMA_NUM_PARALLEL=3 OLLAMA_MAX_LOADED_MODELS=3 /usr/bin/ollama serve`
then i created 3 different copy of the model
`ollama create my_lama3_1 -f ./Modelfile1`
`ollama create my_lama3_2 -f ./Modelfile2`
`ollama create my_lama3_2 -f ./Modelfile2`
and the `Modelfile1` `Modelfile2` `Modelfile3` points to the `llama3_quantize_1.gguf` ``llama3_quantize_2.gguf` ``llama3_quantize_3.gguf` which actually is the same int4 gguf (about 5GB).
`ollama list` shows that i got 3 different instance model but got the same id

`nvidia-smi` shows that the gpu only uses 6g

`ollama ps` shows that i got only one instance running.

i wrote a script to randomly distribute the request to one of the three models(my_lama3_1 / my_lama3_2 / my_lama3_3 ) to test the concurrency ability of my server , but the gpu still keeps at the range of 6GB, which i expect to be 6GB * 3 = 18GB to release all its ability ..
is there any way to get the 3 instance load into the gpu so i can get a better qps of my server?
or i was not using the command correctly and please tell me the correct way...
all i want is to improve the qps of my server, any advice is welcome!!
thanks ~
### OS
Linux
### GPU
Intel
### CPU
Intel
### Ollama version
0.2.5
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6654/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6654/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5507
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5507/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5507/comments
|
https://api.github.com/repos/ollama/ollama/issues/5507/events
|
https://github.com/ollama/ollama/pull/5507
| 2,393,208,927
|
PR_kwDOJ0Z1Ps50kfwE
| 5,507
|
llm: put back old include dir
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-05T22:43:28
| 2024-07-05T23:34:23
| 2024-07-05T23:34:21
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5507",
"html_url": "https://github.com/ollama/ollama/pull/5507",
"diff_url": "https://github.com/ollama/ollama/pull/5507.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5507.patch",
"merged_at": "2024-07-05T23:34:21"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5507/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5507/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3673
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3673/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3673/comments
|
https://api.github.com/repos/ollama/ollama/issues/3673/events
|
https://github.com/ollama/ollama/issues/3673
| 2,246,100,159
|
I_kwDOJ0Z1Ps6F4MS_
| 3,673
|
truly opensource model called olmo
|
{
"login": "olumolu",
"id": 162728301,
"node_id": "U_kgDOCbMJbQ",
"avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/olumolu",
"html_url": "https://github.com/olumolu",
"followers_url": "https://api.github.com/users/olumolu/followers",
"following_url": "https://api.github.com/users/olumolu/following{/other_user}",
"gists_url": "https://api.github.com/users/olumolu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/olumolu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/olumolu/subscriptions",
"organizations_url": "https://api.github.com/users/olumolu/orgs",
"repos_url": "https://api.github.com/users/olumolu/repos",
"events_url": "https://api.github.com/users/olumolu/events{/privacy}",
"received_events_url": "https://api.github.com/users/olumolu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-04-16T13:40:58
| 2024-04-20T13:03:43
| 2024-04-16T23:15:57
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What model would you like?
Build with truly open dataset and fully open-source model can this be supported in olllama thanks.
https://allenai.org/olmo
https://huggingface.co/allenai/OLMo-7B
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3673/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3673/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5388
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5388/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5388/comments
|
https://api.github.com/repos/ollama/ollama/issues/5388/events
|
https://github.com/ollama/ollama/issues/5388
| 2,382,035,386
|
I_kwDOJ0Z1Ps6N-vm6
| 5,388
|
Ollama fails to create model when blob is already present and drive is full
|
{
"login": "thot-experiment",
"id": 94414189,
"node_id": "U_kgDOBaClbQ",
"avatar_url": "https://avatars.githubusercontent.com/u/94414189?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thot-experiment",
"html_url": "https://github.com/thot-experiment",
"followers_url": "https://api.github.com/users/thot-experiment/followers",
"following_url": "https://api.github.com/users/thot-experiment/following{/other_user}",
"gists_url": "https://api.github.com/users/thot-experiment/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thot-experiment/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thot-experiment/subscriptions",
"organizations_url": "https://api.github.com/users/thot-experiment/orgs",
"repos_url": "https://api.github.com/users/thot-experiment/repos",
"events_url": "https://api.github.com/users/thot-experiment/events{/privacy}",
"received_events_url": "https://api.github.com/users/thot-experiment/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-06-30T01:25:59
| 2024-08-12T16:28:56
| 2024-08-12T16:28:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
If I try to create a model from a modelfile that references an existing blob, ollama (i think) copies the entire blob to a temp file before realizing it already exists and creating the model. This makes model import take needlessly long and means you need to have extra free space on you drive to fit the model twice. I would expect it to just create whatever metadata files it needs and leave the model alone.
FWIW there should probably be better a way to use existing models, I've been doing this manually by symlinking models into the blobs directory as the correct name, but the import process still requires a full copy of the model before it figures it out
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.47
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5388/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5388/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2822
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2822/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2822/comments
|
https://api.github.com/repos/ollama/ollama/issues/2822/events
|
https://github.com/ollama/ollama/issues/2822
| 2,160,194,172
|
I_kwDOJ0Z1Ps6AwfJ8
| 2,822
|
multiple idle ollama threads for each ollama serve process
|
{
"login": "aiseei",
"id": 30615541,
"node_id": "MDQ6VXNlcjMwNjE1NTQx",
"avatar_url": "https://avatars.githubusercontent.com/u/30615541?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aiseei",
"html_url": "https://github.com/aiseei",
"followers_url": "https://api.github.com/users/aiseei/followers",
"following_url": "https://api.github.com/users/aiseei/following{/other_user}",
"gists_url": "https://api.github.com/users/aiseei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aiseei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aiseei/subscriptions",
"organizations_url": "https://api.github.com/users/aiseei/orgs",
"repos_url": "https://api.github.com/users/aiseei/repos",
"events_url": "https://api.github.com/users/aiseei/events{/privacy}",
"received_events_url": "https://api.github.com/users/aiseei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-02-29T01:52:01
| 2024-04-09T15:05:15
| 2024-03-20T16:29:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Ubuntu 20.04
we run a small proxy that creates multiple ollama serve processes on diff ports. I have noticed in htop that there a ton of threads created but not disposed under each parent/master process.
This looks to be from every generate api call. Does ollama not manage this?
Is there a workround to safely closed unused threads ?
<img width="611" alt="image" src="https://github.com/ollama/ollama/assets/30615541/dd19047b-f181-4bae-a4a4-8eea23e2a58d">
Thanks!
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2822/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2822/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7795
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7795/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7795/comments
|
https://api.github.com/repos/ollama/ollama/issues/7795/events
|
https://github.com/ollama/ollama/issues/7795
| 2,682,642,543
|
I_kwDOJ0Z1Ps6f5eBv
| 7,795
|
Empty output from chat-endpoint / non-empty endpoint for non-chat endpoint
|
{
"login": "Tomas2D",
"id": 15633909,
"node_id": "MDQ6VXNlcjE1NjMzOTA5",
"avatar_url": "https://avatars.githubusercontent.com/u/15633909?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Tomas2D",
"html_url": "https://github.com/Tomas2D",
"followers_url": "https://api.github.com/users/Tomas2D/followers",
"following_url": "https://api.github.com/users/Tomas2D/following{/other_user}",
"gists_url": "https://api.github.com/users/Tomas2D/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Tomas2D/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Tomas2D/subscriptions",
"organizations_url": "https://api.github.com/users/Tomas2D/orgs",
"repos_url": "https://api.github.com/users/Tomas2D/repos",
"events_url": "https://api.github.com/users/Tomas2D/events{/privacy}",
"received_events_url": "https://api.github.com/users/Tomas2D/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 17
| 2024-11-22T09:44:47
| 2025-01-12T08:20:41
| 2024-12-09T19:02:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I request a chat endpoint with the attached request body, I receive an empty response (the content is an empty string) with `done_reason: stop`. When I send the exact same request wrapped (just wrapped in the appropriate models' template) to generate a (non-chat) endpoint, I receive the correct (non-empty) response.
## Chat request
### Request
```bash
$ curl -X POST -H "Content-Type: application/json" -d @chat_body.json http://127.0.0.1:11434/api/chat
```
File: [chat_body.json](https://github.com/user-attachments/files/17868125/chat_body.json)
### Response
```jsonl
{
"model": "llama3.1",
"created_at": "2024-11-22T09:32:06.13661Z",
"message": {
"role": "assistant",
"content": ""
},
"done_reason": "stop",
"done": true,
"total_duration": 4868301375,
"load_duration": 37070667,
"prompt_eval_count": 1257,
"prompt_eval_duration": 3522000000,
"eval_count": 1
}
```
## Non-Chat request
```bash
$ curl -X POST -H "Content-Type: application/json" -d @non_chat_body.json http://127.0.0.1:11434/api/generate
```
File: [non_chat_body.json](https://github.com/user-attachments/files/17868126/non_chat_body.json)
### Response
```jsonl
{
"model": "llama3.1",
"created_at": "2024-11-22T09:37:18.317133Z",
"response": "Final Answer: Why was the math book sad? Because it had too many problems.",
"done": true,
"done_reason": "stop",
"total_duration": 8587334375,
"load_duration": 51171791,
"prompt_eval_count": 1264,
"prompt_eval_duration": 344000000,
"eval_count": 18,
"eval_duration": 1479000000
}
```
# Closing notes
- There is a 5 % chance of giving a non-empty result.
- Before testing, I ensured I had the latest `ollama3.1:8b`.
- Colleague had no issue with Ollama version `0.3.12`.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.4.3
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7795/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7795/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2802
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2802/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2802/comments
|
https://api.github.com/repos/ollama/ollama/issues/2802/events
|
https://github.com/ollama/ollama/issues/2802
| 2,158,235,139
|
I_kwDOJ0Z1Ps6ApA4D
| 2,802
|
Madlad400 model
|
{
"login": "malipetek",
"id": 13527277,
"node_id": "MDQ6VXNlcjEzNTI3Mjc3",
"avatar_url": "https://avatars.githubusercontent.com/u/13527277?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/malipetek",
"html_url": "https://github.com/malipetek",
"followers_url": "https://api.github.com/users/malipetek/followers",
"following_url": "https://api.github.com/users/malipetek/following{/other_user}",
"gists_url": "https://api.github.com/users/malipetek/gists{/gist_id}",
"starred_url": "https://api.github.com/users/malipetek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/malipetek/subscriptions",
"organizations_url": "https://api.github.com/users/malipetek/orgs",
"repos_url": "https://api.github.com/users/malipetek/repos",
"events_url": "https://api.github.com/users/malipetek/events{/privacy}",
"received_events_url": "https://api.github.com/users/malipetek/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 7
| 2024-02-28T06:44:47
| 2024-09-16T11:45:06
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello, I wanted to test [madlad400](https://huggingface.co/jbochi/madlad400-3b-mt/blob/main/model-q4k.gguf) which said to be a great translator model.
I downloaded the GGUF and created a file with models name with only FROM line. It looks like model created but when I test-run it, it outputs 2 empty lines for some reason. And when I specify `--verbose` flag there is no report at the end of inference.
This should work in theory no?
<img width="842" alt="image" src="https://github.com/ollama/ollama/assets/13527277/f1cc65ea-4dd1-4c58-adc4-eb198f087dd3">
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2802/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2802/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5057
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5057/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5057/comments
|
https://api.github.com/repos/ollama/ollama/issues/5057/events
|
https://github.com/ollama/ollama/issues/5057
| 2,354,580,827
|
I_kwDOJ0Z1Ps6MWA1b
| 5,057
|
Is the location of saving the model different between automatic startup through 'systemictl' and manual 'serve'?
|
{
"login": "wszgrcy",
"id": 9607121,
"node_id": "MDQ6VXNlcjk2MDcxMjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/9607121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wszgrcy",
"html_url": "https://github.com/wszgrcy",
"followers_url": "https://api.github.com/users/wszgrcy/followers",
"following_url": "https://api.github.com/users/wszgrcy/following{/other_user}",
"gists_url": "https://api.github.com/users/wszgrcy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wszgrcy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wszgrcy/subscriptions",
"organizations_url": "https://api.github.com/users/wszgrcy/orgs",
"repos_url": "https://api.github.com/users/wszgrcy/repos",
"events_url": "https://api.github.com/users/wszgrcy/events{/privacy}",
"received_events_url": "https://api.github.com/users/wszgrcy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-15T06:14:48
| 2024-06-21T01:52:55
| 2024-06-21T01:52:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I pull model `qwen2:7b`.and `ollama list`.can see the model
```
> ollama list
NAME ID SIZE MODIFIED
qwen2:7b e0d4e1163c58 4.4 GB 6 hours ago
qwen:7b 2091ee8c8d8f 4.5 GB 2 weeks ago
```
but when I `systemctl stop` and `ollama serve`,`ollama list`. the model can' find
```
> ollama list
NAME ID SIZE MODIFIED
qwen:14b 80362ced6553 8.2 GB 10 days ago
qwen:7b 2091ee8c8d8f 4.5 GB 2 weeks ago
```
I've encountered it before, and I thought I didn't succeed in pulling it. It was found that the results of the 'olama list' were completely different
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.1.44
|
{
"login": "wszgrcy",
"id": 9607121,
"node_id": "MDQ6VXNlcjk2MDcxMjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/9607121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wszgrcy",
"html_url": "https://github.com/wszgrcy",
"followers_url": "https://api.github.com/users/wszgrcy/followers",
"following_url": "https://api.github.com/users/wszgrcy/following{/other_user}",
"gists_url": "https://api.github.com/users/wszgrcy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wszgrcy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wszgrcy/subscriptions",
"organizations_url": "https://api.github.com/users/wszgrcy/orgs",
"repos_url": "https://api.github.com/users/wszgrcy/repos",
"events_url": "https://api.github.com/users/wszgrcy/events{/privacy}",
"received_events_url": "https://api.github.com/users/wszgrcy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5057/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5057/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/284
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/284/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/284/comments
|
https://api.github.com/repos/ollama/ollama/issues/284/events
|
https://github.com/ollama/ollama/pull/284
| 1,837,002,931
|
PR_kwDOJ0Z1Ps5XNhWW
| 284
|
update to nous-hermes modelfile
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-08-04T15:57:43
| 2023-08-08T23:04:49
| 2023-08-08T23:04:49
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/284",
"html_url": "https://github.com/ollama/ollama/pull/284",
"diff_url": "https://github.com/ollama/ollama/pull/284.diff",
"patch_url": "https://github.com/ollama/ollama/pull/284.patch",
"merged_at": null
}
|
- The nous-hermes model will now accept a system prompt from a model that uses nous-hermes.
- also updated the midjourney-prompter to use a better name.
as per Hugging Face (https://huggingface.co/NousResearch/Nous-Hermes-13b#prompt-format), the prompt template is:
```
Prompt Format
The model follows the Alpaca prompt format:
### Instruction:
### Response:
or
### Instruction:
### Input:
### Response:
```
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/284/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/284/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2768
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2768/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2768/comments
|
https://api.github.com/repos/ollama/ollama/issues/2768/events
|
https://github.com/ollama/ollama/issues/2768
| 2,154,679,947
|
I_kwDOJ0Z1Ps6Abc6L
| 2,768
|
Ollama Not Running Failing to Load
|
{
"login": "TankMan649",
"id": 124530160,
"node_id": "U_kgDOB2wt8A",
"avatar_url": "https://avatars.githubusercontent.com/u/124530160?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TankMan649",
"html_url": "https://github.com/TankMan649",
"followers_url": "https://api.github.com/users/TankMan649/followers",
"following_url": "https://api.github.com/users/TankMan649/following{/other_user}",
"gists_url": "https://api.github.com/users/TankMan649/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TankMan649/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TankMan649/subscriptions",
"organizations_url": "https://api.github.com/users/TankMan649/orgs",
"repos_url": "https://api.github.com/users/TankMan649/repos",
"events_url": "https://api.github.com/users/TankMan649/events{/privacy}",
"received_events_url": "https://api.github.com/users/TankMan649/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-02-26T17:07:41
| 2024-03-12T00:00:04
| 2024-03-11T23:59:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I keep encountering a problem with Ollama and when it has been solved I have no idea how it was solved and everything I am doing to solve it nothing works.
I am running a Python script with LangChain and Ollama testing it on a a simple Gradio interface. Let me emphasize this is a script that has worked before and NOTHING has changed in the code.
It is a very simple Q&A interface from a RAG pipeline with indexed documents located in a LanceDB.
I have an RTX 3060TI with 96GB RAM and 2xCPUs with 12 cores each for 24 total.
I needed show the Gradio test to someone today and when I run the Python script, Gradio fires up, the Python code code runs smoothly but when you type in a question everything just spins and spins forever. No error messages. No traceback. Nothing. It just doesn't come to an answer.
Now when it has worked before I see Ollama in the GPU process list using memory through nvidia-smi and in the process list using significant CPU core power through htop. Now however, I do not see Ollama in the GPU or CPU processes.
I have tried upgrading Ollama, downgrading, systemctl, restarting the system, I've updated the cuda (that was in a previous iteration of this problem and I thought that maybe played a role in helping solve it the first time...guess I was wrong). Absolutely no idea what the problem is.
Thoughts?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2768/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2768/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6318
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6318/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6318/comments
|
https://api.github.com/repos/ollama/ollama/issues/6318/events
|
https://github.com/ollama/ollama/issues/6318
| 2,460,056,009
|
I_kwDOJ0Z1Ps6SoXnJ
| 6,318
|
ollama.app cannot open on my macbookpro with m3 pro
|
{
"login": "Spockkk0225",
"id": 54880260,
"node_id": "MDQ6VXNlcjU0ODgwMjYw",
"avatar_url": "https://avatars.githubusercontent.com/u/54880260?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Spockkk0225",
"html_url": "https://github.com/Spockkk0225",
"followers_url": "https://api.github.com/users/Spockkk0225/followers",
"following_url": "https://api.github.com/users/Spockkk0225/following{/other_user}",
"gists_url": "https://api.github.com/users/Spockkk0225/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Spockkk0225/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Spockkk0225/subscriptions",
"organizations_url": "https://api.github.com/users/Spockkk0225/orgs",
"repos_url": "https://api.github.com/users/Spockkk0225/repos",
"events_url": "https://api.github.com/users/Spockkk0225/events{/privacy}",
"received_events_url": "https://api.github.com/users/Spockkk0225/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-08-12T05:11:49
| 2024-09-02T22:01:22
| 2024-09-02T22:01:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
environment: macbook pro, m3 pro, 18gb memory, Sonoma 14.4.1
the ollama.app cannot be opened with double click
it reports segmentation fault when I execute it in terminal
\>\>\> /Applications/Ollama.app/Contents/MacOS/ollama
<<< segmentation fault /Applications/Ollama.app/Contents/MacOS/ollama
<img width="523" alt="截屏2024-08-12 13 07 49" src="https://github.com/user-attachments/assets/b7b0d4c3-66c0-4920-b7d0-ce8d8a7e4dde">
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.4
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6318/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6318/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5331
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5331/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5331/comments
|
https://api.github.com/repos/ollama/ollama/issues/5331/events
|
https://github.com/ollama/ollama/issues/5331
| 2,378,585,021
|
I_kwDOJ0Z1Ps6NxlO9
| 5,331
|
version 1.47 downloaded, gemma2 error
|
{
"login": "MeDott29",
"id": 13264408,
"node_id": "MDQ6VXNlcjEzMjY0NDA4",
"avatar_url": "https://avatars.githubusercontent.com/u/13264408?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MeDott29",
"html_url": "https://github.com/MeDott29",
"followers_url": "https://api.github.com/users/MeDott29/followers",
"following_url": "https://api.github.com/users/MeDott29/following{/other_user}",
"gists_url": "https://api.github.com/users/MeDott29/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MeDott29/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MeDott29/subscriptions",
"organizations_url": "https://api.github.com/users/MeDott29/orgs",
"repos_url": "https://api.github.com/users/MeDott29/repos",
"events_url": "https://api.github.com/users/MeDott29/events{/privacy}",
"received_events_url": "https://api.github.com/users/MeDott29/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 12
| 2024-06-27T16:17:09
| 2024-06-29T23:08:22
| 2024-06-29T23:06:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
Jun 27 12:06:15 ollama[11759]: INFO [main] build info | build=1 commit="7c26775" tid="124734763667456" timestamp=1719504375
Jun 27 12:06:15 ollama[11759]: INFO [main] system info | n_threads=4 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VN>
Jun 27 12:06:15 ollama[11759]: INFO [main] HTTP server listening | hostname="127.0.0.1" n_threads_http="7" port="42375" tid="124734763667456" timestamp=1719504375
Jun 27 12:06:15 ollama[10798]: llama_model_loader: loaded meta data with 32 key-value pairs and 464 tensors from /usr/share/ollama/.ollama/models/blobs/sha256-e84ed7399c82fbf7db>
Jun 27 12:06:15 ollama[10798]: llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 0: gemma2.attention.head_count u32 = 16
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 1: gemma2.attention.head_count_kv u32 = 8
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 2: gemma2.attention.key_length u32 = 256
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 3: gemma2.attention.layer_norm_rms_epsilon f32 = 0.000001
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 4: gemma2.attention.value_length u32 = 256
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 5: gemma2.block_count u32 = 42
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 6: gemma2.context_length u32 = 8192
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 7: gemma2.embedding_length u32 = 3584
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 8: gemma2.feed_forward_length u32 = 14336
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 9: general.architecture str = gemma2
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 10: general.file_type u32 = 2
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 11: general.name str = gemma2
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 12: general.quantization_version u32 = 2
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 13: tokenizer.chat_template str = {{ bos_token }}{% if messages[0]['rol...
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 14: tokenizer.ggml.add_bos_token bool = true
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 15: tokenizer.ggml.add_eos_token bool = false
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 16: tokenizer.ggml.add_padding_token bool = false
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 17: tokenizer.ggml.add_unknown_token bool = false
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 2
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 1
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 20: tokenizer.ggml.eot_token_id u32 = 107
Jun 27 12:06:15 ollama[10798]: time=2024-06-27T12:06:15.876-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server loading model"
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 21: tokenizer.ggml.merges arr[str,580604] = ["\n \n", "\n \n\n", "\n\n \n", "\n \n\n\n", "\n\n ...
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 22: tokenizer.ggml.middle_token_id u32 = 68
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 23: tokenizer.ggml.model str = llama
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 24: tokenizer.ggml.padding_token_id u32 = 0
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 25: tokenizer.ggml.pre str = default
Jun 27 12:06:15 ollama[10798]: llama_model_loader: - kv 26: tokenizer.ggml.prefix_token_id u32 = 67
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - kv 27: tokenizer.ggml.scores arr[f32,256000] = [0.000000, 0.000000, 0.000000, 0.0000...
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - kv 28: tokenizer.ggml.suffix_token_id u32 = 69
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - kv 29: tokenizer.ggml.token_type arr[i32,256000] = [3, 3, 3, 2, 1, 1, 1, 1, 1, 1, 1, 1, ...
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - kv 30: tokenizer.ggml.tokens arr[str,256000] = ["<pad>", "<eos>", "<bos>", "<unk>", ...
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - kv 31: tokenizer.ggml.unknown_token_id u32 = 3
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - type f32: 169 tensors
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - type q4_0: 294 tensors
Jun 27 12:06:16 ollama[10798]: llama_model_loader: - type q6_K: 1 tensors
Jun 27 12:06:16 ollama[10798]: llm_load_vocab: special tokens cache size = 260
Jun 27 12:06:16 ollama[10798]: llm_load_vocab: token to piece cache size = 1.6014 MB
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: format = GGUF V3 (latest)
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: arch = gemma2
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: vocab type = SPM
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_vocab = 256000
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_merges = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_ctx_train = 8192
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_embd = 3584
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_head = 16
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_head_kv = 8
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_layer = 42
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_rot = 224
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_embd_head_k = 256
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_embd_head_v = 256
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_gqa = 2
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_embd_k_gqa = 2048
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_embd_v_gqa = 2048
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: f_norm_eps = 0.0e+00
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: f_norm_rms_eps = 1.0e-06
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: f_clamp_kqv = 0.0e+00
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: f_max_alibi_bias = 0.0e+00
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: f_logit_scale = 0.0e+00
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_ff = 14336
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_expert = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_expert_used = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: causal attn = 1
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: pooling type = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: rope type = 2
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: rope scaling = linear
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: freq_base_train = 10000.0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: freq_scale_train = 1
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: n_ctx_orig_yarn = 8192
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: rope_finetuned = unknown
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: ssm_d_conv = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: ssm_d_inner = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: ssm_d_state = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: ssm_dt_rank = 0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: model type = ?B
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: model ftype = Q4_0
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: model params = 9.24 B
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: model size = 5.06 GiB (4.71 BPW)
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: general.name = gemma2
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: BOS token = 2 '<bos>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: EOS token = 1 '<eos>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: UNK token = 3 '<unk>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: PAD token = 0 '<pad>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: LF token = 227 '<0x0A>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: PRE token = 67 '<unused60>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: SUF token = 69 '<unused62>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: MID token = 68 '<unused61>'
Jun 27 12:06:16 ollama[10798]: llm_load_print_meta: EOT token = 107 '<end_of_turn>'
Jun 27 12:06:16 ollama[10798]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: yes
Jun 27 12:06:16 ollama[10798]: ggml_cuda_init: CUDA_USE_TENSOR_CORES: no
Jun 27 12:06:16 ollama[10798]: ggml_cuda_init: found 2 CUDA devices:
Jun 27 12:06:16 ollama[10798]: Device 0: NVIDIA GeForce GTX 1060 6GB, compute capability 6.1, VMM: yes
Jun 27 12:06:16 ollama[10798]: Device 1: NVIDIA GeForce GTX 1060 3GB, compute capability 6.1, VMM: yes
Jun 27 12:06:16 ollama[10798]: llm_load_tensors: ggml ctx size = 0.68 MiB
Jun 27 12:06:16 ollama[10798]: llm_load_tensors: offloading 42 repeating layers to GPU
Jun 27 12:06:16 ollama[10798]: llm_load_tensors: offloading non-repeating layers to GPU
Jun 27 12:06:16 ollama[10798]: llm_load_tensors: offloaded 43/43 layers to GPU
Jun 27 12:06:16 ollama[10798]: llm_load_tensors: CPU buffer size = 717.77 MiB
Jun 27 12:06:16 ollama[10798]: llm_load_tensors: CUDA0 buffer size = 2765.55 MiB
Jun 27 12:06:16 ollama[10798]: llm_load_tensors: CUDA1 buffer size = 2419.66 MiB
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: n_ctx = 2048
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: n_batch = 512
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: n_ubatch = 512
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: flash_attn = 0
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: freq_base = 10000.0
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: freq_scale = 1
Jun 27 12:06:18 ollama[10798]: llama_kv_cache_init: CUDA0 KV buffer size = 416.00 MiB
Jun 27 12:06:18 ollama[10798]: llama_kv_cache_init: CUDA1 KV buffer size = 256.00 MiB
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: KV self size = 672.00 MiB, K (f16): 336.00 MiB, V (f16): 336.00 MiB
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: CUDA_Host output buffer size = 0.99 MiB
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: pipeline parallelism enabled (n_copies=4)
Jun 27 12:06:18 ollama[10798]: ggml_backend_cuda_buffer_type_alloc_buffer: allocating 551.02 MiB on device 1: cudaMalloc failed: out of memory
Jun 27 12:06:18 ollama[10798]: ggml_gallocr_reserve_n: failed to allocate CUDA1 buffer of size 577781760
Jun 27 12:06:18 ollama[10798]: llama_new_context_with_model: failed to allocate compute buffers
Jun 27 12:06:18 ollama[10798]: llama_init_from_gpt_params: error: failed to create context with model '/usr/share/ollama/.ollama/models/blobs/sha256-e84ed7399c82fbf7dbd6cdef3f12>
Jun 27 12:06:18 ollama[11759]: ERROR [load_model] unable to load model | model="/usr/share/ollama/.ollama/models/blobs/sha256-e84ed7399c82fbf7dbd6cdef3f12d356c3cdb5512e5d8b2a989>
Jun 27 12:06:18 ollama[10798]: terminate called without an active exception
Jun 27 12:06:18 ollama[10798]: time=2024-06-27T12:06:18.506-04:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server error"
Jun 27 12:06:18 ollama[10798]: time=2024-06-27T12:06:18.757-04:00 level=ERROR source=sched.go:388 msg="error loading llama server" error="llama runner process has terminated: signal: aborted (core dumped) error:failed to create context with model
Jun 27 12:06:18 ollama[10798]: [GIN] 2024/06/27 - 12:06:18 | 500 | 3.394458507s | 127.0.0.1 | POST "/api/chat"
Jun 27 12:06:23 ollama[10798]: time=2024-06-27T12:06:23.929-04:00 level=WARN source=sched.go:575 msg="gpu VRAM usage didn't recover within timeout" seconds=5.171683469 model=/us>
Jun 27 12:06:24 ollama[10798]: time=2024-06-27T12:06:24.179-04:00 level=WARN source=sched.go:575 msg="gpu VRAM usage didn't recover within timeout" seconds=5.421977546 model=/us>
Jun 27 12:06:24 ollama[10798]: time=2024-06-27T12:06:24.429-04:00 level=WARN source=sched.go:575 msg="gpu VRAM usage didn't recover within timeout" seconds=5.67175286 model=/usr>```
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
1.47
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5331/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5331/timeline
| null |
completed
| false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.