url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/3891
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3891/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3891/comments
|
https://api.github.com/repos/ollama/ollama/issues/3891/events
|
https://github.com/ollama/ollama/issues/3891
| 2,262,214,074
|
I_kwDOJ0Z1Ps6G1qW6
| 3,891
|
not clear what the options are for OLLAMA_LLM_LIBRARY
|
{
"login": "FlorinAndrei",
"id": 901867,
"node_id": "MDQ6VXNlcjkwMTg2Nw==",
"avatar_url": "https://avatars.githubusercontent.com/u/901867?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FlorinAndrei",
"html_url": "https://github.com/FlorinAndrei",
"followers_url": "https://api.github.com/users/FlorinAndrei/followers",
"following_url": "https://api.github.com/users/FlorinAndrei/following{/other_user}",
"gists_url": "https://api.github.com/users/FlorinAndrei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FlorinAndrei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FlorinAndrei/subscriptions",
"organizations_url": "https://api.github.com/users/FlorinAndrei/orgs",
"repos_url": "https://api.github.com/users/FlorinAndrei/repos",
"events_url": "https://api.github.com/users/FlorinAndrei/events{/privacy}",
"received_events_url": "https://api.github.com/users/FlorinAndrei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-04-24T21:24:59
| 2024-05-01T23:02:27
| 2024-05-01T23:02:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
This document https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md says:
```
You can set OLLAMA_LLM_LIBRARY to any of the available LLM libraries to bypass autodetection, so for example, if you have a CUDA card, but want to force the CPU LLM library with AVX2 vector support, use:
OLLAMA_LLM_LIBRARY="cpu_avx2" ollama serve
```
What is not clear is: what are all the possible values I could give to OLLAMA_LLM_LIBRARY?
I ended up here trying to figure out how to force the model to run on the CPU even when there is a GPU present in the system. But I would like to see the more general answer, and it should be in the project documentation.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.32
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3891/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3891/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7793
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7793/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7793/comments
|
https://api.github.com/repos/ollama/ollama/issues/7793/events
|
https://github.com/ollama/ollama/issues/7793
| 2,682,562,136
|
I_kwDOJ0Z1Ps6f5KZY
| 7,793
|
LLM(vision) GGUF Recommendation: Is there any LLM(vision) with great performance in GGUF format?
|
{
"login": "bohaocheung",
"id": 106144344,
"node_id": "U_kgDOBlOiWA",
"avatar_url": "https://avatars.githubusercontent.com/u/106144344?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bohaocheung",
"html_url": "https://github.com/bohaocheung",
"followers_url": "https://api.github.com/users/bohaocheung/followers",
"following_url": "https://api.github.com/users/bohaocheung/following{/other_user}",
"gists_url": "https://api.github.com/users/bohaocheung/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bohaocheung/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bohaocheung/subscriptions",
"organizations_url": "https://api.github.com/users/bohaocheung/orgs",
"repos_url": "https://api.github.com/users/bohaocheung/repos",
"events_url": "https://api.github.com/users/bohaocheung/events{/privacy}",
"received_events_url": "https://api.github.com/users/bohaocheung/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 2
| 2024-11-22T09:30:55
| 2024-11-26T17:38:13
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### Disappointing Performance
It's really strange that I have tried many **LLMs with vision** in `GGUF` format, listed in the official website, such as `Llama3.2-vision`, `llava`, `llava-llama3`, `llava-phi3`. However, all of their performance is disappointing in **vision** aspect, even a simple task like recognizing an image of an apple.
### GGUF format
By the way, to be quickly start, I tried all LLMs in `GGUF` format. I don't know whether it is a hinder for good performance.
### Help
So, I post this issue for your help and a good LLM with vision in `GGUF` format recommendation, or **any other importing ways** to get a great performance, thanks for your help!!!
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.2
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7793/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7793/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3770
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3770/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3770/comments
|
https://api.github.com/repos/ollama/ollama/issues/3770/events
|
https://github.com/ollama/ollama/pull/3770
| 2,254,361,254
|
PR_kwDOJ0Z1Ps5tPV0T
| 3,770
|
Allow User-Defined GPU Selection for Ollama
|
{
"login": "chornox",
"id": 1256609,
"node_id": "MDQ6VXNlcjEyNTY2MDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/1256609?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chornox",
"html_url": "https://github.com/chornox",
"followers_url": "https://api.github.com/users/chornox/followers",
"following_url": "https://api.github.com/users/chornox/following{/other_user}",
"gists_url": "https://api.github.com/users/chornox/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chornox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chornox/subscriptions",
"organizations_url": "https://api.github.com/users/chornox/orgs",
"repos_url": "https://api.github.com/users/chornox/repos",
"events_url": "https://api.github.com/users/chornox/events{/privacy}",
"received_events_url": "https://api.github.com/users/chornox/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-20T04:03:57
| 2024-04-24T06:28:20
| 2024-04-24T06:27:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3770",
"html_url": "https://github.com/ollama/ollama/pull/3770",
"diff_url": "https://github.com/ollama/ollama/pull/3770.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3770.patch",
"merged_at": null
}
|
Currently, Ollama defaults to using NVIDIA GPUs. This PR introduces the ability for users to choose their preferred GPU by leveraging the existing `CUDA_VISIBLE_DEVICES` environment variable.
By setting `CUDA_VISIBLE_DEVICES` to a "-1" (invalid value), users can ensure Ollama respects their GPU preference, regardless of brand. This change is backwards compatible and will not impact users without AMD GPUs.
|
{
"login": "chornox",
"id": 1256609,
"node_id": "MDQ6VXNlcjEyNTY2MDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/1256609?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chornox",
"html_url": "https://github.com/chornox",
"followers_url": "https://api.github.com/users/chornox/followers",
"following_url": "https://api.github.com/users/chornox/following{/other_user}",
"gists_url": "https://api.github.com/users/chornox/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chornox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chornox/subscriptions",
"organizations_url": "https://api.github.com/users/chornox/orgs",
"repos_url": "https://api.github.com/users/chornox/repos",
"events_url": "https://api.github.com/users/chornox/events{/privacy}",
"received_events_url": "https://api.github.com/users/chornox/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3770/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3770/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6642
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6642/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6642/comments
|
https://api.github.com/repos/ollama/ollama/issues/6642/events
|
https://github.com/ollama/ollama/pull/6642
| 2,506,161,987
|
PR_kwDOJ0Z1Ps56cHdy
| 6,642
|
llm: use json.hpp from common
|
{
"login": "iscy",
"id": 294710,
"node_id": "MDQ6VXNlcjI5NDcxMA==",
"avatar_url": "https://avatars.githubusercontent.com/u/294710?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iscy",
"html_url": "https://github.com/iscy",
"followers_url": "https://api.github.com/users/iscy/followers",
"following_url": "https://api.github.com/users/iscy/following{/other_user}",
"gists_url": "https://api.github.com/users/iscy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iscy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iscy/subscriptions",
"organizations_url": "https://api.github.com/users/iscy/orgs",
"repos_url": "https://api.github.com/users/iscy/repos",
"events_url": "https://api.github.com/users/iscy/events{/privacy}",
"received_events_url": "https://api.github.com/users/iscy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-09-04T19:54:36
| 2024-09-04T23:34:42
| 2024-09-04T23:34:42
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6642",
"html_url": "https://github.com/ollama/ollama/pull/6642",
"diff_url": "https://github.com/ollama/ollama/pull/6642.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6642.patch",
"merged_at": "2024-09-04T23:34:42"
}
|
The version of json.hpp from the 'common' module was no longer the same as the one within the 'ext_server' module. The discrepancy can cause linking errors depending on the functions used. This patch remove the old version in favor of using the one found in the common module.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6642/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6642/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8681
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8681/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8681/comments
|
https://api.github.com/repos/ollama/ollama/issues/8681/events
|
https://github.com/ollama/ollama/pull/8681
| 2,819,666,702
|
PR_kwDOJ0Z1Ps6JcEt2
| 8,681
|
Remove hard-coded GIN mode
|
{
"login": "yoonsio",
"id": 24367477,
"node_id": "MDQ6VXNlcjI0MzY3NDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/24367477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yoonsio",
"html_url": "https://github.com/yoonsio",
"followers_url": "https://api.github.com/users/yoonsio/followers",
"following_url": "https://api.github.com/users/yoonsio/following{/other_user}",
"gists_url": "https://api.github.com/users/yoonsio/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yoonsio/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yoonsio/subscriptions",
"organizations_url": "https://api.github.com/users/yoonsio/orgs",
"repos_url": "https://api.github.com/users/yoonsio/repos",
"events_url": "https://api.github.com/users/yoonsio/events{/privacy}",
"received_events_url": "https://api.github.com/users/yoonsio/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2025-01-30T01:02:26
| 2025-01-30T01:07:02
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8681",
"html_url": "https://github.com/ollama/ollama/pull/8681",
"diff_url": "https://github.com/ollama/ollama/pull/8681.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8681.patch",
"merged_at": null
}
|
## Context
https://github.com/ollama/ollama/issues/8682: Gin mode is hard-coded to `gin.DebugMode` and the server displays this log on start up.
```
[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
```
## Changes
This PR removes hard-coded `gin.DebugMode` from the source code, which allows users to set the desired `GIN_MODE` via environment variable without modifying the source code.
Gin v1.10.0 loads `GIN_MODE` from the [environment variable](https://github.com/gin-gonic/gin/blob/v1.10.0/mode.go#L16-L25) and falls back to debug mode if it is unset by default.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8681/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8681/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6783
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6783/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6783/comments
|
https://api.github.com/repos/ollama/ollama/issues/6783/events
|
https://github.com/ollama/ollama/issues/6783
| 2,523,800,337
|
I_kwDOJ0Z1Ps6WbiMR
| 6,783
|
Ollama run says "A model with that name already exists" but really its a casing issue?
|
{
"login": "Sourdface",
"id": 130875793,
"node_id": "U_kgDOB80BkQ",
"avatar_url": "https://avatars.githubusercontent.com/u/130875793?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sourdface",
"html_url": "https://github.com/Sourdface",
"followers_url": "https://api.github.com/users/Sourdface/followers",
"following_url": "https://api.github.com/users/Sourdface/following{/other_user}",
"gists_url": "https://api.github.com/users/Sourdface/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Sourdface/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Sourdface/subscriptions",
"organizations_url": "https://api.github.com/users/Sourdface/orgs",
"repos_url": "https://api.github.com/users/Sourdface/repos",
"events_url": "https://api.github.com/users/Sourdface/events{/privacy}",
"received_events_url": "https://api.github.com/users/Sourdface/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-09-13T03:28:21
| 2024-09-13T03:28:21
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I don't know how to explain this exactly, but when I try to run `ollama run Llama3.1` I get the confusing error:
```
Error: a model with that name already exists
```
And it *does* exist, but the issue is that the casing is different (it's `llama3.1`, not `Llama3.1`). This is evidently confusing the engine somehow. If I give the lowercase version then its fine and loads normally, and if I give a name that doesn't exist then I get `Error: pull model manifest: file does not exist`.
This is running on Ubuntu 24.04, x64.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.10
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6783/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6783/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7677
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7677/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7677/comments
|
https://api.github.com/repos/ollama/ollama/issues/7677/events
|
https://github.com/ollama/ollama/issues/7677
| 2,660,688,240
|
I_kwDOJ0Z1Ps6eluFw
| 7,677
|
Enable image embeddings for vision models
|
{
"login": "kevin-pw",
"id": 140451262,
"node_id": "U_kgDOCF8dvg",
"avatar_url": "https://avatars.githubusercontent.com/u/140451262?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kevin-pw",
"html_url": "https://github.com/kevin-pw",
"followers_url": "https://api.github.com/users/kevin-pw/followers",
"following_url": "https://api.github.com/users/kevin-pw/following{/other_user}",
"gists_url": "https://api.github.com/users/kevin-pw/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kevin-pw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevin-pw/subscriptions",
"organizations_url": "https://api.github.com/users/kevin-pw/orgs",
"repos_url": "https://api.github.com/users/kevin-pw/repos",
"events_url": "https://api.github.com/users/kevin-pw/events{/privacy}",
"received_events_url": "https://api.github.com/users/kevin-pw/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-15T04:37:24
| 2024-11-15T17:10:49
| 2024-11-15T17:09:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I would love to be able to create embeddings for images with vision models like `llama3.2-vision`.
Creating image and text embeddings with a vision-capable model should allow creating image search and image categorization applications.
If my understanding of the shared semantic vector space of image models is correct, it should be possible to perform calculations like cosine similarity on text and image embeddings to, for example, find all the photos of puppy dogs in a random assortment of photos :)
At this time, the `generate` endpoint accepts an `images` parameter, but the `embed` endpoint does not. I tried passing an image as a base64 string to the `input` parameter of the `embed` endpoint, but the resulting embedding appears to be the vector of the text string and not of the image.
Would it be possible to expand the `embed` endpoint to accept an image parameter?
|
{
"login": "kevin-pw",
"id": 140451262,
"node_id": "U_kgDOCF8dvg",
"avatar_url": "https://avatars.githubusercontent.com/u/140451262?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kevin-pw",
"html_url": "https://github.com/kevin-pw",
"followers_url": "https://api.github.com/users/kevin-pw/followers",
"following_url": "https://api.github.com/users/kevin-pw/following{/other_user}",
"gists_url": "https://api.github.com/users/kevin-pw/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kevin-pw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevin-pw/subscriptions",
"organizations_url": "https://api.github.com/users/kevin-pw/orgs",
"repos_url": "https://api.github.com/users/kevin-pw/repos",
"events_url": "https://api.github.com/users/kevin-pw/events{/privacy}",
"received_events_url": "https://api.github.com/users/kevin-pw/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7677/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7677/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/6526
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6526/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6526/comments
|
https://api.github.com/repos/ollama/ollama/issues/6526/events
|
https://github.com/ollama/ollama/issues/6526
| 2,488,730,181
|
I_kwDOJ0Z1Ps6UVwJF
| 6,526
|
database modify capability
|
{
"login": "nRanzo",
"id": 104451140,
"node_id": "U_kgDOBjnMRA",
"avatar_url": "https://avatars.githubusercontent.com/u/104451140?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nRanzo",
"html_url": "https://github.com/nRanzo",
"followers_url": "https://api.github.com/users/nRanzo/followers",
"following_url": "https://api.github.com/users/nRanzo/following{/other_user}",
"gists_url": "https://api.github.com/users/nRanzo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nRanzo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nRanzo/subscriptions",
"organizations_url": "https://api.github.com/users/nRanzo/orgs",
"repos_url": "https://api.github.com/users/nRanzo/repos",
"events_url": "https://api.github.com/users/nRanzo/events{/privacy}",
"received_events_url": "https://api.github.com/users/nRanzo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-08-27T09:05:35
| 2024-09-12T01:33:31
| 2024-09-12T01:33:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
it would be interesting to have the possibility to provide ollama with a folder containing data and ask him to extrapolate a database with the required fields in an .md file
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6526/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6526/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4051
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4051/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4051/comments
|
https://api.github.com/repos/ollama/ollama/issues/4051/events
|
https://github.com/ollama/ollama/issues/4051
| 2,271,433,081
|
I_kwDOJ0Z1Ps6HY1F5
| 4,051
|
Enable Flash Attention on GGML/GGUF (feature now merged into llama.cpp)
|
{
"login": "sammcj",
"id": 862951,
"node_id": "MDQ6VXNlcjg2Mjk1MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/862951?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sammcj",
"html_url": "https://github.com/sammcj",
"followers_url": "https://api.github.com/users/sammcj/followers",
"following_url": "https://api.github.com/users/sammcj/following{/other_user}",
"gists_url": "https://api.github.com/users/sammcj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sammcj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sammcj/subscriptions",
"organizations_url": "https://api.github.com/users/sammcj/orgs",
"repos_url": "https://api.github.com/users/sammcj/repos",
"events_url": "https://api.github.com/users/sammcj/events{/privacy}",
"received_events_url": "https://api.github.com/users/sammcj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 21
| 2024-04-30T13:06:47
| 2024-07-18T14:46:21
| 2024-05-20T20:36:04
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Flash Attention has landed in llama.cpp (https://github.com/ggerganov/llama.cpp/pull/5021).
The tldr; is simply to pass the -fa flag to llama.cpp’s server.
- Can we please have an Ollama server env var to pass this flag to the underlying llama.cpp server?
also a related idea - perhaps there could be a way to pass arbitrary flags down to llama.cpp so that hints like this can be easily enabled? (E.g. `OLLAMA_LLAMA_EXTRA_ARGS=-fa,—something-else`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4051/reactions",
"total_count": 30,
"+1": 18,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 12,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4051/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5329
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5329/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5329/comments
|
https://api.github.com/repos/ollama/ollama/issues/5329/events
|
https://github.com/ollama/ollama/issues/5329
| 2,378,420,087
|
I_kwDOJ0Z1Ps6Nw893
| 5,329
|
clip models fail to load with unicode characters in OLLAMA_MODELS path on windows
|
{
"login": "Derix76",
"id": 174033173,
"node_id": "U_kgDOCl-JFQ",
"avatar_url": "https://avatars.githubusercontent.com/u/174033173?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Derix76",
"html_url": "https://github.com/Derix76",
"followers_url": "https://api.github.com/users/Derix76/followers",
"following_url": "https://api.github.com/users/Derix76/following{/other_user}",
"gists_url": "https://api.github.com/users/Derix76/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Derix76/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Derix76/subscriptions",
"organizations_url": "https://api.github.com/users/Derix76/orgs",
"repos_url": "https://api.github.com/users/Derix76/repos",
"events_url": "https://api.github.com/users/Derix76/events{/privacy}",
"received_events_url": "https://api.github.com/users/Derix76/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 9
| 2024-06-27T15:07:04
| 2024-07-05T15:16:59
| 2024-07-05T15:16:59
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I tried to start llava:1.6 (or any similar llava based modell) an the llama server terminated.
llama3 modell or different non llava models work just fine.
GPU is: NVIDIA GeForce RTX 4060" total="8.0 GiB" available="6.9 GiB"
CPU is: AMD Ryzen 7 4700G with Radeon GPU (ignored by Ollama) ("unsupported Radeon iGPU detected skipping" id=0 name="AMD Radeon(TM) Graphics")
OS in Win 11
Geforce Drivers: 555.99 Studio
Log extract:
time=2024-06-27T16:53:15.204+02:00 level=INFO source=memory.go:309 msg="offload to cuda" layers.requested=-1 layers.model=33 layers.offload=33 layers.split="" memory.available="[7.2 GiB]" memory.required.full="5.3 GiB" memory.required.partial="5.3 GiB" memory.required.kv="256.0 MiB" memory.required.allocations="[5.3 GiB]" memory.weights.total="3.9 GiB" memory.weights.repeating="3.8 GiB" memory.weights.nonrepeating="102.6 MiB" memory.graph.full="164.0 MiB" memory.graph.partial="181.0 MiB"
time=2024-06-27T16:53:15.205+02:00 level=WARN source=server.go:241 msg="multimodal models don't support parallel requests yet"
time=2024-06-27T16:53:15.208+02:00 level=INFO source=server.go:368 msg="starting llama server" cmd="C:\\Users\\Stefan Hüttner\\AppData\\Local\\Programs\\Ollama\\ollama_runners\\cuda_v11.3\\ollama_llama_server.exe --model C:\\Users\\Stefan Hüttner\\.ollama\\models\\blobs\\sha256-170370233dd5c5415250a2ecd5c71586352850729062ccef1496385647293868 --ctx-size 2048 --batch-size 512 --embedding --log-disable --n-gpu-layers 33 --mmproj C:\\Users\\Stefan Hüttner\\.ollama\\models\\blobs\\sha256-72d6f08a42f656d36b356dbe0920675899a99ce21192fd66266fb7d82ed07539 --no-mmap --parallel 1 --port 51274"
time=2024-06-27T16:53:15.742+02:00 level=INFO source=sched.go:382 msg="loaded runners" count=1
time=2024-06-27T16:53:15.742+02:00 level=INFO source=server.go:556 msg="waiting for llama runner to start responding"
time=2024-06-27T16:53:15.743+02:00 level=INFO source=server.go:594 msg="waiting for server to become available" status="llm server error"
INFO [wmain] build info | build=3171 commit="7c26775a" tid="17384" timestamp=1719499996
INFO [wmain] system info | n_threads=8 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="17384" timestamp=1719499996 total_threads=16
INFO [wmain] HTTP server listening | hostname="127.0.0.1" n_threads_http="15" port="51274" tid="17384" timestamp=1719499996
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: CUDA_USE_TENSOR_CORES: yes
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 4060, compute capability 8.9, VMM: yes
ERROR [load_model] unable to load clip model | model="C:\\Users\\Stefan Hüttner\\.ollama\\models\\blobs\\sha256-72d6f08a42f656d36b356dbe0920675899a99ce21192fd66266fb7d82ed07539" tid="17384" timestamp=1719499996
time=2024-06-27T16:53:16.512+02:00 level=ERROR source=sched.go:388 msg="error loading llama server" error="llama runner process has terminated: exit status 0xc0000409 "
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.47
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5329/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5329/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8244
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8244/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8244/comments
|
https://api.github.com/repos/ollama/ollama/issues/8244/events
|
https://github.com/ollama/ollama/issues/8244
| 2,759,178,297
|
I_kwDOJ0Z1Ps6kdbg5
| 8,244
|
Ollama GPU/CPU
|
{
"login": "mcodexyz",
"id": 25278019,
"node_id": "MDQ6VXNlcjI1Mjc4MDE5",
"avatar_url": "https://avatars.githubusercontent.com/u/25278019?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mcodexyz",
"html_url": "https://github.com/mcodexyz",
"followers_url": "https://api.github.com/users/mcodexyz/followers",
"following_url": "https://api.github.com/users/mcodexyz/following{/other_user}",
"gists_url": "https://api.github.com/users/mcodexyz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mcodexyz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mcodexyz/subscriptions",
"organizations_url": "https://api.github.com/users/mcodexyz/orgs",
"repos_url": "https://api.github.com/users/mcodexyz/repos",
"events_url": "https://api.github.com/users/mcodexyz/events{/privacy}",
"received_events_url": "https://api.github.com/users/mcodexyz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-12-26T02:01:26
| 2024-12-27T01:05:53
| 2024-12-27T01:05:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
While trying to run the Qwen 2.5 7b 8Q model, I noticed differences in performance between llama.cpp and Ollama. In llama.cpp, the model runs entirely on the RTX 2060 Super graphics card, which is the desired behavior. However, in the case of Ollama, although the VRAM usage is significant (7220MiB out of the available 8192MiB), part of the calculation is still performed on the processor (CPU). I wonder if there is a way to force Ollama to use only the graphics card, since the model fits there entirely.
### OS
Linux, Docker
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.5.4
|
{
"login": "mcodexyz",
"id": 25278019,
"node_id": "MDQ6VXNlcjI1Mjc4MDE5",
"avatar_url": "https://avatars.githubusercontent.com/u/25278019?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mcodexyz",
"html_url": "https://github.com/mcodexyz",
"followers_url": "https://api.github.com/users/mcodexyz/followers",
"following_url": "https://api.github.com/users/mcodexyz/following{/other_user}",
"gists_url": "https://api.github.com/users/mcodexyz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mcodexyz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mcodexyz/subscriptions",
"organizations_url": "https://api.github.com/users/mcodexyz/orgs",
"repos_url": "https://api.github.com/users/mcodexyz/repos",
"events_url": "https://api.github.com/users/mcodexyz/events{/privacy}",
"received_events_url": "https://api.github.com/users/mcodexyz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8244/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8244/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2589
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2589/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2589/comments
|
https://api.github.com/repos/ollama/ollama/issues/2589/events
|
https://github.com/ollama/ollama/issues/2589
| 2,141,938,590
|
I_kwDOJ0Z1Ps5_q2Oe
| 2,589
|
Windows ARM support
|
{
"login": "PeronGH",
"id": 18367871,
"node_id": "MDQ6VXNlcjE4MzY3ODcx",
"avatar_url": "https://avatars.githubusercontent.com/u/18367871?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PeronGH",
"html_url": "https://github.com/PeronGH",
"followers_url": "https://api.github.com/users/PeronGH/followers",
"following_url": "https://api.github.com/users/PeronGH/following{/other_user}",
"gists_url": "https://api.github.com/users/PeronGH/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PeronGH/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PeronGH/subscriptions",
"organizations_url": "https://api.github.com/users/PeronGH/orgs",
"repos_url": "https://api.github.com/users/PeronGH/repos",
"events_url": "https://api.github.com/users/PeronGH/events{/privacy}",
"received_events_url": "https://api.github.com/users/PeronGH/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-02-19T09:41:03
| 2024-09-20T20:09:39
| 2024-09-20T20:09:39
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I tried to run it on a Windows on ARM device and the installer refused to exectue.

Is there any plan for the native Windows on ARM support? Or is it possible to remove the architecture checking and make the x86 version work on ARM devices?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2589/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2589/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4795
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4795/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4795/comments
|
https://api.github.com/repos/ollama/ollama/issues/4795/events
|
https://github.com/ollama/ollama/issues/4795
| 2,330,494,055
|
I_kwDOJ0Z1Ps6K6IRn
| 4,795
|
Error: llama runner process has terminated: exit status 0xc000001d
|
{
"login": "Ecthellin203",
"id": 94040890,
"node_id": "U_kgDOBZrzOg",
"avatar_url": "https://avatars.githubusercontent.com/u/94040890?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Ecthellin203",
"html_url": "https://github.com/Ecthellin203",
"followers_url": "https://api.github.com/users/Ecthellin203/followers",
"following_url": "https://api.github.com/users/Ecthellin203/following{/other_user}",
"gists_url": "https://api.github.com/users/Ecthellin203/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Ecthellin203/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Ecthellin203/subscriptions",
"organizations_url": "https://api.github.com/users/Ecthellin203/orgs",
"repos_url": "https://api.github.com/users/Ecthellin203/repos",
"events_url": "https://api.github.com/users/Ecthellin203/events{/privacy}",
"received_events_url": "https://api.github.com/users/Ecthellin203/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-06-03T08:22:08
| 2024-07-03T23:25:39
| 2024-07-03T23:25:39
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
run llama3:8b
Error: llama runner process has terminated: exit status 0xc000001d
```
2024/06/03 15:40:13 routes.go:1007: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR:C:\\Users\\ecthe\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_TMPDIR:]"
time=2024-06-03T15:40:13.471+08:00 level=INFO source=images.go:729 msg="total blobs: 0"
time=2024-06-03T15:40:13.471+08:00 level=INFO source=images.go:736 msg="total unused blobs removed: 0"
time=2024-06-03T15:40:13.472+08:00 level=INFO source=routes.go:1053 msg="Listening on 127.0.0.1:11434 (version 0.1.41)"
time=2024-06-03T15:40:13.472+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [rocm_v5.7 cpu cpu_avx cpu_avx2 cuda_v11.3]"
time=2024-06-03T15:40:18.670+08:00 level=INFO source=types.go:71 msg="inference compute" id=0 library=rocm compute=gfx1100 driver=0.0 name="AMD Radeon RX 7900 XTX" total="24.0 GiB" available="23.9 GiB"
2024/06/03 15:43:44 routes.go:1007: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR:C:\\Users\\ecthe\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_TMPDIR:]"
time=2024-06-03T15:43:44.069+08:00 level=INFO source=images.go:729 msg="total blobs: 0"
time=2024-06-03T15:43:44.070+08:00 level=INFO source=images.go:736 msg="total unused blobs removed: 0"
time=2024-06-03T15:43:44.070+08:00 level=INFO source=routes.go:1053 msg="Listening on 127.0.0.1:11434 (version 0.1.41)"
time=2024-06-03T15:43:44.071+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11.3 rocm_v5.7]"
time=2024-06-03T15:43:45.378+08:00 level=INFO source=types.go:71 msg="inference compute" id=0 library=rocm compute=gfx1100 driver=0.0 name="AMD Radeon RX 7900 XTX" total="24.0 GiB" available="23.9 GiB"
2024/06/03 15:46:24 routes.go:1007: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST: OLLAMA_KEEP_ALIVE: OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS: OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR:C:\\Users\\ecthe\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_TMPDIR:]"
time=2024-06-03T15:46:24.031+08:00 level=INFO source=images.go:729 msg="total blobs: 0"
time=2024-06-03T15:46:24.034+08:00 level=INFO source=images.go:736 msg="total unused blobs removed: 0"
time=2024-06-03T15:46:24.035+08:00 level=INFO source=routes.go:1053 msg="Listening on 127.0.0.1:11434 (version 0.1.41)"
time=2024-06-03T15:46:24.035+08:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [rocm_v5.7 cpu cpu_avx cpu_avx2 cuda_v11.3]"
time=2024-06-03T15:46:24.820+08:00 level=INFO source=types.go:71 msg="inference compute" id=0 library=rocm compute=gfx1100 driver=0.0 name="AMD Radeon RX 7900 XTX" total="24.0 GiB" available="23.9 GiB"
[GIN] 2024/06/03 - 15:46:49 | 200 | 0s | 127.0.0.1 | GET "/api/version"
[GIN] 2024/06/03 - 15:47:01 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/06/03 - 15:47:01 | 404 | 527.2µs | 127.0.0.1 | POST "/api/show"
time=2024-06-03T15:58:14.081+08:00 level=INFO source=download.go:136 msg="downloading 6a0746a1ec1a in 47 100 MB part(s)"
time=2024-06-03T15:58:31.081+08:00 level=INFO source=download.go:251 msg="6a0746a1ec1a part 33 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-06-03T15:58:31.081+08:00 level=INFO source=download.go:251 msg="6a0746a1ec1a part 7 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-06-03T16:07:06.001+08:00 level=INFO source=download.go:178 msg="6a0746a1ec1a part 38 attempt 0 failed: unexpected EOF, retrying in 1s"
time=2024-06-03T16:09:06.642+08:00 level=INFO source=download.go:136 msg="downloading 4fa551d4f938 in 1 12 KB part(s)"
time=2024-06-03T16:09:09.722+08:00 level=INFO source=download.go:136 msg="downloading 8ab4849b038c in 1 254 B part(s)"
time=2024-06-03T16:09:12.829+08:00 level=INFO source=download.go:136 msg="downloading 577073ffcc6c in 1 110 B part(s)"
time=2024-06-03T16:09:15.947+08:00 level=INFO source=download.go:136 msg="downloading 3f8eb4da87fa in 1 485 B part(s)"
[GIN] 2024/06/03 - 16:09:50 | 200 | 11m38s | 127.0.0.1 | POST "/api/pull"
[GIN] 2024/06/03 - 16:09:50 | 200 | 1.6429ms | 127.0.0.1 | POST "/api/show"
[GIN] 2024/06/03 - 16:09:50 | 200 | 1.558ms | 127.0.0.1 | POST "/api/show"
time=2024-06-03T16:09:55.333+08:00 level=INFO source=memory.go:133 msg="offload to gpu" layers.requested=-1 layers.real=33 memory.available="23.9 GiB" memory.required.full="5.0 GiB" memory.required.partial="5.0 GiB" memory.required.kv="256.0 MiB" memory.weights.total="4.1 GiB" memory.weights.repeating="3.7 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="164.0 MiB" memory.graph.partial="677.5 MiB"
time=2024-06-03T16:09:55.334+08:00 level=INFO source=memory.go:133 msg="offload to gpu" layers.requested=-1 layers.real=33 memory.available="23.9 GiB" memory.required.full="5.0 GiB" memory.required.partial="5.0 GiB" memory.required.kv="256.0 MiB" memory.weights.total="4.1 GiB" memory.weights.repeating="3.7 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="164.0 MiB" memory.graph.partial="677.5 MiB"
time=2024-06-03T16:09:55.339+08:00 level=INFO source=server.go:341 msg="starting llama server" cmd="C:\\Users\\ecthe\\AppData\\Local\\Programs\\Ollama\\ollama_runners\\rocm_v5.7\\ollama_llama_server.exe --model D:\\Models\\blobs\\sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa --ctx-size 2048 --batch-size 512 --embedding --log-disable --n-gpu-layers 33 --parallel 1 --port 50779"
time=2024-06-03T16:09:55.542+08:00 level=INFO source=sched.go:338 msg="loaded runners" count=1
time=2024-06-03T16:09:55.542+08:00 level=INFO source=server.go:529 msg="waiting for llama runner to start responding"
time=2024-06-03T16:09:55.542+08:00 level=INFO source=server.go:567 msg="waiting for server to become available" status="llm server error"
time=2024-06-03T16:10:05.789+08:00 level=ERROR source=sched.go:344 msg="error loading llama server" error="llama runner process has terminated: exit status 0xc000001d "
[GIN] 2024/06/03 - 16:10:05 | 500 | 15.41134s | 127.0.0.1 | POST "/api/chat"
```
[server.log](https://github.com/user-attachments/files/15530934/server.log)
[app.log](https://github.com/user-attachments/files/15530937/app.log)
### OS
Windows
### GPU
AMD
### CPU
Intel
### Ollama version
0.1.41
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4795/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4795/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8343
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8343/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8343/comments
|
https://api.github.com/repos/ollama/ollama/issues/8343/events
|
https://github.com/ollama/ollama/pull/8343
| 2,773,995,059
|
PR_kwDOJ0Z1Ps6G__yQ
| 8,343
|
OpenAI: accept additional headers to fix CORS error #8342
|
{
"login": "isamu",
"id": 231763,
"node_id": "MDQ6VXNlcjIzMTc2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/231763?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/isamu",
"html_url": "https://github.com/isamu",
"followers_url": "https://api.github.com/users/isamu/followers",
"following_url": "https://api.github.com/users/isamu/following{/other_user}",
"gists_url": "https://api.github.com/users/isamu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/isamu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/isamu/subscriptions",
"organizations_url": "https://api.github.com/users/isamu/orgs",
"repos_url": "https://api.github.com/users/isamu/repos",
"events_url": "https://api.github.com/users/isamu/events{/privacy}",
"received_events_url": "https://api.github.com/users/isamu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2025-01-08T00:40:37
| 2025-01-08T19:28:24
| 2025-01-08T19:28:11
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8343",
"html_url": "https://github.com/ollama/ollama/pull/8343",
"diff_url": "https://github.com/ollama/ollama/pull/8343.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8343.patch",
"merged_at": "2025-01-08T19:28:11"
}
|
Related to #8342, I add some optional header for openai npm package.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8343/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8343/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2585
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2585/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2585/comments
|
https://api.github.com/repos/ollama/ollama/issues/2585/events
|
https://github.com/ollama/ollama/pull/2585
| 2,141,375,423
|
PR_kwDOJ0Z1Ps5nPC1L
| 2,585
|
Fix cuda leaks
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-19T02:37:52
| 2024-02-19T20:48:04
| 2024-02-19T20:48:00
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2585",
"html_url": "https://github.com/ollama/ollama/pull/2585",
"diff_url": "https://github.com/ollama/ollama/pull/2585.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2585.patch",
"merged_at": "2024-02-19T20:48:00"
}
|
This should resolve the problem where we don't fully unload from the GPU when we go idle.
Fixes #1848
This carries the upstream PR https://github.com/ggerganov/llama.cpp/pull/5576 as a patch until that's reviewed/merged.
This also updates the shutdown patch to match what was [merged upstream](https://github.com/ggerganov/llama.cpp/pull/5244#pullrequestreview-1887279675) since we haven't yet bumped llama.cpp to pick that up.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2585/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2585/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5843
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5843/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5843/comments
|
https://api.github.com/repos/ollama/ollama/issues/5843/events
|
https://github.com/ollama/ollama/issues/5843
| 2,422,076,442
|
I_kwDOJ0Z1Ps6QXfQa
| 5,843
|
How to offload all layers to GPU?
|
{
"login": "RakshitAralimatti",
"id": 170917018,
"node_id": "U_kgDOCi_8mg",
"avatar_url": "https://avatars.githubusercontent.com/u/170917018?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RakshitAralimatti",
"html_url": "https://github.com/RakshitAralimatti",
"followers_url": "https://api.github.com/users/RakshitAralimatti/followers",
"following_url": "https://api.github.com/users/RakshitAralimatti/following{/other_user}",
"gists_url": "https://api.github.com/users/RakshitAralimatti/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RakshitAralimatti/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RakshitAralimatti/subscriptions",
"organizations_url": "https://api.github.com/users/RakshitAralimatti/orgs",
"repos_url": "https://api.github.com/users/RakshitAralimatti/repos",
"events_url": "https://api.github.com/users/RakshitAralimatti/events{/privacy}",
"received_events_url": "https://api.github.com/users/RakshitAralimatti/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 18
| 2024-07-22T06:44:13
| 2024-11-17T20:06:11
| 2024-07-24T20:38:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently when I am running gemma2 (using Ollama serve) on my device by default only 27 layers are offloaded on GPU, but I want to offload all 43 layers to GPU
Does anyone know how I can do that?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5843/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5843/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4539
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4539/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4539/comments
|
https://api.github.com/repos/ollama/ollama/issues/4539/events
|
https://github.com/ollama/ollama/issues/4539
| 2,306,061,251
|
I_kwDOJ0Z1Ps6Jc7PD
| 4,539
|
qwen模型简介未更新110B
|
{
"login": "yuchenwei28",
"id": 141537882,
"node_id": "U_kgDOCG-yWg",
"avatar_url": "https://avatars.githubusercontent.com/u/141537882?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuchenwei28",
"html_url": "https://github.com/yuchenwei28",
"followers_url": "https://api.github.com/users/yuchenwei28/followers",
"following_url": "https://api.github.com/users/yuchenwei28/following{/other_user}",
"gists_url": "https://api.github.com/users/yuchenwei28/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuchenwei28/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuchenwei28/subscriptions",
"organizations_url": "https://api.github.com/users/yuchenwei28/orgs",
"repos_url": "https://api.github.com/users/yuchenwei28/repos",
"events_url": "https://api.github.com/users/yuchenwei28/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuchenwei28/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-05-20T13:56:16
| 2024-05-20T13:56:16
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
qwen模型简介未更新110B,
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4539/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4539/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1551
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1551/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1551/comments
|
https://api.github.com/repos/ollama/ollama/issues/1551/events
|
https://github.com/ollama/ollama/issues/1551
| 2,044,219,935
|
I_kwDOJ0Z1Ps552FIf
| 1,551
|
Analyse this document.
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 4
| 2023-12-15T18:54:28
| 2024-05-10T00:27:00
| 2024-05-10T00:27:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Thinking of an enhancement. With llava, you could ask what a picture is about and give the file location.
I wonder if it would be useful or worthwhile to analyse a document by giving it the file location.
Downsides, no rag so info can't be easily stored.
Upsides, would be super useful and can use as a reference, using the document itself as the storage medium while its running, ie load the vectors into memory that links to file locations.
Use case:
rewrite the code located at ./oldcode.py and save the new version at ./newcode.py change it to do ....
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1551/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1551/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7653
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7653/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7653/comments
|
https://api.github.com/repos/ollama/ollama/issues/7653/events
|
https://github.com/ollama/ollama/issues/7653
| 2,656,195,761
|
I_kwDOJ0Z1Ps6eUlSx
| 7,653
|
Validation of Keys and Subkeys in Ollama API JSON Objects
|
{
"login": "d-kleine",
"id": 53251018,
"node_id": "MDQ6VXNlcjUzMjUxMDE4",
"avatar_url": "https://avatars.githubusercontent.com/u/53251018?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/d-kleine",
"html_url": "https://github.com/d-kleine",
"followers_url": "https://api.github.com/users/d-kleine/followers",
"following_url": "https://api.github.com/users/d-kleine/following{/other_user}",
"gists_url": "https://api.github.com/users/d-kleine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/d-kleine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/d-kleine/subscriptions",
"organizations_url": "https://api.github.com/users/d-kleine/orgs",
"repos_url": "https://api.github.com/users/d-kleine/repos",
"events_url": "https://api.github.com/users/d-kleine/events{/privacy}",
"received_events_url": "https://api.github.com/users/d-kleine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-11-13T17:11:07
| 2024-12-03T10:32:50
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
**Problem:**
When interacting with the Ollama API, developers may inadvertently pass incorrect keys or subkeys in their requests (e.g., due to typos or misunderstanding of the expected structure). Currently, the API does not provide feedback when this occurs, which can lead to silent failures where options are ignored without notifying the user. This behavior makes debugging difficult and increases the likelihood of unintended behavior.
**Proposed Solution:**
Implement a validation mechanism that checks for the existence of valid keys and subkeys in the JSON request object. If an invalid key or subkey is detected, return an informative error message detailing which key or subkey is incorrect. This will help developers quickly identify issues in their request payloads.
### Expected Behavior:
1. **Validation on Request Submission:** When a request is made to any endpoint (e.g., `/generate`, `/chat`), the API should validate all provided keys and subkeys against a predefined schema.
2. **Error Response for Invalid Keys:** If an invalid key or subkey is detected, return an error response with a message such as:
```python
KeyError("Invalid key 'num_predct' in 'options'. Did you mean 'num_predict'?")
```
4. **Graceful Handling of Missing Required Keys:** If a required key (e.g., `"model"`, `"prompt"`) is missing, return an error indicating which required key is missing:
```python
KeyError("Missing required key 'model'.")
```
### Benefits:
1. **Improved Usability:** Developers will receive immediate feedback when they make mistakes in their request payloads, reducing debugging time.
2. **Reduced Silent Failures:** By explicitly notifying users of invalid keys or subkeys, you can eliminate silent failures where options are ignored without warning.
3. **Increased Confidence in Requests:** Developers can be confident that all provided options are being processed as intended.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7653/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/7653/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3961
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3961/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3961/comments
|
https://api.github.com/repos/ollama/ollama/issues/3961/events
|
https://github.com/ollama/ollama/issues/3961
| 2,266,535,771
|
I_kwDOJ0Z1Ps6HGJdb
| 3,961
|
setting OLLAMA_HOST to 0.0.0.0 could make the API to listen on the port using IPv6 only
|
{
"login": "TadayukiOkada",
"id": 51673480,
"node_id": "MDQ6VXNlcjUxNjczNDgw",
"avatar_url": "https://avatars.githubusercontent.com/u/51673480?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TadayukiOkada",
"html_url": "https://github.com/TadayukiOkada",
"followers_url": "https://api.github.com/users/TadayukiOkada/followers",
"following_url": "https://api.github.com/users/TadayukiOkada/following{/other_user}",
"gists_url": "https://api.github.com/users/TadayukiOkada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TadayukiOkada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TadayukiOkada/subscriptions",
"organizations_url": "https://api.github.com/users/TadayukiOkada/orgs",
"repos_url": "https://api.github.com/users/TadayukiOkada/repos",
"events_url": "https://api.github.com/users/TadayukiOkada/events{/privacy}",
"received_events_url": "https://api.github.com/users/TadayukiOkada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-04-26T21:35:48
| 2025-01-30T00:03:34
| 2024-04-26T23:43:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Edit2: sorry, if you set BindIPv6Only, 0.0.0.0:11434 should use v4. so this shouldn't be a problem.
Edit: by default, it seems it'll listen on both v4 and v6. If you set BindIPv6Only in systemd.socket, or /proc/sys/net/ipv6/bindv6only is set to 1, it may not listen on v4.
Ollama is only listening on IPv6 with OLLAMA_HOST=0.0.0.0:
```
# netstat -anp | grep 11434
tcp6 0 0 :::11434 :::* LISTEN 5009/ollama
```
This seems to be the cause:
https://github.com/golang/go/issues/48723
### OS
Linux
### GPU
Nvidia
### CPU
Other
### Ollama version
0.1.32
|
{
"login": "TadayukiOkada",
"id": 51673480,
"node_id": "MDQ6VXNlcjUxNjczNDgw",
"avatar_url": "https://avatars.githubusercontent.com/u/51673480?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TadayukiOkada",
"html_url": "https://github.com/TadayukiOkada",
"followers_url": "https://api.github.com/users/TadayukiOkada/followers",
"following_url": "https://api.github.com/users/TadayukiOkada/following{/other_user}",
"gists_url": "https://api.github.com/users/TadayukiOkada/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TadayukiOkada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TadayukiOkada/subscriptions",
"organizations_url": "https://api.github.com/users/TadayukiOkada/orgs",
"repos_url": "https://api.github.com/users/TadayukiOkada/repos",
"events_url": "https://api.github.com/users/TadayukiOkada/events{/privacy}",
"received_events_url": "https://api.github.com/users/TadayukiOkada/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3961/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3961/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3831
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3831/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3831/comments
|
https://api.github.com/repos/ollama/ollama/issues/3831/events
|
https://github.com/ollama/ollama/issues/3831
| 2,257,276,019
|
I_kwDOJ0Z1Ps6Gi0xz
| 3,831
|
Upsert to Vector Store Error: 404
|
{
"login": "thedavc",
"id": 28845125,
"node_id": "MDQ6VXNlcjI4ODQ1MTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/28845125?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thedavc",
"html_url": "https://github.com/thedavc",
"followers_url": "https://api.github.com/users/thedavc/followers",
"following_url": "https://api.github.com/users/thedavc/following{/other_user}",
"gists_url": "https://api.github.com/users/thedavc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thedavc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thedavc/subscriptions",
"organizations_url": "https://api.github.com/users/thedavc/orgs",
"repos_url": "https://api.github.com/users/thedavc/repos",
"events_url": "https://api.github.com/users/thedavc/events{/privacy}",
"received_events_url": "https://api.github.com/users/thedavc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-04-22T19:10:19
| 2024-09-05T20:08:22
| 2024-05-09T22:34:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm running into the 404 error when upserting into Flowise Vector Store. The system does not seem to register the call. In the server logs, I can see that the chat api is working as expected but not the embed.
{"function":"print_timings","level":"INFO","line":290,"msg":"generation eval time = 6108.93 ms / 236 runs ( 25.89 ms per token, 38.63 tokens per second)","n_decoded":236,"n_tokens_second":38.631969919445794,"slot_id":0,"t_token":25.885296610169494,"t_token_generation":6108.93,"task_id":257,"tid":"5144","timestamp":1713806786}
{"function":"print_timings","level":"INFO","line":299,"msg":" total time = 6249.46 ms","slot_id":0,"t_prompt_processing":140.531,"t_token_generation":6108.93,"t_total":6249.461,"task_id":257,"tid":"5144","timestamp":1713806786}
{"function":"update_slots","level":"INFO","line":1648,"msg":"slot released","n_cache_tokens":298,"n_ctx":2048,"n_past":297,"n_system_tokens":0,"slot_id":0,"task_id":257,"tid":"5144","timestamp":1713806786,"truncated":false}
{"function":"log_server_request","level":"INFO","line":2741,"method":"POST","msg":"request","params":{},"path":"/completion","remote_addr":"127.0.0.1","remote_port":53510,"status":200,"tid":"5268","timestamp":1713806786}
[GIN] 2024/04/22 - 10:26:26 | 200 | 6.2526428s | 192.168.86.1 | POST "/api/chat"
[GIN] 2024/04/22 - 10:26:37 | 404 | 0s | 192.168.86.1 | POST "//api/embeddings"
[GIN] 2024/04/22 - 10:26:37 | 404 | 0s | 192.168.86.1 | POST "//api/embeddings"
[GIN] 2024/04/22 - 10:26:38 | 404 | 0s | 192.168.86.1 | POST "//api/embeddings"
### OS
Linux, Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3831/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3831/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5572
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5572/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5572/comments
|
https://api.github.com/repos/ollama/ollama/issues/5572/events
|
https://github.com/ollama/ollama/pull/5572
| 2,398,302,645
|
PR_kwDOJ0Z1Ps501stW
| 5,572
|
Create SECURITY.md
|
{
"login": "Senipostol",
"id": 168364989,
"node_id": "U_kgDOCgkLvQ",
"avatar_url": "https://avatars.githubusercontent.com/u/168364989?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Senipostol",
"html_url": "https://github.com/Senipostol",
"followers_url": "https://api.github.com/users/Senipostol/followers",
"following_url": "https://api.github.com/users/Senipostol/following{/other_user}",
"gists_url": "https://api.github.com/users/Senipostol/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Senipostol/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Senipostol/subscriptions",
"organizations_url": "https://api.github.com/users/Senipostol/orgs",
"repos_url": "https://api.github.com/users/Senipostol/repos",
"events_url": "https://api.github.com/users/Senipostol/events{/privacy}",
"received_events_url": "https://api.github.com/users/Senipostol/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-07-09T13:54:50
| 2024-08-17T02:33:09
| 2024-08-14T16:55:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5572",
"html_url": "https://github.com/ollama/ollama/pull/5572",
"diff_url": "https://github.com/ollama/ollama/pull/5572.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5572.patch",
"merged_at": null
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5572/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5572/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/977
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/977/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/977/comments
|
https://api.github.com/repos/ollama/ollama/issues/977/events
|
https://github.com/ollama/ollama/issues/977
| 1,975,159,125
|
I_kwDOJ0Z1Ps51uolV
| 977
|
connect: can't assign requested address & $HOME variable not defined
|
{
"login": "tyhallcsu",
"id": 16804423,
"node_id": "MDQ6VXNlcjE2ODA0NDIz",
"avatar_url": "https://avatars.githubusercontent.com/u/16804423?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tyhallcsu",
"html_url": "https://github.com/tyhallcsu",
"followers_url": "https://api.github.com/users/tyhallcsu/followers",
"following_url": "https://api.github.com/users/tyhallcsu/following{/other_user}",
"gists_url": "https://api.github.com/users/tyhallcsu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tyhallcsu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tyhallcsu/subscriptions",
"organizations_url": "https://api.github.com/users/tyhallcsu/orgs",
"repos_url": "https://api.github.com/users/tyhallcsu/repos",
"events_url": "https://api.github.com/users/tyhallcsu/events{/privacy}",
"received_events_url": "https://api.github.com/users/tyhallcsu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2023-11-02T22:46:05
| 2023-11-03T07:25:05
| 2023-11-03T07:25:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
```
% ollama run llama2
Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connect: can't assign requested address
```
`/opt/homebrew/var/log/ollama.log`
Log file states:
```
Error: $HOME is not defined
Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connect: can't assign requested address
```
But when I run echo $HOME i get
```
% echo $HOME
/Users/myusername
```
So what do I need to do to fix this? I even checked that the home variable was accessible from CPP enviornment using this script:
```
#include <cstdlib>
#include <iostream>
int main() {
const char* homeDir = getenv("HOME");
if (homeDir) {
std::cout << "The HOME directory is: " << homeDir << std::endl;
} else {
std::cout << "The HOME environment variable is not set." << std::endl;
}
return 0;
}
```
and it returned my $HOME variable just fine
|
{
"login": "tyhallcsu",
"id": 16804423,
"node_id": "MDQ6VXNlcjE2ODA0NDIz",
"avatar_url": "https://avatars.githubusercontent.com/u/16804423?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tyhallcsu",
"html_url": "https://github.com/tyhallcsu",
"followers_url": "https://api.github.com/users/tyhallcsu/followers",
"following_url": "https://api.github.com/users/tyhallcsu/following{/other_user}",
"gists_url": "https://api.github.com/users/tyhallcsu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tyhallcsu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tyhallcsu/subscriptions",
"organizations_url": "https://api.github.com/users/tyhallcsu/orgs",
"repos_url": "https://api.github.com/users/tyhallcsu/repos",
"events_url": "https://api.github.com/users/tyhallcsu/events{/privacy}",
"received_events_url": "https://api.github.com/users/tyhallcsu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/977/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/977/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/587
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/587/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/587/comments
|
https://api.github.com/repos/ollama/ollama/issues/587/events
|
https://github.com/ollama/ollama/issues/587
| 1,911,634,901
|
I_kwDOJ0Z1Ps5x8TvV
| 587
|
Clicking "restart to update Ollama" may not restart the Mac app
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2023-09-25T14:27:17
| 2023-09-28T19:29:19
| 2023-09-28T19:29:19
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://github.com/jmorganca/ollama/assets/5853428/ad263742-275a-4dca-a7f9-f7ea8b6408f7
Clicking the "restart to update Ollama" option to get a new version of the Ollama app did not close and update the desktop Mac app. Looking in the server logs there are no errors or information displayed.
Ollama is still responsive, and using the CLI still works.
**Additional info:**

- This behaviour was observed after sleeping the computer. Not sure if this was a factor.
**Workaround:**
- Exiting and re-opening the toolbar app updates successfully.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/587/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/587/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/643
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/643/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/643/comments
|
https://api.github.com/repos/ollama/ollama/issues/643/events
|
https://github.com/ollama/ollama/issues/643
| 1,918,611,605
|
I_kwDOJ0Z1Ps5yW7CV
| 643
|
Docs request: quantizations used for Llama models
|
{
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-09-29T05:16:20
| 2023-09-30T08:10:10
| 2023-09-30T04:57:03
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://ollama.ai/library/llama2 is nice that it links the model sources as TheBloke.
Can we add what quantization is used? That way there's more traceability as to what model is being run/downloaded.
---
Update: I can see from the aliases that it's Q4_0

I think though that's somewhat buried, it would be good to have a more explicit table explaining this.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/643/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/643/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6339
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6339/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6339/comments
|
https://api.github.com/repos/ollama/ollama/issues/6339/events
|
https://github.com/ollama/ollama/issues/6339
| 2,463,699,809
|
I_kwDOJ0Z1Ps6S2RNh
| 6,339
|
ollama - default tool support
|
{
"login": "Kreijstal",
"id": 2415206,
"node_id": "MDQ6VXNlcjI0MTUyMDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/2415206?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kreijstal",
"html_url": "https://github.com/Kreijstal",
"followers_url": "https://api.github.com/users/Kreijstal/followers",
"following_url": "https://api.github.com/users/Kreijstal/following{/other_user}",
"gists_url": "https://api.github.com/users/Kreijstal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kreijstal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kreijstal/subscriptions",
"organizations_url": "https://api.github.com/users/Kreijstal/orgs",
"repos_url": "https://api.github.com/users/Kreijstal/repos",
"events_url": "https://api.github.com/users/Kreijstal/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kreijstal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2024-08-13T16:00:14
| 2024-10-04T09:49:08
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
In the standard cli there should be some default tools like IPython (just like code interpreter, disabled by default), or so so that you can simply add it those tools to any model that supports such a tool, and a way to integrate custom tools, maybe someone wants to integrate their SD3 workflow, but it should be provided from the CLI itself, no?
as CLI flag wthout needing to use the ollama api, just CLI!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6339/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6339/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2140
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2140/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2140/comments
|
https://api.github.com/repos/ollama/ollama/issues/2140/events
|
https://github.com/ollama/ollama/issues/2140
| 2,094,532,599
|
I_kwDOJ0Z1Ps582Af3
| 2,140
|
Embedding api returns null (sometimes)
|
{
"login": "Gal-Lahat",
"id": 73216615,
"node_id": "MDQ6VXNlcjczMjE2NjE1",
"avatar_url": "https://avatars.githubusercontent.com/u/73216615?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Gal-Lahat",
"html_url": "https://github.com/Gal-Lahat",
"followers_url": "https://api.github.com/users/Gal-Lahat/followers",
"following_url": "https://api.github.com/users/Gal-Lahat/following{/other_user}",
"gists_url": "https://api.github.com/users/Gal-Lahat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Gal-Lahat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Gal-Lahat/subscriptions",
"organizations_url": "https://api.github.com/users/Gal-Lahat/orgs",
"repos_url": "https://api.github.com/users/Gal-Lahat/repos",
"events_url": "https://api.github.com/users/Gal-Lahat/events{/privacy}",
"received_events_url": "https://api.github.com/users/Gal-Lahat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-01-22T18:48:16
| 2024-03-13T23:02:37
| 2024-03-13T23:02:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
This is my code (C# .NET):
```cs
string url = "http://localhost:11434/api/embeddings";
string json = "{ \"model\": \"llama2:text\",\"prompt\": \"" + jsonSafeText + "\" }";
// get the response field from the json response
HttpClient client = new HttpClient();
var response = client.PostAsync(url, new StringContent(json, System.Text.Encoding.UTF8, "application/json")).Result;
if (response.StatusCode != System.Net.HttpStatusCode.OK)
{
Debug.LogError("Error getting embedding for: " + jsonSafeText);
return new float[0];
}
string responseString = response.Content.ReadAsStringAsync().Result;
```
On about 50% of the calls i get:
`{"embedding":null}`
as response with no errors.
The issue persists on all models that I've tested (llama2, llama2:text, mistral, mistran:text)
The first run is always fine, but from the second run onwards it fail randomly with no error.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2140/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2140/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7097
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7097/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7097/comments
|
https://api.github.com/repos/ollama/ollama/issues/7097/events
|
https://github.com/ollama/ollama/pull/7097
| 2,565,214,977
|
PR_kwDOJ0Z1Ps59kAsi
| 7,097
|
feat: configure auto startup in macos
|
{
"login": "hichemfantar",
"id": 34947993,
"node_id": "MDQ6VXNlcjM0OTQ3OTkz",
"avatar_url": "https://avatars.githubusercontent.com/u/34947993?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hichemfantar",
"html_url": "https://github.com/hichemfantar",
"followers_url": "https://api.github.com/users/hichemfantar/followers",
"following_url": "https://api.github.com/users/hichemfantar/following{/other_user}",
"gists_url": "https://api.github.com/users/hichemfantar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hichemfantar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hichemfantar/subscriptions",
"organizations_url": "https://api.github.com/users/hichemfantar/orgs",
"repos_url": "https://api.github.com/users/hichemfantar/repos",
"events_url": "https://api.github.com/users/hichemfantar/events{/privacy}",
"received_events_url": "https://api.github.com/users/hichemfantar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 4
| 2024-10-04T00:49:51
| 2025-01-27T02:57:05
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7097",
"html_url": "https://github.com/ollama/ollama/pull/7097",
"diff_url": "https://github.com/ollama/ollama/pull/7097.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7097.patch",
"merged_at": null
}
|
This pull request adds a new function called `toggleAutoStartup` and a corresponding menu item to the application. The function allows the user to enable or disable auto startup of the application. When the function is called, it updates the application's login item settings and displays a notification to indicate whether auto startup is enabled or disabled. The menu item is added to the tray menu and allows the user to easily toggle the auto startup setting. Additionally, the tray menu is now updated when clicked to avoid displaying stale information.
closes #162



https://github.com/user-attachments/assets/83c43726-4348-4074-9fe8-d893236acf12
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7097/reactions",
"total_count": 13,
"+1": 10,
"-1": 0,
"laugh": 0,
"hooray": 3,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7097/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1734
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1734/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1734/comments
|
https://api.github.com/repos/ollama/ollama/issues/1734/events
|
https://github.com/ollama/ollama/issues/1734
| 2,058,552,441
|
I_kwDOJ0Z1Ps56swR5
| 1,734
|
Ollama - Llava Model Unable to detect image uploaded (WSL2 on Windows10)
|
{
"login": "m4ttgit",
"id": 27547776,
"node_id": "MDQ6VXNlcjI3NTQ3Nzc2",
"avatar_url": "https://avatars.githubusercontent.com/u/27547776?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/m4ttgit",
"html_url": "https://github.com/m4ttgit",
"followers_url": "https://api.github.com/users/m4ttgit/followers",
"following_url": "https://api.github.com/users/m4ttgit/following{/other_user}",
"gists_url": "https://api.github.com/users/m4ttgit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/m4ttgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/m4ttgit/subscriptions",
"organizations_url": "https://api.github.com/users/m4ttgit/orgs",
"repos_url": "https://api.github.com/users/m4ttgit/repos",
"events_url": "https://api.github.com/users/m4ttgit/events{/privacy}",
"received_events_url": "https://api.github.com/users/m4ttgit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2023-12-28T15:15:52
| 2024-04-27T15:57:07
| 2023-12-29T14:00:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Trying to run Llava model using WSL2 on Windows10. Ollama version is 0.1.16
Got this error message.

How do I fix this?
Thanks
|
{
"login": "m4ttgit",
"id": 27547776,
"node_id": "MDQ6VXNlcjI3NTQ3Nzc2",
"avatar_url": "https://avatars.githubusercontent.com/u/27547776?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/m4ttgit",
"html_url": "https://github.com/m4ttgit",
"followers_url": "https://api.github.com/users/m4ttgit/followers",
"following_url": "https://api.github.com/users/m4ttgit/following{/other_user}",
"gists_url": "https://api.github.com/users/m4ttgit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/m4ttgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/m4ttgit/subscriptions",
"organizations_url": "https://api.github.com/users/m4ttgit/orgs",
"repos_url": "https://api.github.com/users/m4ttgit/repos",
"events_url": "https://api.github.com/users/m4ttgit/events{/privacy}",
"received_events_url": "https://api.github.com/users/m4ttgit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1734/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1734/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8525
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8525/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8525/comments
|
https://api.github.com/repos/ollama/ollama/issues/8525/events
|
https://github.com/ollama/ollama/issues/8525
| 2,803,116,322
|
I_kwDOJ0Z1Ps6nFCki
| 8,525
|
Ollama Linux Service vs. Ollama Serve (Changing Ports)
|
{
"login": "slyyyle",
"id": 78447050,
"node_id": "MDQ6VXNlcjc4NDQ3MDUw",
"avatar_url": "https://avatars.githubusercontent.com/u/78447050?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/slyyyle",
"html_url": "https://github.com/slyyyle",
"followers_url": "https://api.github.com/users/slyyyle/followers",
"following_url": "https://api.github.com/users/slyyyle/following{/other_user}",
"gists_url": "https://api.github.com/users/slyyyle/gists{/gist_id}",
"starred_url": "https://api.github.com/users/slyyyle/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/slyyyle/subscriptions",
"organizations_url": "https://api.github.com/users/slyyyle/orgs",
"repos_url": "https://api.github.com/users/slyyyle/repos",
"events_url": "https://api.github.com/users/slyyyle/events{/privacy}",
"received_events_url": "https://api.github.com/users/slyyyle/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2025-01-22T00:52:50
| 2025-01-24T09:27:25
| 2025-01-24T09:27:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Default Setup:
Following the installation guide, Ollama works without issues when hosted on the default port.
Changing Address to 0.0.0.0:
I was able to successfully change the address to 0.0.0.0, which works fine. However, when trying to change the port, I encountered issues.
Modifying the Service File:
When I modify the system service file with commands above the ############################## section, I can successfully set the address to 0.0.0.0. However, after trying to add a custom port, I encounter "debug mode." Even when canceling this debug mode and re-running the default port setup, the ollama -v works for default port, not for custom port, with no apparent difference between the two other than address listed. In new terminal, ollama -v is not recognized and it asks if ollama is running for port 3001, when showing clearly in the other terminal. Yet when I attempt to run OLLAMA_HOST=0:0:0:0:3001 in this case via terminal, it says the port is already being used. In neither of these cases can I achieve a "run" command because in one case, I am not allowed further inputs, and in another, it is not recognized.
Environment Variable Overrides:
When I try to override the port by setting OLLAMA_HOST=0.0.0.0:3001 ollama serve, the command gives identical debug ouput but different address stated reflecting [::]3001. Opening another terminal and running ollama seems to indicate that the service is not running. However, when I attempt to start Ollama on the same port, it says the port is already in use.
Temporary File Confusion:
After making edits removing my additions using systemctl, because the explanation implies anything above the hashtag symbols will be added to the .service file. When doing so it fails and says it is an empty temporary file.
Lack of Clear Documentation:
The documentation around configuring Ollama to run on a custom port is unclear. It seems like some people suggest not including the port in the OLLAMA_HOST variable, while others recommend adding it. I don't see a clear explanation of how to configure the service properly for a custom setup.
Confusion Over Service and Manual Commands:
I'm also confused about the relationship between the system service and the manual command ollama serve. It seems like the service is already running the process on the port I specified in the .service file, but when I try to call ollama manually, it either doesn't respond or conflicts with the service. There's advice suggesting it is unnecessary to modify the service directly, but this has led to further confusion, as I cannot get the ollama command to behave as expected.
Unclear Chronological Setup Process:
I would like a clearer understanding of the intended setup process. Specifically, how the service is supposed to be configured on Linux when using a custom port, and how it interacts with ollama serve and other commands.
Request for Help:
I would appreciate any clarification or guidance on the following:
How to properly configure the Ollama service to run on a custom port.
What the expected interaction is between system services (e.g., systemctl) and manual commands like ollama serve.
Whether modifying the .service file directly is appropriate, and if so, how to ensure my changes persist.
Any additional resources or documentation that can provide a clearer explanation of these steps.
I would be happy to contribute to the documentation once I better understand the process, as I believe clearer guidance is needed, especially for production-focused setups.
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8525/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8525/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/67
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/67/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/67/comments
|
https://api.github.com/repos/ollama/ollama/issues/67/events
|
https://github.com/ollama/ollama/pull/67
| 1,799,442,678
|
PR_kwDOJ0Z1Ps5VOs74
| 67
|
app: write logs to ~/.ollama/logs
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-11T17:42:13
| 2023-07-11T18:45:53
| 2023-07-11T18:45:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/67",
"html_url": "https://github.com/ollama/ollama/pull/67",
"diff_url": "https://github.com/ollama/ollama/pull/67.diff",
"patch_url": "https://github.com/ollama/ollama/pull/67.patch",
"merged_at": "2023-07-11T18:45:21"
}
| null |
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/67/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/67/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3842
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3842/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3842/comments
|
https://api.github.com/repos/ollama/ollama/issues/3842/events
|
https://github.com/ollama/ollama/issues/3842
| 2,258,798,122
|
I_kwDOJ0Z1Ps6GooYq
| 3,842
|
mixtao-7bx2-moe-v8.1 cannot work
|
{
"login": "eramax",
"id": 542413,
"node_id": "MDQ6VXNlcjU0MjQxMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/542413?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eramax",
"html_url": "https://github.com/eramax",
"followers_url": "https://api.github.com/users/eramax/followers",
"following_url": "https://api.github.com/users/eramax/following{/other_user}",
"gists_url": "https://api.github.com/users/eramax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eramax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eramax/subscriptions",
"organizations_url": "https://api.github.com/users/eramax/orgs",
"repos_url": "https://api.github.com/users/eramax/repos",
"events_url": "https://api.github.com/users/eramax/events{/privacy}",
"received_events_url": "https://api.github.com/users/eramax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2024-04-23T12:49:43
| 2024-04-23T14:22:06
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Just changed my computer and installed ollama, and I found this model is not working.
https://ollama.com/eramax/mixtao-7bx2-moe-v8.1
```
llama_model_loader: loaded meta data with 25 key-value pairs and 419 tensors from C:\Users\eramax\.ollama\models\blobs\sha256-1e360f0f98ef2687a01f8775aee66c1e3ff0e576ae11ae2955314f2c26671007 (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = .
llama_model_loader: - kv 2: llama.context_length u32 = 32768
llama_model_loader: - kv 3: llama.embedding_length u32 = 4096
llama_model_loader: - kv 4: llama.block_count u32 = 32
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 6: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 7: llama.attention.head_count u32 = 32
llama_model_loader: - kv 8: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 9: llama.expert_count u32 = 2
llama_model_loader: - kv 10: llama.expert_used_count u32 = 2
llama_model_loader: - kv 11: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 12: llama.rope.freq_base f32 = 10000.000000
llama_model_loader: - kv 13: general.file_type u32 = 15
llama_model_loader: - kv 14: tokenizer.ggml.model str = llama
llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,32000] = ["<unk>", "<s>", "</s>", "<0x00>", "<...
llama_model_loader: - kv 16: tokenizer.ggml.scores arr[f32,32000] = [0.000000, 0.000000, 0.000000, 0.0000...
llama_model_loader: - kv 17: tokenizer.ggml.token_type arr[i32,32000] = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ...
llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 1
llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 2
llama_model_loader: - kv 20: tokenizer.ggml.unknown_token_id u32 = 0
llama_model_loader: - kv 21: tokenizer.ggml.padding_token_id u32 = 1
llama_model_loader: - kv 22: tokenizer.ggml.add_bos_token bool = true
llama_model_loader: - kv 23: tokenizer.ggml.add_eos_token bool = false
llama_model_loader: - kv 24: general.quantization_version u32 = 2
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type f16: 32 tensors
llama_model_loader: - type q4_K: 273 tensors
llama_model_loader: - type q6_K: 49 tensors
llm_load_vocab: special tokens definition check successful ( 259/32000 ).
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 32000
llm_load_print_meta: n_merges = 0
llm_load_print_meta: n_ctx_train = 32768
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: n_embd_k_gqa = 1024
llm_load_print_meta: n_embd_v_gqa = 1024
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: n_expert = 2
llm_load_print_meta: n_expert_used = 2
llm_load_print_meta: causal attn = 1
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 0
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 10000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx = 32768
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: model type = 7B
llm_load_print_meta: model ftype = Q4_K - Medium
llm_load_print_meta: model params = 12.88 B
llm_load_print_meta: model size = 7.25 GiB (4.83 BPW)
llm_load_print_meta: general.name = .
llm_load_print_meta: BOS token = 1 '<s>'
llm_load_print_meta: EOS token = 2 '</s>'
llm_load_print_meta: UNK token = 0 '<unk>'
llm_load_print_meta: PAD token = 1 '<s>'
llm_load_print_meta: LF token = 13 '<0x0A>'
llm_load_tensors: ggml ctx size = 0.18 MiB
time=2024-04-23T14:44:22.196+02:00 level=ERROR source=routes.go:120 msg="error loading llama server" error="llama runner process no longer running: 3221225477 "
[GIN] 2024/04/23 - 14:44:22 | 500 | 1.1909775s | 127.0.0.1 | POST "/api/chat"
```
### OS
Windows
### GPU
_No response_
### CPU
Intel
### Ollama version
0.1.32
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3842/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3842/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2610
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2610/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2610/comments
|
https://api.github.com/repos/ollama/ollama/issues/2610/events
|
https://github.com/ollama/ollama/issues/2610
| 2,144,009,930
|
I_kwDOJ0Z1Ps5_yv7K
| 2,610
|
Return citations for given answers
|
{
"login": "SteffenBrinckmann",
"id": 39419674,
"node_id": "MDQ6VXNlcjM5NDE5Njc0",
"avatar_url": "https://avatars.githubusercontent.com/u/39419674?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SteffenBrinckmann",
"html_url": "https://github.com/SteffenBrinckmann",
"followers_url": "https://api.github.com/users/SteffenBrinckmann/followers",
"following_url": "https://api.github.com/users/SteffenBrinckmann/following{/other_user}",
"gists_url": "https://api.github.com/users/SteffenBrinckmann/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SteffenBrinckmann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SteffenBrinckmann/subscriptions",
"organizations_url": "https://api.github.com/users/SteffenBrinckmann/orgs",
"repos_url": "https://api.github.com/users/SteffenBrinckmann/repos",
"events_url": "https://api.github.com/users/SteffenBrinckmann/events{/privacy}",
"received_events_url": "https://api.github.com/users/SteffenBrinckmann/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-20T10:08:37
| 2024-02-27T07:53:19
| 2024-02-27T07:53:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hey,
would it be possible to return citations, too. Just like perplexity does?
Best, Steffen
|
{
"login": "SteffenBrinckmann",
"id": 39419674,
"node_id": "MDQ6VXNlcjM5NDE5Njc0",
"avatar_url": "https://avatars.githubusercontent.com/u/39419674?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SteffenBrinckmann",
"html_url": "https://github.com/SteffenBrinckmann",
"followers_url": "https://api.github.com/users/SteffenBrinckmann/followers",
"following_url": "https://api.github.com/users/SteffenBrinckmann/following{/other_user}",
"gists_url": "https://api.github.com/users/SteffenBrinckmann/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SteffenBrinckmann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SteffenBrinckmann/subscriptions",
"organizations_url": "https://api.github.com/users/SteffenBrinckmann/orgs",
"repos_url": "https://api.github.com/users/SteffenBrinckmann/repos",
"events_url": "https://api.github.com/users/SteffenBrinckmann/events{/privacy}",
"received_events_url": "https://api.github.com/users/SteffenBrinckmann/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2610/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2610/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/394
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/394/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/394/comments
|
https://api.github.com/repos/ollama/ollama/issues/394/events
|
https://github.com/ollama/ollama/issues/394
| 1,860,928,542
|
I_kwDOJ0Z1Ps5u64Qe
| 394
|
Ollama on VMware Photon OS
|
{
"login": "dcasota",
"id": 14890243,
"node_id": "MDQ6VXNlcjE0ODkwMjQz",
"avatar_url": "https://avatars.githubusercontent.com/u/14890243?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dcasota",
"html_url": "https://github.com/dcasota",
"followers_url": "https://api.github.com/users/dcasota/followers",
"following_url": "https://api.github.com/users/dcasota/following{/other_user}",
"gists_url": "https://api.github.com/users/dcasota/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dcasota/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dcasota/subscriptions",
"organizations_url": "https://api.github.com/users/dcasota/orgs",
"repos_url": "https://api.github.com/users/dcasota/repos",
"events_url": "https://api.github.com/users/dcasota/events{/privacy}",
"received_events_url": "https://api.github.com/users/dcasota/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-08-22T08:37:21
| 2023-08-22T23:55:07
| 2023-08-22T23:55:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
I'm tinkering with Ollama on VMware Photon OS.
The langchain example works, but the langchain-document example not.
This is ok
```
tdnf update -y
tdnf install -y git go build-essential
git clone https://github.com/jmorganca/ollama
cd ollama
go build .
tdnf install -y python3-pip
pip3 install -r examples/langchain/requirements.txt
./ollama serve &
./ollama pull llama2
python examples/langchain/main.py
```
Photon OS 5.0 comes with python 3.11. Actually seems to be an issue for tensorflow-macos.
```
pip3 install -r examples/langchain-document/requirements.txt
[...]
Collecting tensorflow-hub==0.14.0
Downloading tensorflow_hub-0.14.0-py2.py3-none-any.whl (90 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 90.3/90.3 kB 9.2 MB/s eta 0:00:00
ERROR: Ignored the following versions that require a different python version: 1.21.2 Requires-Python >=3.7,<3.11; 1.21.3 Requires-Python >=3.7,<3.11; 1.21.4 Requires-Python >=3.7,<3.11; 1.21.5 Requires-Python >=3.7,<3.11; 1.21.6 Requires-Python >=3.7,<3.11
ERROR: Could not find a version that satisfies the requirement tensorflow-macos==2.13.0 (from versions: none)
ERROR: No matching distribution found for tensorflow-macos==2.13.0
```
The workaround using an older python version seems difficult as well.
```
# Built-in python3-pip (3.11) actually is not compatible with langchain-document' requirements.txt
tdnf remove -y python*
tdnf install -y zip unzip zlib-devel openssl-devel libffi-devel bazel sqlite sqlite-devel ncurses-devel gdbm-devel bzip2-devel
curl -J -L -O https://www.python.org/ftp/python/3.9.17/Python-3.9.17.tgz
tar -xzvf Python-3.9.17.tgz
cd ./Python-3.9.17/
./configure
make
make install
export PATH=/root/Python-3.9.17/:$PATH
./python setup.py install
curl -J -L -O https://bootstrap.pypa.io/get-pip.py
./python get-pip.py
pip install setuptools --force
cd ..
python -m venv .venv
source .venv/bin/activate
git clone https://github.com/jmorganca/ollama
cd ollama
go build .
pip install -r examples/langchain/requirements.txt
pip install unstructured
pip install pdf2image
pip install pdfminer
pip install pdfminer.six
pip install pyproject.toml
pip install pysqlite3
pip install gpt4all
pip install chromadb
pip install tensorflow
pip install -r examples/langchain-document/requirements.txt
```
Does langchain-document work on other distros? which python version?
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/394/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/394/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7799
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7799/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7799/comments
|
https://api.github.com/repos/ollama/ollama/issues/7799/events
|
https://github.com/ollama/ollama/issues/7799
| 2,683,643,510
|
I_kwDOJ0Z1Ps6f9SZ2
| 7,799
|
langchain_ollama tool_calls is None
|
{
"login": "UICJohn",
"id": 4167985,
"node_id": "MDQ6VXNlcjQxNjc5ODU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4167985?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/UICJohn",
"html_url": "https://github.com/UICJohn",
"followers_url": "https://api.github.com/users/UICJohn/followers",
"following_url": "https://api.github.com/users/UICJohn/following{/other_user}",
"gists_url": "https://api.github.com/users/UICJohn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/UICJohn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/UICJohn/subscriptions",
"organizations_url": "https://api.github.com/users/UICJohn/orgs",
"repos_url": "https://api.github.com/users/UICJohn/repos",
"events_url": "https://api.github.com/users/UICJohn/events{/privacy}",
"received_events_url": "https://api.github.com/users/UICJohn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-11-22T15:41:07
| 2024-11-23T13:52:11
| 2024-11-23T01:17:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
` File "/workspaces/vivichains-base/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 732, in _agenerate
final_chunk = await self._achat_stream_with_aggregation(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/vivichains-base/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 602, in _achat_stream_with_aggregation
tool_calls=_get_tool_calls_from_response(stream_resp),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspaces/vivichains-base/.venv/lib/python3.11/site-packages/langchain_ollama/chat_models.py", line 71, in _get_tool_calls_from_response
for tc in response["message"]["tool_calls"]:
TypeError: 'NoneType' object is not iterable`
when requesting message from langchain-ollama to ollama server, the message from response is: Message(role='assistant', content='It', images=None, tool_calls=None), which make `if "tool_calls" in response["message"]` is always true.
langchain-ollama version: 0.2.0
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.4.3
|
{
"login": "UICJohn",
"id": 4167985,
"node_id": "MDQ6VXNlcjQxNjc5ODU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4167985?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/UICJohn",
"html_url": "https://github.com/UICJohn",
"followers_url": "https://api.github.com/users/UICJohn/followers",
"following_url": "https://api.github.com/users/UICJohn/following{/other_user}",
"gists_url": "https://api.github.com/users/UICJohn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/UICJohn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/UICJohn/subscriptions",
"organizations_url": "https://api.github.com/users/UICJohn/orgs",
"repos_url": "https://api.github.com/users/UICJohn/repos",
"events_url": "https://api.github.com/users/UICJohn/events{/privacy}",
"received_events_url": "https://api.github.com/users/UICJohn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7799/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7799/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/743
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/743/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/743/comments
|
https://api.github.com/repos/ollama/ollama/issues/743/events
|
https://github.com/ollama/ollama/pull/743
| 1,933,620,549
|
PR_kwDOJ0Z1Ps5cShZI
| 743
|
handle upstream proxies
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-09T18:51:18
| 2023-10-10T16:59:07
| 2023-10-10T16:59:06
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/743",
"html_url": "https://github.com/ollama/ollama/pull/743",
"diff_url": "https://github.com/ollama/ollama/pull/743.diff",
"patch_url": "https://github.com/ollama/ollama/pull/743.patch",
"merged_at": "2023-10-10T16:59:06"
}
|
`http.ProxyFromEnvironment` returns the appropriate `*_PROXY` for the request. e.g. `HTTP_PROXY` for `http://` requests, `HTTPS_PROXY` for `https://` requests.
Resolves #729
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/743/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/743/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1968
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1968/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1968/comments
|
https://api.github.com/repos/ollama/ollama/issues/1968/events
|
https://github.com/ollama/ollama/pull/1968
| 2,079,757,821
|
PR_kwDOJ0Z1Ps5j-YbF
| 1,968
|
fix: request retry with error
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-12T21:34:32
| 2024-01-16T18:33:51
| 2024-01-16T18:33:50
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1968",
"html_url": "https://github.com/ollama/ollama/pull/1968",
"diff_url": "https://github.com/ollama/ollama/pull/1968.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1968.patch",
"merged_at": "2024-01-16T18:33:50"
}
|
This fixes a subtle bug with makeRequestWithRetry where an HTTP status error on a retried request will potentially not return the right error.
When a request is retried on Unauthorized, the second request does not go through the same error handling as the first request. For example, if the second request returns with status 404, it returns the request and a nil error while if the first request returns with the same status, it will return a nil request and os.ErrNotExist
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1968/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1968/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8679
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8679/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8679/comments
|
https://api.github.com/repos/ollama/ollama/issues/8679/events
|
https://github.com/ollama/ollama/issues/8679
| 2,819,621,617
|
I_kwDOJ0Z1Ps6oEALx
| 8,679
|
AMD RX 6750 GPU not recognized by Ollama on Arch Linux despite HSA_OVERRIDE_GFX_VERSION
|
{
"login": "Guedxx",
"id": 148347673,
"node_id": "U_kgDOCNebGQ",
"avatar_url": "https://avatars.githubusercontent.com/u/148347673?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Guedxx",
"html_url": "https://github.com/Guedxx",
"followers_url": "https://api.github.com/users/Guedxx/followers",
"following_url": "https://api.github.com/users/Guedxx/following{/other_user}",
"gists_url": "https://api.github.com/users/Guedxx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Guedxx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Guedxx/subscriptions",
"organizations_url": "https://api.github.com/users/Guedxx/orgs",
"repos_url": "https://api.github.com/users/Guedxx/repos",
"events_url": "https://api.github.com/users/Guedxx/events{/privacy}",
"received_events_url": "https://api.github.com/users/Guedxx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-30T00:21:07
| 2025-01-30T00:31:06
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm running Arch Linux with an AMD RX 6750 GPU. Ollama fails to recognize my GPU as compatible, even after setting the Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0" environment variable. I've tried several steps to resolve the issue, but nothing has worked so far.
time=2025-01-29T21:15:25.499-03:00 level=INFO source=gpu.go:226 msg="looking for compatible GPUs"
time=2025-01-29T21:15:25.520-03:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory"
time=2025-01-29T21:15:25.522-03:00 level=WARN source=amd_linux.go:378 msg="amdgpu is not supported (supported types:[gfx1010 gfx1012 gfx1030 gfx1100 gfx1101 gfx1102 gfx900 gfx906 gfx908 gfx90a gfx940 gfx941 gfx942])" gpu_type=gfx1031 gpu=0 library=/opt/rocm/lib
time=2025-01-29T21:15:25.522-03:00 level=WARN source=amd_linux.go:385 msg="See https://github.com/ollama/ollama/blob/main/docs/gpu.md#overrides for HSA_OVERRIDE_GFX_VERSION usage"
time=2025-01-29T21:15:25.522-03:00 level=INFO source=amd_linux.go:404 msg="no compatible amdgpu devices detected"
time=2025-01-29T21:15:25.522-03:00 level=INFO source=gpu.go:392 msg="no compatible GPUs were discovered"
### OS
Arch Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.5.7
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8679/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8679/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4286
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4286/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4286/comments
|
https://api.github.com/repos/ollama/ollama/issues/4286/events
|
https://github.com/ollama/ollama/issues/4286
| 2,287,791,130
|
I_kwDOJ0Z1Ps6IXOwa
| 4,286
|
can't copy command correctly on ollama.com
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-05-09T14:16:19
| 2024-05-09T16:34:06
| 2024-05-09T16:18:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
as shown in the picture, not a full command is copied.
1. I click webpage link and enter this page with default tag
2. press copy button and paste to terminal
3. only part of command is copied
but, if I choose other tag and copy command , it ok. then I choose the default tag which is shown on the webpage is loaded and press the copy button again, full command is copied.
so strange.

### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
up to date
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4286/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4286/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1407
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1407/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1407/comments
|
https://api.github.com/repos/ollama/ollama/issues/1407/events
|
https://github.com/ollama/ollama/issues/1407
| 2,029,423,183
|
I_kwDOJ0Z1Ps549opP
| 1,407
|
When using chat, no error when param names are wrong
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 0
| 2023-12-06T21:34:49
| 2024-02-20T01:23:23
| 2024-02-20T01:23:23
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
This works and gives the output I expect:
```
POST http://localhost:11434/api/chat
Content-Type: application/json
{
"model": "llama2",
"messages": [
{
"role": "user",
"content": "Why is the sky blue"
}
]
}
```
But this:
```
POST http://localhost:11434/api/chat
Content-Type: application/json
{
"model": "llama2",
"messages": [
{
"role": "user",
"context": "Why is the sky blue"
}
]
}
```
gives me
```
HTTP/1.1 200 OK
Content-Type: application/x-ndjson
Date: Wed, 06 Dec 2023 21:33:44 GMT
Connection: close
Transfer-Encoding: chunked
{
"model": "registry.ollama.ai/library/llama2:latest",
"created_at": "2023-12-06T21:33:44.440469Z",
"done": true,
"total_duration": 4962667,
"prompt_eval_count": 25,
"prompt_eval_duration": 290212000,
"eval_count": 457,
"eval_duration": 107065386000
}
```
So it sort of works, but not really. The difference is that I put the content in a field called context instead of content.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1407/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1407/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4540
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4540/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4540/comments
|
https://api.github.com/repos/ollama/ollama/issues/4540/events
|
https://github.com/ollama/ollama/issues/4540
| 2,306,182,760
|
I_kwDOJ0Z1Ps6JdY5o
| 4,540
|
"ollama is not running" issue after changing the host ip
|
{
"login": "hknatm",
"id": 132488695,
"node_id": "U_kgDOB-Wd9w",
"avatar_url": "https://avatars.githubusercontent.com/u/132488695?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hknatm",
"html_url": "https://github.com/hknatm",
"followers_url": "https://api.github.com/users/hknatm/followers",
"following_url": "https://api.github.com/users/hknatm/following{/other_user}",
"gists_url": "https://api.github.com/users/hknatm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hknatm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hknatm/subscriptions",
"organizations_url": "https://api.github.com/users/hknatm/orgs",
"repos_url": "https://api.github.com/users/hknatm/repos",
"events_url": "https://api.github.com/users/hknatm/events{/privacy}",
"received_events_url": "https://api.github.com/users/hknatm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-05-20T14:57:18
| 2025-01-06T14:59:07
| 2024-07-03T23:05:33
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
when first start everything is normal. After I change host address to 192.168.1.10:11434 from /etc/systemd/system/ollama.service by adding environment, I put "ollama pull [model]" it throws an error : "Error: could not connect to ollama app, is it running?" but when I curl that IP that I changed, it shows running and I can use it from my OpenWebUI with the new IP. If I erase that new host ip from config file and reboot the server I can pull new models (but host ip changes to localhost:11434). What is the issue with that, how to resolve it?
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.38
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4540/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4540/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1760
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1760/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1760/comments
|
https://api.github.com/repos/ollama/ollama/issues/1760/events
|
https://github.com/ollama/ollama/issues/1760
| 2,062,548,464
|
I_kwDOJ0Z1Ps567_3w
| 1,760
|
[WSL1] Ollama is outright ignoring keyboard input
|
{
"login": "TheSystemGuy1337",
"id": 61162037,
"node_id": "MDQ6VXNlcjYxMTYyMDM3",
"avatar_url": "https://avatars.githubusercontent.com/u/61162037?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TheSystemGuy1337",
"html_url": "https://github.com/TheSystemGuy1337",
"followers_url": "https://api.github.com/users/TheSystemGuy1337/followers",
"following_url": "https://api.github.com/users/TheSystemGuy1337/following{/other_user}",
"gists_url": "https://api.github.com/users/TheSystemGuy1337/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TheSystemGuy1337/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TheSystemGuy1337/subscriptions",
"organizations_url": "https://api.github.com/users/TheSystemGuy1337/orgs",
"repos_url": "https://api.github.com/users/TheSystemGuy1337/repos",
"events_url": "https://api.github.com/users/TheSystemGuy1337/events{/privacy}",
"received_events_url": "https://api.github.com/users/TheSystemGuy1337/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 9
| 2024-01-02T15:05:48
| 2024-01-04T00:07:14
| 2024-01-02T18:35:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It just had to happen. After running ollama, any attempt to type out a message fails, with the program acting like you have not pressed a single key on the keyboard. I am using a Unicomp New Model M, which is an industry-standard ANSI/ASCII QWERTY 108 key keyboard, and this "program" just doesn't want to touch it's output even with a 10-foot pole. I then tried a quick and dirty hack involving VcXsrv and XFCE4, and that still didn't fix the issue. I am about ready to just ditch WSL1 and use a VM like Virtualbox. I am using Ubuntu 22.04.3 LTS.
I did get these warnings, prob because Microsoft was too lazy to add proper PCI bus emulation:
```
pcilib: Cannot open /proc/bus/pci
lspci: Cannot find any working access method.
```
PS: I cannot use WSL2 as my system appears to have bizzare chipset and BIOS quirks that completely break WSL2 (ie WSL2 will complain that Virtualization and the Virtual Machine platform is not enabled, despite the fact they are)
|
{
"login": "TheSystemGuy1337",
"id": 61162037,
"node_id": "MDQ6VXNlcjYxMTYyMDM3",
"avatar_url": "https://avatars.githubusercontent.com/u/61162037?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TheSystemGuy1337",
"html_url": "https://github.com/TheSystemGuy1337",
"followers_url": "https://api.github.com/users/TheSystemGuy1337/followers",
"following_url": "https://api.github.com/users/TheSystemGuy1337/following{/other_user}",
"gists_url": "https://api.github.com/users/TheSystemGuy1337/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TheSystemGuy1337/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TheSystemGuy1337/subscriptions",
"organizations_url": "https://api.github.com/users/TheSystemGuy1337/orgs",
"repos_url": "https://api.github.com/users/TheSystemGuy1337/repos",
"events_url": "https://api.github.com/users/TheSystemGuy1337/events{/privacy}",
"received_events_url": "https://api.github.com/users/TheSystemGuy1337/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1760/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1760/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7570
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7570/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7570/comments
|
https://api.github.com/repos/ollama/ollama/issues/7570/events
|
https://github.com/ollama/ollama/issues/7570
| 2,643,220,474
|
I_kwDOJ0Z1Ps6djFf6
| 7,570
|
How to install Olama in a distributed manner
|
{
"login": "smileyboy2019",
"id": 59221294,
"node_id": "MDQ6VXNlcjU5MjIxMjk0",
"avatar_url": "https://avatars.githubusercontent.com/u/59221294?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/smileyboy2019",
"html_url": "https://github.com/smileyboy2019",
"followers_url": "https://api.github.com/users/smileyboy2019/followers",
"following_url": "https://api.github.com/users/smileyboy2019/following{/other_user}",
"gists_url": "https://api.github.com/users/smileyboy2019/gists{/gist_id}",
"starred_url": "https://api.github.com/users/smileyboy2019/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/smileyboy2019/subscriptions",
"organizations_url": "https://api.github.com/users/smileyboy2019/orgs",
"repos_url": "https://api.github.com/users/smileyboy2019/repos",
"events_url": "https://api.github.com/users/smileyboy2019/events{/privacy}",
"received_events_url": "https://api.github.com/users/smileyboy2019/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-08T08:05:43
| 2024-11-17T14:03:43
| 2024-11-17T14:03:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
How to connect two servers with 4090 graphics cards and provide unified services
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7570/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7570/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6333
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6333/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6333/comments
|
https://api.github.com/repos/ollama/ollama/issues/6333/events
|
https://github.com/ollama/ollama/issues/6333
| 2,462,676,268
|
I_kwDOJ0Z1Ps6SyXUs
| 6,333
|
"couldn't remove unused layers: invalid character '\x00' looking for beginning of value"
|
{
"login": "FellowTraveler",
"id": 339191,
"node_id": "MDQ6VXNlcjMzOTE5MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/339191?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/FellowTraveler",
"html_url": "https://github.com/FellowTraveler",
"followers_url": "https://api.github.com/users/FellowTraveler/followers",
"following_url": "https://api.github.com/users/FellowTraveler/following{/other_user}",
"gists_url": "https://api.github.com/users/FellowTraveler/gists{/gist_id}",
"starred_url": "https://api.github.com/users/FellowTraveler/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FellowTraveler/subscriptions",
"organizations_url": "https://api.github.com/users/FellowTraveler/orgs",
"repos_url": "https://api.github.com/users/FellowTraveler/repos",
"events_url": "https://api.github.com/users/FellowTraveler/events{/privacy}",
"received_events_url": "https://api.github.com/users/FellowTraveler/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-08-13T08:02:03
| 2024-08-18T00:02:12
| 2024-08-15T19:20:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
<img width="934" alt="image" src="https://github.com/user-attachments/assets/937cbcd5-ac5f-4a0a-ba7b-97dc6327efa9">
```
(base) ollama % ollama pull llama3.1:8b-instruct-q8_0
pulling manifest
pulling cc04e85e1f86... 100% ▕█████████████████████████████████████▏ 8.5 GB
pulling 11ce4ee3e170... 100% ▕█████████████████████████████████████▏ 1.7 KB
pulling 0ba8f0e314b4... 100% ▕█████████████████████████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕█████████████████████████████████████▏ 96 B
pulling 4a4a958ae550... 100% ▕█████████████████████████████████████▏ 485 B
verifying sha256 digest
writing manifest
removing any unused layers
success
(base) ollama % ollama pull llama3.1:70b-instruct-q5_K_M
pulling manifest
pulling f8f84c9d6421... 100% ▕█████████████████████████████████████▏ 49 GB
pulling 11ce4ee3e170... 100% ▕█████████████████████████████████████▏ 1.7 KB
pulling 0ba8f0e314b4... 100% ▕█████████████████████████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕█████████████████████████████████████▏ 96 B
pulling a9528db392c2... 100% ▕█████████████████████████████████████▏ 488 B
verifying sha256 digest
writing manifest
removing any unused layers
success
(base) ollama % ollama pull llama3.1:70b-instruct-q4_K_M
pulling manifest
pulling de20d2cf2dc4... 100% ▕█████████████████████████████████████▏ 42 GB
pulling 11ce4ee3e170... 100% ▕█████████████████████████████████████▏ 1.7 KB
pulling 0ba8f0e314b4... 100% ▕█████████████████████████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕█████████████████████████████████████▏ 96 B
pulling 4bde18c49e43... 100% ▕█████████████████████████████████████▏ 488 B
verifying sha256 digest
writing manifest
removing any unused layers
couldn't remove unused layers: invalid character '\x00' looking for beginning of value
success
(base) ollama % ollama pull llama3.1:70b-instruct-q4_K_M
pulling manifest
pulling de20d2cf2dc4... 100% ▕█████████████████████████████████████▏ 42 GB
pulling 11ce4ee3e170... 100% ▕█████████████████████████████████████▏ 1.7 KB
pulling 0ba8f0e314b4... 100% ▕█████████████████████████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕█████████████████████████████████████▏ 96 B
pulling 4bde18c49e43... 100% ▕█████████████████████████████████████▏ 488 B
verifying sha256 digest
writing manifest
removing any unused layers
couldn't remove unused layers: invalid character '\x00' looking for beginning of value
success
(base) ollama % ollama pull llama3.1:70b-instruct-q5_K_M
pulling manifest
pulling f8f84c9d6421... 100% ▕█████████████████████████████████████▏ 49 GB
pulling 11ce4ee3e170... 100% ▕█████████████████████████████████████▏ 1.7 KB
pulling 0ba8f0e314b4... 100% ▕█████████████████████████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕█████████████████████████████████████▏ 96 B
pulling a9528db392c2... 100% ▕█████████████████████████████████████▏ 488 B
verifying sha256 digest
writing manifest
removing any unused layers
couldn't remove unused layers: invalid character '\x00' looking for beginning of value
success
(base) ollama % ollama pull llama3.1:8b-instruct-q8_0
pulling manifest
pulling cc04e85e1f86... 100% ▕█████████████████████████████████████▏ 8.5 GB
pulling 11ce4ee3e170... 100% ▕█████████████████████████████████████▏ 1.7 KB
pulling 0ba8f0e314b4... 100% ▕█████████████████████████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕█████████████████████████████████████▏ 96 B
pulling 4a4a958ae550... 100% ▕█████████████████████████████████████▏ 485 B
verifying sha256 digest
writing manifest
removing any unused layers
couldn't remove unused layers: invalid character '\x00' looking for beginning of value
success
(base) ollama % ollama pull llama3.1:8b-instruct-fp16
pulling manifest
pulling 09cd6813dc2e... 100% ▕█████████████████████████████████████▏ 16 GB
pulling 11ce4ee3e170... 100% ▕█████████████████████████████████████▏ 1.7 KB
pulling 0ba8f0e314b4... 100% ▕█████████████████████████████████████▏ 12 KB
pulling 56bb8bd477a5... 100% ▕█████████████████████████████████████▏ 96 B
pulling daa7d15f6d0b... 100% ▕█████████████████████████████████████▏ 484 B
verifying sha256 digest
writing manifest
removing any unused layers
couldn't remove unused layers: invalid character '\x00' looking for beginning of value
success
(base) ollama %
```
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.5-1-gb807ec8
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6333/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6333/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7323
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7323/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7323/comments
|
https://api.github.com/repos/ollama/ollama/issues/7323/events
|
https://github.com/ollama/ollama/issues/7323
| 2,606,116,550
|
I_kwDOJ0Z1Ps6bVi7G
| 7,323
|
ollama ps reporting "100% GPU" while model is running on CPU only.
|
{
"login": "Liu-Eroteme",
"id": 129079288,
"node_id": "U_kgDOB7GX-A",
"avatar_url": "https://avatars.githubusercontent.com/u/129079288?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Liu-Eroteme",
"html_url": "https://github.com/Liu-Eroteme",
"followers_url": "https://api.github.com/users/Liu-Eroteme/followers",
"following_url": "https://api.github.com/users/Liu-Eroteme/following{/other_user}",
"gists_url": "https://api.github.com/users/Liu-Eroteme/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Liu-Eroteme/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Liu-Eroteme/subscriptions",
"organizations_url": "https://api.github.com/users/Liu-Eroteme/orgs",
"repos_url": "https://api.github.com/users/Liu-Eroteme/repos",
"events_url": "https://api.github.com/users/Liu-Eroteme/events{/privacy}",
"received_events_url": "https://api.github.com/users/Liu-Eroteme/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2024-10-22T17:56:23
| 2025-01-27T09:39:07
| 2024-12-02T14:43:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Running llama 3.1 70b q3km on 2x4090 when there is already a colbert retriever loaded (takes up ~2800MiB VRAM) should work, but doesn't - ollama ps reports that the model is running and using the GPU:
`llama3.1:70b-instruct-q3_K_M 0e97a7709799 40 GB 100% GPU Less than a second from now`
but my GPUs are unloaded, the VRAM is empty, and my CPU is fully loaded.
what gives?
### OS
Linux, Docker
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.10
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7323/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7323/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1966
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1966/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1966/comments
|
https://api.github.com/repos/ollama/ollama/issues/1966/events
|
https://github.com/ollama/ollama/pull/1966
| 2,079,729,874
|
PR_kwDOJ0Z1Ps5j-SUy
| 1,966
|
improve cuda detection (rel. issue #1704)
|
{
"login": "fpreiss",
"id": 17441607,
"node_id": "MDQ6VXNlcjE3NDQxNjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/17441607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fpreiss",
"html_url": "https://github.com/fpreiss",
"followers_url": "https://api.github.com/users/fpreiss/followers",
"following_url": "https://api.github.com/users/fpreiss/following{/other_user}",
"gists_url": "https://api.github.com/users/fpreiss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fpreiss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fpreiss/subscriptions",
"organizations_url": "https://api.github.com/users/fpreiss/orgs",
"repos_url": "https://api.github.com/users/fpreiss/repos",
"events_url": "https://api.github.com/users/fpreiss/events{/privacy}",
"received_events_url": "https://api.github.com/users/fpreiss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-12T21:12:54
| 2024-01-15T02:00:11
| 2024-01-15T02:00:11
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1966",
"html_url": "https://github.com/ollama/ollama/pull/1966",
"diff_url": "https://github.com/ollama/ollama/pull/1966.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1966.patch",
"merged_at": "2024-01-15T02:00:11"
}
|
This pull request supersedes https://github.com/jmorganca/ollama/pull/1880
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1966/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1966/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1807
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1807/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1807/comments
|
https://api.github.com/repos/ollama/ollama/issues/1807/events
|
https://github.com/ollama/ollama/issues/1807
| 2,067,376,071
|
I_kwDOJ0Z1Ps57OafH
| 1,807
|
[ISSUES] I think it would be interesting to have different templates.
|
{
"login": "rgaidot",
"id": 5269,
"node_id": "MDQ6VXNlcjUyNjk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5269?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rgaidot",
"html_url": "https://github.com/rgaidot",
"followers_url": "https://api.github.com/users/rgaidot/followers",
"following_url": "https://api.github.com/users/rgaidot/following{/other_user}",
"gists_url": "https://api.github.com/users/rgaidot/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rgaidot/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rgaidot/subscriptions",
"organizations_url": "https://api.github.com/users/rgaidot/orgs",
"repos_url": "https://api.github.com/users/rgaidot/repos",
"events_url": "https://api.github.com/users/rgaidot/events{/privacy}",
"received_events_url": "https://api.github.com/users/rgaidot/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-01-05T13:43:41
| 2024-03-14T22:43:59
| 2024-03-14T22:43:59
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I think it would be interesting to have different templates (.github/**/*.md) for various purposes within your repo. Templates can significantly enhance efficiency and clarity in communication, especially when dealing with different aspects of your code/repo. Imagine having specific templates tailored for bug reports, allowing users to succinctly detail the issue they encountered, including steps to reproduce. This standardized format would streamline the debugging process, making it more organized and time-effective.
Similarly, having a dedicated template for reporting issues can help users express concerns or suggestions in a structured manner. Users could provide essential details, such as the nature of the problem, its impact, and any relevant markdown/screenshots, making it easier for the team to comprehend and address their concerns promptly.
Moreover, the inclusion of a feature request template could be a valuable addition. Users often have innovative ideas or specific functionalities they'd like to see implemented. A feature request template could guide users in articulating their suggestions comprehensively, specifying the intended benefits and potential use cases. This structured approach would empower your development team to better understand and evaluate the feasibility and significance of each proposed feature.
In conclusion, introducing different templates for bug reports, issue reports, and feature requests can enhance the overall user experience by promoting clear and concise communication. This, in turn, facilitates more efficient problem resolution, ensuring that your platform remains responsive to user needs and continually evolves with valuable user input.
What do you think ?
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1807/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1807/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7856
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7856/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7856/comments
|
https://api.github.com/repos/ollama/ollama/issues/7856/events
|
https://github.com/ollama/ollama/issues/7856
| 2,697,641,318
|
I_kwDOJ0Z1Ps6gyr1m
| 7,856
|
Ddos of parsing markdown in frontend & images
|
{
"login": "remco-pc",
"id": 8077908,
"node_id": "MDQ6VXNlcjgwNzc5MDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8077908?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/remco-pc",
"html_url": "https://github.com/remco-pc",
"followers_url": "https://api.github.com/users/remco-pc/followers",
"following_url": "https://api.github.com/users/remco-pc/following{/other_user}",
"gists_url": "https://api.github.com/users/remco-pc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/remco-pc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/remco-pc/subscriptions",
"organizations_url": "https://api.github.com/users/remco-pc/orgs",
"repos_url": "https://api.github.com/users/remco-pc/repos",
"events_url": "https://api.github.com/users/remco-pc/events{/privacy}",
"received_events_url": "https://api.github.com/users/remco-pc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-11-27T08:39:12
| 2024-11-27T08:40:27
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
if the frontend converts markdown strings to supported html elements, on every token the frontend is requesting the same image over and over again and start downloading all images on every new token.
So markdown conversion should be done on the backend to avoid ddos attacks through wrong javascript versions.
also a rate limiter on your image server should be applied, if this gets fired you can amplify the load on servers real easily, example on youtube
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
0.4.1
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7856/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7856/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7956
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7956/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7956/comments
|
https://api.github.com/repos/ollama/ollama/issues/7956/events
|
https://github.com/ollama/ollama/issues/7956
| 2,721,367,886
|
I_kwDOJ0Z1Ps6iNMdO
| 7,956
|
Low GPU usage on second GPU
|
{
"login": "frenzybiscuit",
"id": 190028151,
"node_id": "U_kgDOC1OZdw",
"avatar_url": "https://avatars.githubusercontent.com/u/190028151?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/frenzybiscuit",
"html_url": "https://github.com/frenzybiscuit",
"followers_url": "https://api.github.com/users/frenzybiscuit/followers",
"following_url": "https://api.github.com/users/frenzybiscuit/following{/other_user}",
"gists_url": "https://api.github.com/users/frenzybiscuit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/frenzybiscuit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/frenzybiscuit/subscriptions",
"organizations_url": "https://api.github.com/users/frenzybiscuit/orgs",
"repos_url": "https://api.github.com/users/frenzybiscuit/repos",
"events_url": "https://api.github.com/users/frenzybiscuit/events{/privacy}",
"received_events_url": "https://api.github.com/users/frenzybiscuit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 25
| 2024-12-05T20:50:03
| 2024-12-14T22:30:44
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am on the 0.5.0 release (which links to 0.4.8-rc0) and using Qwen 2.5 32b Q5 with 32k context and flash attention with q8_0 KV cache.
I have a 3090 and 2080ti.
Ollama is putting 22GB on the 3090 and 5.3GB on the 2080ti.
When running a prompt the 3090 is at 80%-90% GPU usage while the 2080ti is only at 10%.
When using llama.cpp directly with split row, the VRAM on the 2080ti is mostly maxed and the GPU usage on both GPU is in the 50%-65% range.
----
My question: Why is the 3090 doing most of the work on Ollama?
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.4.8-rc0
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7956/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7956/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2755
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2755/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2755/comments
|
https://api.github.com/repos/ollama/ollama/issues/2755/events
|
https://github.com/ollama/ollama/issues/2755
| 2,153,025,711
|
I_kwDOJ0Z1Ps6AVJCv
| 2,755
|
New Model Request: BioMistral model?
|
{
"login": "unclecode",
"id": 12494079,
"node_id": "MDQ6VXNlcjEyNDk0MDc5",
"avatar_url": "https://avatars.githubusercontent.com/u/12494079?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/unclecode",
"html_url": "https://github.com/unclecode",
"followers_url": "https://api.github.com/users/unclecode/followers",
"following_url": "https://api.github.com/users/unclecode/following{/other_user}",
"gists_url": "https://api.github.com/users/unclecode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/unclecode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/unclecode/subscriptions",
"organizations_url": "https://api.github.com/users/unclecode/orgs",
"repos_url": "https://api.github.com/users/unclecode/repos",
"events_url": "https://api.github.com/users/unclecode/events{/privacy}",
"received_events_url": "https://api.github.com/users/unclecode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 3
| 2024-02-26T00:37:01
| 2024-05-15T21:05:21
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
I wonder if you have any plan to add BioMistral in library?
Thanks
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2755/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2755/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3858
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3858/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3858/comments
|
https://api.github.com/repos/ollama/ollama/issues/3858/events
|
https://github.com/ollama/ollama/pull/3858
| 2,260,007,235
|
PR_kwDOJ0Z1Ps5tiSSm
| 3,858
|
types/model: restrict digest hash part to a minimum of 2 characters
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-24T00:36:11
| 2024-04-24T01:24:18
| 2024-04-24T01:24:17
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3858",
"html_url": "https://github.com/ollama/ollama/pull/3858",
"diff_url": "https://github.com/ollama/ollama/pull/3858.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3858.patch",
"merged_at": "2024-04-24T01:24:17"
}
|
This allows users of a valid Digest to know it has a minimum of 2 characters in the hash part for use when sharding.
This is a reasonable restriction as the hash part is a SHA256 hash which is 64 characters long, which is the common hash used. There is no anticipation of using a hash with less than 2 characters.
Also, add MustParseDigest.
Also, replace Digest.Type with Digest.Split for getting both the type and hash parts together, which is most the common case when asking for either.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3858/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3858/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7495
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7495/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7495/comments
|
https://api.github.com/repos/ollama/ollama/issues/7495/events
|
https://github.com/ollama/ollama/issues/7495
| 2,633,538,249
|
I_kwDOJ0Z1Ps6c-JrJ
| 7,495
|
mac Errors when running
|
{
"login": "shan23chen",
"id": 44418759,
"node_id": "MDQ6VXNlcjQ0NDE4NzU5",
"avatar_url": "https://avatars.githubusercontent.com/u/44418759?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shan23chen",
"html_url": "https://github.com/shan23chen",
"followers_url": "https://api.github.com/users/shan23chen/followers",
"following_url": "https://api.github.com/users/shan23chen/following{/other_user}",
"gists_url": "https://api.github.com/users/shan23chen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shan23chen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shan23chen/subscriptions",
"organizations_url": "https://api.github.com/users/shan23chen/orgs",
"repos_url": "https://api.github.com/users/shan23chen/repos",
"events_url": "https://api.github.com/users/shan23chen/events{/privacy}",
"received_events_url": "https://api.github.com/users/shan23chen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-11-04T18:24:24
| 2025-01-13T00:52:19
| 2025-01-13T00:52:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
`ollama run gemma2:2b`
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/gemma2/manifests/2b": write tcp [2601:19b:0:b8a0:915f:c8c:3de4:9c5]:50022->[2606:4700:3034::ac43:b6e5]:443: write: socket is not connected
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
ollama version is 0.3.14
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7495/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7495/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1754
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1754/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1754/comments
|
https://api.github.com/repos/ollama/ollama/issues/1754/events
|
https://github.com/ollama/ollama/issues/1754
| 2,061,609,837
|
I_kwDOJ0Z1Ps564att
| 1,754
|
How to add custom LLM models from Huggingface
|
{
"login": "yiouyou",
"id": 14249712,
"node_id": "MDQ6VXNlcjE0MjQ5NzEy",
"avatar_url": "https://avatars.githubusercontent.com/u/14249712?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yiouyou",
"html_url": "https://github.com/yiouyou",
"followers_url": "https://api.github.com/users/yiouyou/followers",
"following_url": "https://api.github.com/users/yiouyou/following{/other_user}",
"gists_url": "https://api.github.com/users/yiouyou/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yiouyou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yiouyou/subscriptions",
"organizations_url": "https://api.github.com/users/yiouyou/orgs",
"repos_url": "https://api.github.com/users/yiouyou/repos",
"events_url": "https://api.github.com/users/yiouyou/events{/privacy}",
"received_events_url": "https://api.github.com/users/yiouyou/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-01-01T15:02:24
| 2025-01-28T03:49:18
| 2024-01-02T11:27:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have some fine-tuned models saved on Huggingface. How to add or convert any custome LLM to ollama fitted version?
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1754/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1754/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4933
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4933/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4933/comments
|
https://api.github.com/repos/ollama/ollama/issues/4933/events
|
https://github.com/ollama/ollama/issues/4933
| 2,341,691,507
|
I_kwDOJ0Z1Ps6Lk2Bz
| 4,933
|
Error: Pull Model Manifest - Timeout
|
{
"login": "ulhaqi12",
"id": 44068298,
"node_id": "MDQ6VXNlcjQ0MDY4Mjk4",
"avatar_url": "https://avatars.githubusercontent.com/u/44068298?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ulhaqi12",
"html_url": "https://github.com/ulhaqi12",
"followers_url": "https://api.github.com/users/ulhaqi12/followers",
"following_url": "https://api.github.com/users/ulhaqi12/following{/other_user}",
"gists_url": "https://api.github.com/users/ulhaqi12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ulhaqi12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ulhaqi12/subscriptions",
"organizations_url": "https://api.github.com/users/ulhaqi12/orgs",
"repos_url": "https://api.github.com/users/ulhaqi12/repos",
"events_url": "https://api.github.com/users/ulhaqi12/events{/privacy}",
"received_events_url": "https://api.github.com/users/ulhaqi12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-06-08T14:54:03
| 2024-08-11T12:50:51
| 2024-06-18T11:19:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
I am using the latest docker image of Ollama(0.1.40). Here are the contents of my docker-compose file:
```
ollama:
image: internal-mirror/ollama/ollama
container_name: ollama
ports:
- "11434:11434"
volumes:
- ollama:/root/.ollama
restart: unless-stopped
```
and I am trying to run llama3 using the following command:
```
$ sudo docker exec -it ollama ollama pull llama3
```
Getting the following error:
```
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3/manifests/latest": dial tcp 104.21.75.227:443: i/o timeout
```
Can you guide me? Is this an issue with Ollama?
I have tried accessing manifest (https://registry.ollama.ai/v2/library/llama3/manifests/latest) on a web browser. Here is what it's showing me:

-Ikram
### OS
Linux
### GPU
_No response_
### CPU
Intel
### Ollama version
0.1.40
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4933/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4933/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6176
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6176/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6176/comments
|
https://api.github.com/repos/ollama/ollama/issues/6176/events
|
https://github.com/ollama/ollama/issues/6176
| 2,448,260,013
|
I_kwDOJ0Z1Ps6R7Xut
| 6,176
|
System Prompts can not work on the first round.
|
{
"login": "DirtyKnightForVi",
"id": 116725810,
"node_id": "U_kgDOBvUYMg",
"avatar_url": "https://avatars.githubusercontent.com/u/116725810?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DirtyKnightForVi",
"html_url": "https://github.com/DirtyKnightForVi",
"followers_url": "https://api.github.com/users/DirtyKnightForVi/followers",
"following_url": "https://api.github.com/users/DirtyKnightForVi/following{/other_user}",
"gists_url": "https://api.github.com/users/DirtyKnightForVi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DirtyKnightForVi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DirtyKnightForVi/subscriptions",
"organizations_url": "https://api.github.com/users/DirtyKnightForVi/orgs",
"repos_url": "https://api.github.com/users/DirtyKnightForVi/repos",
"events_url": "https://api.github.com/users/DirtyKnightForVi/events{/privacy}",
"received_events_url": "https://api.github.com/users/DirtyKnightForVi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 26
| 2024-08-05T10:59:26
| 2024-12-02T20:09:52
| 2024-12-02T20:09:51
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# What is the issue?
## Description
**Bug Summary:**
System Prompts can not work on the first round.
**Actual Behavior:**
For a specific task scenario, there might be special System Prompts. However, in the current version (at least starting from 3.10), an additional round of conversation is needed before these System Prompts can take effect.
Assuming the scenario is to generate SQL.
In the previous normal version, you could set the System Prompts in Setting --- General, then start inputting questions to generate SQL.
However, in the current version, after inputting a question, an additional round of Q&A with the LLM is required to produce the desired SQL.
## model
WizardLM2-8x22b
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.3
https://github.com/open-webui/open-webui/discussions/4381#discussion-7014017
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6176/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6176/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2322
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2322/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2322/comments
|
https://api.github.com/repos/ollama/ollama/issues/2322/events
|
https://github.com/ollama/ollama/issues/2322
| 2,114,448,109
|
I_kwDOJ0Z1Ps5-B-rt
| 2,322
|
Run Ollama models stored on external disk
|
{
"login": "B-Gendron",
"id": 95307996,
"node_id": "U_kgDOBa5I3A",
"avatar_url": "https://avatars.githubusercontent.com/u/95307996?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/B-Gendron",
"html_url": "https://github.com/B-Gendron",
"followers_url": "https://api.github.com/users/B-Gendron/followers",
"following_url": "https://api.github.com/users/B-Gendron/following{/other_user}",
"gists_url": "https://api.github.com/users/B-Gendron/gists{/gist_id}",
"starred_url": "https://api.github.com/users/B-Gendron/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/B-Gendron/subscriptions",
"organizations_url": "https://api.github.com/users/B-Gendron/orgs",
"repos_url": "https://api.github.com/users/B-Gendron/repos",
"events_url": "https://api.github.com/users/B-Gendron/events{/privacy}",
"received_events_url": "https://api.github.com/users/B-Gendron/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2024-02-02T09:07:51
| 2024-10-10T18:26:13
| 2024-02-05T19:22:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
As I went through the whole documentation, I am still a bit confused about how the model are saved when doing `ollama pull` and how I can use it. For instance, as I don't have that much storage on my computer I would like to pull several models and then save the whole `/.ollama/models/blobs/` directory on an external disk.
Is it possible then to fetch the desired model from my external storage to run the model locally on my computer? More precisely, when the documentation of `pull`command says `Pull a model from a registry`, is there a way to specify such registry, and can it be a storage place like a hard disk?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2322/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2322/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5095
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5095/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5095/comments
|
https://api.github.com/repos/ollama/ollama/issues/5095/events
|
https://github.com/ollama/ollama/issues/5095
| 2,356,871,776
|
I_kwDOJ0Z1Ps6MewJg
| 5,095
|
add support Alibaba-NLP/gte-Qwen2-7B-instruct
|
{
"login": "louyongjiu",
"id": 16408477,
"node_id": "MDQ6VXNlcjE2NDA4NDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/16408477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/louyongjiu",
"html_url": "https://github.com/louyongjiu",
"followers_url": "https://api.github.com/users/louyongjiu/followers",
"following_url": "https://api.github.com/users/louyongjiu/following{/other_user}",
"gists_url": "https://api.github.com/users/louyongjiu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/louyongjiu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/louyongjiu/subscriptions",
"organizations_url": "https://api.github.com/users/louyongjiu/orgs",
"repos_url": "https://api.github.com/users/louyongjiu/repos",
"events_url": "https://api.github.com/users/louyongjiu/events{/privacy}",
"received_events_url": "https://api.github.com/users/louyongjiu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-06-17T09:35:32
| 2024-07-09T19:16:35
| 2024-06-27T09:17:08
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct

|
{
"login": "louyongjiu",
"id": 16408477,
"node_id": "MDQ6VXNlcjE2NDA4NDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/16408477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/louyongjiu",
"html_url": "https://github.com/louyongjiu",
"followers_url": "https://api.github.com/users/louyongjiu/followers",
"following_url": "https://api.github.com/users/louyongjiu/following{/other_user}",
"gists_url": "https://api.github.com/users/louyongjiu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/louyongjiu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/louyongjiu/subscriptions",
"organizations_url": "https://api.github.com/users/louyongjiu/orgs",
"repos_url": "https://api.github.com/users/louyongjiu/repos",
"events_url": "https://api.github.com/users/louyongjiu/events{/privacy}",
"received_events_url": "https://api.github.com/users/louyongjiu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5095/reactions",
"total_count": 11,
"+1": 11,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5095/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7279
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7279/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7279/comments
|
https://api.github.com/repos/ollama/ollama/issues/7279/events
|
https://github.com/ollama/ollama/issues/7279
| 2,600,692,910
|
I_kwDOJ0Z1Ps6bA2yu
| 7,279
|
Ollama Docker image 0.4.0-rc3-rocm crashes due to missing shared library
|
{
"login": "ic4-y",
"id": 61844926,
"node_id": "MDQ6VXNlcjYxODQ0OTI2",
"avatar_url": "https://avatars.githubusercontent.com/u/61844926?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ic4-y",
"html_url": "https://github.com/ic4-y",
"followers_url": "https://api.github.com/users/ic4-y/followers",
"following_url": "https://api.github.com/users/ic4-y/following{/other_user}",
"gists_url": "https://api.github.com/users/ic4-y/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ic4-y/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ic4-y/subscriptions",
"organizations_url": "https://api.github.com/users/ic4-y/orgs",
"repos_url": "https://api.github.com/users/ic4-y/repos",
"events_url": "https://api.github.com/users/ic4-y/events{/privacy}",
"received_events_url": "https://api.github.com/users/ic4-y/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677677816,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgVG-A",
"url": "https://api.github.com/repos/ollama/ollama/labels/docker",
"name": "docker",
"color": "0052CC",
"default": false,
"description": "Issues relating to using ollama in containers"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-10-20T17:14:21
| 2024-10-22T19:54:16
| 2024-10-22T19:54:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I just tried out the latest 0.4.0-rc3-rocm docker image and the `ollama_llama_server` crashes with
```ollama-rocm | /usr/lib/ollama/runners/rocm/ollama_llama_server: error while loading shared libraries: libelf.so.1: cannot open shared object file: No such file or directory```
I am running this on a Radeon Pro W6800, the latest stable release `0.3.13-rocm` works just fine.
Here is a slightly bigger section of the debug log in case that is helpful.
```
ollama-rocm | time=2024-10-20T17:07:58.034Z level=INFO source=llama-server.go:355 msg="starting llama server" cmd="/usr/lib/ollama/runners/rocm/ollama_llama_server --model /root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa --ctx-size 8192 --batch-size 512 --embedding --n-gpu-layers 33 --verbose --threads 12 --parallel 4 --port 46723"
ollama-rocm | time=2024-10-20T17:07:58.034Z level=DEBUG source=llama-server.go:372 msg=subprocess environment="[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin HSA_OVERRIDE_GFX_VERSION=10.3.0 ROCR_VISIBLE_DEVICES=0 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/runners/rocm HIP_VISIBLE_DEVICES=0]"
ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=sched.go:450 msg="loaded runners" count=1
ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=llama-server.go:534 msg="waiting for llama runner to start responding"
ollama-rocm | time=2024-10-20T17:07:58.037Z level=INFO source=llama-server.go:568 msg="waiting for server to become available" status="llm server error"
ollama-rocm | /usr/lib/ollama/runners/rocm/ollama_llama_server: error while loading shared libraries: libelf.so.1: cannot open shared object file: No such file or directory
ollama-rocm | time=2024-10-20T17:07:58.288Z level=ERROR source=sched.go:456 msg="error loading llama server" error="llama runner process has terminated: exit status 127"
ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:459 msg="triggering expiration for failed load" model=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:361 msg="runner expired event received" modelPath=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
ollama-rocm | time=2024-10-20T17:07:58.288Z level=DEBUG source=sched.go:376 msg="got lock to unload" modelPath=/root/.ollama/models/blobs/sha256-6a0746a1ec1aef3e7ec53868f220ff6e389f6f8ef87a01d77c96807de94ca2aa
ollama-rocm | [GIN] 2024/10/20 - 17:07:58 | 500 | 311.617978ms | 127.0.0.1 | POST "/api/generate"
```
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.4.0-rc3-rocm (Docker image)
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7279/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7279/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2621
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2621/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2621/comments
|
https://api.github.com/repos/ollama/ollama/issues/2621/events
|
https://github.com/ollama/ollama/issues/2621
| 2,145,600,139
|
I_kwDOJ0Z1Ps5_40KL
| 2,621
|
Request to allow installation to a different location
|
{
"login": "QJAG1024",
"id": 123146382,
"node_id": "U_kgDOB1cQjg",
"avatar_url": "https://avatars.githubusercontent.com/u/123146382?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/QJAG1024",
"html_url": "https://github.com/QJAG1024",
"followers_url": "https://api.github.com/users/QJAG1024/followers",
"following_url": "https://api.github.com/users/QJAG1024/following{/other_user}",
"gists_url": "https://api.github.com/users/QJAG1024/gists{/gist_id}",
"starred_url": "https://api.github.com/users/QJAG1024/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/QJAG1024/subscriptions",
"organizations_url": "https://api.github.com/users/QJAG1024/orgs",
"repos_url": "https://api.github.com/users/QJAG1024/repos",
"events_url": "https://api.github.com/users/QJAG1024/events{/privacy}",
"received_events_url": "https://api.github.com/users/QJAG1024/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-02-21T01:35:37
| 2024-03-02T04:21:12
| 2024-03-02T04:21:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I noticed that ollama can only be installed on the **volume C**.
although there is enough space for me to install models, i prefer to install programs on volume D.
and for some people, they even haven't enough space to install models on volume C.
so i think they should have a chance to install ollama to a different location.
* I'm using translator so it may be hard to read this issue. sorry for that
|
{
"login": "QJAG1024",
"id": 123146382,
"node_id": "U_kgDOB1cQjg",
"avatar_url": "https://avatars.githubusercontent.com/u/123146382?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/QJAG1024",
"html_url": "https://github.com/QJAG1024",
"followers_url": "https://api.github.com/users/QJAG1024/followers",
"following_url": "https://api.github.com/users/QJAG1024/following{/other_user}",
"gists_url": "https://api.github.com/users/QJAG1024/gists{/gist_id}",
"starred_url": "https://api.github.com/users/QJAG1024/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/QJAG1024/subscriptions",
"organizations_url": "https://api.github.com/users/QJAG1024/orgs",
"repos_url": "https://api.github.com/users/QJAG1024/repos",
"events_url": "https://api.github.com/users/QJAG1024/events{/privacy}",
"received_events_url": "https://api.github.com/users/QJAG1024/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2621/reactions",
"total_count": 6,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2621/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8397
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8397/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8397/comments
|
https://api.github.com/repos/ollama/ollama/issues/8397/events
|
https://github.com/ollama/ollama/issues/8397
| 2,782,680,124
|
I_kwDOJ0Z1Ps6l3FQ8
| 8,397
|
[UNK_BYTE_…] Output with gemma-2b-it in Ollama
|
{
"login": "TsurHerman",
"id": 3405405,
"node_id": "MDQ6VXNlcjM0MDU0MDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3405405?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TsurHerman",
"html_url": "https://github.com/TsurHerman",
"followers_url": "https://api.github.com/users/TsurHerman/followers",
"following_url": "https://api.github.com/users/TsurHerman/following{/other_user}",
"gists_url": "https://api.github.com/users/TsurHerman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TsurHerman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TsurHerman/subscriptions",
"organizations_url": "https://api.github.com/users/TsurHerman/orgs",
"repos_url": "https://api.github.com/users/TsurHerman/repos",
"events_url": "https://api.github.com/users/TsurHerman/events{/privacy}",
"received_events_url": "https://api.github.com/users/TsurHerman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2025-01-12T20:34:12
| 2025-01-16T13:26:20
| 2025-01-16T13:25:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When running ollama run with gemma-2b-it model, the generated text contains [UNK_BYTE_...] markers interleaved with normal text, instead of producing the expected characters.
>
> ollama run Al
> >>> hi
> Hi[UNK_BYTE_0xe29681▁there]there![UNK_BYTE_0xe29681▁👋][UNK_BYTE_0xf09f918b▁👋]▁▁
> What[UNK_BYTE_0xe29681▁can]can[
>
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.5.4
|
{
"login": "TsurHerman",
"id": 3405405,
"node_id": "MDQ6VXNlcjM0MDU0MDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3405405?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TsurHerman",
"html_url": "https://github.com/TsurHerman",
"followers_url": "https://api.github.com/users/TsurHerman/followers",
"following_url": "https://api.github.com/users/TsurHerman/following{/other_user}",
"gists_url": "https://api.github.com/users/TsurHerman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TsurHerman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TsurHerman/subscriptions",
"organizations_url": "https://api.github.com/users/TsurHerman/orgs",
"repos_url": "https://api.github.com/users/TsurHerman/repos",
"events_url": "https://api.github.com/users/TsurHerman/events{/privacy}",
"received_events_url": "https://api.github.com/users/TsurHerman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8397/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8397/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8032
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8032/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8032/comments
|
https://api.github.com/repos/ollama/ollama/issues/8032/events
|
https://github.com/ollama/ollama/pull/8032
| 2,730,947,526
|
PR_kwDOJ0Z1Ps6EwDHg
| 8,032
|
Remove unused runner CpuFeatures
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-10T18:57:02
| 2024-12-10T20:59:43
| 2024-12-10T20:59:39
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8032",
"html_url": "https://github.com/ollama/ollama/pull/8032",
"diff_url": "https://github.com/ollama/ollama/pull/8032.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8032.patch",
"merged_at": "2024-12-10T20:59:39"
}
|
The final implementation of #7499 removed dynamic vector requirements in favor of a simpler [filename based model](https://github.com/ollama/ollama/blob/main/runners/common.go#L125-L132), and this was left over logic that is no longer needed.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8032/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8032/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2740
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2740/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2740/comments
|
https://api.github.com/repos/ollama/ollama/issues/2740/events
|
https://github.com/ollama/ollama/issues/2740
| 2,152,628,134
|
I_kwDOJ0Z1Ps6ATn-m
| 2,740
|
Cannot pass file as suggested in example with windows
|
{
"login": "mattjoyce",
"id": 278869,
"node_id": "MDQ6VXNlcjI3ODg2OQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/278869?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mattjoyce",
"html_url": "https://github.com/mattjoyce",
"followers_url": "https://api.github.com/users/mattjoyce/followers",
"following_url": "https://api.github.com/users/mattjoyce/following{/other_user}",
"gists_url": "https://api.github.com/users/mattjoyce/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mattjoyce/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mattjoyce/subscriptions",
"organizations_url": "https://api.github.com/users/mattjoyce/orgs",
"repos_url": "https://api.github.com/users/mattjoyce/repos",
"events_url": "https://api.github.com/users/mattjoyce/events{/privacy}",
"received_events_url": "https://api.github.com/users/mattjoyce/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-02-25T06:45:42
| 2024-06-17T16:51:47
| 2024-03-12T21:48:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
ollama version is 0.1.27
Here's the example provided in the documentation.
> ollama run llama2 "Summarize this file: $(cat README.md)"
Here's what I tried use the windows versions and the response
> ollama run phi "summarize this file $(type 5_QGU5D7mLk.md)"
> I'm sorry, but as an AI language model, I cannot provide a summary of any specific text without access to its contents.
> Please provide me with more context or information about the text you would like me to summarize.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2740/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2740/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5066
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5066/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5066/comments
|
https://api.github.com/repos/ollama/ollama/issues/5066/events
|
https://github.com/ollama/ollama/issues/5066
| 2,355,056,694
|
I_kwDOJ0Z1Ps6MX1A2
| 5,066
|
AMD 7945HX not showing avx512
|
{
"login": "mikealanni",
"id": 25714603,
"node_id": "MDQ6VXNlcjI1NzE0NjAz",
"avatar_url": "https://avatars.githubusercontent.com/u/25714603?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mikealanni",
"html_url": "https://github.com/mikealanni",
"followers_url": "https://api.github.com/users/mikealanni/followers",
"following_url": "https://api.github.com/users/mikealanni/following{/other_user}",
"gists_url": "https://api.github.com/users/mikealanni/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mikealanni/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mikealanni/subscriptions",
"organizations_url": "https://api.github.com/users/mikealanni/orgs",
"repos_url": "https://api.github.com/users/mikealanni/repos",
"events_url": "https://api.github.com/users/mikealanni/events{/privacy}",
"received_events_url": "https://api.github.com/users/mikealanni/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-15T17:28:07
| 2024-06-18T22:17:17
| 2024-06-18T22:17:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi, I'm curious if this is a bug as the logs showing like I don't have avx512 in my CPU while I have. When I start my ollama docker it show this in the log
`INFO [main] system info | n_threads=16 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="130972104485952" timestamp=1718469327 total_threads=32`
And my /proc/cpuinfo showing
`fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good amd_lbr_v2 nopl xtopology nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba perfmon_v2 ibrs ibpb stibp ibrs_enhanced vmmcall fsgsbase bmi1 avx2 smep bmi2 erms invpcid cqm rdt_a avx512f avx512dq rdseed adx smap avx512ifma clflushopt clwb avx512cd sha_ni avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local user_shstk avx512_bf16 clzero irperf xsaveerptr rdpru wbnoinvd cppc arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif x2avic v_spec_ctrl vnmi avx512vbmi umip pku ospke avx512_vbmi2 gfni vaes vpclmulqdq avx512_vnni avx512_bitalg avx512_vpopcntdq rdpid overflow_recov succor smca fsrm flush_l1d amd_lbr_pmc_freeze`
### OS
Docker
### GPU
AMD
### CPU
AMD
### Ollama version
0.1.44
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5066/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5066/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8171
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8171/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8171/comments
|
https://api.github.com/repos/ollama/ollama/issues/8171/events
|
https://github.com/ollama/ollama/pull/8171
| 2,749,930,106
|
PR_kwDOJ0Z1Ps6Fwv7v
| 8,171
|
Update go.sum
|
{
"login": "Squishedmac",
"id": 88924339,
"node_id": "MDQ6VXNlcjg4OTI0MzM5",
"avatar_url": "https://avatars.githubusercontent.com/u/88924339?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Squishedmac",
"html_url": "https://github.com/Squishedmac",
"followers_url": "https://api.github.com/users/Squishedmac/followers",
"following_url": "https://api.github.com/users/Squishedmac/following{/other_user}",
"gists_url": "https://api.github.com/users/Squishedmac/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Squishedmac/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Squishedmac/subscriptions",
"organizations_url": "https://api.github.com/users/Squishedmac/orgs",
"repos_url": "https://api.github.com/users/Squishedmac/repos",
"events_url": "https://api.github.com/users/Squishedmac/events{/privacy}",
"received_events_url": "https://api.github.com/users/Squishedmac/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-19T10:50:34
| 2024-12-19T10:51:53
| 2024-12-19T10:51:53
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8171",
"html_url": "https://github.com/ollama/ollama/pull/8171",
"diff_url": "https://github.com/ollama/ollama/pull/8171.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8171.patch",
"merged_at": null
}
| null |
{
"login": "Squishedmac",
"id": 88924339,
"node_id": "MDQ6VXNlcjg4OTI0MzM5",
"avatar_url": "https://avatars.githubusercontent.com/u/88924339?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Squishedmac",
"html_url": "https://github.com/Squishedmac",
"followers_url": "https://api.github.com/users/Squishedmac/followers",
"following_url": "https://api.github.com/users/Squishedmac/following{/other_user}",
"gists_url": "https://api.github.com/users/Squishedmac/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Squishedmac/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Squishedmac/subscriptions",
"organizations_url": "https://api.github.com/users/Squishedmac/orgs",
"repos_url": "https://api.github.com/users/Squishedmac/repos",
"events_url": "https://api.github.com/users/Squishedmac/events{/privacy}",
"received_events_url": "https://api.github.com/users/Squishedmac/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8171/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8171/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4464
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4464/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4464/comments
|
https://api.github.com/repos/ollama/ollama/issues/4464/events
|
https://github.com/ollama/ollama/issues/4464
| 2,299,107,344
|
I_kwDOJ0Z1Ps6JCZgQ
| 4,464
|
Support RX6600 (gfx1032) on windows (gfx override works on linux)
|
{
"login": "usmandilmeer",
"id": 51738693,
"node_id": "MDQ6VXNlcjUxNzM4Njkz",
"avatar_url": "https://avatars.githubusercontent.com/u/51738693?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/usmandilmeer",
"html_url": "https://github.com/usmandilmeer",
"followers_url": "https://api.github.com/users/usmandilmeer/followers",
"following_url": "https://api.github.com/users/usmandilmeer/following{/other_user}",
"gists_url": "https://api.github.com/users/usmandilmeer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/usmandilmeer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/usmandilmeer/subscriptions",
"organizations_url": "https://api.github.com/users/usmandilmeer/orgs",
"repos_url": "https://api.github.com/users/usmandilmeer/repos",
"events_url": "https://api.github.com/users/usmandilmeer/events{/privacy}",
"received_events_url": "https://api.github.com/users/usmandilmeer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-05-16T01:10:36
| 2024-08-27T21:13:12
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
Ollama(0.1.32) is working awesome with Zluda using AMD RX6600 on windows 10.
But I have downloaded and tested the all above versions from"0.1.33 to 0.1.38" Ollama is not working with Zluda.
It gives error "0xc000001d"
So, now I downgraded and using 0.1.32 with Zluda.
Is it Zluda's issue or Ollama's?
Can anyone help me working with newer versions of Ollama?
### OS
Windows
### GPU
AMD
### CPU
Intel
### Ollama version
0.1.32
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4464/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4464/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4413
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4413/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4413/comments
|
https://api.github.com/repos/ollama/ollama/issues/4413/events
|
https://github.com/ollama/ollama/pull/4413
| 2,293,952,173
|
PR_kwDOJ0Z1Ps5vUgzk
| 4,413
|
check if name exists before create/pull/copy
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-05-13T22:28:11
| 2024-05-29T19:06:59
| 2024-05-29T19:06:58
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4413",
"html_url": "https://github.com/ollama/ollama/pull/4413",
"diff_url": "https://github.com/ollama/ollama/pull/4413.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4413.patch",
"merged_at": "2024-05-29T19:06:58"
}
|
TODO
- [x] tests
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4413/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4413/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1121
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1121/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1121/comments
|
https://api.github.com/repos/ollama/ollama/issues/1121/events
|
https://github.com/ollama/ollama/issues/1121
| 1,992,243,519
|
I_kwDOJ0Z1Ps52vzk_
| 1,121
|
Using FROM command and using Modelfile not clear
|
{
"login": "kikoferrer",
"id": 135333835,
"node_id": "U_kgDOCBEHyw",
"avatar_url": "https://avatars.githubusercontent.com/u/135333835?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kikoferrer",
"html_url": "https://github.com/kikoferrer",
"followers_url": "https://api.github.com/users/kikoferrer/followers",
"following_url": "https://api.github.com/users/kikoferrer/following{/other_user}",
"gists_url": "https://api.github.com/users/kikoferrer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kikoferrer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kikoferrer/subscriptions",
"organizations_url": "https://api.github.com/users/kikoferrer/orgs",
"repos_url": "https://api.github.com/users/kikoferrer/repos",
"events_url": "https://api.github.com/users/kikoferrer/events{/privacy}",
"received_events_url": "https://api.github.com/users/kikoferrer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2023-11-14T08:36:40
| 2023-11-20T16:04:09
| 2023-11-16T16:02:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
so I installed ollama using the instructions here. Then I want to use a predownloaded model. So this is what I did:
guide says create a Modelfile so I used touch
`touch Modelfile`
then add a FROM instruction with the local filepath to the model you want to import
`nano Modelfile
FROM ./path/to/model/model.gguf`
Then create modelfile
`ollama create model -f Modelfile`
Then it returns
`couldn't open modelfile '/path/to/modelfile/Modelfile' Error: failed to open file: open //path/to/modelfile/Modelfile: permission denied`
I am dumb please teach me where I went wrong. Thanks. I am using linux.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1121/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1121/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5116
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5116/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5116/comments
|
https://api.github.com/repos/ollama/ollama/issues/5116/events
|
https://github.com/ollama/ollama/issues/5116
| 2,359,886,939
|
I_kwDOJ0Z1Ps6MqQRb
| 5,116
|
ERROR [validate_model_chat_template] deepseek-coder-v2:16b-lite-instruct-q8_0
|
{
"login": "ekolawole",
"id": 79321648,
"node_id": "MDQ6VXNlcjc5MzIxNjQ4",
"avatar_url": "https://avatars.githubusercontent.com/u/79321648?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ekolawole",
"html_url": "https://github.com/ekolawole",
"followers_url": "https://api.github.com/users/ekolawole/followers",
"following_url": "https://api.github.com/users/ekolawole/following{/other_user}",
"gists_url": "https://api.github.com/users/ekolawole/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ekolawole/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ekolawole/subscriptions",
"organizations_url": "https://api.github.com/users/ekolawole/orgs",
"repos_url": "https://api.github.com/users/ekolawole/repos",
"events_url": "https://api.github.com/users/ekolawole/events{/privacy}",
"received_events_url": "https://api.github.com/users/ekolawole/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-18T13:35:19
| 2024-06-19T18:44:06
| 2024-06-19T18:44:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
deepseek-coder-v2:16b-lite-instruct-q8_0:
INFO [main] model loaded | tid="0x1fe414c00" timestamp=1718717321
ERROR [validate_model_chat_template] The chat template comes with this model is not yet supported, falling back to chatml. This may cause the model to output suboptimal responses | tid="0x1fe414c00" timestamp=1718717321
time=2024-06-18T09:28:41.274-04:00 level=INFO source=server.go:572 msg="llama runner started in 2.66 seconds"
GGML_ASSERT: /Users/runner/work/ollama/ollama/llm/llama.cpp/ggml-metal.m:1853: dst_rows <= 2048
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.44
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5116/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5116/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8583
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8583/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8583/comments
|
https://api.github.com/repos/ollama/ollama/issues/8583/events
|
https://github.com/ollama/ollama/issues/8583
| 2,811,062,722
|
I_kwDOJ0Z1Ps6njWnC
| 8,583
|
Deepseek R1 throwing weird generation DDDDDDDDDDDDDDDDDDDDDDDDDDDDDDD
|
{
"login": "amrrs",
"id": 5347322,
"node_id": "MDQ6VXNlcjUzNDczMjI=",
"avatar_url": "https://avatars.githubusercontent.com/u/5347322?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amrrs",
"html_url": "https://github.com/amrrs",
"followers_url": "https://api.github.com/users/amrrs/followers",
"following_url": "https://api.github.com/users/amrrs/following{/other_user}",
"gists_url": "https://api.github.com/users/amrrs/gists{/gist_id}",
"starred_url": "https://api.github.com/users/amrrs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amrrs/subscriptions",
"organizations_url": "https://api.github.com/users/amrrs/orgs",
"repos_url": "https://api.github.com/users/amrrs/repos",
"events_url": "https://api.github.com/users/amrrs/events{/privacy}",
"received_events_url": "https://api.github.com/users/amrrs/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 5
| 2025-01-25T16:25:14
| 2025-01-27T13:44:28
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I tried to use the full deepseek model 4-bit quantized one with `ollama run deepseek-r1:671b`
but it somehow gives `DDDDDDDDDDDDDDDDDDDDDDDDDDDDDDD` as the output

### OS
Linux
### GPU
AMD
### CPU
_No response_
### Ollama version
0.5.7
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8583/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8583/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2055
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2055/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2055/comments
|
https://api.github.com/repos/ollama/ollama/issues/2055/events
|
https://github.com/ollama/ollama/pull/2055
| 2,088,786,742
|
PR_kwDOJ0Z1Ps5kc53j
| 2,055
|
Refine the linux cuda/rocm developer docs
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-18T17:52:23
| 2024-01-18T20:07:34
| 2024-01-18T20:07:31
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2055",
"html_url": "https://github.com/ollama/ollama/pull/2055",
"diff_url": "https://github.com/ollama/ollama/pull/2055.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2055.patch",
"merged_at": "2024-01-18T20:07:31"
}
|
With the recent improvements in the [gen_linux.sh](https://github.com/jmorganca/ollama/blob/main/llm/generate/gen_linux.sh) script and these doc updates, this should fix #1704
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2055/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2055/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/491
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/491/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/491/comments
|
https://api.github.com/repos/ollama/ollama/issues/491/events
|
https://github.com/ollama/ollama/pull/491
| 1,886,678,309
|
PR_kwDOJ0Z1Ps5Z0wdg
| 491
|
add autoprune to remove unused layers
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-07T23:06:48
| 2023-09-11T18:46:36
| 2023-09-11T18:46:35
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/491",
"html_url": "https://github.com/ollama/ollama/pull/491",
"diff_url": "https://github.com/ollama/ollama/pull/491.diff",
"patch_url": "https://github.com/ollama/ollama/pull/491.patch",
"merged_at": "2023-09-11T18:46:35"
}
|
This change will remove any unused layers for models. It runs at server startup, and will also clean up on `pull` or `create` commands which can orphan older layers.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/491/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/491/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3436
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3436/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3436/comments
|
https://api.github.com/repos/ollama/ollama/issues/3436/events
|
https://github.com/ollama/ollama/pull/3436
| 2,218,025,498
|
PR_kwDOJ0Z1Ps5rTcx0
| 3,436
|
Update README.md
|
{
"login": "ParisNeo",
"id": 827993,
"node_id": "MDQ6VXNlcjgyNzk5Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/827993?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParisNeo",
"html_url": "https://github.com/ParisNeo",
"followers_url": "https://api.github.com/users/ParisNeo/followers",
"following_url": "https://api.github.com/users/ParisNeo/following{/other_user}",
"gists_url": "https://api.github.com/users/ParisNeo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParisNeo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParisNeo/subscriptions",
"organizations_url": "https://api.github.com/users/ParisNeo/orgs",
"repos_url": "https://api.github.com/users/ParisNeo/repos",
"events_url": "https://api.github.com/users/ParisNeo/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParisNeo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-01T10:49:28
| 2024-04-01T15:16:31
| 2024-04-01T15:16:31
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3436",
"html_url": "https://github.com/ollama/ollama/pull/3436",
"diff_url": "https://github.com/ollama/ollama/pull/3436.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3436.patch",
"merged_at": "2024-04-01T15:16:31"
}
|
Just added lollms-webui to the list of supported webuis.
Lollms is a webui that can perform a large range of tasks, from generating text and chatting with more than 500 agents to generating images, music and videos. Lollms supports multimodality and can use it along with ollama. it can also offer RAG and summary services. All running locally and for free.
For more about lollms, you can check out its website :
[https://lollms.com/](https://lollms.com/)
or the github
[https://github.com/ParisNeo/lollms-webui](https://github.com/ParisNeo/lollms-webui)
or my youtube channel:
https://www.youtube.com/@Parisneo
Thank you very much for this wonderful backend.
I'll make a video about how to install lollms along with ollama.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3436/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3436/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6414
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6414/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6414/comments
|
https://api.github.com/repos/ollama/ollama/issues/6414/events
|
https://github.com/ollama/ollama/issues/6414
| 2,472,725,641
|
I_kwDOJ0Z1Ps6TYsyJ
| 6,414
|
Ollama embedding is slow
|
{
"login": "yuanjie-ai",
"id": 20265321,
"node_id": "MDQ6VXNlcjIwMjY1MzIx",
"avatar_url": "https://avatars.githubusercontent.com/u/20265321?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanjie-ai",
"html_url": "https://github.com/yuanjie-ai",
"followers_url": "https://api.github.com/users/yuanjie-ai/followers",
"following_url": "https://api.github.com/users/yuanjie-ai/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanjie-ai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanjie-ai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanjie-ai/subscriptions",
"organizations_url": "https://api.github.com/users/yuanjie-ai/orgs",
"repos_url": "https://api.github.com/users/yuanjie-ai/repos",
"events_url": "https://api.github.com/users/yuanjie-ai/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanjie-ai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-19T08:06:45
| 2024-08-23T23:38:13
| 2024-08-23T23:38:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Ollama embedding is slow
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6414/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6414/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1165
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1165/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1165/comments
|
https://api.github.com/repos/ollama/ollama/issues/1165/events
|
https://github.com/ollama/ollama/issues/1165
| 1,998,144,073
|
I_kwDOJ0Z1Ps53GUJJ
| 1,165
|
Provide command to export downloaded models
|
{
"login": "biandayu",
"id": 52662468,
"node_id": "MDQ6VXNlcjUyNjYyNDY4",
"avatar_url": "https://avatars.githubusercontent.com/u/52662468?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/biandayu",
"html_url": "https://github.com/biandayu",
"followers_url": "https://api.github.com/users/biandayu/followers",
"following_url": "https://api.github.com/users/biandayu/following{/other_user}",
"gists_url": "https://api.github.com/users/biandayu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/biandayu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/biandayu/subscriptions",
"organizations_url": "https://api.github.com/users/biandayu/orgs",
"repos_url": "https://api.github.com/users/biandayu/repos",
"events_url": "https://api.github.com/users/biandayu/events{/privacy}",
"received_events_url": "https://api.github.com/users/biandayu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 10
| 2023-11-17T02:21:19
| 2024-02-20T01:08:28
| 2024-02-20T01:08:27
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is there any way to import and export downloaded models? In this way, there is no need to use ollama pull to download again on another local machine.
Thanks
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1165/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1165/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5910
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5910/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5910/comments
|
https://api.github.com/repos/ollama/ollama/issues/5910/events
|
https://github.com/ollama/ollama/issues/5910
| 2,427,476,355
|
I_kwDOJ0Z1Ps6QsFmD
| 5,910
|
Ollama serve hangs on openai completions request
|
{
"login": "ikamensh",
"id": 23004004,
"node_id": "MDQ6VXNlcjIzMDA0MDA0",
"avatar_url": "https://avatars.githubusercontent.com/u/23004004?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ikamensh",
"html_url": "https://github.com/ikamensh",
"followers_url": "https://api.github.com/users/ikamensh/followers",
"following_url": "https://api.github.com/users/ikamensh/following{/other_user}",
"gists_url": "https://api.github.com/users/ikamensh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ikamensh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ikamensh/subscriptions",
"organizations_url": "https://api.github.com/users/ikamensh/orgs",
"repos_url": "https://api.github.com/users/ikamensh/repos",
"events_url": "https://api.github.com/users/ikamensh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ikamensh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-07-24T12:35:56
| 2024-09-04T04:18:47
| 2024-09-04T04:18:47
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I was processing files locally in a loop, and at some point process just stopped moving forward. I had to do keyboard interrupt. In the terminal, this gave this entry, at the end of the log below. In the terminal running python query source, on termination I've seen it hanged in sock.receive() call.
[GIN] 2024/07/24 - 14:25:35 | 200 | 2.729474292s | 127.0.0.1 | POST "/v1/chat/completions"
[GIN] 2024/07/24 - 14:25:36 | 200 | 1.86225825s | 127.0.0.1 | POST "/v1/chat/completions"
[GIN] 2024/07/24 - 14:25:40 | 200 | 3.38871s | 127.0.0.1 | POST "/v1/chat/completions"
[GIN] 2024/07/24 - 14:25:45 | 200 | 4.8193825s | 127.0.0.1 | POST "/v1/chat/completions"
[GIN] 2024/07/24 - 14:25:47 | 200 | 2.7770215s | 127.0.0.1 | POST "/v1/chat/completions"
[GIN] 2024/07/24 - 14:31:02 | 500 | 5m14s | 127.0.0.1 | POST "/v1/chat/completions"
time=2024-07-24T14:33:00.048+02:00 level=INFO source=memory.go:309 msg="o
### OS
macOS
### GPU
AMD
### CPU
Apple
### Ollama version
0.1.48
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5910/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5910/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8106
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8106/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8106/comments
|
https://api.github.com/repos/ollama/ollama/issues/8106/events
|
https://github.com/ollama/ollama/pull/8106
| 2,740,273,114
|
PR_kwDOJ0Z1Ps6FPoc3
| 8,106
|
server: tokenize & detokenize endpoints
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-12-15T04:32:59
| 2024-12-19T01:39:45
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8106",
"html_url": "https://github.com/ollama/ollama/pull/8106",
"diff_url": "https://github.com/ollama/ollama/pull/8106.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8106.patch",
"merged_at": null
}
|
Massive shoutout to @Yurzs for getting this in.
Doing cleanup + tests.
Closes: https://github.com/ollama/ollama/issues/3582
TO-DO:
- [ ] Python SDK: https://github.com/ollama/ollama-python/pull/383
- [ ] JS SDK: https://github.com/ollama/ollama-js/pull/179
- [ ] Benchmarking w/ & w/o caching
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8106/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 4,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8106/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7534
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7534/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7534/comments
|
https://api.github.com/repos/ollama/ollama/issues/7534/events
|
https://github.com/ollama/ollama/issues/7534
| 2,639,296,141
|
I_kwDOJ0Z1Ps6dUHaN
| 7,534
|
Performance Regression in Ollama 0.4.0 Compared to 0.3.14
|
{
"login": "MMaturax",
"id": 3213496,
"node_id": "MDQ6VXNlcjMyMTM0OTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/3213496?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MMaturax",
"html_url": "https://github.com/MMaturax",
"followers_url": "https://api.github.com/users/MMaturax/followers",
"following_url": "https://api.github.com/users/MMaturax/following{/other_user}",
"gists_url": "https://api.github.com/users/MMaturax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MMaturax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MMaturax/subscriptions",
"organizations_url": "https://api.github.com/users/MMaturax/orgs",
"repos_url": "https://api.github.com/users/MMaturax/repos",
"events_url": "https://api.github.com/users/MMaturax/events{/privacy}",
"received_events_url": "https://api.github.com/users/MMaturax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
| null |
[] | null | 16
| 2024-11-06T21:37:05
| 2024-11-22T19:34:20
| 2024-11-22T04:36:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hello,
After updating to Ollama version 0.4.0, which was noted to have performance improvements, I conducted some performance tests and observed that version 0.3.14 outperformed 0.4.0 in certain cases on my system.
Here are the specifics:
Ollama Version 0.4.0 Test Results (Average speed of 70-78 tokens/second.):

Ollama Version 0.3.14 Test Results (Average speed of 82-88 tokens/second):

Could you provide insight into why version 0.4.0 is performing slower when an increase in speed was expected?
Thank you.
**System Information:**
OS: Ubuntu 24.04.1 LTS
CPU: AMD Ryzen 9 7950X3D 16-Core Processor
GPU: GeForce RTX 4070 Ti SUPER
Driver Version: 550.120
CUDA Version: 12.4
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.4.0
|
{
"login": "MMaturax",
"id": 3213496,
"node_id": "MDQ6VXNlcjMyMTM0OTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/3213496?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MMaturax",
"html_url": "https://github.com/MMaturax",
"followers_url": "https://api.github.com/users/MMaturax/followers",
"following_url": "https://api.github.com/users/MMaturax/following{/other_user}",
"gists_url": "https://api.github.com/users/MMaturax/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MMaturax/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MMaturax/subscriptions",
"organizations_url": "https://api.github.com/users/MMaturax/orgs",
"repos_url": "https://api.github.com/users/MMaturax/repos",
"events_url": "https://api.github.com/users/MMaturax/events{/privacy}",
"received_events_url": "https://api.github.com/users/MMaturax/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7534/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7534/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2039
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2039/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2039/comments
|
https://api.github.com/repos/ollama/ollama/issues/2039/events
|
https://github.com/ollama/ollama/issues/2039
| 2,087,341,946
|
I_kwDOJ0Z1Ps58ak96
| 2,039
|
web-ui log error loading model: llama.cpp: tensor 'layers.2.ffn_norm.weight' is missing from model
|
{
"login": "lpf763827726",
"id": 43004977,
"node_id": "MDQ6VXNlcjQzMDA0OTc3",
"avatar_url": "https://avatars.githubusercontent.com/u/43004977?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lpf763827726",
"html_url": "https://github.com/lpf763827726",
"followers_url": "https://api.github.com/users/lpf763827726/followers",
"following_url": "https://api.github.com/users/lpf763827726/following{/other_user}",
"gists_url": "https://api.github.com/users/lpf763827726/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lpf763827726/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lpf763827726/subscriptions",
"organizations_url": "https://api.github.com/users/lpf763827726/orgs",
"repos_url": "https://api.github.com/users/lpf763827726/repos",
"events_url": "https://api.github.com/users/lpf763827726/events{/privacy}",
"received_events_url": "https://api.github.com/users/lpf763827726/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-18T02:15:04
| 2024-05-17T21:57:45
| 2024-05-17T21:57:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
when i run `ollama run llama2:13b` and `ollama run codellama` with ollama-webui, and ask 2~3 question, it start to got error, it report error missing something
[Issue details](https://github.com/ollama-webui/ollama-webui/issues/507)
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2039/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2039/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4058
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4058/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4058/comments
|
https://api.github.com/repos/ollama/ollama/issues/4058/events
|
https://github.com/ollama/ollama/pull/4058
| 2,272,227,088
|
PR_kwDOJ0Z1Ps5uLslV
| 4,058
|
fix: store accurate model parameter size
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-30T18:43:30
| 2024-05-07T21:41:54
| 2024-05-07T21:41:54
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4058",
"html_url": "https://github.com/ollama/ollama/pull/4058",
"diff_url": "https://github.com/ollama/ollama/pull/4058.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4058.patch",
"merged_at": "2024-05-07T21:41:54"
}
|
- add test for number formatting
- fix bug where 1B and 1M were not stored correctly
- display 2 decimal points for million param sizes
- display 1 decimal point for billion param sizes
This human conversion is displayed as the parameter size on ollama.com, so it should be in the standard format that model parameter sizes are measured.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4058/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4058/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/681
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/681/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/681/comments
|
https://api.github.com/repos/ollama/ollama/issues/681/events
|
https://github.com/ollama/ollama/pull/681
| 1,922,761,585
|
PR_kwDOJ0Z1Ps5bt8Rx
| 681
|
show a default message when license/parameters/etc aren't specified
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-02T21:33:32
| 2023-10-02T21:34:53
| 2023-10-02T21:34:53
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/681",
"html_url": "https://github.com/ollama/ollama/pull/681",
"diff_url": "https://github.com/ollama/ollama/pull/681.diff",
"patch_url": "https://github.com/ollama/ollama/pull/681.patch",
"merged_at": "2023-10-02T21:34:53"
}
| null |
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/681/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/681/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7842
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7842/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7842/comments
|
https://api.github.com/repos/ollama/ollama/issues/7842/events
|
https://github.com/ollama/ollama/issues/7842
| 2,694,371,569
|
I_kwDOJ0Z1Ps6gmNjx
| 7,842
|
Ovis1.6-Gemma2-27B Model request
|
{
"login": "Backendmagier",
"id": 158162798,
"node_id": "U_kgDOCW1fbg",
"avatar_url": "https://avatars.githubusercontent.com/u/158162798?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Backendmagier",
"html_url": "https://github.com/Backendmagier",
"followers_url": "https://api.github.com/users/Backendmagier/followers",
"following_url": "https://api.github.com/users/Backendmagier/following{/other_user}",
"gists_url": "https://api.github.com/users/Backendmagier/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Backendmagier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Backendmagier/subscriptions",
"organizations_url": "https://api.github.com/users/Backendmagier/orgs",
"repos_url": "https://api.github.com/users/Backendmagier/repos",
"events_url": "https://api.github.com/users/Backendmagier/events{/privacy}",
"received_events_url": "https://api.github.com/users/Backendmagier/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-11-26T11:42:39
| 2024-11-26T11:42:39
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/AIDC-AI/Ovis1.6-Gemma2-27B
very good multi modal.
Could be the best open source multimodal atm.
Would love to have it in Ollama.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7842/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7842/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6224
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6224/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6224/comments
|
https://api.github.com/repos/ollama/ollama/issues/6224/events
|
https://github.com/ollama/ollama/issues/6224
| 2,452,506,249
|
I_kwDOJ0Z1Ps6SLkaJ
| 6,224
|
Passing result from tool calling to model
|
{
"login": "tristanMatthias",
"id": 2550138,
"node_id": "MDQ6VXNlcjI1NTAxMzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2550138?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tristanMatthias",
"html_url": "https://github.com/tristanMatthias",
"followers_url": "https://api.github.com/users/tristanMatthias/followers",
"following_url": "https://api.github.com/users/tristanMatthias/following{/other_user}",
"gists_url": "https://api.github.com/users/tristanMatthias/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tristanMatthias/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tristanMatthias/subscriptions",
"organizations_url": "https://api.github.com/users/tristanMatthias/orgs",
"repos_url": "https://api.github.com/users/tristanMatthias/repos",
"events_url": "https://api.github.com/users/tristanMatthias/events{/privacy}",
"received_events_url": "https://api.github.com/users/tristanMatthias/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-08-07T05:39:45
| 2024-10-24T03:23:46
| 2024-10-24T03:23:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi there. I am trying to follow the guidelines from Meta on how to pass a result from a tool call to Llama3.1
This is per [their documentation](https://llama.meta.com/docs/model-cards-and-prompt-formats/llama3_1/)
The ollama [`api.ToolCall`](https://github.com/ollama/ollama/blob/main/api/types.go#L144-L146) struct does not have any way to pass the result back to the model.
Is there some way we could support passing responses back to ollama/llama in Meta's preferred way? Or am I missing something?
Thank you for the great work!
```
### Step - 3 Result from calling the tool is passed back to the model
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Environment: ipython
Tools: brave_search, wolfram_alpha
Cutting Knowledge Date: December 2023
Today Date: 23 Jul 2024
# Tool Instructions
- Always execute python code in messages that you share.
- When looking for real time information use relevant functions if available else fallback to brave_search
You have access to the following functions:
Use the function 'spotify_trending_songs' to: Get top trending songs on Spotify
{"name": "spotify_trending_songs", "description": "Get top trending songs on Spotify", "parameters": {"n": {"param_type": "int", "description": "Number of trending songs to get", "required": true}}}
If a you choose to call a function ONLY reply in the following format:
<{start_tag}={function_name}>{parameters}{end_tag}
where
start_tag => `<function`
parameters => a JSON dict with the function argument name as key and function argument value as value.
end_tag => `</function>`
Here is an example,
<function=example_function_name>{"example_name": "example_value"}</function>
Reminder:
- Function calls MUST follow the specified format
- Required parameters MUST be specified
- Only call one function at a time
- Put the entire function call reply on one line
- Always add your sources when using search results to answer the user query
You are a helpful assistant.<|eot_id|><|start_header_id|>user<|end_header_id|>
Can you check the top 5 trending songs on spotify?<|eot_id|><|start_header_id|>assistant<|end_header_id|>
<function=spotify_trending_songs>{"n": "5"}</function><|eom_id|><|start_header_id|>ipython<|end_header_id|>
["1. BIRDS OF A FEATHER by Billie Eilish", "2. Espresso by Sabrina Carpenter", "3. Please Please Please by Sabrina Carpenter", "4. Not Like Us by Kendrick Lamar", "5. Gata Only by FloyyMenor, Cris Mj"]<|eot_id|><|start_header_id|>assistant<|end_header_id|>
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6224/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6224/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6731
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6731/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6731/comments
|
https://api.github.com/repos/ollama/ollama/issues/6731/events
|
https://github.com/ollama/ollama/issues/6731
| 2,516,878,681
|
I_kwDOJ0Z1Ps6WBIVZ
| 6,731
|
error wile install on opensuse leap 15.6
|
{
"login": "kc8pdr205",
"id": 95314147,
"node_id": "U_kgDOBa5g4w",
"avatar_url": "https://avatars.githubusercontent.com/u/95314147?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kc8pdr205",
"html_url": "https://github.com/kc8pdr205",
"followers_url": "https://api.github.com/users/kc8pdr205/followers",
"following_url": "https://api.github.com/users/kc8pdr205/following{/other_user}",
"gists_url": "https://api.github.com/users/kc8pdr205/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kc8pdr205/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kc8pdr205/subscriptions",
"organizations_url": "https://api.github.com/users/kc8pdr205/orgs",
"repos_url": "https://api.github.com/users/kc8pdr205/repos",
"events_url": "https://api.github.com/users/kc8pdr205/events{/privacy}",
"received_events_url": "https://api.github.com/users/kc8pdr205/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-09-10T16:07:33
| 2024-09-11T01:27:43
| 2024-09-11T01:27:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm trying to install ollame in leap 15.6 When I try the command to install ollama . I getting his error. I have both files installed
WARNING: Unable to detect NVIDIA/AMD GPU. Install lspci or lshw to automatically detect and install GPU dependencies.I do have cuda and the NVIDIA drivers installed
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6731/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6731/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1513
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1513/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1513/comments
|
https://api.github.com/repos/ollama/ollama/issues/1513/events
|
https://github.com/ollama/ollama/issues/1513
| 2,040,814,879
|
I_kwDOJ0Z1Ps55pF0f
| 1,513
|
I don't like the idea that ollama force me to use a server.
|
{
"login": "franciscoprin",
"id": 27599257,
"node_id": "MDQ6VXNlcjI3NTk5MjU3",
"avatar_url": "https://avatars.githubusercontent.com/u/27599257?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/franciscoprin",
"html_url": "https://github.com/franciscoprin",
"followers_url": "https://api.github.com/users/franciscoprin/followers",
"following_url": "https://api.github.com/users/franciscoprin/following{/other_user}",
"gists_url": "https://api.github.com/users/franciscoprin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/franciscoprin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/franciscoprin/subscriptions",
"organizations_url": "https://api.github.com/users/franciscoprin/orgs",
"repos_url": "https://api.github.com/users/franciscoprin/repos",
"events_url": "https://api.github.com/users/franciscoprin/events{/privacy}",
"received_events_url": "https://api.github.com/users/franciscoprin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-12-14T02:57:59
| 2024-03-12T01:25:25
| 2024-03-12T01:25:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
so, if I have a python code that looks like this:
```python
from langchain.schema import (SystemMessage, HumanMessage, AIMessage)
from langchain.callbacks.manager import CallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain.chat_models import ChatOllama
question = "Could I have GitHub access?"
chat_template = [
SystemMessage(
content=(
"You are a helpful DevOps assistant, rewrite user's questions to only include the websites that they want to access."
)
),
HumanMessage(content=question),
]
chat_model = ChatOllama(
# model="llama2:7b-chat",
model_path="./models/llama-2-7b-chat.Q4_K_M.gguf",
callback_manager=CallbackManager([StreamingStdOutCallbackHandler()]),
)
chat_model(chat_template)
```
the above source code gives me the following error:
```
requests. exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=11434): Max retries exceeded with url: /api/generate/ (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x107601090>: Failed to establish a new connection: [Errno 61] Connection refused'))
```
I don't want my models to be downloaded by Ollama service, I want to use the models that I already had downloaded instead.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1513/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1513/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6657
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6657/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6657/comments
|
https://api.github.com/repos/ollama/ollama/issues/6657/events
|
https://github.com/ollama/ollama/issues/6657
| 2,508,239,239
|
I_kwDOJ0Z1Ps6VgLGH
| 6,657
|
Qwen2-VL 2B / 7B / 72B
|
{
"login": "thiswillbeyourgithub",
"id": 26625900,
"node_id": "MDQ6VXNlcjI2NjI1OTAw",
"avatar_url": "https://avatars.githubusercontent.com/u/26625900?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thiswillbeyourgithub",
"html_url": "https://github.com/thiswillbeyourgithub",
"followers_url": "https://api.github.com/users/thiswillbeyourgithub/followers",
"following_url": "https://api.github.com/users/thiswillbeyourgithub/following{/other_user}",
"gists_url": "https://api.github.com/users/thiswillbeyourgithub/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thiswillbeyourgithub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thiswillbeyourgithub/subscriptions",
"organizations_url": "https://api.github.com/users/thiswillbeyourgithub/orgs",
"repos_url": "https://api.github.com/users/thiswillbeyourgithub/repos",
"events_url": "https://api.github.com/users/thiswillbeyourgithub/events{/privacy}",
"received_events_url": "https://api.github.com/users/thiswillbeyourgithub/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-09-05T16:23:48
| 2024-09-05T16:24:37
| 2024-09-05T16:24:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
The [new Qwen2-VL model](https://github.com/QwenLM/Qwen2-VL) supports as input vision and even videos at low size. It uses [a permissive license too!](https://simonwillison.net/2024/Sep/4/qwen2-vl/)
Example by [simonw](https://simonwillison.net/2024/Sep/4/qwen2-vl/)

Edit: oups: dupe of #6564
|
{
"login": "thiswillbeyourgithub",
"id": 26625900,
"node_id": "MDQ6VXNlcjI2NjI1OTAw",
"avatar_url": "https://avatars.githubusercontent.com/u/26625900?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thiswillbeyourgithub",
"html_url": "https://github.com/thiswillbeyourgithub",
"followers_url": "https://api.github.com/users/thiswillbeyourgithub/followers",
"following_url": "https://api.github.com/users/thiswillbeyourgithub/following{/other_user}",
"gists_url": "https://api.github.com/users/thiswillbeyourgithub/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thiswillbeyourgithub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thiswillbeyourgithub/subscriptions",
"organizations_url": "https://api.github.com/users/thiswillbeyourgithub/orgs",
"repos_url": "https://api.github.com/users/thiswillbeyourgithub/repos",
"events_url": "https://api.github.com/users/thiswillbeyourgithub/events{/privacy}",
"received_events_url": "https://api.github.com/users/thiswillbeyourgithub/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6657/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6657/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8666
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8666/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8666/comments
|
https://api.github.com/repos/ollama/ollama/issues/8666/events
|
https://github.com/ollama/ollama/issues/8666
| 2,818,549,002
|
I_kwDOJ0Z1Ps6n_6UK
| 8,666
|
TERMUX ERROR
|
{
"login": "NeKosmico",
"id": 165345955,
"node_id": "U_kgDOCdr6ow",
"avatar_url": "https://avatars.githubusercontent.com/u/165345955?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NeKosmico",
"html_url": "https://github.com/NeKosmico",
"followers_url": "https://api.github.com/users/NeKosmico/followers",
"following_url": "https://api.github.com/users/NeKosmico/following{/other_user}",
"gists_url": "https://api.github.com/users/NeKosmico/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NeKosmico/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NeKosmico/subscriptions",
"organizations_url": "https://api.github.com/users/NeKosmico/orgs",
"repos_url": "https://api.github.com/users/NeKosmico/repos",
"events_url": "https://api.github.com/users/NeKosmico/events{/privacy}",
"received_events_url": "https://api.github.com/users/NeKosmico/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-29T15:30:45
| 2025-01-29T16:10:43
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I wanted to run Ollama in the termux (Android) application, everything was going well... Until the following happened in this part:
```bash
~/ollama $ go build .
# github.com/ollama/ollama/discover
gpu_info_cudart.c:61:13: warning: comparison of different enumeration types ('cudartReturn_t' (aka 'enum cudartReturn_enum') and 'enum cudaError_enum') [-Wenum-compare]
gpu_info_cudart.c:171:60: warning: format specifies type 'unsigned long' but the argument has type 'uint64_t' (aka 'unsigned long long') [-Wformat]
./gpu_info.h:33:23: note: expanded from macro 'LOG'
gpu_info_cudart.c:172:59: warning: format specifies type 'unsigned long' but the argument has type 'uint64_t' (aka 'unsigned long long') [-Wformat]
./gpu_info.h:33:23: note: expanded from macro 'LOG'
gpu_info_cudart.c:173:59: warning: format specifies type 'unsigned long' but the argument has type 'uint64_t' (aka 'unsigned long long') [-Wformat]
./gpu_info.h:33:23: note: expanded from macro 'LOG'
# github.com/ollama/ollama/discover
gpu_info_nvcuda.c:196:63: warning: format specifies type 'unsigned long' but the argument has type 'uint64_t' (aka 'unsigned long long') [-Wformat]
./gpu_info.h:33:23: note: expanded from macro 'LOG'
gpu_info_nvcuda.c:197:62: warning: format specifies type 'unsigned long' but the argument has type 'uint64_t' (aka 'unsigned long long') [-Wformat]
./gpu_info.h:33:23: note: expanded from macro 'LOG'
```
I don't know if there is any solution or not, Your answers are appreciated :'3
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
0.5.12
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8666/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8666/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4905
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4905/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4905/comments
|
https://api.github.com/repos/ollama/ollama/issues/4905/events
|
https://github.com/ollama/ollama/issues/4905
| 2,340,398,957
|
I_kwDOJ0Z1Ps6Lf6dt
| 4,905
|
Issue verifying SHA256 digest in Windows version of Ollama
|
{
"login": "raymond-infinitecode",
"id": 4714784,
"node_id": "MDQ6VXNlcjQ3MTQ3ODQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/4714784?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/raymond-infinitecode",
"html_url": "https://github.com/raymond-infinitecode",
"followers_url": "https://api.github.com/users/raymond-infinitecode/followers",
"following_url": "https://api.github.com/users/raymond-infinitecode/following{/other_user}",
"gists_url": "https://api.github.com/users/raymond-infinitecode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/raymond-infinitecode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/raymond-infinitecode/subscriptions",
"organizations_url": "https://api.github.com/users/raymond-infinitecode/orgs",
"repos_url": "https://api.github.com/users/raymond-infinitecode/repos",
"events_url": "https://api.github.com/users/raymond-infinitecode/events{/privacy}",
"received_events_url": "https://api.github.com/users/raymond-infinitecode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-07T13:00:03
| 2024-06-07T13:19:39
| 2024-06-07T13:19:39
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Error: digest mismatch, file must be downloaded again: want sha256:xxxxx, got sha256:xxxxx
ollama run phi3:3.3b-mini-4k-instruct-q8_0
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.41
|
{
"login": "raymond-infinitecode",
"id": 4714784,
"node_id": "MDQ6VXNlcjQ3MTQ3ODQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/4714784?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/raymond-infinitecode",
"html_url": "https://github.com/raymond-infinitecode",
"followers_url": "https://api.github.com/users/raymond-infinitecode/followers",
"following_url": "https://api.github.com/users/raymond-infinitecode/following{/other_user}",
"gists_url": "https://api.github.com/users/raymond-infinitecode/gists{/gist_id}",
"starred_url": "https://api.github.com/users/raymond-infinitecode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/raymond-infinitecode/subscriptions",
"organizations_url": "https://api.github.com/users/raymond-infinitecode/orgs",
"repos_url": "https://api.github.com/users/raymond-infinitecode/repos",
"events_url": "https://api.github.com/users/raymond-infinitecode/events{/privacy}",
"received_events_url": "https://api.github.com/users/raymond-infinitecode/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4905/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4905/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3647
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3647/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3647/comments
|
https://api.github.com/repos/ollama/ollama/issues/3647/events
|
https://github.com/ollama/ollama/issues/3647
| 2,243,153,230
|
I_kwDOJ0Z1Ps6Fs81O
| 3,647
|
Ollama reverts to CPU on a100 docker. "error looking up CUDA GPU memory: device memory info lookup failure 0: 4
|
{
"login": "Yaffa16",
"id": 13223356,
"node_id": "MDQ6VXNlcjEzMjIzMzU2",
"avatar_url": "https://avatars.githubusercontent.com/u/13223356?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Yaffa16",
"html_url": "https://github.com/Yaffa16",
"followers_url": "https://api.github.com/users/Yaffa16/followers",
"following_url": "https://api.github.com/users/Yaffa16/following{/other_user}",
"gists_url": "https://api.github.com/users/Yaffa16/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Yaffa16/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Yaffa16/subscriptions",
"organizations_url": "https://api.github.com/users/Yaffa16/orgs",
"repos_url": "https://api.github.com/users/Yaffa16/repos",
"events_url": "https://api.github.com/users/Yaffa16/events{/privacy}",
"received_events_url": "https://api.github.com/users/Yaffa16/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-04-15T09:20:50
| 2024-09-25T20:31:42
| 2024-04-24T00:28:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
time=2024-04-15T09:17:48.609Z level=INFO source=gpu.go:82 msg="Nvidia GPU detected"
time=2024-04-15T09:17:48.609Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-04-15T09:17:48.617Z level=INFO source=gpu.go:109 msg="error looking up CUDA GPU memory: device memory info lookup failure 0: 4"
time=2024-04-15T09:17:48.617Z level=INFO source=routes.go:1133 msg="no GPU detected"
time=2024-04-15T09:17:49.031Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-04-15T09:17:49.031Z level=INFO source=gpu.go:109 msg="error looking up CUDA GPU memory: device memory info lookup failure 0: 4"
time=2024-04-15T09:17:49.031Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-04-15T09:17:49.031Z level=INFO source=gpu.go:109 msg="error looking up CUDA GPU memory: device memory info lookup failure 0: 4"
time=2024-04-15T09:17:49.031Z level=INFO source=llm.go:85 msg="GPU not available, falling back to CPU"
time=2024-04-15T09:17:49.034Z level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama3883625654/runners/cpu_avx2/libext_server.so"
time=2024-04-15T09:17:49.034Z level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"
+-----------------------------------------------------------------------------------------+
| NVIDIA-SMI 550.54.15 Driver Version: 550.54.15 CUDA Version: 12.4 |
|-----------------------------------------+------------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+========================+======================|
| 0 NVIDIA A100-PCIE-40GB Off | 00000000:01:00.0 Off | On |
| N/A 38C P0 32W / 250W | 90MiB / 40960MiB | N/A Default |
| | | Enabled |
+-----------------------------------------+------------------------+----------------------+
| 1 NVIDIA A100-PCIE-40GB Off | 00000000:43:00.0 Off | On |
| N/A 32C P0 35W / 250W | 17843MiB / 40960MiB | N/A Default |
| | | Enabled |
+-----------------------------------------+------------------------+----------------------+
+-----------------------------------------------------------------------------------------+
| MIG devices: |
+------------------+----------------------------------+-----------+-----------------------+
| GPU GI CI MIG | Memory-Usage | Vol| Shared |
| ID ID Dev | BAR1-Usage | SM Unc| CE ENC DEC OFA JPG |
| | | ECC| |
|==================+==================================+===========+=======================|
| 0 1 0 0 | 53MiB / 19968MiB | 56 0 | 4 0 2 0 0 |
| | 1MiB / 32767MiB | | |
+------------------+----------------------------------+-----------+-----------------------+
| 0 5 0 1 | 25MiB / 9856MiB | 28 0 | 2 0 1 0 0 |
| | 0MiB / 16383MiB | | |
+------------------+----------------------------------+-----------+-----------------------+
| 0 13 0 2 | 12MiB / 4864MiB | 14 0 | 1 0 0 0 0 |
| | 0MiB / 8191MiB | | |
+------------------+----------------------------------+-----------+-----------------------+
| 1 0 0 0 | 17843MiB / 40326MiB | 98 0 | 7 0 5 1 1 |
| | 7MiB / 65536MiB | | |
+------------------+----------------------------------+-----------+-----------------------+
+-----------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=========================================================================================|
| 1 0 0 4197 C /opt/conda/bin/python 5180MiB |
| 1 0 0 4201 C /opt/conda/bin/python 12644MiB |
+-----------------------------------------------------------------------------------------+
$
### What did you expect to see?
_No response_
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
Linux
### Architecture
_No response_
### Platform
_No response_
### Ollama version
latest
### GPU
Nvidia
### GPU info
a100
### CPU
_No response_
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3647/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3647/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4357
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4357/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4357/comments
|
https://api.github.com/repos/ollama/ollama/issues/4357/events
|
https://github.com/ollama/ollama/issues/4357
| 2,290,856,865
|
I_kwDOJ0Z1Ps6Ii7Oh
| 4,357
|
Incorrect value of "finish_reason" when streaming
|
{
"login": "longseespace",
"id": 187720,
"node_id": "MDQ6VXNlcjE4NzcyMA==",
"avatar_url": "https://avatars.githubusercontent.com/u/187720?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/longseespace",
"html_url": "https://github.com/longseespace",
"followers_url": "https://api.github.com/users/longseespace/followers",
"following_url": "https://api.github.com/users/longseespace/following{/other_user}",
"gists_url": "https://api.github.com/users/longseespace/gists{/gist_id}",
"starred_url": "https://api.github.com/users/longseespace/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/longseespace/subscriptions",
"organizations_url": "https://api.github.com/users/longseespace/orgs",
"repos_url": "https://api.github.com/users/longseespace/repos",
"events_url": "https://api.github.com/users/longseespace/events{/privacy}",
"received_events_url": "https://api.github.com/users/longseespace/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-05-11T11:48:47
| 2024-05-11T22:31:42
| 2024-05-11T22:31:42
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When streaming using OpenAI server, the "finish_reason" is an empty which is incorrect. It should be one of the values from OpenAI, or null.
```
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" Hello"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" there"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"!"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" I"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"'"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"m"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" B"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"olt"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"AI"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":","},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" your"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" helpful"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" assistant"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"."},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" How"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" can"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" I"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" help"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" you"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" today"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"?"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" Let"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" me"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" know"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" if"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" you"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" have"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" any"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" questions"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427619,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" or"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" tasks"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" you"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" need"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" assistance"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" with"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"."},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" I"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"'"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"ll"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" do"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" my"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" best"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" to"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" provide"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" you"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" with"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" accurate"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" and"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" tim"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"ely"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" information"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"."},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" Go"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" ahead"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" and"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" give"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" me"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" a"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" try"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"!"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":" "},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427620,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"😊"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427621,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":"💡"},"finish_reason":""}]}
data: {"id":"chatcmpl-693","object":"chat.completion.chunk","created":1715427621,"model":"mistral:latest","system_fingerprint":"fp_ollama","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":"stop"}]}
data: [DONE]
```
I think this was the source of the issue though I might be wrong:
https://github.com/ollama/ollama/commit/cfa84b8470837ecb418d1668858fe06c35c01d34#diff-e09be2f542df58a8abd41a5061837090a6b630a5f16f491b0e7a61e9dc429679
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.36
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4357/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4357/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1493
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1493/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1493/comments
|
https://api.github.com/repos/ollama/ollama/issues/1493/events
|
https://github.com/ollama/ollama/issues/1493
| 2,038,728,774
|
I_kwDOJ0Z1Ps55hIhG
| 1,493
|
A way to prevent downloaded models from being deleted
|
{
"login": "t18n",
"id": 14198542,
"node_id": "MDQ6VXNlcjE0MTk4NTQy",
"avatar_url": "https://avatars.githubusercontent.com/u/14198542?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/t18n",
"html_url": "https://github.com/t18n",
"followers_url": "https://api.github.com/users/t18n/followers",
"following_url": "https://api.github.com/users/t18n/following{/other_user}",
"gists_url": "https://api.github.com/users/t18n/gists{/gist_id}",
"starred_url": "https://api.github.com/users/t18n/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/t18n/subscriptions",
"organizations_url": "https://api.github.com/users/t18n/orgs",
"repos_url": "https://api.github.com/users/t18n/repos",
"events_url": "https://api.github.com/users/t18n/events{/privacy}",
"received_events_url": "https://api.github.com/users/t18n/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 8
| 2023-12-13T00:09:31
| 2024-11-01T17:01:53
| 2024-01-25T22:26:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I downloaded around 50Gbs worth of models to use with Big AGI. For some reason, when I reloaded with Big AGI interface, all the models are gone. The models are too easy to get removed and it takes a lot of time to download them. Is there a way to prevent that? Can I save the models somewhere and point Ollama to it instead?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1493/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1493/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3887
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3887/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3887/comments
|
https://api.github.com/repos/ollama/ollama/issues/3887/events
|
https://github.com/ollama/ollama/pull/3887
| 2,261,967,319
|
PR_kwDOJ0Z1Ps5to5dN
| 3,887
|
types/model: require all names parts start with an alnum char
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2024-04-24T18:55:11
| 2024-04-26T03:13:22
| 2024-04-26T03:13:22
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3887",
"html_url": "https://github.com/ollama/ollama/pull/3887",
"diff_url": "https://github.com/ollama/ollama/pull/3887.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3887.patch",
"merged_at": null
}
| null |
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3887/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3887/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1264
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1264/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1264/comments
|
https://api.github.com/repos/ollama/ollama/issues/1264/events
|
https://github.com/ollama/ollama/issues/1264
| 2,009,654,397
|
I_kwDOJ0Z1Ps53yOR9
| 1,264
|
Why is my model not referring to the info given in system command in Modelfile
|
{
"login": "DeeptangshuSaha",
"id": 64020655,
"node_id": "MDQ6VXNlcjY0MDIwNjU1",
"avatar_url": "https://avatars.githubusercontent.com/u/64020655?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DeeptangshuSaha",
"html_url": "https://github.com/DeeptangshuSaha",
"followers_url": "https://api.github.com/users/DeeptangshuSaha/followers",
"following_url": "https://api.github.com/users/DeeptangshuSaha/following{/other_user}",
"gists_url": "https://api.github.com/users/DeeptangshuSaha/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DeeptangshuSaha/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DeeptangshuSaha/subscriptions",
"organizations_url": "https://api.github.com/users/DeeptangshuSaha/orgs",
"repos_url": "https://api.github.com/users/DeeptangshuSaha/repos",
"events_url": "https://api.github.com/users/DeeptangshuSaha/events{/privacy}",
"received_events_url": "https://api.github.com/users/DeeptangshuSaha/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-11-24T12:32:09
| 2024-01-25T22:05:28
| 2024-01-25T22:05:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Okay let me explain what I meant by that.
I am trying to create a personal assistant and I want the model to remember some of my details.
I tried this by providing a system prompt but that did not exactly work as I set myself as its master for a lack of a better term.
But it shoots of saying its an AI and it only assists and that it does not have any master.
Now with the context out of the way What my real question is can I actually create a Model like this with my info or is it just impossible for now and I have to forget about it.
Here is an example for you to see where I am going with this--->
Prompt: Who is your master? (or Who am I?)
Answer: I serve <My Provided Name> (or You are <My Provided Name> )
--> As I will be the only one using it.
Prompt: What is my Address?
Answer: Your address is <My Provided Address>
P.S Its more of an inquiry than a Issue per se. Please inform me if its possible or should I forget about it completely.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1264/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1264/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7610
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7610/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7610/comments
|
https://api.github.com/repos/ollama/ollama/issues/7610/events
|
https://github.com/ollama/ollama/issues/7610
| 2,648,107,517
|
I_kwDOJ0Z1Ps6d1un9
| 7,610
|
Blank responses
|
{
"login": "AncientMystic",
"id": 62780271,
"node_id": "MDQ6VXNlcjYyNzgwMjcx",
"avatar_url": "https://avatars.githubusercontent.com/u/62780271?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AncientMystic",
"html_url": "https://github.com/AncientMystic",
"followers_url": "https://api.github.com/users/AncientMystic/followers",
"following_url": "https://api.github.com/users/AncientMystic/following{/other_user}",
"gists_url": "https://api.github.com/users/AncientMystic/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AncientMystic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AncientMystic/subscriptions",
"organizations_url": "https://api.github.com/users/AncientMystic/orgs",
"repos_url": "https://api.github.com/users/AncientMystic/repos",
"events_url": "https://api.github.com/users/AncientMystic/events{/privacy}",
"received_events_url": "https://api.github.com/users/AncientMystic/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-11-11T04:34:22
| 2024-12-23T07:53:09
| 2024-12-23T07:53:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Testing different models, mainly gemma 2, i have been receiving a lot of blank responses (no line, no spacing, just blank no characters at all), usually a few regens fixes it but sometimes it takes quite a few (once took 60x regenerating on my laptop instance to move on and generate a response) thought it might have been open-webui possibly something i had configured, but i have just ran into it happening in the terminal with ollama directly. So i think this is either a bug with the new ollama or how it handles gemma 2 and a few other models which i think are also gemma based.
(Doesn't always happen but randomly happens sometimes, sometimes its the 1st or 2nd response sometimes its the 15th, just random zero byte responses come up here and there)
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7610/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7610/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/4945
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4945/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4945/comments
|
https://api.github.com/repos/ollama/ollama/issues/4945/events
|
https://github.com/ollama/ollama/issues/4945
| 2,342,064,044
|
I_kwDOJ0Z1Ps6LmQ-s
| 4,945
|
Trying to Run Ollama on openSUSE Tumbleweed - GPU errors
|
{
"login": "richardstevenhack",
"id": 44449170,
"node_id": "MDQ6VXNlcjQ0NDQ5MTcw",
"avatar_url": "https://avatars.githubusercontent.com/u/44449170?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/richardstevenhack",
"html_url": "https://github.com/richardstevenhack",
"followers_url": "https://api.github.com/users/richardstevenhack/followers",
"following_url": "https://api.github.com/users/richardstevenhack/following{/other_user}",
"gists_url": "https://api.github.com/users/richardstevenhack/gists{/gist_id}",
"starred_url": "https://api.github.com/users/richardstevenhack/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/richardstevenhack/subscriptions",
"organizations_url": "https://api.github.com/users/richardstevenhack/orgs",
"repos_url": "https://api.github.com/users/richardstevenhack/repos",
"events_url": "https://api.github.com/users/richardstevenhack/events{/privacy}",
"received_events_url": "https://api.github.com/users/richardstevenhack/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-06-09T07:07:35
| 2024-06-09T07:10:17
| 2024-06-09T07:10:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm trying to run Ollama on the latest openSUSE Tumbleweed Linux. I got it to install by running the installer as root and then explicitly passing the path where it was installed to the ollama serve command. However, I then get a slew of error messages.
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
Warning: could not connect to a running Ollama instance Warning: client version is 0.1.42
|
{
"login": "richardstevenhack",
"id": 44449170,
"node_id": "MDQ6VXNlcjQ0NDQ5MTcw",
"avatar_url": "https://avatars.githubusercontent.com/u/44449170?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/richardstevenhack",
"html_url": "https://github.com/richardstevenhack",
"followers_url": "https://api.github.com/users/richardstevenhack/followers",
"following_url": "https://api.github.com/users/richardstevenhack/following{/other_user}",
"gists_url": "https://api.github.com/users/richardstevenhack/gists{/gist_id}",
"starred_url": "https://api.github.com/users/richardstevenhack/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/richardstevenhack/subscriptions",
"organizations_url": "https://api.github.com/users/richardstevenhack/orgs",
"repos_url": "https://api.github.com/users/richardstevenhack/repos",
"events_url": "https://api.github.com/users/richardstevenhack/events{/privacy}",
"received_events_url": "https://api.github.com/users/richardstevenhack/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4945/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4945/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2419
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2419/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2419/comments
|
https://api.github.com/repos/ollama/ollama/issues/2419/events
|
https://github.com/ollama/ollama/issues/2419
| 2,126,496,211
|
I_kwDOJ0Z1Ps5-v8HT
| 2,419
|
Running Qwen
|
{
"login": "PrashantDixit0",
"id": 54981696,
"node_id": "MDQ6VXNlcjU0OTgxNjk2",
"avatar_url": "https://avatars.githubusercontent.com/u/54981696?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PrashantDixit0",
"html_url": "https://github.com/PrashantDixit0",
"followers_url": "https://api.github.com/users/PrashantDixit0/followers",
"following_url": "https://api.github.com/users/PrashantDixit0/following{/other_user}",
"gists_url": "https://api.github.com/users/PrashantDixit0/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PrashantDixit0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PrashantDixit0/subscriptions",
"organizations_url": "https://api.github.com/users/PrashantDixit0/orgs",
"repos_url": "https://api.github.com/users/PrashantDixit0/repos",
"events_url": "https://api.github.com/users/PrashantDixit0/events{/privacy}",
"received_events_url": "https://api.github.com/users/PrashantDixit0/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 12
| 2024-02-09T05:23:46
| 2024-03-11T19:17:38
| 2024-03-11T19:17:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I tried running Qwen with Langchain but didn't get any output. It is stuck.
Has anyone else got stuck at the same place?
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2419/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2419/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3520
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3520/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3520/comments
|
https://api.github.com/repos/ollama/ollama/issues/3520/events
|
https://github.com/ollama/ollama/issues/3520
| 2,229,570,794
|
I_kwDOJ0Z1Ps6E5Izq
| 3,520
|
The ability to pass session commands as startup arguments client-side
|
{
"login": "redpiller",
"id": 31500722,
"node_id": "MDQ6VXNlcjMxNTAwNzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/31500722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/redpiller",
"html_url": "https://github.com/redpiller",
"followers_url": "https://api.github.com/users/redpiller/followers",
"following_url": "https://api.github.com/users/redpiller/following{/other_user}",
"gists_url": "https://api.github.com/users/redpiller/gists{/gist_id}",
"starred_url": "https://api.github.com/users/redpiller/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/redpiller/subscriptions",
"organizations_url": "https://api.github.com/users/redpiller/orgs",
"repos_url": "https://api.github.com/users/redpiller/repos",
"events_url": "https://api.github.com/users/redpiller/events{/privacy}",
"received_events_url": "https://api.github.com/users/redpiller/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-07T05:26:53
| 2024-04-07T09:11:35
| 2024-04-07T09:11:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
I recently attempted to make permanent adjustments to the system prompt of a model and realized it is a cumbersome process of rebuilding the model, changing its manifest causing a lot of needless IO to my SSD.
This lack of scalability is a critical flaw in a software peace of this magnitude.
### How should we solve this?
The ability to pass session commands as startup arguments (client side) would drastically reduce the need of rebuilding the entire language model, consequently improving this project's scalability and enabling its incorporation into other systems.
### What is the impact of not solving this?
The lack of scalability preventing this project from being included into more complex environments.
A cumbersome process of rebuilding the model, changing its manifest causing a lot of needless IO to the drive every time a session-level adjustment is in order.
### Anything else?
_No response_
|
{
"login": "redpiller",
"id": 31500722,
"node_id": "MDQ6VXNlcjMxNTAwNzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/31500722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/redpiller",
"html_url": "https://github.com/redpiller",
"followers_url": "https://api.github.com/users/redpiller/followers",
"following_url": "https://api.github.com/users/redpiller/following{/other_user}",
"gists_url": "https://api.github.com/users/redpiller/gists{/gist_id}",
"starred_url": "https://api.github.com/users/redpiller/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/redpiller/subscriptions",
"organizations_url": "https://api.github.com/users/redpiller/orgs",
"repos_url": "https://api.github.com/users/redpiller/repos",
"events_url": "https://api.github.com/users/redpiller/events{/privacy}",
"received_events_url": "https://api.github.com/users/redpiller/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3520/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3520/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8152
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8152/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8152/comments
|
https://api.github.com/repos/ollama/ollama/issues/8152/events
|
https://github.com/ollama/ollama/issues/8152
| 2,747,307,037
|
I_kwDOJ0Z1Ps6jwJQd
| 8,152
|
LangChain - ChatOLLAMA model - calling tool on every input
|
{
"login": "Arslan-Mehmood1",
"id": 51626734,
"node_id": "MDQ6VXNlcjUxNjI2NzM0",
"avatar_url": "https://avatars.githubusercontent.com/u/51626734?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Arslan-Mehmood1",
"html_url": "https://github.com/Arslan-Mehmood1",
"followers_url": "https://api.github.com/users/Arslan-Mehmood1/followers",
"following_url": "https://api.github.com/users/Arslan-Mehmood1/following{/other_user}",
"gists_url": "https://api.github.com/users/Arslan-Mehmood1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Arslan-Mehmood1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Arslan-Mehmood1/subscriptions",
"organizations_url": "https://api.github.com/users/Arslan-Mehmood1/orgs",
"repos_url": "https://api.github.com/users/Arslan-Mehmood1/repos",
"events_url": "https://api.github.com/users/Arslan-Mehmood1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Arslan-Mehmood1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-12-18T09:40:51
| 2024-12-23T08:14:26
| 2024-12-23T08:14:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
llama3.2:1b
llama3.2:3b
llama3.2:1b-instruct-fp16
llama3.1:8b
I've tested above models and all the above models are calling tools even with simple query like 'hi'.
the behavior is same whether binding :
tools_list
openai_format_tools_list
Need help.
Result:
```
python 1_tool_calling_test.py
content='' additional_kwargs={} response_metadata={'model': 'llama3.1:8b', 'created_at': '2024-12-18T09:17:37.90843589Z', 'done': True, 'done_reason': 'stop', 'total_duration': 72841245771, 'load_duration': 13778033737, 'prompt_eval_count': 194, 'prompt_eval_duration': 50723000000, 'eval_count': 22, 'eval_duration': 8337000000, 'message': Message(role='assistant', content='', images=None, tool_calls=[ToolCall(function=Function(name='tavily_search_results_json', arguments={'query': 'current events'}))])} id='run-8931e574-9297-4ce9-93f1-54d00ce8c413-0' tool_calls=[{'name': 'tavily_search_results_json', 'args': {'query': 'current events'}, 'id': '82754a8a-619b-4a1e-85d3-cb767d4c6a9f', 'type': 'tool_call'}] usage_metadata={'input_tokens': 194, 'output_tokens': 22, 'total_tokens': 216}
[{'name': 'tavily_search_results_json', 'args': {'query': 'current events'}, 'id': '82754a8a-619b-4a1e-85d3-cb767d4c6a9f', 'type': 'tool_call'}]
```
Code For testing:
```
from typing import List
from dotenv import load_dotenv, find_dotenv
load_dotenv(find_dotenv())
from langchain_core.tools import tool
from langchain_ollama import ChatOllama
from langchain_community.tools.tavily_search import TavilySearchResults
from langchain_core.utils.function_calling import convert_to_openai_tool
# @tool
# def web_search_tool(web_query: str) -> str:
# """
# Use this tool only when you need to use web search in order to find an answer for user.
# Args:
# web_query (str) : the query for web search
# """
# search = TavilySearchResults()
# results = search.invoke(query)
# return results
web_search_tool = TavilySearchResults()
tools_list = [web_search_tool]
openai_format_tools_list = [convert_to_openai_tool(f) for f in tools_list]
llm = ChatOllama(model="llama3.1:8b", temperature=0).bind_tools(tools_list)
result = llm.invoke("Hi, how are you?")
print(result,"\n\n")
print(result.tool_calls)
```
### OS
Linux
### GPU
_No response_
### CPU
Intel
### Ollama version
0.5.1
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8152/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8152/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/8479
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8479/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8479/comments
|
https://api.github.com/repos/ollama/ollama/issues/8479/events
|
https://github.com/ollama/ollama/issues/8479
| 2,796,791,671
|
I_kwDOJ0Z1Ps6ms6d3
| 8,479
|
Embedding Model: iamgroot42/rover_nexus
|
{
"login": "AlgorithmicKing",
"id": 147901320,
"node_id": "U_kgDOCNDLiA",
"avatar_url": "https://avatars.githubusercontent.com/u/147901320?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AlgorithmicKing",
"html_url": "https://github.com/AlgorithmicKing",
"followers_url": "https://api.github.com/users/AlgorithmicKing/followers",
"following_url": "https://api.github.com/users/AlgorithmicKing/following{/other_user}",
"gists_url": "https://api.github.com/users/AlgorithmicKing/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AlgorithmicKing/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AlgorithmicKing/subscriptions",
"organizations_url": "https://api.github.com/users/AlgorithmicKing/orgs",
"repos_url": "https://api.github.com/users/AlgorithmicKing/repos",
"events_url": "https://api.github.com/users/AlgorithmicKing/events{/privacy}",
"received_events_url": "https://api.github.com/users/AlgorithmicKing/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-18T06:23:15
| 2025-01-18T06:23:15
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Its the top model in MTEB Leaderboard.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8479/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8479/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/398
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/398/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/398/comments
|
https://api.github.com/repos/ollama/ollama/issues/398/events
|
https://github.com/ollama/ollama/pull/398
| 1,862,088,981
|
PR_kwDOJ0Z1Ps5YiEeQ
| 398
|
Mxyng/cleanup
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-08-22T19:41:43
| 2023-08-22T22:51:42
| 2023-08-22T22:51:41
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/398",
"html_url": "https://github.com/ollama/ollama/pull/398",
"diff_url": "https://github.com/ollama/ollama/pull/398.diff",
"patch_url": "https://github.com/ollama/ollama/pull/398.patch",
"merged_at": "2023-08-22T22:51:41"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/398/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/398/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8351
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8351/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8351/comments
|
https://api.github.com/repos/ollama/ollama/issues/8351/events
|
https://github.com/ollama/ollama/pull/8351
| 2,776,383,777
|
PR_kwDOJ0Z1Ps6HIRqZ
| 8,351
|
better client error for /api/create
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-08T21:28:44
| 2025-01-09T18:12:33
| 2025-01-09T18:12:30
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8351",
"html_url": "https://github.com/ollama/ollama/pull/8351",
"diff_url": "https://github.com/ollama/ollama/pull/8351.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8351.patch",
"merged_at": "2025-01-09T18:12:30"
}
|
This change shows a mode descriptive error in the client w/ the `POST /api/create` endpoint if the client has been refreshed but the server hasn't been updated.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8351/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8351/timeline
| null | null | true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.