url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/7243
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7243/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7243/comments
|
https://api.github.com/repos/ollama/ollama/issues/7243/events
|
https://github.com/ollama/ollama/issues/7243
| 2,595,080,170
|
I_kwDOJ0Z1Ps6arcfq
| 7,243
|
add module/plug-in system to ollama
|
{
"login": "malv-c",
"id": 19170213,
"node_id": "MDQ6VXNlcjE5MTcwMjEz",
"avatar_url": "https://avatars.githubusercontent.com/u/19170213?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/malv-c",
"html_url": "https://github.com/malv-c",
"followers_url": "https://api.github.com/users/malv-c/followers",
"following_url": "https://api.github.com/users/malv-c/following{/other_user}",
"gists_url": "https://api.github.com/users/malv-c/gists{/gist_id}",
"starred_url": "https://api.github.com/users/malv-c/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/malv-c/subscriptions",
"organizations_url": "https://api.github.com/users/malv-c/orgs",
"repos_url": "https://api.github.com/users/malv-c/repos",
"events_url": "https://api.github.com/users/malv-c/events{/privacy}",
"received_events_url": "https://api.github.com/users/malv-c/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-10-17T15:31:56
| 2024-10-17T15:31:56
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
llm > go code > ollama add-on
no ollama recompil but fast evolution without you have more work...
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7243/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7243/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/986
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/986/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/986/comments
|
https://api.github.com/repos/ollama/ollama/issues/986/events
|
https://github.com/ollama/ollama/issues/986
| 1,976,025,776
|
I_kwDOJ0Z1Ps51x8Kw
| 986
|
Uninstall ollama from linux
|
{
"login": "sheecegardezi",
"id": 7543039,
"node_id": "MDQ6VXNlcjc1NDMwMzk=",
"avatar_url": "https://avatars.githubusercontent.com/u/7543039?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sheecegardezi",
"html_url": "https://github.com/sheecegardezi",
"followers_url": "https://api.github.com/users/sheecegardezi/followers",
"following_url": "https://api.github.com/users/sheecegardezi/following{/other_user}",
"gists_url": "https://api.github.com/users/sheecegardezi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sheecegardezi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sheecegardezi/subscriptions",
"organizations_url": "https://api.github.com/users/sheecegardezi/orgs",
"repos_url": "https://api.github.com/users/sheecegardezi/repos",
"events_url": "https://api.github.com/users/sheecegardezi/events{/privacy}",
"received_events_url": "https://api.github.com/users/sheecegardezi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 11
| 2023-11-03T11:54:38
| 2025-01-24T13:49:35
| 2023-11-03T14:52:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Here's a general guideline on how to uninstall it:
1. Delete the Ollama binary:
Use the `rm` command to remove the Ollama binary. For example:
```
sudo rm /usr/local/bin/ollama
```
2. If the script created a systemd service, disable and remove it:
If the script created a systemd service for Ollama, you should disable and remove it using the following commands:
```
sudo systemctl stop ollama
sudo systemctl disable ollama
sudo rm /etc/systemd/system/ollama.service
sudo systemctl daemon-reload
```
3. Remove any created user and group (if applicable):
The script might have created a user and group named "ollama." You can remove them using the following commands:
```
sudo userdel ollama
sudo groupdel ollama
```
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/986/reactions",
"total_count": 53,
"+1": 52,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/986/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4336
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4336/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4336/comments
|
https://api.github.com/repos/ollama/ollama/issues/4336/events
|
https://github.com/ollama/ollama/issues/4336
| 2,290,623,441
|
I_kwDOJ0Z1Ps6IiCPR
| 4,336
|
improve search engine of ollama.com
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-05-11T03:08:55
| 2024-10-28T21:43:11
| 2024-10-28T21:43:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
<img width="1217" alt="截屏2024-05-11 10 52 49" src="https://github.com/ollama/ollama/assets/146583103/422e0577-4721-4fa8-b063-ad248fe933a2">
please improve search engine . show the relevant result.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
1.35
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4336/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5846
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5846/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5846/comments
|
https://api.github.com/repos/ollama/ollama/issues/5846/events
|
https://github.com/ollama/ollama/issues/5846
| 2,422,356,664
|
I_kwDOJ0Z1Ps6QYjq4
| 5,846
|
[Very minor] When eGPU gets disconnected, Ollama falls back to using CPU and doesn't detec when eGPU is connected again.
|
{
"login": "kosmallafilip",
"id": 80066643,
"node_id": "MDQ6VXNlcjgwMDY2NjQz",
"avatar_url": "https://avatars.githubusercontent.com/u/80066643?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kosmallafilip",
"html_url": "https://github.com/kosmallafilip",
"followers_url": "https://api.github.com/users/kosmallafilip/followers",
"following_url": "https://api.github.com/users/kosmallafilip/following{/other_user}",
"gists_url": "https://api.github.com/users/kosmallafilip/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kosmallafilip/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kosmallafilip/subscriptions",
"organizations_url": "https://api.github.com/users/kosmallafilip/orgs",
"repos_url": "https://api.github.com/users/kosmallafilip/repos",
"events_url": "https://api.github.com/users/kosmallafilip/events{/privacy}",
"received_events_url": "https://api.github.com/users/kosmallafilip/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-22T09:10:34
| 2024-07-23T17:57:44
| 2024-07-23T17:57:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
This is quite a big issue, but affects probably less than 0.01% of users, so I am tagging this as minor.
I am using Ollama locally on Asus ROG Flow Z13 with XG Mobile eGPU - this is Asus's proprietary eGPU technology, it's quite different than connecting with Thunderbolt, so I can't really tell if it's going to affect other types of eGPUs as well.
If, for any reason, the eGPU disconnects, Ollama falls back to using a CPU for generating messages, which is great, but when I connect it back I'd expect Ollama to detect a new GPU being connected and use it instead. This doesn't happen and I was left wondering why Ollama uses so much CPU and lags the whole system.
Due to how I use Ollama, I am still on 0.1.45, but let me know if you want me to test this the latest release.
I don't expect this to get fixed anytime soon, but I wanted to let you know it happens.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.45
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5846/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5846/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1880
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1880/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1880/comments
|
https://api.github.com/repos/ollama/ollama/issues/1880/events
|
https://github.com/ollama/ollama/pull/1880
| 2,073,336,091
|
PR_kwDOJ0Z1Ps5joTw2
| 1,880
|
improve cuda detection (rel. issue 1704)
|
{
"login": "fpreiss",
"id": 17441607,
"node_id": "MDQ6VXNlcjE3NDQxNjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/17441607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fpreiss",
"html_url": "https://github.com/fpreiss",
"followers_url": "https://api.github.com/users/fpreiss/followers",
"following_url": "https://api.github.com/users/fpreiss/following{/other_user}",
"gists_url": "https://api.github.com/users/fpreiss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fpreiss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fpreiss/subscriptions",
"organizations_url": "https://api.github.com/users/fpreiss/orgs",
"repos_url": "https://api.github.com/users/fpreiss/repos",
"events_url": "https://api.github.com/users/fpreiss/events{/privacy}",
"received_events_url": "https://api.github.com/users/fpreiss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-09T23:34:20
| 2024-01-12T21:13:09
| 2024-01-12T21:13:09
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1880",
"html_url": "https://github.com/ollama/ollama/pull/1880",
"diff_url": "https://github.com/ollama/ollama/pull/1880.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1880.patch",
"merged_at": null
}
|
Improve the CUDACXX and CUDA_LIB_DIR variable lookup in gen_linux.sh
|
{
"login": "fpreiss",
"id": 17441607,
"node_id": "MDQ6VXNlcjE3NDQxNjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/17441607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fpreiss",
"html_url": "https://github.com/fpreiss",
"followers_url": "https://api.github.com/users/fpreiss/followers",
"following_url": "https://api.github.com/users/fpreiss/following{/other_user}",
"gists_url": "https://api.github.com/users/fpreiss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fpreiss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fpreiss/subscriptions",
"organizations_url": "https://api.github.com/users/fpreiss/orgs",
"repos_url": "https://api.github.com/users/fpreiss/repos",
"events_url": "https://api.github.com/users/fpreiss/events{/privacy}",
"received_events_url": "https://api.github.com/users/fpreiss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1880/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1880/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2556
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2556/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2556/comments
|
https://api.github.com/repos/ollama/ollama/issues/2556/events
|
https://github.com/ollama/ollama/issues/2556
| 2,139,859,947
|
I_kwDOJ0Z1Ps5_i6vr
| 2,556
|
Phi-2-X
|
{
"login": "axrwl",
"id": 22836245,
"node_id": "MDQ6VXNlcjIyODM2MjQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/22836245?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/axrwl",
"html_url": "https://github.com/axrwl",
"followers_url": "https://api.github.com/users/axrwl/followers",
"following_url": "https://api.github.com/users/axrwl/following{/other_user}",
"gists_url": "https://api.github.com/users/axrwl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/axrwl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/axrwl/subscriptions",
"organizations_url": "https://api.github.com/users/axrwl/orgs",
"repos_url": "https://api.github.com/users/axrwl/repos",
"events_url": "https://api.github.com/users/axrwl/events{/privacy}",
"received_events_url": "https://api.github.com/users/axrwl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-02-17T06:35:37
| 2024-02-20T03:40:57
| 2024-02-20T03:40:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/axra/phi-2-x-0.1
A very high performing finetune of phi-2
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2556/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2556/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8369
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8369/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8369/comments
|
https://api.github.com/repos/ollama/ollama/issues/8369/events
|
https://github.com/ollama/ollama/issues/8369
| 2,779,190,195
|
I_kwDOJ0Z1Ps6lpxOz
| 8,369
|
type:file_url not supported?
|
{
"login": "SDAIer",
"id": 174102361,
"node_id": "U_kgDOCmCXWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/174102361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SDAIer",
"html_url": "https://github.com/SDAIer",
"followers_url": "https://api.github.com/users/SDAIer/followers",
"following_url": "https://api.github.com/users/SDAIer/following{/other_user}",
"gists_url": "https://api.github.com/users/SDAIer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SDAIer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SDAIer/subscriptions",
"organizations_url": "https://api.github.com/users/SDAIer/orgs",
"repos_url": "https://api.github.com/users/SDAIer/repos",
"events_url": "https://api.github.com/users/SDAIer/events{/privacy}",
"received_events_url": "https://api.github.com/users/SDAIer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2025-01-10T03:24:59
| 2025-01-10T04:53:35
| 2025-01-10T04:53:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Thanks for help, I want to know if **"type": "file_url"** supported , if not supported how can I handle the file
```
messages=[
{
"role": "user",
"content": [
{
"type": "file_url",
"name": "2222.docx",
"file_url": "https://xxx.com:4444/test/2222.docx"
},
{
"type": "text",
"text": "summarize the file "
}
]
}
]
```
|
{
"login": "SDAIer",
"id": 174102361,
"node_id": "U_kgDOCmCXWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/174102361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SDAIer",
"html_url": "https://github.com/SDAIer",
"followers_url": "https://api.github.com/users/SDAIer/followers",
"following_url": "https://api.github.com/users/SDAIer/following{/other_user}",
"gists_url": "https://api.github.com/users/SDAIer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SDAIer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SDAIer/subscriptions",
"organizations_url": "https://api.github.com/users/SDAIer/orgs",
"repos_url": "https://api.github.com/users/SDAIer/repos",
"events_url": "https://api.github.com/users/SDAIer/events{/privacy}",
"received_events_url": "https://api.github.com/users/SDAIer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8369/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8369/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2232
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2232/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2232/comments
|
https://api.github.com/repos/ollama/ollama/issues/2232/events
|
https://github.com/ollama/ollama/issues/2232
| 2,103,661,792
|
I_kwDOJ0Z1Ps59Y1Tg
| 2,232
|
Error: error loading model /root/.ollama/models/blobs when I use any other directory for the volume.
|
{
"login": "jds11111",
"id": 15203860,
"node_id": "MDQ6VXNlcjE1MjAzODYw",
"avatar_url": "https://avatars.githubusercontent.com/u/15203860?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jds11111",
"html_url": "https://github.com/jds11111",
"followers_url": "https://api.github.com/users/jds11111/followers",
"following_url": "https://api.github.com/users/jds11111/following{/other_user}",
"gists_url": "https://api.github.com/users/jds11111/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jds11111/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jds11111/subscriptions",
"organizations_url": "https://api.github.com/users/jds11111/orgs",
"repos_url": "https://api.github.com/users/jds11111/repos",
"events_url": "https://api.github.com/users/jds11111/events{/privacy}",
"received_events_url": "https://api.github.com/users/jds11111/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-01-27T16:28:18
| 2024-01-29T16:23:45
| 2024-01-29T16:23:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am using docker-compose to create both the server and webgui of ollama. These compile fine. Going to either port shows that both are running. There are no errors in the logs. However, if I use exec in the container, or go to a console in portainer, there are problems loading any model (I tried openchat and llama2). It downloads the files fine, but then ends with the above error statement:
```
oot@c19165979f14:~/.ollama# ollama run openchat
pulling manifest
pulling 1cecc26325a1... 100% ▕█████████████████████████████████████████████████████████████████████████████████████████████████▏ 4.1 GB
pulling 43070e2d4e53... 100% ▕█████████████████████████████████████████████████████████████████████████████████████████████████▏ 11 KB
pulling d68706c17530... 100% ▕█████████████████████████████████████████████████████████████████████████████████████████████████▏ 98 B
pulling 415f0f6b43dd... 100% ▕█████████████████████████████████████████████████████████████████████████████████████████████████▏ 65 B
pulling 278996753456... 100% ▕█████████████████████████████████████████████████████████████████████████████████████████████████▏ 483 B
verifying sha256 digest
writing manifest
removing any unused layers
success
Error: error loading model /root/.ollama/models/blobs/sha256:1cecc26325a197571a1961bfacf64dc6e35e0f05faf57d3c6941a982e1eb2e1d
```
The problem seems to stem from trying to use a chosen directory to store all the data, instead of using `/data/compose`. If I use
```
version: '3.8'
services:
ollama-server:
image: ollama/ollama
container_name: ollama-server
volumes:
- ./ollama:/root/.ollama
restart: unless-stopped
ports:
- '11434:11434'
```
then I can load models fine. However, if I change that volume to something else, it fails. Changing permissions on that volume does not help.
|
{
"login": "jds11111",
"id": 15203860,
"node_id": "MDQ6VXNlcjE1MjAzODYw",
"avatar_url": "https://avatars.githubusercontent.com/u/15203860?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jds11111",
"html_url": "https://github.com/jds11111",
"followers_url": "https://api.github.com/users/jds11111/followers",
"following_url": "https://api.github.com/users/jds11111/following{/other_user}",
"gists_url": "https://api.github.com/users/jds11111/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jds11111/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jds11111/subscriptions",
"organizations_url": "https://api.github.com/users/jds11111/orgs",
"repos_url": "https://api.github.com/users/jds11111/repos",
"events_url": "https://api.github.com/users/jds11111/events{/privacy}",
"received_events_url": "https://api.github.com/users/jds11111/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2232/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2232/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5527
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5527/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5527/comments
|
https://api.github.com/repos/ollama/ollama/issues/5527/events
|
https://github.com/ollama/ollama/pull/5527
| 2,394,007,683
|
PR_kwDOJ0Z1Ps50nFkp
| 5,527
|
Add Environment Variable For Row Split
|
{
"login": "datacrystals",
"id": 49183477,
"node_id": "MDQ6VXNlcjQ5MTgzNDc3",
"avatar_url": "https://avatars.githubusercontent.com/u/49183477?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/datacrystals",
"html_url": "https://github.com/datacrystals",
"followers_url": "https://api.github.com/users/datacrystals/followers",
"following_url": "https://api.github.com/users/datacrystals/following{/other_user}",
"gists_url": "https://api.github.com/users/datacrystals/gists{/gist_id}",
"starred_url": "https://api.github.com/users/datacrystals/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/datacrystals/subscriptions",
"organizations_url": "https://api.github.com/users/datacrystals/orgs",
"repos_url": "https://api.github.com/users/datacrystals/repos",
"events_url": "https://api.github.com/users/datacrystals/events{/privacy}",
"received_events_url": "https://api.github.com/users/datacrystals/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 7
| 2024-07-07T11:16:32
| 2024-10-20T03:46:09
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5527",
"html_url": "https://github.com/ollama/ollama/pull/5527",
"diff_url": "https://github.com/ollama/ollama/pull/5527.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5527.patch",
"merged_at": null
}
|
As discussed in #5458, there does not appear to be a way to enable row-split rather than layer splitting, which on older multi-gpu setups seems to result in a 40-70% performance improvement.
I tested this on 3xP40 24GB GPUs running Ubuntu Server 22.04, and observed about a 70% improvement in throughput, but it does seem to vary model to model.
I've gone ahead and added it with a really simple change (added an environment variable, which when set to 1 enables row split rather than layer splitting) - please let me know if any edits are needed, I'd be happy to go and make any changes needed.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5527/reactions",
"total_count": 5,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/5527/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5343
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5343/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5343/comments
|
https://api.github.com/repos/ollama/ollama/issues/5343/events
|
https://github.com/ollama/ollama/issues/5343
| 2,379,120,272
|
I_kwDOJ0Z1Ps6Nzn6Q
| 5,343
|
Add mobillama
|
{
"login": "mak448a",
"id": 94062293,
"node_id": "U_kgDOBZtG1Q",
"avatar_url": "https://avatars.githubusercontent.com/u/94062293?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mak448a",
"html_url": "https://github.com/mak448a",
"followers_url": "https://api.github.com/users/mak448a/followers",
"following_url": "https://api.github.com/users/mak448a/following{/other_user}",
"gists_url": "https://api.github.com/users/mak448a/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mak448a/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mak448a/subscriptions",
"organizations_url": "https://api.github.com/users/mak448a/orgs",
"repos_url": "https://api.github.com/users/mak448a/repos",
"events_url": "https://api.github.com/users/mak448a/events{/privacy}",
"received_events_url": "https://api.github.com/users/mak448a/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-06-27T21:30:22
| 2024-06-27T21:30:22
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Could you add mobillama?
https://github.com/mbzuai-oryx/MobiLlama
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5343/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5343/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/920
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/920/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/920/comments
|
https://api.github.com/repos/ollama/ollama/issues/920/events
|
https://github.com/ollama/ollama/issues/920
| 1,964,218,737
|
I_kwDOJ0Z1Ps51E5lx
| 920
|
[Starcoder:7b] Not using CUDA
|
{
"login": "ManuLinares",
"id": 3720020,
"node_id": "MDQ6VXNlcjM3MjAwMjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/3720020?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ManuLinares",
"html_url": "https://github.com/ManuLinares",
"followers_url": "https://api.github.com/users/ManuLinares/followers",
"following_url": "https://api.github.com/users/ManuLinares/following{/other_user}",
"gists_url": "https://api.github.com/users/ManuLinares/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ManuLinares/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ManuLinares/subscriptions",
"organizations_url": "https://api.github.com/users/ManuLinares/orgs",
"repos_url": "https://api.github.com/users/ManuLinares/repos",
"events_url": "https://api.github.com/users/ManuLinares/events{/privacy}",
"received_events_url": "https://api.github.com/users/ManuLinares/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2023-10-26T19:15:54
| 2023-11-21T20:32:55
| 2023-11-21T20:31:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
`# ollama run starcoder:7b prompt`
```
...
llm_load_tensors: VRAM used: 3968.42 MB
..GGML_ASSERT: /build/ollama/src/ollama-cuda/llm/llama.cpp/gguf/ggml-cuda.cu:6115: false
2023/10/26 16:09:13 llama.go:378: signal: aborted (core dumped)
2023/10/26 16:09:13 llama.go:386: error starting llama runner: llama runner process has terminated
2023/10/26 16:09:13 llama.go:452: llama runner stopped successfully
2023/10/26 16:09:13 llama.go:363: starting llama runner
2023/10/26 16:09:13 llama.go:421: waiting for llama runner to start responding
{"timestamp":1698347353,"level":"WARNING","function":"server_params_parse","line":871,"message":"Not compiled with GPU offload support, --n-gpu-layers option will be ignored. See main README.md for information on enabling GPU BLAS support","n_gpu_layers":-1}
...
```
Other models work fine:
example with **ollama run mistral:instruct**:
```
...
2023/10/26 16:12:55 llama.go:252: 7456 MB VRAM available, loading up to 55 GPU layers
2023/10/26 16:12:55 llama.go:363: starting llama runner
2023/10/26 16:12:55 llama.go:421: waiting for llama runner to start responding
ggml_init_cublas: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 3070, compute capability 8.6
...
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/920/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/920/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/94
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/94/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/94/comments
|
https://api.github.com/repos/ollama/ollama/issues/94/events
|
https://github.com/ollama/ollama/pull/94
| 1,808,815,249
|
PR_kwDOJ0Z1Ps5Vufup
| 94
|
stop condition
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-07-18T00:09:42
| 2023-08-18T00:13:34
| 2023-07-28T15:27:05
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/94",
"html_url": "https://github.com/ollama/ollama/pull/94",
"diff_url": "https://github.com/ollama/ollama/pull/94.diff",
"patch_url": "https://github.com/ollama/ollama/pull/94.patch",
"merged_at": null
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/94/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/94/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7007
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7007/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7007/comments
|
https://api.github.com/repos/ollama/ollama/issues/7007/events
|
https://github.com/ollama/ollama/issues/7007
| 2,553,623,580
|
I_kwDOJ0Z1Ps6YNTQc
| 7,007
|
llama3.2 can't tell where it's installed or that it exists
|
{
"login": "superbonaci",
"id": 8339596,
"node_id": "MDQ6VXNlcjgzMzk1OTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/8339596?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/superbonaci",
"html_url": "https://github.com/superbonaci",
"followers_url": "https://api.github.com/users/superbonaci/followers",
"following_url": "https://api.github.com/users/superbonaci/following{/other_user}",
"gists_url": "https://api.github.com/users/superbonaci/gists{/gist_id}",
"starred_url": "https://api.github.com/users/superbonaci/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/superbonaci/subscriptions",
"organizations_url": "https://api.github.com/users/superbonaci/orgs",
"repos_url": "https://api.github.com/users/superbonaci/repos",
"events_url": "https://api.github.com/users/superbonaci/events{/privacy}",
"received_events_url": "https://api.github.com/users/superbonaci/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 11
| 2024-09-27T19:43:51
| 2024-10-01T23:55:32
| 2024-10-01T23:55:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Installed as website claims:
```
ollama run llama3.2
```
Can't even help with basic installation steps or where local files are located:
```
$ ollama run llama3.2
pulling manifest
pulling dde5aa3fc5ff... 100% ▕██▏ 2.0 GB
pulling 966de95ca8a6... 100% ▕██▏ 1.4 KB
pulling fcc5a6bec9da... 100% ▕██▏ 7.7 KB
pulling a70ff7e570d9... 100% ▕██▏ 6.0 KB
pulling 56bb8bd477a5... 100% ▕██▏ 96 B
pulling 34bb5ab01051... 100% ▕██▏ 561 B
verifying sha256 digest
writing manifest
success
>>> Which operating system is this one where you are installed?
I don't have a physical installation on an operating system. I exist as a cloud-based language model, which means I'm a software application that runs on remote servers and can be accessed through the internet.
When you interact with me, you're essentially talking to a computer program that's running on those servers, not on your local device. This allows me to provide information and answer questions for multiple users simultaneously, without requiring a physical installation of an operating system.
>>> I ran the command 'ollama run llama3.2', so now there must some local files in my computer. Which computer is it and where are those files located?
I'm not aware of any specific operating system or software that has a "run" command with an option like "llama3.2". It's possible that you may have installed a custom application or tool, but without more information, I can only speculate.
However, I do know that there is a popular language model called LLaMA ( Large Language Model Application) developed by Meta AI. This model has been released as an open-source project, and it's possible that you may be referring to this.
If you've installed the LLaMA model on your computer, it's likely stored in a specific directory, such as `/usr/local/lib` or `~/.local/lib`, depending on your operating system and installation method. However, I don't have enough information to pinpoint the exact location of those files without knowing more
about your system configuration.
Can you tell me more about how you installed the LLaMA model? Which operating system are you using (Windows, macOS, Linux)?
>>> Where are the local files installed in macOS, for llama3.2?
I couldn't find any information on a specific version of LLaMA called "llama3.2". The LLaMA model is typically referred to by its version number, which is usually provided as a numerical string (e.g., `v1.1`, `v2.0`, etc.).
However, if you're looking for the installation location of the LLaMA model on macOS, it's likely stored in one of the following directories:
* `/usr/local/lib`: This directory contains system-wide libraries and binaries.
* `~/Library/Developer/Xcode/Versions/Applications/llama3.2.app/Contents/Resources/`: If you installed the app through Xcode or a similar development environment.
If you installed LLaMA using pip (Python package manager) or another method, it's likely stored in a directory like:
* `/usr/local/lib/pythonX.X/site-packages/llama3.2/` (for Python installations)
To find the exact location of the files on your system, I recommend searching for `llama3.2` within these directories using Spotlight (on macOS) or the command line.
>>> /bye
```
How can this model be so bad? How is it even supposed to help me?
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
3.2
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7007/reactions",
"total_count": 7,
"+1": 0,
"-1": 4,
"laugh": 3,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7007/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6844
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6844/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6844/comments
|
https://api.github.com/repos/ollama/ollama/issues/6844/events
|
https://github.com/ollama/ollama/pull/6844
| 2,532,167,023
|
PR_kwDOJ0Z1Ps570TMf
| 6,844
|
runner.go: Don't panic when processing sequences
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-09-17T21:10:55
| 2024-09-23T19:13:04
| 2024-09-23T19:12:53
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6844",
"html_url": "https://github.com/ollama/ollama/pull/6844",
"diff_url": "https://github.com/ollama/ollama/pull/6844.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6844.patch",
"merged_at": null
}
|
If there is an error processing a sequence, we should simply return
an HTTP error and abort that sequence rather than panic the whole
runner. This will make us more resilient to transient failures.
Panics can still occur during startup as there is no way to serve
requests if that fails.
Based on some code that was originally part of the vision work.
Co-authored-by: jmorganca <jmorganca@gmail.com>
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6844/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6844/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4930
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4930/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4930/comments
|
https://api.github.com/repos/ollama/ollama/issues/4930/events
|
https://github.com/ollama/ollama/issues/4930
| 2,341,631,707
|
I_kwDOJ0Z1Ps6Lknbb
| 4,930
|
Application Failure on Latest MacBook M1, Runs Smoothly on Intel Mac
|
{
"login": "hylexus",
"id": 13914832,
"node_id": "MDQ6VXNlcjEzOTE0ODMy",
"avatar_url": "https://avatars.githubusercontent.com/u/13914832?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hylexus",
"html_url": "https://github.com/hylexus",
"followers_url": "https://api.github.com/users/hylexus/followers",
"following_url": "https://api.github.com/users/hylexus/following{/other_user}",
"gists_url": "https://api.github.com/users/hylexus/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hylexus/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hylexus/subscriptions",
"organizations_url": "https://api.github.com/users/hylexus/orgs",
"repos_url": "https://api.github.com/users/hylexus/repos",
"events_url": "https://api.github.com/users/hylexus/events{/privacy}",
"received_events_url": "https://api.github.com/users/hylexus/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-06-08T12:19:25
| 2024-06-08T21:58:16
| 2024-06-08T21:58:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
ollama run qwen2
Error: llama runner process has terminated: signal: abort trap error:error loading model vocabulary: unknown pre-tokenizer type: 'qwen2'
```
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.1.37
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4930/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4930/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2028
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2028/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2028/comments
|
https://api.github.com/repos/ollama/ollama/issues/2028/events
|
https://github.com/ollama/ollama/issues/2028
| 2,085,766,643
|
I_kwDOJ0Z1Ps58UkXz
| 2,028
|
how to remove ollama from macos?
|
{
"login": "artnoimann",
"id": 30262687,
"node_id": "MDQ6VXNlcjMwMjYyNjg3",
"avatar_url": "https://avatars.githubusercontent.com/u/30262687?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/artnoimann",
"html_url": "https://github.com/artnoimann",
"followers_url": "https://api.github.com/users/artnoimann/followers",
"following_url": "https://api.github.com/users/artnoimann/following{/other_user}",
"gists_url": "https://api.github.com/users/artnoimann/gists{/gist_id}",
"starred_url": "https://api.github.com/users/artnoimann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/artnoimann/subscriptions",
"organizations_url": "https://api.github.com/users/artnoimann/orgs",
"repos_url": "https://api.github.com/users/artnoimann/repos",
"events_url": "https://api.github.com/users/artnoimann/events{/privacy}",
"received_events_url": "https://api.github.com/users/artnoimann/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 13
| 2024-01-17T09:27:03
| 2025-01-30T04:41:42
| 2024-01-22T17:51:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When deleting an application from the list, the error "ollama is still running" is displayed
If you terminate processes from system monitoring, they start again immediately and the application itself cannot be deleted either.
|
{
"login": "artnoimann",
"id": 30262687,
"node_id": "MDQ6VXNlcjMwMjYyNjg3",
"avatar_url": "https://avatars.githubusercontent.com/u/30262687?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/artnoimann",
"html_url": "https://github.com/artnoimann",
"followers_url": "https://api.github.com/users/artnoimann/followers",
"following_url": "https://api.github.com/users/artnoimann/following{/other_user}",
"gists_url": "https://api.github.com/users/artnoimann/gists{/gist_id}",
"starred_url": "https://api.github.com/users/artnoimann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/artnoimann/subscriptions",
"organizations_url": "https://api.github.com/users/artnoimann/orgs",
"repos_url": "https://api.github.com/users/artnoimann/repos",
"events_url": "https://api.github.com/users/artnoimann/events{/privacy}",
"received_events_url": "https://api.github.com/users/artnoimann/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2028/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2028/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4318
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4318/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4318/comments
|
https://api.github.com/repos/ollama/ollama/issues/4318/events
|
https://github.com/ollama/ollama/issues/4318
| 2,290,200,694
|
I_kwDOJ0Z1Ps6IgbB2
| 4,318
|
Can this set the number of cores on the cpu?
|
{
"login": "ghost-him",
"id": 70733461,
"node_id": "MDQ6VXNlcjcwNzMzNDYx",
"avatar_url": "https://avatars.githubusercontent.com/u/70733461?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ghost-him",
"html_url": "https://github.com/ghost-him",
"followers_url": "https://api.github.com/users/ghost-him/followers",
"following_url": "https://api.github.com/users/ghost-him/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost-him/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ghost-him/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost-him/subscriptions",
"organizations_url": "https://api.github.com/users/ghost-him/orgs",
"repos_url": "https://api.github.com/users/ghost-him/repos",
"events_url": "https://api.github.com/users/ghost-him/events{/privacy}",
"received_events_url": "https://api.github.com/users/ghost-him/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-05-10T18:21:28
| 2024-05-14T03:36:20
| 2024-05-14T03:36:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
My server has 24 cores, but each time it can only use a portion of them. If I could use all of these cores, it would run much more efficiently!
|
{
"login": "ghost-him",
"id": 70733461,
"node_id": "MDQ6VXNlcjcwNzMzNDYx",
"avatar_url": "https://avatars.githubusercontent.com/u/70733461?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ghost-him",
"html_url": "https://github.com/ghost-him",
"followers_url": "https://api.github.com/users/ghost-him/followers",
"following_url": "https://api.github.com/users/ghost-him/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost-him/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ghost-him/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost-him/subscriptions",
"organizations_url": "https://api.github.com/users/ghost-him/orgs",
"repos_url": "https://api.github.com/users/ghost-him/repos",
"events_url": "https://api.github.com/users/ghost-him/events{/privacy}",
"received_events_url": "https://api.github.com/users/ghost-him/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4318/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4318/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1201
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1201/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1201/comments
|
https://api.github.com/repos/ollama/ollama/issues/1201/events
|
https://github.com/ollama/ollama/issues/1201
| 2,001,162,928
|
I_kwDOJ0Z1Ps53R1Kw
| 1,201
|
Issues with files written to /tmp
|
{
"login": "jmont-dev",
"id": 17934059,
"node_id": "MDQ6VXNlcjE3OTM0MDU5",
"avatar_url": "https://avatars.githubusercontent.com/u/17934059?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmont-dev",
"html_url": "https://github.com/jmont-dev",
"followers_url": "https://api.github.com/users/jmont-dev/followers",
"following_url": "https://api.github.com/users/jmont-dev/following{/other_user}",
"gists_url": "https://api.github.com/users/jmont-dev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmont-dev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmont-dev/subscriptions",
"organizations_url": "https://api.github.com/users/jmont-dev/orgs",
"repos_url": "https://api.github.com/users/jmont-dev/repos",
"events_url": "https://api.github.com/users/jmont-dev/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmont-dev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-20T00:45:21
| 2023-12-21T15:15:10
| 2023-11-20T04:30:31
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I noticed that when creating modelfiles, ollama will write very large files to /tmp named similarly to `/tmp/sha256:73bcb64521284e790e63aa8a017669fa5fa40448bc4895c500d7529e374aad7b2640781174`
This is causing a few issues. First, /tmp is frequently part of a separate logical volume than /home/ and other partitions. In my case, /tmp is much smaller than my other filesystems (about 20G of space, and cannot be expanded) which is making generation of larger models fail due to filling up all of the disk space in that partition (an example is Goliath, which even as a Q_2 takes up 47GB of space).
Secondly, even if I create smaller modelfiles ollama appears to be doing some kind of a link or copying that fails due to the different filesystems. When I attempt to create a model from a 7B parameter base GGUF I receive the following error:
transferring context Error: rename /tmp/sha256:73bcb64521284e790e63aa8a017669fa5fa40448bc4895c500d7529e374aad7b2640781174 /home/jmont/.ollama/models/blobs/sha256:73bcb64521284e790e63aa8a017669fa5fa40448bc4895c500d7529e374aad7b: invalid cross-device link
A quick google search seems to associate that issue with creating hard links across different filesystems, though I'm not sure what ollama is doing under the hood.
These were all models that were working just fine in the last version of ollama that I had before updating. Is there any way to change this behavior so that these files can be written to a different location?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1201/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1201/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5362
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5362/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5362/comments
|
https://api.github.com/repos/ollama/ollama/issues/5362/events
|
https://github.com/ollama/ollama/issues/5362
| 2,380,928,840
|
I_kwDOJ0Z1Ps6N6hdI
| 5,362
|
allow temperature to be set on command line ( w/out using a modelfile )
|
{
"login": "pracplayopen",
"id": 11450154,
"node_id": "MDQ6VXNlcjExNDUwMTU0",
"avatar_url": "https://avatars.githubusercontent.com/u/11450154?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pracplayopen",
"html_url": "https://github.com/pracplayopen",
"followers_url": "https://api.github.com/users/pracplayopen/followers",
"following_url": "https://api.github.com/users/pracplayopen/following{/other_user}",
"gists_url": "https://api.github.com/users/pracplayopen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pracplayopen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pracplayopen/subscriptions",
"organizations_url": "https://api.github.com/users/pracplayopen/orgs",
"repos_url": "https://api.github.com/users/pracplayopen/repos",
"events_url": "https://api.github.com/users/pracplayopen/events{/privacy}",
"received_events_url": "https://api.github.com/users/pracplayopen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 7
| 2024-06-28T18:07:44
| 2025-01-09T22:21:48
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
would be super helpful to set temperature for models via command line, rather than having to create a separate model file for every model and temperature combination.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5362/reactions",
"total_count": 6,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5362/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1462
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1462/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1462/comments
|
https://api.github.com/repos/ollama/ollama/issues/1462/events
|
https://github.com/ollama/ollama/issues/1462
| 2,035,032,940
|
I_kwDOJ0Z1Ps55TCNs
| 1,462
|
--test ignores --model
|
{
"login": "kfsone",
"id": 323009,
"node_id": "MDQ6VXNlcjMyMzAwOQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/323009?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kfsone",
"html_url": "https://github.com/kfsone",
"followers_url": "https://api.github.com/users/kfsone/followers",
"following_url": "https://api.github.com/users/kfsone/following{/other_user}",
"gists_url": "https://api.github.com/users/kfsone/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kfsone/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kfsone/subscriptions",
"organizations_url": "https://api.github.com/users/kfsone/orgs",
"repos_url": "https://api.github.com/users/kfsone/repos",
"events_url": "https://api.github.com/users/kfsone/events{/privacy}",
"received_events_url": "https://api.github.com/users/kfsone/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-12-11T07:48:59
| 2023-12-11T19:59:03
| 2023-12-11T17:17:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
```
(venv) root@afa266a7b553:/workspace# litellm --model starling-lm --test
/workspace/venv/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_list" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
LiteLLM: Making a test ChatCompletions request to your proxy
Traceback (most recent call last):
File "/workspace/venv/bin/litellm", line 8, in <module>
sys.exit(run_server())
File "/workspace/venv/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
File "/workspace/venv/lib/python3.10/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/workspace/venv/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/workspace/venv/lib/python3.10/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/workspace/venv/lib/python3.10/site-packages/litellm/proxy/proxy_cli.py", line 198, in run_server
response = client.chat.completions.create(model="gpt-3.5-turbo", messages = [
File "/workspace/venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 303, in wrapper
return func(*args, **kwargs)
File "/workspace/venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 598, in create
return self._post(
File "/workspace/venv/lib/python3.10/site-packages/openai/_base_client.py", line 1086, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/workspace/venv/lib/python3.10/site-packages/openai/_base_client.py", line 846, in request
return self._request(
File "/workspace/venv/lib/python3.10/site-packages/openai/_base_client.py", line 884, in _request
return self._retry_request(
File "/workspace/venv/lib/python3.10/site-packages/openai/_base_client.py", line 956, in _retry_request
return self._request(
File "/workspace/venv/lib/python3.10/site-packages/openai/_base_client.py", line 884, in _request
return self._retry_request(
File "/workspace/venv/lib/python3.10/site-packages/openai/_base_client.py", line 956, in _retry_request
return self._request(
File "/workspace/venv/lib/python3.10/site-packages/openai/_base_client.py", line 898, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 500 - {'detail': 'OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable\n\nTraceback (most recent call last):\n File "/workspace/venv/lib/python3.10/site-packages/litellm/llms/openai.py", line 266, in acompletion\n openai_aclient = AsyncOpenAI(api_key=api_key, base_url=api_base, http_client=litellm.aclient_session, timeout=timeout, max_retries=max_retries)\n File "/workspace/venv/lib/python3.10/site-packages/openai/_client.py", line 303, in __init__\n raise OpenAIError(\nopenai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/workspace/venv/lib/python3.10/site-packages/litellm/main.py", line 187, in acompletion\n response = await init_response\n File "/workspace/venv/lib/python3.10/site-packages/litellm/llms/openai.py", line 278, in acompletion\n raise OpenAIError(status_code=500, message=f"{str(e)}")\nlitellm.llms.openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File "/workspace/venv/lib/python3.10/site-packages/litellm/proxy/proxy_server.py", line 1033, in chat_completion\n response = await litellm.acompletion(**data)\n File "/workspace/venv/lib/python3.10/site-packages/litellm/utils.py", line 1682, in wrapper_async\n raise e\n File "/workspace/venv/lib/python3.10/site-packages/litellm/utils.py", line 1626, in wrapper_async\n result = await original_function(*args, **kwargs)\n File "/workspace/venv/lib/python3.10/site-packages/litellm/main.py", line 197, in acompletion\n raise exception_type(\n File "/workspace/venv/lib/python3.10/site-packages/litellm/utils.py", line 4973, in exception_type\n raise e\n File "/workspace/venv/lib/python3.10/site-packages/litellm/utils.py", line 4115, in exception_type\n raise APIError(\nlitellm.exceptions.APIError: OpenAIException - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable\n'}
```
(same for `litellm --test --model ...`)
/workspace/config.yaml:
```
model_list:
- model_name: starling-lm
litellm_params:
model: ollama/starling-lm
api_base: http://192.168.86.26:11434
api_key: "none"
rpm: 100
- model_name: vicuna:7b-16k
litellm_params:
model: ollama/vicuna:7b-16k
api_base: http://192.168.86.26:11434
api_key: "none"
rpm: 100
litellm_settings: # module level litellm settings - https://github.com/BerriAI/litellm/blob/main/litellm/__init__.py
drop_params: True
set_verbose: True
```
litellm command line:
```
litellm --config /workspace/config.yaml
```
Output from litellm starting up:
```
(venv) root@afa266a7b553:/workspace# litellm --config /workspace/config.yaml
/workspace/venv/lib/python3.10/site-packages/pydantic/_internal/_fields.py:149: UserWarning: Field "model_list" has conflict with protected namespace "model_".
You may be able to resolve this warning by setting `model_config['protected_namespaces'] = ()`.
warnings.warn(
INFO: Started server process [130]
INFO: Waiting for application startup.
#------------------------------------------------------------#
# #
# 'I don't like how this works...' #
# https://github.com/BerriAI/litellm/issues/new #
# #
#------------------------------------------------------------#
Thank you for using LiteLLM! - Krrish & Ishaan
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM: Proxy initialized with Config, Set models:
starling-lm
vicuna:7b-16k
LiteLLM.Router:
Initialized Model List [{'model_name': 'starling-lm', 'litellm_params': {'model': 'ollama/starling-lm-ModelID-ollama/starling-lmhttp://192.168.86.26:11434100', 'api_base': 'http://192.168.86.26:11434', 'api_key': 'none', 'rpm': 100}, 'model_info': {'id': '26263866-4b46-471d-a4eb-41826662724c'}}, {'model_name': 'vicuna:7b-16k', 'litellm_params': {'model': 'ollama/vicuna:7b-16k', 'api_base': 'http://192.168.86.26:11434', 'api_key': 'none', 'rpm': 100}}]
LiteLLM.Router:
Initialized Model List [{'model_name': 'starling-lm', 'litellm_params': {'model': 'ollama/starling-lm-ModelID-ollama/starling-lmhttp://192.168.86.26:11434100', 'api_base': 'http://192.168.86.26:11434', 'api_key': 'none', 'rpm': 100}, 'model_info': {'id': '26263866-4b46-471d-a4eb-41826662724c'}}, {'model_name': 'vicuna:7b-16k', 'litellm_params': {'model': 'ollama/vicuna:7b-16k-ModelID-ollama/vicuna:7b-16khttp://192.168.86.26:11434100', 'api_base': 'http://192.168.86.26:11434', 'api_key': 'none', 'rpm': 100}, 'model_info': {'id': '37bc152d-f977-4d30-b021-d55fbc5a828a'}}]
LiteLLM.Router: Intialized router with Routing strategy: simple-shuffle
LiteLLM: Test your local proxy with: "litellm --test" This runs an openai.ChatCompletion request to your proxy [In a new terminal tab]
LiteLLM: Curl Command Test for your local proxy
curl --location 'http://0.0.0.0:8000/chat/completions' \
--header 'Content-Type: application/json' \
--data ' {
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "what llm are you"
}
]
}'
Docs: https://docs.litellm.ai/docs/simple_proxy
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8000 (Press CTRL+C to quit)
```
successful query forward to ollama:
```
LiteLLM.Router: Inside async function with retries: args - (); kwargs - {'proxy_server_request': {'url': 'http://0.0.0.0:8000/chat/completions', 'method': 'POST', 'headers': {'host': '0.0.0.0:8000', 'user-agent': 'curl/7.81.0', 'accept': '*/*', 'content-type': 'application/json', 'content-length': '142'}, 'body': {'model': 'starling-lm', 'messages': [{'role': 'user', 'content': 'what llm are you'}]}}, 'user': None, 'metadata': {'user_api_key': None, 'headers': {'host': '0.0.0.0:8000', 'user-agent': 'curl/7.81.0', 'accept': '*/*', 'content-type': 'application/json', 'content-length': '142'}, 'model_group': 'starling-lm'}, 'request_timeout': 600, 'model': 'starling-lm', 'messages': [{'role': 'user', 'content': 'what llm are you'}], 'original_function': <bound method Router._acompletion of <litellm.router.Router object at 0x7f04918cf2e0>>, 'num_retries': 3}
LiteLLM.Router: async function w/ retries: original_function - <bound method Router._acompletion of <litellm.router.Router object at 0x7f04918cf2e0>>
LiteLLM.Router: Inside _acompletion()- model: starling-lm; kwargs: {'proxy_server_request': {'url': 'http://0.0.0.0:8000/chat/completions', 'method': 'POST', 'headers': {'host': '0.0.0.0:8000', 'user-agent': 'curl/7.81.0', 'accept': '*/*', 'content-type': 'application/json', 'content-length': '142'}, 'body': {'model': 'starling-lm', 'messages': [{'role': 'user', 'content': 'what llm are you'}]}}, 'user': None, 'metadata': {'user_api_key': None, 'headers': {'host': '0.0.0.0:8000', 'user-agent': 'curl/7.81.0', 'accept': '*/*', 'content-type': 'application/json', 'content-length': '142'}, 'model_group': 'starling-lm'}, 'request_timeout': 600}
LiteLLM.Router: initial list of deployments: [{'model_name': 'starling-lm', 'litellm_params': {'model': 'ollama/starling-lm-ModelID-ollama/starling-lmhttp://192.168.86.26:11434100', 'api_base': 'http://192.168.86.26:11434', 'api_key': 'none', 'rpm': 100}, 'model_info': {'id': '26263866-4b46-471d-a4eb-41826662724c'}}]
get cache: cache key: 07-47:cooldown_models
get cache: cache result: None
LiteLLM.Router: retrieve cooldown models: []
LiteLLM.Router: cooldown deployments: []
LiteLLM.Router: healthy deployments: length 1 [{'model_name': 'starling-lm', 'litellm_params': {'model': 'ollama/starling-lm-ModelID-ollama/starling-lmhttp://192.168.86.26:11434100', 'api_base': 'http://192.168.86.26:11434', 'api_key': 'none', 'rpm': 100}, 'model_info': {'id': '26263866-4b46-471d-a4eb-41826662724c'}}]
LiteLLM.Router:
rpms [100]
LiteLLM.Router:
weights [1.0]
LiteLLM.Router:
selected index, 0
callback: <bound method Router.deployment_callback_on_failure of <litellm.router.Router object at 0x7f04918cf2e0>>
callback: <bound method Router.deployment_callback of <litellm.router.Router object at 0x7f04918cf2e0>>
litellm.cache: None
kwargs[caching]: False; litellm.cache: None
kwargs[caching]: False; litellm.cache: None
LiteLLM completion() model= starling-lm; provider = ollama
LiteLLM: Params passed to completion() {'functions': [], 'function_call': '', 'temperature': None, 'top_p': None, 'stream': None, 'max_tokens': None, 'presence_penalty': None, 'frequency_penalty': None, 'logit_bias': None, 'user': None, 'response_format': None, 'seed': None, 'tools': None, 'tool_choice': None, 'max_retries': 0, 'custom_llm_provider': 'ollama', 'model': 'starling-lm', 'n': None, 'stop': None}
LiteLLM: Non-Default params passed to completion() {'max_retries': 0}
self.optional_params: {}
PRE-API-CALL ADDITIONAL ARGS: {'api_base': 'http://192.168.86.26:11434/api/generate', 'complete_input_dict': {'model': 'starling-lm', 'prompt': 'what llm are you'}}
POST Request Sent from LiteLLM:
curl -X POST \
http://192.168.86.26:11434/api/generate \
-d '{'model': 'starling-lm', 'prompt': 'what llm are you'}'
Async Wrapper: Completed Call, calling async_success_handler
Logging Details LiteLLM-Success Call
success callbacks: [<bound method Router.deployment_callback of <litellm.router.Router object at 0x7f04918cf2e0>>]
LiteLLM.Router: Async Response: ModelResponse(id='chatcmpl-3ce1d54e-7acf-416b-9c22-85eeb6189572', choices=[Choices(finish_reason='stop', index=0, message=Message(content=' I am an AI language model known as OpenAI GPT-4, designed to assist users with various tasks, including answering questions and providing information.', role='assistant'))], created=1702280848, model='ollama/starling-lm', object='chat.completion', system_fingerprint=None, usage=Usage(prompt_tokens=5, completion_tokens=30, total_tokens=35))
success callbacks: Running Custom Callback Function
final response: ModelResponse(id='chatcmpl-3ce1d54e-7acf-416b-9c22-85eeb6189572', choices=[Choices(finish_reason='stop', index=0, message=Message(content=' I am an AI language model known as OpenAI GPT-4, designed to assist users with various tasks, including answering questions and providing information.', role='assistant'))], created=1702280848, model='ollama/starling-lm', object='chat.completion', system_fingerprint=None, usage=Usage(prompt_tokens=5, completion_tokens=30, total_tokens=35))
get cache: cache key: ollama/starling-lm:tpm:07-47
get cache: cache result: None
INFO: 127.0.0.1:44184 - "POST /chat/completions HTTP/1.1" 200 OK
set cache: key: ollama/starling-lm:tpm:07-47; value: 35
get cache: cache key: ollama/starling-lm:rpm:07-47
get cache: cache result: None
set cache: key: ollama/starling-lm:rpm:07-47; value: 1
Custom Logger - final response object: {'id': 'chatcmpl-3ce1d54e-7acf-416b-9c22-85eeb6189572', 'choices': [{'finish_reason': 'stop', 'index': 0, 'message': {'content': ' I am an AI language model known as OpenAI GPT-4, designed to assist users with various tasks, including answering questions and providing information.', 'role': 'assistant'}}], 'created': 1702280848, 'model': 'ollama/starling-lm', 'object': 'chat.completion', 'system_fingerprint': None, 'usage': {'prompt_tokens': 5, 'completion_tokens': 30, 'total_tokens': 35}}
Async success callbacks: []
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1462/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1462/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/203
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/203/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/203/comments
|
https://api.github.com/repos/ollama/ollama/issues/203/events
|
https://github.com/ollama/ollama/pull/203
| 1,819,216,086
|
PR_kwDOJ0Z1Ps5WRiec
| 203
|
pull model on make if not present locally
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2023-07-24T22:08:01
| 2023-07-25T20:53:02
| 2023-07-25T20:53:01
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/203",
"html_url": "https://github.com/ollama/ollama/pull/203",
"diff_url": "https://github.com/ollama/ollama/pull/203.diff",
"patch_url": "https://github.com/ollama/ollama/pull/203.patch",
"merged_at": "2023-07-25T20:53:01"
}
|
resolves #129
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/203/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/203/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/322
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/322/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/322/comments
|
https://api.github.com/repos/ollama/ollama/issues/322/events
|
https://github.com/ollama/ollama/pull/322
| 1,846,012,074
|
PR_kwDOJ0Z1Ps5Xrv39
| 322
|
no warning on comments
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-08-10T23:22:55
| 2023-08-10T23:24:42
| 2023-08-10T23:24:41
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/322",
"html_url": "https://github.com/ollama/ollama/pull/322",
"diff_url": "https://github.com/ollama/ollama/pull/322.diff",
"patch_url": "https://github.com/ollama/ollama/pull/322.patch",
"merged_at": "2023-08-10T23:24:41"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/322/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/322/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1/comments
|
https://api.github.com/repos/ollama/ollama/issues/1/events
|
https://github.com/ollama/ollama/pull/1
| 1,777,625,644
|
PR_kwDOJ0Z1Ps5UElI4
| 1
|
refactor
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-06-27T19:29:54
| 2023-06-27T21:47:53
| 2023-06-27T20:49:08
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1",
"html_url": "https://github.com/ollama/ollama/pull/1",
"diff_url": "https://github.com/ollama/ollama/pull/1.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1.patch",
"merged_at": "2023-06-27T20:49:08"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4595
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4595/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4595/comments
|
https://api.github.com/repos/ollama/ollama/issues/4595/events
|
https://github.com/ollama/ollama/issues/4595
| 2,313,421,708
|
I_kwDOJ0Z1Ps6J5AOM
| 4,595
|
Possible to add numDownloadParts as option?
|
{
"login": "Gomez12",
"id": 138985,
"node_id": "MDQ6VXNlcjEzODk4NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/138985?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Gomez12",
"html_url": "https://github.com/Gomez12",
"followers_url": "https://api.github.com/users/Gomez12/followers",
"following_url": "https://api.github.com/users/Gomez12/following{/other_user}",
"gists_url": "https://api.github.com/users/Gomez12/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Gomez12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Gomez12/subscriptions",
"organizations_url": "https://api.github.com/users/Gomez12/orgs",
"repos_url": "https://api.github.com/users/Gomez12/repos",
"events_url": "https://api.github.com/users/Gomez12/events{/privacy}",
"received_events_url": "https://api.github.com/users/Gomez12/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-05-23T16:52:48
| 2024-12-29T19:29:07
| 2024-12-29T19:29:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently in server/download.go there is a constant : numDownloadParts = 64
And basically I would like to see it as an option to either change the value or simply put it to 1.
Currently it is overloading my router (which does content inspection) and then my whole network almost has no internet any more.
It also blocks itself I think and it will errorOut with : max retries exceeded.
Then the network works fine again, until I restart the Ollama pull (which continues fine) and then the internet goes down again.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4595/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4595/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7511
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7511/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7511/comments
|
https://api.github.com/repos/ollama/ollama/issues/7511/events
|
https://github.com/ollama/ollama/issues/7511
| 2,635,526,685
|
I_kwDOJ0Z1Ps6dFvId
| 7,511
|
About OLLAMA_SCHED_SPREAD env,How to load a model on two GPUs
|
{
"login": "Kouuh",
"id": 46215418,
"node_id": "MDQ6VXNlcjQ2MjE1NDE4",
"avatar_url": "https://avatars.githubusercontent.com/u/46215418?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kouuh",
"html_url": "https://github.com/Kouuh",
"followers_url": "https://api.github.com/users/Kouuh/followers",
"following_url": "https://api.github.com/users/Kouuh/following{/other_user}",
"gists_url": "https://api.github.com/users/Kouuh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kouuh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kouuh/subscriptions",
"organizations_url": "https://api.github.com/users/Kouuh/orgs",
"repos_url": "https://api.github.com/users/Kouuh/repos",
"events_url": "https://api.github.com/users/Kouuh/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kouuh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 13
| 2024-11-05T13:50:56
| 2024-11-17T14:09:42
| 2024-11-17T14:09:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Previously, I was using 515 version of NVIDIA driver and cuda11.7, and I could load a model (qwen1.5_32b_q8) on two GPUs by setting OLLAMA_SCHED_SPREAD=1.
But now I am using 535 version of NVIDIA driver and cuda12.1, and I want to load a model (qwen1.5_32b_q8) on two GPUs. Even if I set OLLAMA_SCHED_SPREAD=1, and tried ollamav_0.2.5 and ollama_v0.3.14, ollama will still run a complete model on two graphics cards. How can I solve this problem?
I run the service via command "CUDA_VISIBLE_DEVICES=6,7 /usr/bin/ollama serve"
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.2.5 and 0.3.14
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7511/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7511/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8600
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8600/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8600/comments
|
https://api.github.com/repos/ollama/ollama/issues/8600/events
|
https://github.com/ollama/ollama/issues/8600
| 2,812,007,544
|
I_kwDOJ0Z1Ps6nm9R4
| 8,600
|
requesting support new model(maybe not new): Qwen2.5-1M
|
{
"login": "utopeadia",
"id": 98788152,
"node_id": "U_kgDOBeNjOA",
"avatar_url": "https://avatars.githubusercontent.com/u/98788152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/utopeadia",
"html_url": "https://github.com/utopeadia",
"followers_url": "https://api.github.com/users/utopeadia/followers",
"following_url": "https://api.github.com/users/utopeadia/following{/other_user}",
"gists_url": "https://api.github.com/users/utopeadia/gists{/gist_id}",
"starred_url": "https://api.github.com/users/utopeadia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/utopeadia/subscriptions",
"organizations_url": "https://api.github.com/users/utopeadia/orgs",
"repos_url": "https://api.github.com/users/utopeadia/repos",
"events_url": "https://api.github.com/users/utopeadia/events{/privacy}",
"received_events_url": "https://api.github.com/users/utopeadia/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-27T03:25:33
| 2025-01-27T17:39:47
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Qwen has released a long context version of Qwen 2.5 on hugging face, hoping to add support:
https://huggingface.co/collections/Qwen/qwen25-1m-679325716327ec07860530ba
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8600/reactions",
"total_count": 9,
"+1": 9,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8600/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4047
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4047/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4047/comments
|
https://api.github.com/repos/ollama/ollama/issues/4047/events
|
https://github.com/ollama/ollama/issues/4047
| 2,271,173,363
|
I_kwDOJ0Z1Ps6HX1rz
| 4,047
|
Document the different windows downloads
|
{
"login": "eabase",
"id": 52289379,
"node_id": "MDQ6VXNlcjUyMjg5Mzc5",
"avatar_url": "https://avatars.githubusercontent.com/u/52289379?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eabase",
"html_url": "https://github.com/eabase",
"followers_url": "https://api.github.com/users/eabase/followers",
"following_url": "https://api.github.com/users/eabase/following{/other_user}",
"gists_url": "https://api.github.com/users/eabase/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eabase/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eabase/subscriptions",
"organizations_url": "https://api.github.com/users/eabase/orgs",
"repos_url": "https://api.github.com/users/eabase/repos",
"events_url": "https://api.github.com/users/eabase/events{/privacy}",
"received_events_url": "https://api.github.com/users/eabase/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-04-30T11:12:04
| 2024-05-06T14:37:34
| 2024-05-04T19:50:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Please document what the different windows downloads available, are actually doing and their difference.

### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4047/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4047/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4175
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4175/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4175/comments
|
https://api.github.com/repos/ollama/ollama/issues/4175/events
|
https://github.com/ollama/ollama/pull/4175
| 2,279,669,906
|
PR_kwDOJ0Z1Ps5ukud1
| 4,175
|
validate the format of the digest when getting the model path
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-05T18:36:35
| 2024-05-05T18:46:13
| 2024-05-05T18:46:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4175",
"html_url": "https://github.com/ollama/ollama/pull/4175",
"diff_url": "https://github.com/ollama/ollama/pull/4175.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4175.patch",
"merged_at": "2024-05-05T18:46:12"
}
| null |
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4175/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4175/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3763
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3763/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3763/comments
|
https://api.github.com/repos/ollama/ollama/issues/3763/events
|
https://github.com/ollama/ollama/issues/3763
| 2,253,785,618
|
I_kwDOJ0Z1Ps6GVgoS
| 3,763
|
Ollama 0.1.32 not working as expected in Windows.
|
{
"login": "BCoskun",
"id": 2695679,
"node_id": "MDQ6VXNlcjI2OTU2Nzk=",
"avatar_url": "https://avatars.githubusercontent.com/u/2695679?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BCoskun",
"html_url": "https://github.com/BCoskun",
"followers_url": "https://api.github.com/users/BCoskun/followers",
"following_url": "https://api.github.com/users/BCoskun/following{/other_user}",
"gists_url": "https://api.github.com/users/BCoskun/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BCoskun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BCoskun/subscriptions",
"organizations_url": "https://api.github.com/users/BCoskun/orgs",
"repos_url": "https://api.github.com/users/BCoskun/repos",
"events_url": "https://api.github.com/users/BCoskun/events{/privacy}",
"received_events_url": "https://api.github.com/users/BCoskun/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 9
| 2024-04-19T19:30:25
| 2024-04-23T23:35:21
| 2024-04-23T23:35:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I can't no longer serve the ollama using regular user. When I try to run "olllama server" command and use it from another terminal windows, I am getting Error about the runner.
time=2024-04-19T15:16:31.832-04:00 level=ERROR source=routes.go:120 msg="error loading llama server" error="llama runner process no longer
running: 3221225781 "
When I try with "run as administrator" option then it works okay.
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.32
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3763/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3763/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2875
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2875/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2875/comments
|
https://api.github.com/repos/ollama/ollama/issues/2875/events
|
https://github.com/ollama/ollama/issues/2875
| 2,164,707,586
|
I_kwDOJ0Z1Ps6BBtEC
| 2,875
|
Set attention type for Mistral 7B
|
{
"login": "aiseei",
"id": 30615541,
"node_id": "MDQ6VXNlcjMwNjE1NTQx",
"avatar_url": "https://avatars.githubusercontent.com/u/30615541?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aiseei",
"html_url": "https://github.com/aiseei",
"followers_url": "https://api.github.com/users/aiseei/followers",
"following_url": "https://api.github.com/users/aiseei/following{/other_user}",
"gists_url": "https://api.github.com/users/aiseei/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aiseei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aiseei/subscriptions",
"organizations_url": "https://api.github.com/users/aiseei/orgs",
"repos_url": "https://api.github.com/users/aiseei/repos",
"events_url": "https://api.github.com/users/aiseei/events{/privacy}",
"received_events_url": "https://api.github.com/users/aiseei/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-02T08:34:04
| 2024-05-18T03:11:03
| 2024-05-18T03:11:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi - is there any documentation that talks abt how attention types can be set ?
There seem to be options as per the Mistral git.
https://github.com/mistralai/mistral-src
https://huggingface.co/docs/transformers/main/model_doc/mistral HF talks abt using flash attention
Also any pointer on how to interpert the debug messages esp when the model loads and when ut receives a `generate` request?
Any inputs would be much appreciated!
Thanks.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2875/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2875/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8378
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8378/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8378/comments
|
https://api.github.com/repos/ollama/ollama/issues/8378/events
|
https://github.com/ollama/ollama/pull/8378
| 2,781,249,727
|
PR_kwDOJ0Z1Ps6HY-Hf
| 8,378
|
convert: fix tensor iteration by calling Items() method
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-10T21:50:28
| 2025-01-23T23:45:45
| 2025-01-10T21:54:43
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8378",
"html_url": "https://github.com/ollama/ollama/pull/8378",
"diff_url": "https://github.com/ollama/ollama/pull/8378.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8378.patch",
"merged_at": null
}
|
This fixes a range iteration error by properly calling the `Items()` method instead of trying to range over the method itself.
Previously the code was attempting to iterate over the function definition of `Items`, which isn't the valid way to do a range over a function in Go. By adding parentheses to call the method, we now correctly iterate over the returned `[]*Tensor` slice.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8378/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8378/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7530
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7530/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7530/comments
|
https://api.github.com/repos/ollama/ollama/issues/7530/events
|
https://github.com/ollama/ollama/issues/7530
| 2,638,873,204
|
I_kwDOJ0Z1Ps6dSgJ0
| 7,530
|
submitting corrections to models.
|
{
"login": "ht990332",
"id": 1168407,
"node_id": "MDQ6VXNlcjExNjg0MDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/1168407?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ht990332",
"html_url": "https://github.com/ht990332",
"followers_url": "https://api.github.com/users/ht990332/followers",
"following_url": "https://api.github.com/users/ht990332/following{/other_user}",
"gists_url": "https://api.github.com/users/ht990332/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ht990332/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ht990332/subscriptions",
"organizations_url": "https://api.github.com/users/ht990332/orgs",
"repos_url": "https://api.github.com/users/ht990332/repos",
"events_url": "https://api.github.com/users/ht990332/events{/privacy}",
"received_events_url": "https://api.github.com/users/ht990332/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-06T18:19:53
| 2024-11-13T21:59:00
| 2024-11-13T21:58:59
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hello. I am using LLama3.2 and I asked it an RXJS question. The answer was clear and pleasantly complete but there was a big mistake in the answer. Citation to corrections can be found on RXJS documentation. How do I submit corrections to the model's answers?
### OS
Linux
### GPU
Intel
### CPU
Intel
### Ollama version
0.3.14
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7530/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7530/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6482
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6482/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6482/comments
|
https://api.github.com/repos/ollama/ollama/issues/6482/events
|
https://github.com/ollama/ollama/pull/6482
| 2,483,833,920
|
PR_kwDOJ0Z1Ps55SLaQ
| 6,482
|
passthrough OLLAMA_HOST path to client
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-23T20:21:56
| 2024-08-30T16:40:36
| 2024-08-30T16:40:34
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6482",
"html_url": "https://github.com/ollama/ollama/pull/6482",
"diff_url": "https://github.com/ollama/ollama/pull/6482.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6482.patch",
"merged_at": "2024-08-30T16:40:34"
}
|
this allows users to use ollama behind a proxy with a non-root path
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6482/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6482/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8413
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8413/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8413/comments
|
https://api.github.com/repos/ollama/ollama/issues/8413/events
|
https://github.com/ollama/ollama/pull/8413
| 2,786,108,016
|
PR_kwDOJ0Z1Ps6HplkM
| 8,413
|
remove .prettierrc.json
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-14T03:04:23
| 2025-01-14T17:30:37
| 2025-01-14T17:30:35
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8413",
"html_url": "https://github.com/ollama/ollama/pull/8413",
"diff_url": "https://github.com/ollama/ollama/pull/8413.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8413.patch",
"merged_at": "2025-01-14T17:30:35"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8413/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8413/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8295
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8295/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8295/comments
|
https://api.github.com/repos/ollama/ollama/issues/8295/events
|
https://github.com/ollama/ollama/issues/8295
| 2,767,878,471
|
I_kwDOJ0Z1Ps6k-nlH
| 8,295
|
Considerably slower response via API than terminal
|
{
"login": "leo-petrucci",
"id": 23533178,
"node_id": "MDQ6VXNlcjIzNTMzMTc4",
"avatar_url": "https://avatars.githubusercontent.com/u/23533178?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leo-petrucci",
"html_url": "https://github.com/leo-petrucci",
"followers_url": "https://api.github.com/users/leo-petrucci/followers",
"following_url": "https://api.github.com/users/leo-petrucci/following{/other_user}",
"gists_url": "https://api.github.com/users/leo-petrucci/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leo-petrucci/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leo-petrucci/subscriptions",
"organizations_url": "https://api.github.com/users/leo-petrucci/orgs",
"repos_url": "https://api.github.com/users/leo-petrucci/repos",
"events_url": "https://api.github.com/users/leo-petrucci/events{/privacy}",
"received_events_url": "https://api.github.com/users/leo-petrucci/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 4
| 2025-01-03T16:49:33
| 2025-01-09T10:45:39
| 2025-01-09T10:45:39
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I've been trying to run a small model (`gemma2:2b`) on my Raspberry Pi 5. I heard the performance was quite impressive and it's true! When running Ollama in terminal responses are very fast considering the hardware.
https://github.com/user-attachments/assets/eb7b3add-a35a-4123-8e5b-01ca109ae395
That said, I wanted to try to use it via a frontend so I attempted to hit the API endpoints Ollama exposes and the response time was **much** slower than the terminal.
In the video below the first response is almost instant, and then it slows down to a crawl before almost crashing. Most times the first response crashes the system before finishing.
https://github.com/user-attachments/assets/53691d7a-c263-4482-905e-9e8dc9893946
Is there a reason for this? How can I debug the issue?
I'm aware Ollama isn't really made for a GPU-less Rpi but hopefully someone can let me know if I'm doing something wrong or if it's a known issue!
### OS
Linux
### GPU
Other
### CPU
Other
### Ollama version
0.5.4-0-g2ddc32d-dirty
|
{
"login": "leo-petrucci",
"id": 23533178,
"node_id": "MDQ6VXNlcjIzNTMzMTc4",
"avatar_url": "https://avatars.githubusercontent.com/u/23533178?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leo-petrucci",
"html_url": "https://github.com/leo-petrucci",
"followers_url": "https://api.github.com/users/leo-petrucci/followers",
"following_url": "https://api.github.com/users/leo-petrucci/following{/other_user}",
"gists_url": "https://api.github.com/users/leo-petrucci/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leo-petrucci/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leo-petrucci/subscriptions",
"organizations_url": "https://api.github.com/users/leo-petrucci/orgs",
"repos_url": "https://api.github.com/users/leo-petrucci/repos",
"events_url": "https://api.github.com/users/leo-petrucci/events{/privacy}",
"received_events_url": "https://api.github.com/users/leo-petrucci/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8295/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8295/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4712
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4712/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4712/comments
|
https://api.github.com/repos/ollama/ollama/issues/4712/events
|
https://github.com/ollama/ollama/pull/4712
| 2,324,340,007
|
PR_kwDOJ0Z1Ps5w8LNr
| 4,712
|
server: skip blob verification for already verified blobs
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2024-05-29T22:26:02
| 2024-06-06T07:49:20
| 2024-06-05T23:39:11
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4712",
"html_url": "https://github.com/ollama/ollama/pull/4712",
"diff_url": "https://github.com/ollama/ollama/pull/4712.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4712.patch",
"merged_at": "2024-06-05T23:39:11"
}
|
WIP.
This should also include a way to force verification via a flag or env or other means. I'll let the bike shedding begin on the _best_ way to handle that. :)
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4712/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4712/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/627
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/627/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/627/comments
|
https://api.github.com/repos/ollama/ollama/issues/627/events
|
https://github.com/ollama/ollama/issues/627
| 1,916,520,682
|
I_kwDOJ0Z1Ps5yO8jq
| 627
|
Docs request: `codellama` model on Hugging Face, and broken whitepaper link
|
{
"login": "jamesbraza",
"id": 8990777,
"node_id": "MDQ6VXNlcjg5OTA3Nzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8990777?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jamesbraza",
"html_url": "https://github.com/jamesbraza",
"followers_url": "https://api.github.com/users/jamesbraza/followers",
"following_url": "https://api.github.com/users/jamesbraza/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesbraza/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jamesbraza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesbraza/subscriptions",
"organizations_url": "https://api.github.com/users/jamesbraza/orgs",
"repos_url": "https://api.github.com/users/jamesbraza/repos",
"events_url": "https://api.github.com/users/jamesbraza/events{/privacy}",
"received_events_url": "https://api.github.com/users/jamesbraza/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-09-28T00:16:33
| 2023-09-30T05:01:56
| 2023-09-30T05:01:56
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
On https://ollama.ai/library/codellama, I have two requests:
- The Whitepaper link is broken
- Can we link the source models on Hugging Face?
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/627/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/627/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7924
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7924/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7924/comments
|
https://api.github.com/repos/ollama/ollama/issues/7924/events
|
https://github.com/ollama/ollama/pull/7924
| 2,716,236,674
|
PR_kwDOJ0Z1Ps6D9V8W
| 7,924
|
Clarify num_ctx is measured in tokens
|
{
"login": "GhostDog98",
"id": 46546505,
"node_id": "MDQ6VXNlcjQ2NTQ2NTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/46546505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/GhostDog98",
"html_url": "https://github.com/GhostDog98",
"followers_url": "https://api.github.com/users/GhostDog98/followers",
"following_url": "https://api.github.com/users/GhostDog98/following{/other_user}",
"gists_url": "https://api.github.com/users/GhostDog98/gists{/gist_id}",
"starred_url": "https://api.github.com/users/GhostDog98/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GhostDog98/subscriptions",
"organizations_url": "https://api.github.com/users/GhostDog98/orgs",
"repos_url": "https://api.github.com/users/GhostDog98/repos",
"events_url": "https://api.github.com/users/GhostDog98/events{/privacy}",
"received_events_url": "https://api.github.com/users/GhostDog98/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-12-03T23:48:45
| 2024-12-03T23:48:45
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7924",
"html_url": "https://github.com/ollama/ollama/pull/7924",
"diff_url": "https://github.com/ollama/ollama/pull/7924.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7924.patch",
"merged_at": null
}
|
It is not clear in the modelfile documentation that num_ctx is actually tokens, rather than chars, which may lead to large over-sizing of the context window where it is not needed.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7924/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7924/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3571
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3571/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3571/comments
|
https://api.github.com/repos/ollama/ollama/issues/3571/events
|
https://github.com/ollama/ollama/issues/3571
| 2,235,082,155
|
I_kwDOJ0Z1Ps6FOKWr
| 3,571
|
Let ollama's locally started api server support cross-domain!!
|
{
"login": "kirakiray",
"id": 5945154,
"node_id": "MDQ6VXNlcjU5NDUxNTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/5945154?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kirakiray",
"html_url": "https://github.com/kirakiray",
"followers_url": "https://api.github.com/users/kirakiray/followers",
"following_url": "https://api.github.com/users/kirakiray/following{/other_user}",
"gists_url": "https://api.github.com/users/kirakiray/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kirakiray/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kirakiray/subscriptions",
"organizations_url": "https://api.github.com/users/kirakiray/orgs",
"repos_url": "https://api.github.com/users/kirakiray/repos",
"events_url": "https://api.github.com/users/kirakiray/events{/privacy}",
"received_events_url": "https://api.github.com/users/kirakiray/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2024-04-10T08:53:23
| 2024-10-28T09:17:33
| 2024-05-14T19:01:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
I am a front-end development programmer. I found that if the api service started by ollama allows cross-domain requests, then the online web application can also directly request the localhost api, which will greatly reduce the usage threshold of ollama-related applications.
### How should we solve this?
You need to add headers to the returned headers, similar to the following code
`response.Header.Set("Access-Control-Allow-Origin", "*")`
If you are worried about the abuse of this cross-domain interface, you can add similar instructions to enable the cross-domain function on a specific port, similar to the following operations:
`ollama serve --port:5551 --cross-domain`
### What is the impact of not solving this?
The current ollama-related web applications must be started locally on the computer before they can be used, which is very unfriendly to users.
There are also some applications that need to expose (or proxy) the local ollama port to the public network in order to be used by these applications, which will cause greater security issues.
If this cross-domain requirement is completed, online web applications will easily use the local ollama API interface. This is a very convenient operation because users do not need to consider how to download or install the application itself.
### Anything else?
I now use nodejs to expose the local ollama port (11434) to 11001, and then deceive the server's request source to achieve the desired effect I want. This allows my web application to be online and use the native ollama api. (chrome and firefox work fine)
The script address is: https://github.com/kirakiray/test-website/blob/main/scripts/proxy-server.js
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3571/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3571/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4550
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4550/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4550/comments
|
https://api.github.com/repos/ollama/ollama/issues/4550/events
|
https://github.com/ollama/ollama/issues/4550
| 2,307,112,901
|
I_kwDOJ0Z1Ps6Jg7_F
| 4,550
|
GUI for ollama mac app
|
{
"login": "robot-penguin34",
"id": 137585700,
"node_id": "U_kgDOCDNkJA",
"avatar_url": "https://avatars.githubusercontent.com/u/137585700?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robot-penguin34",
"html_url": "https://github.com/robot-penguin34",
"followers_url": "https://api.github.com/users/robot-penguin34/followers",
"following_url": "https://api.github.com/users/robot-penguin34/following{/other_user}",
"gists_url": "https://api.github.com/users/robot-penguin34/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robot-penguin34/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robot-penguin34/subscriptions",
"organizations_url": "https://api.github.com/users/robot-penguin34/orgs",
"repos_url": "https://api.github.com/users/robot-penguin34/repos",
"events_url": "https://api.github.com/users/robot-penguin34/events{/privacy}",
"received_events_url": "https://api.github.com/users/robot-penguin34/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-05-21T02:07:55
| 2024-05-22T07:08:43
| 2024-05-21T18:50:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Ollama has so much potential to be the best virtual assistant, but it doesn't have a built in gui for those who don't plan to host this through the network.
**Please add an option during the setup wizard to enable the gui and add one.**
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4550/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4550/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3567
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3567/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3567/comments
|
https://api.github.com/repos/ollama/ollama/issues/3567/events
|
https://github.com/ollama/ollama/pull/3567
| 2,234,524,686
|
PR_kwDOJ0Z1Ps5sMIyz
| 3,567
|
partial offloading
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-10T00:30:22
| 2024-04-10T17:58:05
| 2024-04-10T17:58:04
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3567",
"html_url": "https://github.com/ollama/ollama/pull/3567",
"diff_url": "https://github.com/ollama/ollama/pull/3567.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3567.patch",
"merged_at": "2024-04-10T17:58:04"
}
|
partial offloading of a model requires more memory in most cases (except for metal)
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3567/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3567/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6535
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6535/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6535/comments
|
https://api.github.com/repos/ollama/ollama/issues/6535/events
|
https://github.com/ollama/ollama/pull/6535
| 2,490,487,418
|
PR_kwDOJ0Z1Ps55oa0G
| 6,535
|
Move ollama executable out of bin dir
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-27T22:39:06
| 2024-08-27T23:19:03
| 2024-08-27T23:19:00
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6535",
"html_url": "https://github.com/ollama/ollama/pull/6535",
"diff_url": "https://github.com/ollama/ollama/pull/6535.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6535.patch",
"merged_at": "2024-08-27T23:19:00"
}
|
Draft until I can verify I didn't miss anything...
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6535/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6535/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6688
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6688/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6688/comments
|
https://api.github.com/repos/ollama/ollama/issues/6688/events
|
https://github.com/ollama/ollama/pull/6688
| 2,511,671,025
|
PR_kwDOJ0Z1Ps56u0wT
| 6,688
|
Align OpenAI Chat option processing with Completion option processing
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-07T14:19:13
| 2024-09-07T16:08:09
| 2024-09-07T16:08:09
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6688",
"html_url": "https://github.com/ollama/ollama/pull/6688",
"diff_url": "https://github.com/ollama/ollama/pull/6688.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6688.patch",
"merged_at": "2024-09-07T16:08:09"
}
|
https://github.com/ollama/ollama/pull/6514 removed the option scaling for OpenAI Completion requests. Do the same for Chat requests.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6688/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6688/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5997
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5997/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5997/comments
|
https://api.github.com/repos/ollama/ollama/issues/5997/events
|
https://github.com/ollama/ollama/issues/5997
| 2,432,973,858
|
I_kwDOJ0Z1Ps6RBDwi
| 5,997
|
Tools documentation
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-07-26T21:02:24
| 2024-09-24T18:00:51
| 2024-09-24T18:00:51
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I like the new tools feature, however I can't find a list of what tools are known, other then the example code give. Can we add the list of known tools to the documentation?
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5997/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5997/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3631
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3631/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3631/comments
|
https://api.github.com/repos/ollama/ollama/issues/3631/events
|
https://github.com/ollama/ollama/issues/3631
| 2,241,723,652
|
I_kwDOJ0Z1Ps6Fnf0E
| 3,631
|
Support bge-reranker-large
|
{
"login": "wwjCMP",
"id": 32979859,
"node_id": "MDQ6VXNlcjMyOTc5ODU5",
"avatar_url": "https://avatars.githubusercontent.com/u/32979859?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wwjCMP",
"html_url": "https://github.com/wwjCMP",
"followers_url": "https://api.github.com/users/wwjCMP/followers",
"following_url": "https://api.github.com/users/wwjCMP/following{/other_user}",
"gists_url": "https://api.github.com/users/wwjCMP/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wwjCMP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wwjCMP/subscriptions",
"organizations_url": "https://api.github.com/users/wwjCMP/orgs",
"repos_url": "https://api.github.com/users/wwjCMP/repos",
"events_url": "https://api.github.com/users/wwjCMP/events{/privacy}",
"received_events_url": "https://api.github.com/users/wwjCMP/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-13T20:00:09
| 2024-04-15T12:28:41
| 2024-04-15T12:28:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What model would you like?
bge-reranker-large
https://huggingface.co/BAAI/bge-reranker-large
|
{
"login": "wwjCMP",
"id": 32979859,
"node_id": "MDQ6VXNlcjMyOTc5ODU5",
"avatar_url": "https://avatars.githubusercontent.com/u/32979859?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wwjCMP",
"html_url": "https://github.com/wwjCMP",
"followers_url": "https://api.github.com/users/wwjCMP/followers",
"following_url": "https://api.github.com/users/wwjCMP/following{/other_user}",
"gists_url": "https://api.github.com/users/wwjCMP/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wwjCMP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wwjCMP/subscriptions",
"organizations_url": "https://api.github.com/users/wwjCMP/orgs",
"repos_url": "https://api.github.com/users/wwjCMP/repos",
"events_url": "https://api.github.com/users/wwjCMP/events{/privacy}",
"received_events_url": "https://api.github.com/users/wwjCMP/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3631/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3631/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3863
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3863/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3863/comments
|
https://api.github.com/repos/ollama/ollama/issues/3863/events
|
https://github.com/ollama/ollama/pull/3863
| 2,260,225,471
|
PR_kwDOJ0Z1Ps5ti8AN
| 3,863
|
fixes for gguf
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-24T03:31:32
| 2024-04-24T03:57:20
| 2024-04-24T03:57:20
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3863",
"html_url": "https://github.com/ollama/ollama/pull/3863",
"diff_url": "https://github.com/ollama/ollama/pull/3863.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3863.patch",
"merged_at": "2024-04-24T03:57:20"
}
|
This change:
* adds the KVs for moes
* catches when an improper type was used in the KVs (we can add more if/when this changes)
* accounts for n-tensors in 8x22b moe models (where n > 2)
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3863/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3863/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2469
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2469/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2469/comments
|
https://api.github.com/repos/ollama/ollama/issues/2469/events
|
https://github.com/ollama/ollama/pull/2469
| 2,131,116,795
|
PR_kwDOJ0Z1Ps5msDiO
| 2,469
|
Adjust ROCm v5 build
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-12T22:41:43
| 2024-03-09T17:29:45
| 2024-03-09T17:29:43
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2469",
"html_url": "https://github.com/ollama/ollama/pull/2469",
"diff_url": "https://github.com/ollama/ollama/pull/2469.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2469.patch",
"merged_at": null
}
|
It looks like within the major version 5 family of ROCm releases, there were two significant library major version bumps that seem to be causing problems for some users on some distros. In 5.6, they bumped hipblas and rocblas libraries from .so.0 to newer versions, and do not bundle the old versions in new packages. While you can install multiple versions, this complicates the UX. Building on either side of this library change should hopefully yield a llm variant that can link on their system.
Related to #738
Marking draft for now until I can test this on more permutations of drivers/rocm libraries to see if it truly makes things better both for the pre-built binaries, and the container image.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2469/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2469/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5369
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5369/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5369/comments
|
https://api.github.com/repos/ollama/ollama/issues/5369/events
|
https://github.com/ollama/ollama/issues/5369
| 2,381,355,321
|
I_kwDOJ0Z1Ps6N8Jk5
| 5,369
|
run gemma2 error
|
{
"login": "isold23",
"id": 1543138,
"node_id": "MDQ6VXNlcjE1NDMxMzg=",
"avatar_url": "https://avatars.githubusercontent.com/u/1543138?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/isold23",
"html_url": "https://github.com/isold23",
"followers_url": "https://api.github.com/users/isold23/followers",
"following_url": "https://api.github.com/users/isold23/following{/other_user}",
"gists_url": "https://api.github.com/users/isold23/gists{/gist_id}",
"starred_url": "https://api.github.com/users/isold23/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/isold23/subscriptions",
"organizations_url": "https://api.github.com/users/isold23/orgs",
"repos_url": "https://api.github.com/users/isold23/repos",
"events_url": "https://api.github.com/users/isold23/events{/privacy}",
"received_events_url": "https://api.github.com/users/isold23/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-06-29T00:53:27
| 2024-06-29T14:28:35
| 2024-06-29T00:57:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Error: llama runner process has terminated: signal: abort trap error:error loading model architecture: unknown model architecture: 'gemma2'
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
ollama version is 0.1.42
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5369/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5369/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4991
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4991/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4991/comments
|
https://api.github.com/repos/ollama/ollama/issues/4991/events
|
https://github.com/ollama/ollama/pull/4991
| 2,347,741,372
|
PR_kwDOJ0Z1Ps5yLpc5
| 4,991
|
Create 1
|
{
"login": "renjy0219",
"id": 172387258,
"node_id": "U_kgDOCkZrug",
"avatar_url": "https://avatars.githubusercontent.com/u/172387258?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/renjy0219",
"html_url": "https://github.com/renjy0219",
"followers_url": "https://api.github.com/users/renjy0219/followers",
"following_url": "https://api.github.com/users/renjy0219/following{/other_user}",
"gists_url": "https://api.github.com/users/renjy0219/gists{/gist_id}",
"starred_url": "https://api.github.com/users/renjy0219/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/renjy0219/subscriptions",
"organizations_url": "https://api.github.com/users/renjy0219/orgs",
"repos_url": "https://api.github.com/users/renjy0219/repos",
"events_url": "https://api.github.com/users/renjy0219/events{/privacy}",
"received_events_url": "https://api.github.com/users/renjy0219/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-06-12T03:49:06
| 2024-06-12T03:53:55
| 2024-06-12T03:53:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4991",
"html_url": "https://github.com/ollama/ollama/pull/4991",
"diff_url": "https://github.com/ollama/ollama/pull/4991.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4991.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4991/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4991/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1937
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1937/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1937/comments
|
https://api.github.com/repos/ollama/ollama/issues/1937/events
|
https://github.com/ollama/ollama/pull/1937
| 2,077,790,585
|
PR_kwDOJ0Z1Ps5j3no7
| 1,937
|
remove client.py
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-11T23:52:08
| 2024-01-16T19:01:42
| 2024-01-16T19:01:41
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1937",
"html_url": "https://github.com/ollama/ollama/pull/1937",
"diff_url": "https://github.com/ollama/ollama/pull/1937.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1937.patch",
"merged_at": "2024-01-16T19:01:41"
}
|
See [ollama-python](https://github.com/jmorganca/ollama-python)
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1937/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1937/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2554
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2554/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2554/comments
|
https://api.github.com/repos/ollama/ollama/issues/2554/events
|
https://github.com/ollama/ollama/pull/2554
| 2,139,744,384
|
PR_kwDOJ0Z1Ps5nJhnH
| 2,554
|
Added FAQ section in readme to reduce issue count
|
{
"login": "saamerm",
"id": 8262287,
"node_id": "MDQ6VXNlcjgyNjIyODc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8262287?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saamerm",
"html_url": "https://github.com/saamerm",
"followers_url": "https://api.github.com/users/saamerm/followers",
"following_url": "https://api.github.com/users/saamerm/following{/other_user}",
"gists_url": "https://api.github.com/users/saamerm/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saamerm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saamerm/subscriptions",
"organizations_url": "https://api.github.com/users/saamerm/orgs",
"repos_url": "https://api.github.com/users/saamerm/repos",
"events_url": "https://api.github.com/users/saamerm/events{/privacy}",
"received_events_url": "https://api.github.com/users/saamerm/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-17T02:05:51
| 2024-05-07T23:44:09
| 2024-05-07T23:44:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2554",
"html_url": "https://github.com/ollama/ollama/pull/2554",
"diff_url": "https://github.com/ollama/ollama/pull/2554.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2554.patch",
"merged_at": null
}
|
* There's a lot of issues that are being used as an Ollama stack overflow. Adding a section like this will help a lot of people
* Fixes #2497
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2554/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2554/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1144
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1144/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1144/comments
|
https://api.github.com/repos/ollama/ollama/issues/1144/events
|
https://github.com/ollama/ollama/issues/1144
| 1,995,638,118
|
I_kwDOJ0Z1Ps528wVm
| 1,144
|
not found, try pulling it first
|
{
"login": "KadirErturk4r",
"id": 48729162,
"node_id": "MDQ6VXNlcjQ4NzI5MTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/48729162?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/KadirErturk4r",
"html_url": "https://github.com/KadirErturk4r",
"followers_url": "https://api.github.com/users/KadirErturk4r/followers",
"following_url": "https://api.github.com/users/KadirErturk4r/following{/other_user}",
"gists_url": "https://api.github.com/users/KadirErturk4r/gists{/gist_id}",
"starred_url": "https://api.github.com/users/KadirErturk4r/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/KadirErturk4r/subscriptions",
"organizations_url": "https://api.github.com/users/KadirErturk4r/orgs",
"repos_url": "https://api.github.com/users/KadirErturk4r/repos",
"events_url": "https://api.github.com/users/KadirErturk4r/events{/privacy}",
"received_events_url": "https://api.github.com/users/KadirErturk4r/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-15T21:58:07
| 2023-11-16T16:23:04
| 2023-11-16T16:23:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am on ubuntu22.04 with ollama 0.1.9. If I run the app with
ollama run mistral,
all works great.
However when i run the following
OLLAMA_HOST=0.0.0.0:1234 ollama serve
the curl -X POST http://ipaddress:1234/api/generate -d '{"model": "mistral", "prompt": "Why is the sky blue?" }'
returns;
{"error":"model 'mistral' not found, try pulling it first"}
server restart, model reload etc did not work.
Any suggestion?
|
{
"login": "KadirErturk4r",
"id": 48729162,
"node_id": "MDQ6VXNlcjQ4NzI5MTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/48729162?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/KadirErturk4r",
"html_url": "https://github.com/KadirErturk4r",
"followers_url": "https://api.github.com/users/KadirErturk4r/followers",
"following_url": "https://api.github.com/users/KadirErturk4r/following{/other_user}",
"gists_url": "https://api.github.com/users/KadirErturk4r/gists{/gist_id}",
"starred_url": "https://api.github.com/users/KadirErturk4r/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/KadirErturk4r/subscriptions",
"organizations_url": "https://api.github.com/users/KadirErturk4r/orgs",
"repos_url": "https://api.github.com/users/KadirErturk4r/repos",
"events_url": "https://api.github.com/users/KadirErturk4r/events{/privacy}",
"received_events_url": "https://api.github.com/users/KadirErturk4r/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1144/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1144/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6787
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6787/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6787/comments
|
https://api.github.com/repos/ollama/ollama/issues/6787/events
|
https://github.com/ollama/ollama/issues/6787
| 2,523,940,488
|
I_kwDOJ0Z1Ps6WcEaI
| 6,787
|
Support googles new "DataGemma" model
|
{
"login": "muehlburger",
"id": 141285,
"node_id": "MDQ6VXNlcjE0MTI4NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/141285?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/muehlburger",
"html_url": "https://github.com/muehlburger",
"followers_url": "https://api.github.com/users/muehlburger/followers",
"following_url": "https://api.github.com/users/muehlburger/following{/other_user}",
"gists_url": "https://api.github.com/users/muehlburger/gists{/gist_id}",
"starred_url": "https://api.github.com/users/muehlburger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/muehlburger/subscriptions",
"organizations_url": "https://api.github.com/users/muehlburger/orgs",
"repos_url": "https://api.github.com/users/muehlburger/repos",
"events_url": "https://api.github.com/users/muehlburger/events{/privacy}",
"received_events_url": "https://api.github.com/users/muehlburger/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-09-13T05:59:15
| 2024-09-13T05:59:15
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Google just released DataGemma, a new model that is based on real world data:
https://blog.google/technology/ai/google-datagemma-ai-llm/
https://huggingface.co/collections/google/datagemma-release-66df7636084d2b150a4e6643
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6787/reactions",
"total_count": 6,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6787/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3476
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3476/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3476/comments
|
https://api.github.com/repos/ollama/ollama/issues/3476/events
|
https://github.com/ollama/ollama/issues/3476
| 2,223,288,431
|
I_kwDOJ0Z1Ps6EhLBv
| 3,476
|
Vision models (bak/llava) crash and exit Ollama docker container
|
{
"login": "RELNO",
"id": 14296774,
"node_id": "MDQ6VXNlcjE0Mjk2Nzc0",
"avatar_url": "https://avatars.githubusercontent.com/u/14296774?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RELNO",
"html_url": "https://github.com/RELNO",
"followers_url": "https://api.github.com/users/RELNO/followers",
"following_url": "https://api.github.com/users/RELNO/following{/other_user}",
"gists_url": "https://api.github.com/users/RELNO/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RELNO/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RELNO/subscriptions",
"organizations_url": "https://api.github.com/users/RELNO/orgs",
"repos_url": "https://api.github.com/users/RELNO/repos",
"events_url": "https://api.github.com/users/RELNO/events{/privacy}",
"received_events_url": "https://api.github.com/users/RELNO/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-04-03T15:46:44
| 2024-07-25T00:24:30
| 2024-07-25T00:24:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When running vision models such as bakllava or llava (all sizes), the docker container is crashing, returning the following error: `RemoteProtocolError: Server disconnected without sending a response.`
Note: llama2 or other llms run well, i.e
```
completion = ollama.chat(
model='llama2', messages=[{"role": "user", "content": 'What is the capital of France?'}])
print(completion)
{'model': 'llama2', 'created_at': '2024-04-03T15:30:45.152975648Z', 'message': {'role': 'assistant', 'content': '\nThe capital of France is Paris.'}, 'done': True, 'total_duration': 4938479062, 'load_duration': 2702775270, 'prompt_eval_count': 27, 'prompt_eval_duration': 1461042000, 'eval_count': 9, 'eval_duration': 773529000}
```
The docker logs [shortened the 100+ goroutines]
```
goroutine 100 gp=0xc000702fc0 m=34 mp=0xc000b01008 [syscall]:
runtime.cgocall(0xebebf0, 0xc0000485e0)
runtime/cgocall.go:157 +0x4b fp=0xc0000485b8 sp=0xc000048580 pc=0x40a72b
github.com/ollama/ollama/llm._Cfunc_dyn_llama_server_init({0x7f8bdc003f00, 0x7f8bce0a5450, 0x7f8bce09c720, 0x7f8bce0a48a0, 0x7f8bce0abb40, 0x7f8bce0a3250, 0x7f8bce09e840, 0x7f8bce09c7a0, 0x7f8bce0ac580, 0x7f8bce0ab6e0, ...}, ...)
_cgo_gotypes.go:286 +0x45 fp=0xc0000485e0 sp=0xc0000485b8 pc=0xce9345
github.com/ollama/ollama/llm.newDynExtServer.func7(0x200?, 0xc000458000, 0xc000a69ce0)
github.com/ollama/ollama/llm/dyn_ext_server.go:151 +0x112 fp=0xc000048720 sp=0xc0000485e0 pc=0xceaaf2
github.com/ollama/ollama/llm.newDynExtServer({0xc00062c000, 0x36}, {0xc00044a2a0, _}, {_, _, _}, {0xc000931910, 0x1, 0x1}, ...)
github.com/ollama/ollama/llm/dyn_ext_server.go:151 +0xb58 fp=0xc000048a80 sp=0xc000048720 pc=0xcea638
github.com/ollama/ollama/llm.newLlmServer({{_, _, _}, {_, _}, {_, _}}, {_, _}, {0x0, ...}, ...)
github.com/ollama/ollama/llm/llm.go:166 +0x4c5 fp=0xc000048c40 sp=0xc000048a80 pc=0xce6b25
github.com/ollama/ollama/llm.New({0xc00044a2a0, 0x62}, {0x0, 0x0, 0x0}, {0xc000931910, _, _}, {{0x0, 0x1000, ...}, ...})
github.com/ollama/ollama/llm/llm.go:131 +0x90e fp=0xc000048ed8 sp=0xc000048c40 pc=0xce64ce
github.com/ollama/ollama/server.load(0xc000003080?, 0xc000003080, {{0x0, 0x1000, 0x200, 0x1, 0xffffffffffffffff, 0x0, 0x0, 0x1, ...}, ...}, ...)
github.com/ollama/ollama/server/routes.go:86 +0x325 fp=0xc000049028 sp=0xc000048ed8 pc=0xe973a5
github.com/ollama/ollama/server.ChatHandler(0xc00023a100)
github.com/ollama/ollama/server/routes.go:1272 +0xa4b fp=0xc000049730 sp=0xc000049028 pc=0xea310b
github.com/gin-gonic/gin.(*Context).Next(0xc00023a100)
github.com/gin-gonic/gin@v1.9.1/context.go:174 +0x2b fp=0xc000049750 sp=0xc000049730 pc=0xe6ab8b
github.com/ollama/ollama/server.(*Server).GenerateRoutes.allowedHostsMiddleware.func3(0xc00023a100)
github.com/ollama/ollama/server/routes.go:990 +0x15d fp=0xc0000497a8 sp=0xc000049750 pc=0xea187d
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.CustomRecoveryWithWriter.func1(0xc00023a100)
github.com/gin-gonic/gin@v1.9.1/recovery.go:102 +0x7a fp=0xc0000497f8 sp=0xc0000497a8 pc=0xe77a7a
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.LoggerWithConfig.func1(0xc00023a100)
github.com/gin-gonic/gin@v1.9.1/logger.go:240 +0xdd fp=0xc0000499a8 sp=0xc0000497f8 pc=0xe76bbd
github.com/gin-gonic/gin.(*Context).Next(...)
github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.(*Engine).handleHTTPRequest(0xc0001c24e0, 0xc00023a100)
github.com/gin-gonic/gin@v1.9.1/gin.go:620 +0x66e fp=0xc000049b28 sp=0xc0000499a8 pc=0xe760ae
github.com/gin-gonic/gin.(*Engine).ServeHTTP(0xc0001c24e0, {0x11872730, 0xc0005caa80}, 0xc000613560)
github.com/gin-gonic/gin@v1.9.1/gin.go:576 +0x1b2 fp=0xc000049b60 sp=0xc000049b28 pc=0xe75872
net/http.serverHandler.ServeHTTP({0x11870610?}, {0x11872730?, 0xc0005caa80?}, 0x6?)
net/http/server.go:3137 +0x8e fp=0xc000049b90 sp=0xc000049b60 pc=0x70028e
net/http.(*conn).serve(0xc0001fcc60, {0x11874ae8, 0xc00060e570})
net/http/server.go:2039 +0x5e8 fp=0xc000049fb8 sp=0xc000049b90 pc=0x6fb648
net/http.(*Server).Serve.gowrap3()
net/http/server.go:3285 +0x28 fp=0xc000049fe0 sp=0xc000049fb8 pc=0x700aa8
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc000049fe8 sp=0xc000049fe0 pc=0x4742e1
created by net/http.(*Server).Serve in goroutine 1
net/http/server.go:3285 +0x4b4
goroutine 1 gp=0xc0000061c0 m=nil [IO wait, 1 minutes]:
runtime.gopark(0xc00004e008?, 0x0?, 0xc0?, 0x61?, 0xc0007d9868?)
runtime/proc.go:402 +0xce fp=0xc0007d9830 sp=0xc0007d9810 pc=0x44160e
runtime.netpollblock(0xc0007d98c8?, 0x409ec6?, 0x0?)
runtime/netpoll.go:573 +0xf7 fp=0xc0007d9868 sp=0xc0007d9830 pc=0x43a377
internal/poll.runtime_pollWait(0x7f8cbbfc66d0, 0x72)
runtime/netpoll.go:345 +0x85 fp=0xc0007d9888 sp=0xc0007d9868 pc=0x46e9e5
internal/poll.(*pollDesc).wait(0x3?, 0x1?, 0x0)
internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0007d98b0 sp=0xc0007d9888 pc=0x503707
internal/poll.(*pollDesc).waitRead(...)
internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0xc0005b4600)
internal/poll/fd_unix.go:611 +0x2ac fp=0xc0007d9958 sp=0xc0007d98b0 pc=0x508aac
net.(*netFD).accept(0xc0005b4600)
net/fd_unix.go:172 +0x29 fp=0xc0007d9a10 sp=0xc0007d9958 pc=0x598f89
net.(*TCPListener).accept(0xc00049a820)
net/tcpsock_posix.go:159 +0x1e fp=0xc0007d9a38 sp=0xc0007d9a10 pc=0x5aeebe
net.(*TCPListener).Accept(0xc00049a820)
net/tcpsock.go:327 +0x30 fp=0xc0007d9a68 sp=0xc0007d9a38 pc=0x5ae0b0
net/http.(*onceCloseListener).Accept(0xc0001fcc60?)
<autogenerated>:1 +0x24 fp=0xc0007d9a80 sp=0xc0007d9a68 pc=0x722ce4
net/http.(*Server).Serve(0xc0005cc690, {0x118724c0, 0xc00049a820})
net/http/server.go:3255 +0x33e fp=0xc0007d9bb0 sp=0xc0007d9a80 pc=0x7006be
github.com/ollama/ollama/server.Serve({0x118724c0, 0xc00049a820})
github.com/ollama/ollama/server/routes.go:1145 +0x4eb fp=0xc0007d9cc0 sp=0xc0007d9bb0 pc=0xea1dab
github.com/ollama/ollama/cmd.RunServer(0xc00023ae00?, {0x11fc38c0?, 0x4?, 0x1055b75?})
github.com/ollama/ollama/cmd/cmd.go:787 +0x1b9 fp=0xc0007d9d58 sp=0xc0007d9cc0 pc=0xeb5b59
github.com/spf13/cobra.(*Command).execute(0xc0004e0f08, {0x11fc38c0, 0x0, 0x0})
github.com/spf13/cobra@v1.7.0/command.go:940 +0x882 fp=0xc0007d9e78 sp=0xc0007d9d58 pc=0x795c62
github.com/spf13/cobra.(*Command).ExecuteC(0xc0004e0308)
github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc0007d9f30 sp=0xc0007d9e78 pc=0x7964a5
github.com/spf13/cobra.(*Command).Execute(...)
github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
github.com/spf13/cobra@v1.7.0/command.go:985
main.main()
github.com/ollama/ollama/main.go:11 +0x4d fp=0xc0007d9f50 sp=0xc0007d9f30 pc=0xebdd0d
runtime.main()
runtime/proc.go:271 +0x29d fp=0xc0007d9fe0 sp=0xc0007d9f50 pc=0x4411dd
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0007d9fe8 sp=0xc0007d9fe0 pc=0x4742e1
...
goroutine 101 gp=0xc001284700 m=nil [IO wait, 1 minutes]:
runtime.gopark(0x40?, 0x3f8ccccd3f4ccccd?, 0x0?, 0x0?, 0xb?)
runtime/proc.go:402 +0xce fp=0xc0004cb5a8 sp=0xc0004cb588 pc=0x44160e
runtime.netpollblock(0x4866f8?, 0x409ec6?, 0x0?)
runtime/netpoll.go:573 +0xf7 fp=0xc0004cb5e0 sp=0xc0004cb5a8 pc=0x43a377
internal/poll.runtime_pollWait(0x7f8cbbfc65d8, 0x72)
runtime/netpoll.go:345 +0x85 fp=0xc0004cb600 sp=0xc0004cb5e0 pc=0x46e9e5
internal/poll.(*pollDesc).wait(0xc0005b4180?, 0xc000acca61?, 0x0)
internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0004cb628 sp=0xc0004cb600 pc=0x503707
internal/poll.(*pollDesc).waitRead(...)
internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc0005b4180, {0xc000acca61, 0x1, 0x1})
internal/poll/fd_unix.go:164 +0x27a fp=0xc0004cb6c0 sp=0xc0004cb628 pc=0x5049fa
net.(*netFD).Read(0xc0005b4180, {0xc000acca61?, 0x40?, 0x3f8ccccd3f4ccccd?})
net/fd_posix.go:55 +0x25 fp=0xc0004cb708 sp=0xc0004cb6c0 pc=0x596fa5
net.(*conn).Read(0xc0000c6738, {0xc000acca61?, 0x0?, 0xc0004cb7d0?})
net/net.go:179 +0x45 fp=0xc0004cb750 sp=0xc0004cb708 pc=0x5a5e05
net.(*TCPConn).Read(0x0?, {0xc000acca61?, 0x11f5d540?, 0xea4be0?})
<autogenerated>:1 +0x25 fp=0xc0004cb780 sp=0xc0004cb750 pc=0x5b7485
net/http.(*connReader).backgroundRead(0xc000acca50)
net/http/server.go:681 +0x37 fp=0xc0004cb7c8 sp=0xc0004cb780 pc=0x6f55b7
net/http.(*connReader).startBackgroundRead.gowrap2()
net/http/server.go:677 +0x25 fp=0xc0004cb7e0 sp=0xc0004cb7c8 pc=0x6f54e5
runtime.goexit({})
runtime/asm_amd64.s:1695 +0x1 fp=0xc0004cb7e8 sp=0xc0004cb7e0 pc=0x4742e1
created by net/http.(*connReader).startBackgroundRead in goroutine 100
net/http/server.go:677 +0xba
```
### What did you expect to see?
Vision model should work as expected.
### Steps to reproduce
```
docker pull ollama/ollama
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
docker exec -it ollama ollama run llama2
docker exec -it ollama ollama run bakllava < crashes the container here
```
-- or in python --
```
ollama.chat(
model="bakllava",
messages=[
{
'role': 'user',
'content': 'how to make a cake',
'images': []
}
]
)
```
### Are there any recent changes that introduced the issue?
no, was running well a couple of days ago. no update to any of the pieces.
### OS
Linux
### Architecture
_No response_
### Platform
Docker
### Ollama version
ollama version is 0.1.30
### GPU
Nvidia
### GPU info
4 X NVIDIA GeForce GTX TITAN X @ 12gb
### CPU
Intel
### Other software
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3476/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3476/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3602
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3602/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3602/comments
|
https://api.github.com/repos/ollama/ollama/issues/3602/events
|
https://github.com/ollama/ollama/pull/3602
| 2,238,410,253
|
PR_kwDOJ0Z1Ps5sZb_U
| 3,602
|
examples: Update imports for deprecation
|
{
"login": "erikos",
"id": 3714785,
"node_id": "MDQ6VXNlcjM3MTQ3ODU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3714785?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/erikos",
"html_url": "https://github.com/erikos",
"followers_url": "https://api.github.com/users/erikos/followers",
"following_url": "https://api.github.com/users/erikos/following{/other_user}",
"gists_url": "https://api.github.com/users/erikos/gists{/gist_id}",
"starred_url": "https://api.github.com/users/erikos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erikos/subscriptions",
"organizations_url": "https://api.github.com/users/erikos/orgs",
"repos_url": "https://api.github.com/users/erikos/repos",
"events_url": "https://api.github.com/users/erikos/events{/privacy}",
"received_events_url": "https://api.github.com/users/erikos/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-11T19:26:29
| 2024-12-29T19:36:25
| 2024-12-29T19:36:25
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3602",
"html_url": "https://github.com/ollama/ollama/pull/3602",
"diff_url": "https://github.com/ollama/ollama/pull/3602.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3602.patch",
"merged_at": "2024-12-29T19:36:25"
}
|
From the error message: Importing from langchain will no longer be supported as of langchain 0.2.0. Please import from langchain-community instead.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3602/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3602/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5308
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5308/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5308/comments
|
https://api.github.com/repos/ollama/ollama/issues/5308/events
|
https://github.com/ollama/ollama/pull/5308
| 2,376,069,278
|
PR_kwDOJ0Z1Ps5zrJfV
| 5,308
|
Update README.md to Portuguese Brazilian and Optimized the image files of the project
|
{
"login": "ItaloGustavoS",
"id": 42496107,
"node_id": "MDQ6VXNlcjQyNDk2MTA3",
"avatar_url": "https://avatars.githubusercontent.com/u/42496107?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ItaloGustavoS",
"html_url": "https://github.com/ItaloGustavoS",
"followers_url": "https://api.github.com/users/ItaloGustavoS/followers",
"following_url": "https://api.github.com/users/ItaloGustavoS/following{/other_user}",
"gists_url": "https://api.github.com/users/ItaloGustavoS/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ItaloGustavoS/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ItaloGustavoS/subscriptions",
"organizations_url": "https://api.github.com/users/ItaloGustavoS/orgs",
"repos_url": "https://api.github.com/users/ItaloGustavoS/repos",
"events_url": "https://api.github.com/users/ItaloGustavoS/events{/privacy}",
"received_events_url": "https://api.github.com/users/ItaloGustavoS/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-26T19:01:01
| 2024-07-11T14:01:24
| 2024-07-11T14:01:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5308",
"html_url": "https://github.com/ollama/ollama/pull/5308",
"diff_url": "https://github.com/ollama/ollama/pull/5308.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5308.patch",
"merged_at": null
}
|
[ImgBot] Optimize images
*Total -- 463.96kb -> 380.99kb (17.88%)
/examples/modelfile-mario/logo.png -- 445.60kb -> 362.69kb (18.61%)
/macapp/assets/iconTemplate@2x.png -- 0.87kb -> 0.84kb (3.82%)
/macapp/assets/iconUpdateTemplate@2x.png -- 0.82kb -> 0.81kb (1.42%)
/macapp/src/ollama.svg -- 16.66kb -> 16.65kb (0.05%)
Signed-off-by: ImgBotApp <ImgBotHelp@gmail.com>
|
{
"login": "ItaloGustavoS",
"id": 42496107,
"node_id": "MDQ6VXNlcjQyNDk2MTA3",
"avatar_url": "https://avatars.githubusercontent.com/u/42496107?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ItaloGustavoS",
"html_url": "https://github.com/ItaloGustavoS",
"followers_url": "https://api.github.com/users/ItaloGustavoS/followers",
"following_url": "https://api.github.com/users/ItaloGustavoS/following{/other_user}",
"gists_url": "https://api.github.com/users/ItaloGustavoS/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ItaloGustavoS/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ItaloGustavoS/subscriptions",
"organizations_url": "https://api.github.com/users/ItaloGustavoS/orgs",
"repos_url": "https://api.github.com/users/ItaloGustavoS/repos",
"events_url": "https://api.github.com/users/ItaloGustavoS/events{/privacy}",
"received_events_url": "https://api.github.com/users/ItaloGustavoS/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5308/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5308/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3375
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3375/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3375/comments
|
https://api.github.com/repos/ollama/ollama/issues/3375/events
|
https://github.com/ollama/ollama/pull/3375
| 2,211,703,858
|
PR_kwDOJ0Z1Ps5q-SVg
| 3,375
|
only generate cuda/rocm when changes to llm detected
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-27T19:09:41
| 2024-03-27T19:40:56
| 2024-03-27T19:40:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3375",
"html_url": "https://github.com/ollama/ollama/pull/3375",
"diff_url": "https://github.com/ollama/ollama/pull/3375.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3375.patch",
"merged_at": "2024-03-27T19:40:55"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3375/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3375/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5568
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5568/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5568/comments
|
https://api.github.com/repos/ollama/ollama/issues/5568/events
|
https://github.com/ollama/ollama/pull/5568
| 2,397,609,616
|
PR_kwDOJ0Z1Ps50zRdo
| 5,568
|
Fix cannot range over constant error
|
{
"login": "tiendreiliass0x",
"id": 167472499,
"node_id": "U_kgDOCfttcw",
"avatar_url": "https://avatars.githubusercontent.com/u/167472499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tiendreiliass0x",
"html_url": "https://github.com/tiendreiliass0x",
"followers_url": "https://api.github.com/users/tiendreiliass0x/followers",
"following_url": "https://api.github.com/users/tiendreiliass0x/following{/other_user}",
"gists_url": "https://api.github.com/users/tiendreiliass0x/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tiendreiliass0x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tiendreiliass0x/subscriptions",
"organizations_url": "https://api.github.com/users/tiendreiliass0x/orgs",
"repos_url": "https://api.github.com/users/tiendreiliass0x/repos",
"events_url": "https://api.github.com/users/tiendreiliass0x/events{/privacy}",
"received_events_url": "https://api.github.com/users/tiendreiliass0x/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-07-09T09:09:32
| 2024-07-30T23:03:08
| 2024-07-30T23:03:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5568",
"html_url": "https://github.com/ollama/ollama/pull/5568",
"diff_url": "https://github.com/ollama/ollama/pull/5568.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5568.patch",
"merged_at": null
}
|
``` for try := range maxRetries { ``` -> ```for try := 0; try < maxRetries; try++ {```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5568/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5568/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1263
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1263/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1263/comments
|
https://api.github.com/repos/ollama/ollama/issues/1263/events
|
https://github.com/ollama/ollama/issues/1263
| 2,009,331,611
|
I_kwDOJ0Z1Ps53w_eb
| 1,263
|
Running Ollama for Orca2:13b on Limited VRAM: Model Configuration and Quantization Inquiry
|
{
"login": "derekhsu",
"id": 2805401,
"node_id": "MDQ6VXNlcjI4MDU0MDE=",
"avatar_url": "https://avatars.githubusercontent.com/u/2805401?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/derekhsu",
"html_url": "https://github.com/derekhsu",
"followers_url": "https://api.github.com/users/derekhsu/followers",
"following_url": "https://api.github.com/users/derekhsu/following{/other_user}",
"gists_url": "https://api.github.com/users/derekhsu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/derekhsu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/derekhsu/subscriptions",
"organizations_url": "https://api.github.com/users/derekhsu/orgs",
"repos_url": "https://api.github.com/users/derekhsu/repos",
"events_url": "https://api.github.com/users/derekhsu/events{/privacy}",
"received_events_url": "https://api.github.com/users/derekhsu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-24T08:46:51
| 2024-03-12T20:20:04
| 2024-03-12T20:20:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I successfully ran Ollama for Orca2:13b on my local machine, which has only 16GB of VRAM. However, I encountered difficulties running the Orca-2-13b model downloaded from Hugging Face due to insufficient VRAM. What model configuration does Ollama use to run Orca2:13b? Also, is it quantized?"
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1263/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1263/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6077
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6077/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6077/comments
|
https://api.github.com/repos/ollama/ollama/issues/6077/events
|
https://github.com/ollama/ollama/issues/6077
| 2,438,172,162
|
I_kwDOJ0Z1Ps6RU44C
| 6,077
|
Ollama Services run on tcp6
|
{
"login": "niravgshah28",
"id": 40138337,
"node_id": "MDQ6VXNlcjQwMTM4MzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/40138337?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/niravgshah28",
"html_url": "https://github.com/niravgshah28",
"followers_url": "https://api.github.com/users/niravgshah28/followers",
"following_url": "https://api.github.com/users/niravgshah28/following{/other_user}",
"gists_url": "https://api.github.com/users/niravgshah28/gists{/gist_id}",
"starred_url": "https://api.github.com/users/niravgshah28/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/niravgshah28/subscriptions",
"organizations_url": "https://api.github.com/users/niravgshah28/orgs",
"repos_url": "https://api.github.com/users/niravgshah28/repos",
"events_url": "https://api.github.com/users/niravgshah28/events{/privacy}",
"received_events_url": "https://api.github.com/users/niravgshah28/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-30T15:56:47
| 2024-12-14T14:03:22
| 2024-08-09T21:40:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi
I am trying to run ollama services with Meta 3-70b model, but when I start my services on my VM even with config file ollama_host with port number still runs on tcp6
if anyone facing similar issue on RHEL 8 Vm
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
Ollam Version 0.2.5
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6077/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6077/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3668
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3668/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3668/comments
|
https://api.github.com/repos/ollama/ollama/issues/3668/events
|
https://github.com/ollama/ollama/issues/3668
| 2,245,171,621
|
I_kwDOJ0Z1Ps6F0pml
| 3,668
|
wizardlm2:8x22b-q4_0 is failing to generate on M3 max
|
{
"login": "xansrnitu",
"id": 63858582,
"node_id": "MDQ6VXNlcjYzODU4NTgy",
"avatar_url": "https://avatars.githubusercontent.com/u/63858582?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xansrnitu",
"html_url": "https://github.com/xansrnitu",
"followers_url": "https://api.github.com/users/xansrnitu/followers",
"following_url": "https://api.github.com/users/xansrnitu/following{/other_user}",
"gists_url": "https://api.github.com/users/xansrnitu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xansrnitu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xansrnitu/subscriptions",
"organizations_url": "https://api.github.com/users/xansrnitu/orgs",
"repos_url": "https://api.github.com/users/xansrnitu/repos",
"events_url": "https://api.github.com/users/xansrnitu/events{/privacy}",
"received_events_url": "https://api.github.com/users/xansrnitu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-04-16T06:09:59
| 2024-04-16T16:23:44
| 2024-04-16T16:23:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am trying to run wizardlm2:8x22b-q4_0 on M3 max machine. However, it is not generating meaningful token to the given prompt.
<img width="520" alt="Screenshot 2024-04-16 at 11 35 33 AM" src="https://github.com/ollama/ollama/assets/63858582/2ceeaff6-dfe2-4c8b-b63a-a33ac9ecb1d2">
ollama version is 0.1.32
### What did you expect to see?
_No response_
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
macOS
### Architecture
arm64
### Platform
_No response_
### Ollama version
0.1.32
### GPU
Apple
### GPU info
M3 Max 128GB
### CPU
Apple
### Other software
_No response_
|
{
"login": "xansrnitu",
"id": 63858582,
"node_id": "MDQ6VXNlcjYzODU4NTgy",
"avatar_url": "https://avatars.githubusercontent.com/u/63858582?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xansrnitu",
"html_url": "https://github.com/xansrnitu",
"followers_url": "https://api.github.com/users/xansrnitu/followers",
"following_url": "https://api.github.com/users/xansrnitu/following{/other_user}",
"gists_url": "https://api.github.com/users/xansrnitu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xansrnitu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xansrnitu/subscriptions",
"organizations_url": "https://api.github.com/users/xansrnitu/orgs",
"repos_url": "https://api.github.com/users/xansrnitu/repos",
"events_url": "https://api.github.com/users/xansrnitu/events{/privacy}",
"received_events_url": "https://api.github.com/users/xansrnitu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3668/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3668/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3957
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3957/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3957/comments
|
https://api.github.com/repos/ollama/ollama/issues/3957/events
|
https://github.com/ollama/ollama/pull/3957
| 2,266,494,122
|
PR_kwDOJ0Z1Ps5t4Y_R
| 3,957
|
types/model: export ParseNameBare and Merge
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-26T20:54:58
| 2024-04-26T21:58:08
| 2024-04-26T21:58:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3957",
"html_url": "https://github.com/ollama/ollama/pull/3957",
"diff_url": "https://github.com/ollama/ollama/pull/3957.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3957.patch",
"merged_at": "2024-04-26T21:58:07"
}
|
These are useful outside this package.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3957/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3957/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7105
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7105/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7105/comments
|
https://api.github.com/repos/ollama/ollama/issues/7105/events
|
https://github.com/ollama/ollama/issues/7105
| 2,567,677,203
|
I_kwDOJ0Z1Ps6ZC6UT
| 7,105
|
Embed list of string error. Is the /embed endpoint deprecated?
|
{
"login": "firdausai",
"id": 44670022,
"node_id": "MDQ6VXNlcjQ0NjcwMDIy",
"avatar_url": "https://avatars.githubusercontent.com/u/44670022?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/firdausai",
"html_url": "https://github.com/firdausai",
"followers_url": "https://api.github.com/users/firdausai/followers",
"following_url": "https://api.github.com/users/firdausai/following{/other_user}",
"gists_url": "https://api.github.com/users/firdausai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/firdausai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/firdausai/subscriptions",
"organizations_url": "https://api.github.com/users/firdausai/orgs",
"repos_url": "https://api.github.com/users/firdausai/repos",
"events_url": "https://api.github.com/users/firdausai/events{/privacy}",
"received_events_url": "https://api.github.com/users/firdausai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 1
| 2024-10-05T05:51:17
| 2024-11-06T00:03:54
| 2024-11-06T00:03:48
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
This works
```
curl http://localhost:11434/api/embeddings -d '{
"model": "mxbai-embed-large",
"prompt": "Llamas are members of the camelid family"
}'
```
But this does not work, returns empty list of embeds
```
curl http://localhost:11434/api/embed -d '{
"model": "mxbai-embed-large",
"prompt": ["Llamas are members","of the camelid family"]
}'
{"model":"mxbai-embed-large","embeddings":[]}%
```
Is the /embed endpoint not supported anymore?
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.10
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7105/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7105/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5666
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5666/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5666/comments
|
https://api.github.com/repos/ollama/ollama/issues/5666/events
|
https://github.com/ollama/ollama/issues/5666
| 2,406,744,397
|
I_kwDOJ0Z1Ps6PdAFN
| 5,666
|
Curious, What's the correct TEMPLATE parameter for google gemma model, in the context of modelfile? I am converting GGUF to ollama by myself by using the command "ollama crea xxx -f xxx"
|
{
"login": "Astroherodvaipayan",
"id": 105009701,
"node_id": "U_kgDOBkJSJQ",
"avatar_url": "https://avatars.githubusercontent.com/u/105009701?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Astroherodvaipayan",
"html_url": "https://github.com/Astroherodvaipayan",
"followers_url": "https://api.github.com/users/Astroherodvaipayan/followers",
"following_url": "https://api.github.com/users/Astroherodvaipayan/following{/other_user}",
"gists_url": "https://api.github.com/users/Astroherodvaipayan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Astroherodvaipayan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Astroherodvaipayan/subscriptions",
"organizations_url": "https://api.github.com/users/Astroherodvaipayan/orgs",
"repos_url": "https://api.github.com/users/Astroherodvaipayan/repos",
"events_url": "https://api.github.com/users/Astroherodvaipayan/events{/privacy}",
"received_events_url": "https://api.github.com/users/Astroherodvaipayan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-07-13T07:04:29
| 2024-11-06T12:36:59
| 2024-11-06T12:36:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Curious, What's the correct TEMPLATE parameter for google gemma model, in the context of modelfile? I am converting GGUF to ollama by myself by using the command "ollama crea xxx -f xxx"
the original hugingface repo chat_template is as follows
```
{% if messages[0]['role'] == 'system' %}
{{ raise_exception('System role not supported') }}
{% endif %}
{% for message in messages %}
{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
{% endif %}
{% if (message['role'] == 'assistant') %}
{% set role = 'model' %}
{% else %}
{% set role = message['role'] %}
{% endif %}
{{ '<start_of_turn>' + role + '\n' + message['content'] | trim + '<end_of_turn>\n' }}
{% endfor %}
{% if add_generation_prompt %}{{'<start_of_turn>model\n'}}{% endif %}
```
_Originally posted by @nps798 in https://github.com/ollama/ollama/issues/2636#issuecomment-1957025970_
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5666/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5666/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8620
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8620/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8620/comments
|
https://api.github.com/repos/ollama/ollama/issues/8620/events
|
https://github.com/ollama/ollama/issues/8620
| 2,814,178,540
|
I_kwDOJ0Z1Ps6nvPTs
| 8,620
|
Add support fo Qwen 2.5 VL models (3B, 7B and 32B) instruct versions
|
{
"login": "YarvixPA",
"id": 152553832,
"node_id": "U_kgDOCRfJaA",
"avatar_url": "https://avatars.githubusercontent.com/u/152553832?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YarvixPA",
"html_url": "https://github.com/YarvixPA",
"followers_url": "https://api.github.com/users/YarvixPA/followers",
"following_url": "https://api.github.com/users/YarvixPA/following{/other_user}",
"gists_url": "https://api.github.com/users/YarvixPA/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YarvixPA/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YarvixPA/subscriptions",
"organizations_url": "https://api.github.com/users/YarvixPA/orgs",
"repos_url": "https://api.github.com/users/YarvixPA/repos",
"events_url": "https://api.github.com/users/YarvixPA/events{/privacy}",
"received_events_url": "https://api.github.com/users/YarvixPA/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 9
| 2025-01-27T22:20:04
| 2025-01-30T11:41:36
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello, now hours after being released, I would like to suggest that you add support for the Qwen2.5-VL models.
**[Qwen2.5-VL - Hugginface collection](https://huggingface.co/collections/Qwen/qwen25-vl-6795ffac22b334a837c0f9a5)**
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8620/reactions",
"total_count": 13,
"+1": 13,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8620/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2495
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2495/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2495/comments
|
https://api.github.com/repos/ollama/ollama/issues/2495/events
|
https://github.com/ollama/ollama/issues/2495
| 2,134,771,810
|
I_kwDOJ0Z1Ps5_Pghi
| 2,495
|
Llama2: `q4_km` as default?
|
{
"login": "matthiasgeihs",
"id": 62935430,
"node_id": "MDQ6VXNlcjYyOTM1NDMw",
"avatar_url": "https://avatars.githubusercontent.com/u/62935430?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/matthiasgeihs",
"html_url": "https://github.com/matthiasgeihs",
"followers_url": "https://api.github.com/users/matthiasgeihs/followers",
"following_url": "https://api.github.com/users/matthiasgeihs/following{/other_user}",
"gists_url": "https://api.github.com/users/matthiasgeihs/gists{/gist_id}",
"starred_url": "https://api.github.com/users/matthiasgeihs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matthiasgeihs/subscriptions",
"organizations_url": "https://api.github.com/users/matthiasgeihs/orgs",
"repos_url": "https://api.github.com/users/matthiasgeihs/repos",
"events_url": "https://api.github.com/users/matthiasgeihs/events{/privacy}",
"received_events_url": "https://api.github.com/users/matthiasgeihs/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-02-14T16:53:03
| 2024-06-07T08:56:09
| 2024-05-17T01:22:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Just saw this table: https://github.com/ggerganov/llama.cpp/pull/1684#issuecomment-1579252501
Perplexity loss is considerably higher for `q4_0` compared to `q4_km`. `km` provides the best tradeoff between size and performance, as noted also [here](https://huggingface.co/TheBloke/Mixtral-8x7B-v0.1-GGUF).
`ollama` currently uses `q4_0` as the default for `llama2`: https://ollama.com/library/llama2:latest (as observed by comparing the ID 78e26419b446).
**Suggestion:**
Use [`q4_km`](https://ollama.com/library/llama2:7b-chat-q4_K_M) as the default instead.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2495/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2495/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5026
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5026/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5026/comments
|
https://api.github.com/repos/ollama/ollama/issues/5026/events
|
https://github.com/ollama/ollama/issues/5026
| 2,351,705,927
|
I_kwDOJ0Z1Ps6MLC9H
| 5,026
|
Can I customize OLLAMA_TMPDIR ?
|
{
"login": "prince21000",
"id": 132249561,
"node_id": "U_kgDOB-H32Q",
"avatar_url": "https://avatars.githubusercontent.com/u/132249561?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/prince21000",
"html_url": "https://github.com/prince21000",
"followers_url": "https://api.github.com/users/prince21000/followers",
"following_url": "https://api.github.com/users/prince21000/following{/other_user}",
"gists_url": "https://api.github.com/users/prince21000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/prince21000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/prince21000/subscriptions",
"organizations_url": "https://api.github.com/users/prince21000/orgs",
"repos_url": "https://api.github.com/users/prince21000/repos",
"events_url": "https://api.github.com/users/prince21000/events{/privacy}",
"received_events_url": "https://api.github.com/users/prince21000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-06-13T17:31:09
| 2024-10-23T17:09:10
| 2024-08-01T22:34:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I use Modelfile to create a new model using Ollama create xxx -f xxx.modefile, error (error: /tmp/ollama-tfxxxxxxx no space left) occurs. I noticed that /tmp and /usr/share/ollama share the same root directory that has no space, so I change OLLAMA_TMPDIR and OLLAMA_MODELS to a customized location.
In ollama.service, I added:
Environment = "OLLAMA_TMPDIR = /apprun/tmp"
Environment = "OLLAMA_MODELS = /apprun/models".
And use sudo chmod -R 777 tmp & sudo chown -R root:root tmp (also tried sudo chown -R ollama:ollama tmp)
sudo chmod -R 777 models & sudo chown -R root:root models(also tried sudo chown -R ollama:ollama models)
sudo systemctl daemon-reload
sudo systemctl restart ollama.service
sudo systemctl status ollama
And I found that my own private folders and files in /apprun were all missing but with /apprun/tmp existing. The /models folder only contains /blobs (but no /manifest). The /models are empty (It should've contain model files I downloaded before).
Is this a normal situation? If restarting ollama.service would delete all files under folders where /tmp locates?
Do you have any suggestion to the error (no space left) for a linux user, when using ollama create. Because I can confirm that the model would download to a customized location using ollama pull. But when using ollama create, it still use root directory where /user/share locate. Therefore, I got a no space error.
Thank you!
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.42
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5026/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5026/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3913
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3913/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3913/comments
|
https://api.github.com/repos/ollama/ollama/issues/3913/events
|
https://github.com/ollama/ollama/issues/3913
| 2,264,017,933
|
I_kwDOJ0Z1Ps6G8iwN
| 3,913
|
Add Option to list/switch models with right click on sys try icon
|
{
"login": "EaglePPP",
"id": 19349338,
"node_id": "MDQ6VXNlcjE5MzQ5MzM4",
"avatar_url": "https://avatars.githubusercontent.com/u/19349338?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/EaglePPP",
"html_url": "https://github.com/EaglePPP",
"followers_url": "https://api.github.com/users/EaglePPP/followers",
"following_url": "https://api.github.com/users/EaglePPP/following{/other_user}",
"gists_url": "https://api.github.com/users/EaglePPP/gists{/gist_id}",
"starred_url": "https://api.github.com/users/EaglePPP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/EaglePPP/subscriptions",
"organizations_url": "https://api.github.com/users/EaglePPP/orgs",
"repos_url": "https://api.github.com/users/EaglePPP/repos",
"events_url": "https://api.github.com/users/EaglePPP/events{/privacy}",
"received_events_url": "https://api.github.com/users/EaglePPP/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-04-25T16:28:33
| 2024-04-25T16:30:44
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Heyy Llamas,
so I got this idea, how about adding an option to show us how many models we have had with a simple right click on the llama icon?
here

yeah yeah I know _ollama list_ can show 'em all, but its not convinient when I use Ollama's Rest api with other programms rather than in the terminal.
Sometimes I forgot which one is in my VRAM, thats it.
like this
llama3
llama2
Mixtral
• Phi3
Phi2
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3913/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3913/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4338
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4338/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4338/comments
|
https://api.github.com/repos/ollama/ollama/issues/4338/events
|
https://github.com/ollama/ollama/issues/4338
| 2,290,624,072
|
I_kwDOJ0Z1Ps6IiCZI
| 4,338
|
is there a way to make LLM model say something first, without blank waiting for me to say something?
|
{
"login": "yaohwang",
"id": 11722528,
"node_id": "MDQ6VXNlcjExNzIyNTI4",
"avatar_url": "https://avatars.githubusercontent.com/u/11722528?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yaohwang",
"html_url": "https://github.com/yaohwang",
"followers_url": "https://api.github.com/users/yaohwang/followers",
"following_url": "https://api.github.com/users/yaohwang/following{/other_user}",
"gists_url": "https://api.github.com/users/yaohwang/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yaohwang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yaohwang/subscriptions",
"organizations_url": "https://api.github.com/users/yaohwang/orgs",
"repos_url": "https://api.github.com/users/yaohwang/repos",
"events_url": "https://api.github.com/users/yaohwang/events{/privacy}",
"received_events_url": "https://api.github.com/users/yaohwang/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-05-11T03:10:40
| 2024-05-11T03:10:40
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null | null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4338/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4338/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6547
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6547/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6547/comments
|
https://api.github.com/repos/ollama/ollama/issues/6547/events
|
https://github.com/ollama/ollama/pull/6547
| 2,493,002,215
|
PR_kwDOJ0Z1Ps55we4u
| 6,547
|
Optimize container images for startup
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-28T21:14:51
| 2024-09-12T19:10:33
| 2024-09-12T19:10:30
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6547",
"html_url": "https://github.com/ollama/ollama/pull/6547",
"diff_url": "https://github.com/ollama/ollama/pull/6547.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6547.patch",
"merged_at": "2024-09-12T19:10:30"
}
|
Replaces #6485
Move the payload handling logic to a discrete go module so we can start to lay the foundation to toggle between C++ and Go runner implementation at build time.
This change adjusts how to handle runner payloads to support container builds where we keep them extracted in the filesystem. This makes it easier to optimize the cpu/cuda vs cpu/rocm images for size, and should result in faster startup times for container images.
Looks like container startup time is down to ~100ms on a warm system.
ROCm image updated to use a base ubuntu image and just use our libraries. The official images and packages pull in compilers as dependencies so this seems to be the optimal lean setup.
Fixes #6541
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6547/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6547/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2762
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2762/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2762/comments
|
https://api.github.com/repos/ollama/ollama/issues/2762/events
|
https://github.com/ollama/ollama/issues/2762
| 2,153,642,588
|
I_kwDOJ0Z1Ps6AXfpc
| 2,762
|
Quip# support for better quantization and quality.
|
{
"login": "Mayorc1978",
"id": 32271686,
"node_id": "MDQ6VXNlcjMyMjcxNjg2",
"avatar_url": "https://avatars.githubusercontent.com/u/32271686?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Mayorc1978",
"html_url": "https://github.com/Mayorc1978",
"followers_url": "https://api.github.com/users/Mayorc1978/followers",
"following_url": "https://api.github.com/users/Mayorc1978/following{/other_user}",
"gists_url": "https://api.github.com/users/Mayorc1978/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Mayorc1978/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Mayorc1978/subscriptions",
"organizations_url": "https://api.github.com/users/Mayorc1978/orgs",
"repos_url": "https://api.github.com/users/Mayorc1978/repos",
"events_url": "https://api.github.com/users/Mayorc1978/events{/privacy}",
"received_events_url": "https://api.github.com/users/Mayorc1978/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-02-26T09:09:52
| 2024-02-26T09:09:52
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Quip# quantization methods are proving a drastic improvement in quantizing the models while retaining the quality of the model, especially with **Quip# 3bit quantization getting better results than Theoretical Lossless (FP16) 4 Bit**.
I think supporting these new quantization methods would benefit a lot the Ollama community.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2762/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2762/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6913
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6913/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6913/comments
|
https://api.github.com/repos/ollama/ollama/issues/6913/events
|
https://github.com/ollama/ollama/issues/6913
| 2,541,987,694
|
I_kwDOJ0Z1Ps6Xg6du
| 6,913
|
On multi-GPU inference speed limited by performance of single CPU core
|
{
"login": "jonathankfmn",
"id": 80793551,
"node_id": "MDQ6VXNlcjgwNzkzNTUx",
"avatar_url": "https://avatars.githubusercontent.com/u/80793551?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jonathankfmn",
"html_url": "https://github.com/jonathankfmn",
"followers_url": "https://api.github.com/users/jonathankfmn/followers",
"following_url": "https://api.github.com/users/jonathankfmn/following{/other_user}",
"gists_url": "https://api.github.com/users/jonathankfmn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jonathankfmn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jonathankfmn/subscriptions",
"organizations_url": "https://api.github.com/users/jonathankfmn/orgs",
"repos_url": "https://api.github.com/users/jonathankfmn/repos",
"events_url": "https://api.github.com/users/jonathankfmn/events{/privacy}",
"received_events_url": "https://api.github.com/users/jonathankfmn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-23T08:56:11
| 2025-01-07T23:59:17
| 2025-01-07T23:59:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?


I have a server with 4 RTX 4090 GPUs. When I run a model, all GPUs are utilized between 10-25%, while the CPU is at 100%, but only using one thread. All layers are loaded into the GPUs.
I already tried, the `num_thread` parameter that does not change anything.
I believe the single thread process in the CPU is limiting the response speed. Is there a way to improve the speed? By changing the configuration?
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.11
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6913/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6913/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7983
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7983/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7983/comments
|
https://api.github.com/repos/ollama/ollama/issues/7983/events
|
https://github.com/ollama/ollama/pull/7983
| 2,724,336,702
|
PR_kwDOJ0Z1Ps6EZSqk
| 7,983
|
Add K/V cache quantization config to Modelfile (Follow-Up to PR #6279)
|
{
"login": "dmatora",
"id": 647062,
"node_id": "MDQ6VXNlcjY0NzA2Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/647062?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dmatora",
"html_url": "https://github.com/dmatora",
"followers_url": "https://api.github.com/users/dmatora/followers",
"following_url": "https://api.github.com/users/dmatora/following{/other_user}",
"gists_url": "https://api.github.com/users/dmatora/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dmatora/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dmatora/subscriptions",
"organizations_url": "https://api.github.com/users/dmatora/orgs",
"repos_url": "https://api.github.com/users/dmatora/repos",
"events_url": "https://api.github.com/users/dmatora/events{/privacy}",
"received_events_url": "https://api.github.com/users/dmatora/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 1
| 2024-12-07T04:41:39
| 2024-12-10T03:55:38
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7983",
"html_url": "https://github.com/ollama/ollama/pull/7983",
"diff_url": "https://github.com/ollama/ollama/pull/7983.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7983.patch",
"merged_at": null
}
|
This PR attempts to add support for setting the K/V cache quantization type directly from a model’s Modelfile, building on the recently merged K/V cache quantization feature introduced in PR #6279. The original contributor, @sammcj, spent months navigating architectural challenges and extensive review cycles to bring K/V cache quantization into Ollama. Given the complexity and the Ollama team’s reluctance to modify the API, @sammcj understandably decided not to pursue Modelfile support further.
The core issue is that introducing Modelfile parameters for `kv_cache_type` without changing the API would require hacky workarounds—such as writing the Modelfile’s `kv_cache_type` value into an environment variable in `server/routes.go:modelOptions` and then relying on `envconfig.KvCacheType()` downstream. Since the Ollama team has clearly indicated they’re not willing to expand or alter the API for this feature at this time, providing a clean, fully integrated solution is not feasible.
This PR, therefore, is not expected to be merged. Instead, it serves as a reference point or temporary measure for users who can’t wait for a more elegant, long-term architectural change and are willing to maintain a custom build of Ollama. It acknowledges the trade-offs and imperfection in the approach, hoping that future involvement from the Ollama team will enable a more robust solution eventually.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7983/reactions",
"total_count": 5,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 5,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7983/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2439
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2439/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2439/comments
|
https://api.github.com/repos/ollama/ollama/issues/2439/events
|
https://github.com/ollama/ollama/issues/2439
| 2,128,030,833
|
I_kwDOJ0Z1Ps5-1yxx
| 2,439
|
Feature
|
{
"login": "MikeyBeez",
"id": 14264000,
"node_id": "MDQ6VXNlcjE0MjY0MDAw",
"avatar_url": "https://avatars.githubusercontent.com/u/14264000?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MikeyBeez",
"html_url": "https://github.com/MikeyBeez",
"followers_url": "https://api.github.com/users/MikeyBeez/followers",
"following_url": "https://api.github.com/users/MikeyBeez/following{/other_user}",
"gists_url": "https://api.github.com/users/MikeyBeez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MikeyBeez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MikeyBeez/subscriptions",
"organizations_url": "https://api.github.com/users/MikeyBeez/orgs",
"repos_url": "https://api.github.com/users/MikeyBeez/repos",
"events_url": "https://api.github.com/users/MikeyBeez/events{/privacy}",
"received_events_url": "https://api.github.com/users/MikeyBeez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-10T01:20:26
| 2024-02-14T01:19:19
| 2024-02-14T01:19:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I wrote this code to chunk streaming into sentences. If you want to use it to enhance your streaming api call, please do.
import nltk
from langchain_community.llms import Ollama
# ANSI escape codes for colors
NEON_GREEN = '\033[92m'
RESET_COLOR = '\033[0m'
# Download NLTK's PunktSentenceTokenizer
nltk.download('punkt')
llm = Ollama(model="llama2")
query = "Explain how large language models work"
sentence_buffer = ""
for chunk in llm.stream(query):
# Directly process the chunk since it's assumed to be a string
sentence_buffer += chunk
sentences = nltk.sent_tokenize(sentence_buffer)
if len(sentences) > 1:
print(NEON_GREEN + sentences[0] + RESET_COLOR)
sentence_buffer = " ".join(sentences[1:])
# Print any remaining text at the end
if sentence_buffer:
print(NEON_GREEN + sentence_buffer + RESET_COLOR)
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2439/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2439/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/5606
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5606/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5606/comments
|
https://api.github.com/repos/ollama/ollama/issues/5606/events
|
https://github.com/ollama/ollama/pull/5606
| 2,401,331,859
|
PR_kwDOJ0Z1Ps50_8EA
| 5,606
|
server/llama.cpp: stream quantize progress to client
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-10T17:42:13
| 2024-07-10T21:25:39
| 2024-07-10T21:25:39
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5606",
"html_url": "https://github.com/ollama/ollama/pull/5606",
"diff_url": "https://github.com/ollama/ollama/pull/5606.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5606.patch",
"merged_at": null
}
|
added in streaming to `Quantize` to stream back tensor progress on quantization.
patched llama.cpp to involve a callback containing streaming progress
`quantizing model tensors 69/195 ⠹ `
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5606/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5606/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3080
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3080/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3080/comments
|
https://api.github.com/repos/ollama/ollama/issues/3080/events
|
https://github.com/ollama/ollama/pull/3080
| 2,182,350,683
|
PR_kwDOJ0Z1Ps5patyC
| 3,080
|
token repeat limit for prediction requests
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-12T18:03:51
| 2024-03-13T02:08:26
| 2024-03-13T02:08:25
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3080",
"html_url": "https://github.com/ollama/ollama/pull/3080",
"diff_url": "https://github.com/ollama/ollama/pull/3080.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3080.patch",
"merged_at": "2024-03-13T02:08:25"
}
|
- abort prediction (generate/chat) requests when a token repeat limit is hit
- this prevent json format infinite loops
- this prevents a stuck request from starving the other queued requests
- move completion cancellation to its own function
Tested with this code from #1910
```python
import requests
import json
country = "france"
schema = {
"city": {
"type": "string",
"description": "Name of the city"
},
"lat":{
"type": "float",
"description": "Decimal Latitude of the city"
},
"lon":{
"type": "float",
"description": "Decimal Longitude of the city"
}
}
payload = {
"model": "mistral-no-repeat",
"messages": [
{"role": "system", "content": f"You are a helpful AI assistant. The user will enter a country name and the assistant will return the decimal latitude and decimal longitude of the capital of the country. Output in JSON using the schema defined here: {schema}."},
{"role": "user", "content": "japan"},
{"role": "assistant", "content": "{\"city\": \"Tokyo\", \"lat\": 35.6748, \"lon\": 139.7624}"},
{"role": "user", "content": country},
],
"format": "json",
"stream": False
}
response = requests.post ("http://localhost:11434/api/chat", json=payload)
response.raise_for_status()
chat = response.json()
try:
message_content_json = json.loads(chat['message']['content'])
print(message_content_json)
except json.JSONDecodeError:
print("JSONDecodeError: The content is not in proper JSON format.")
```
output is more reliable, with occasional JSON format failures which can be handled with a retry:
```bash
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
{'city': 'Paris', ' lat ': 48.8566, ' lon': 2.3522}
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
{'city': 'Paris', 'lat': 48.8566, ' lon ': 2.3522}
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
{'city': 'Paris', 'lat': 48.8566, ' lon ': 2.3522}
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
{'city': 'Paris', ' lat': 48.8566, ' lon': 2.3522}
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
{'city': 'Paris', 'lat': 48.8566, ' lon ': 2.3522}
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
{'city': 'Paris', ' lat ': 48.8566, ' lon': 2.3522}
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
JSONDecodeError: The content is not in proper JSON format.
bruce@Bruces-MBP triage % poetry run python3 make_json_request.py
{'city': 'Paris', ' lat ': 48.8534, ' lon': 2.3522}
```
resolves #1910
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3080/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3080/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5974
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5974/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5974/comments
|
https://api.github.com/repos/ollama/ollama/issues/5974/events
|
https://github.com/ollama/ollama/issues/5974
| 2,431,692,490
|
I_kwDOJ0Z1Ps6Q8K7K
| 5,974
|
tls: failed to verify certificate: x509: certificate is valid for ollama.com, www.ollama.com, registry.ollama.com, not registry.ollama.ai
|
{
"login": "zmiimz",
"id": 1889372,
"node_id": "MDQ6VXNlcjE4ODkzNzI=",
"avatar_url": "https://avatars.githubusercontent.com/u/1889372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zmiimz",
"html_url": "https://github.com/zmiimz",
"followers_url": "https://api.github.com/users/zmiimz/followers",
"following_url": "https://api.github.com/users/zmiimz/following{/other_user}",
"gists_url": "https://api.github.com/users/zmiimz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zmiimz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zmiimz/subscriptions",
"organizations_url": "https://api.github.com/users/zmiimz/orgs",
"repos_url": "https://api.github.com/users/zmiimz/repos",
"events_url": "https://api.github.com/users/zmiimz/events{/privacy}",
"received_events_url": "https://api.github.com/users/zmiimz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-07-26T08:19:36
| 2024-07-26T08:27:31
| 2024-07-26T08:27:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Error message on attempt to pull some model:
>Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama3.1/manifests/latest": tls: failed to verify certificate: x509: certificate is valid for ollama.com, www.ollama.com, registry.ollama.com, not registry.ollama.ai
checked
>openssl s_client -connect registry.ollama.ai:443 -showcerts </dev/null 2>/dev/null | openssl x509 -noout -text | grep -A1 "Subject Alternative Name"
X509v3 Subject Alternative Name:
DNS:ollama.ai, DNS:*.ollama.ai
linked lib in binary seeems to ignore the settings DNS:*.ollama.ai ?
>./ollama-linux-amd64 --version
ollama version is 0.3.0
>openssl -v
OpenSSL 3.3.1 4 Jun 2024 (Library: OpenSSL 3.3.1 4 Jun 2024)
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.0, 0.2.8, 0.2.7
|
{
"login": "zmiimz",
"id": 1889372,
"node_id": "MDQ6VXNlcjE4ODkzNzI=",
"avatar_url": "https://avatars.githubusercontent.com/u/1889372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zmiimz",
"html_url": "https://github.com/zmiimz",
"followers_url": "https://api.github.com/users/zmiimz/followers",
"following_url": "https://api.github.com/users/zmiimz/following{/other_user}",
"gists_url": "https://api.github.com/users/zmiimz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zmiimz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zmiimz/subscriptions",
"organizations_url": "https://api.github.com/users/zmiimz/orgs",
"repos_url": "https://api.github.com/users/zmiimz/repos",
"events_url": "https://api.github.com/users/zmiimz/events{/privacy}",
"received_events_url": "https://api.github.com/users/zmiimz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5974/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5974/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1825
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1825/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1825/comments
|
https://api.github.com/repos/ollama/ollama/issues/1825/events
|
https://github.com/ollama/ollama/pull/1825
| 2,068,679,870
|
PR_kwDOJ0Z1Ps5jYiYI
| 1,825
|
feat: add support for min_p (resolve #1142)
|
{
"login": "Robitx",
"id": 8431097,
"node_id": "MDQ6VXNlcjg0MzEwOTc=",
"avatar_url": "https://avatars.githubusercontent.com/u/8431097?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Robitx",
"html_url": "https://github.com/Robitx",
"followers_url": "https://api.github.com/users/Robitx/followers",
"following_url": "https://api.github.com/users/Robitx/following{/other_user}",
"gists_url": "https://api.github.com/users/Robitx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Robitx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Robitx/subscriptions",
"organizations_url": "https://api.github.com/users/Robitx/orgs",
"repos_url": "https://api.github.com/users/Robitx/repos",
"events_url": "https://api.github.com/users/Robitx/events{/privacy}",
"received_events_url": "https://api.github.com/users/Robitx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 35
| 2024-01-06T15:28:07
| 2024-07-27T21:44:52
| 2024-07-27T21:37:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1825",
"html_url": "https://github.com/ollama/ollama/pull/1825",
"diff_url": "https://github.com/ollama/ollama/pull/1825.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1825.patch",
"merged_at": "2024-07-27T21:37:40"
}
|
Heavy lifting was done by https://github.com/ggerganov/llama.cpp/pull/3841 this PR just makes the option accessible.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1825/reactions",
"total_count": 20,
"+1": 18,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 2,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1825/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7736
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7736/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7736/comments
|
https://api.github.com/repos/ollama/ollama/issues/7736/events
|
https://github.com/ollama/ollama/issues/7736
| 2,671,149,673
|
I_kwDOJ0Z1Ps6fNoJp
| 7,736
|
SSL support (For Brave Leo AI & others)
|
{
"login": "NikosDi",
"id": 31016247,
"node_id": "MDQ6VXNlcjMxMDE2MjQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/31016247?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NikosDi",
"html_url": "https://github.com/NikosDi",
"followers_url": "https://api.github.com/users/NikosDi/followers",
"following_url": "https://api.github.com/users/NikosDi/following{/other_user}",
"gists_url": "https://api.github.com/users/NikosDi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NikosDi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NikosDi/subscriptions",
"organizations_url": "https://api.github.com/users/NikosDi/orgs",
"repos_url": "https://api.github.com/users/NikosDi/repos",
"events_url": "https://api.github.com/users/NikosDi/events{/privacy}",
"received_events_url": "https://api.github.com/users/NikosDi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-19T07:46:27
| 2024-12-02T15:33:00
| 2024-12-02T15:33:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello.
Using Brave Leo AI with Ollama locally, on the same computer, is working like a charm using HTTP://localhost:11434/v1/chat/completions
But when I try to use Ollama via LAN from a different PC, Brave Leo demands a secure HTTP connection - HTTPS is mandatory and Brave setting doesn't allow you to put a simple HTTP address.
I have set `OLLAMA_HOST=0.0.0.0` as environment variable enabling network access and Ollama listens on HTTP://myip:11434 locally.
Is it possible for the Ollama serving framework to listen also on HTTP(S) SSL ?
thank you
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7736/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7736/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/274
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/274/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/274/comments
|
https://api.github.com/repos/ollama/ollama/issues/274/events
|
https://github.com/ollama/ollama/issues/274
| 1,835,902,839
|
I_kwDOJ0Z1Ps5tbad3
| 274
|
Ability to download Vicuna v1.5 Models
|
{
"login": "jlarmstrongiv",
"id": 20903247,
"node_id": "MDQ6VXNlcjIwOTAzMjQ3",
"avatar_url": "https://avatars.githubusercontent.com/u/20903247?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jlarmstrongiv",
"html_url": "https://github.com/jlarmstrongiv",
"followers_url": "https://api.github.com/users/jlarmstrongiv/followers",
"following_url": "https://api.github.com/users/jlarmstrongiv/following{/other_user}",
"gists_url": "https://api.github.com/users/jlarmstrongiv/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jlarmstrongiv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jlarmstrongiv/subscriptions",
"organizations_url": "https://api.github.com/users/jlarmstrongiv/orgs",
"repos_url": "https://api.github.com/users/jlarmstrongiv/repos",
"events_url": "https://api.github.com/users/jlarmstrongiv/events{/privacy}",
"received_events_url": "https://api.github.com/users/jlarmstrongiv/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-08-04T00:42:15
| 2023-08-04T05:13:47
| 2023-08-04T04:48:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
- https://twitter.com/lmsysorg/status/1686794639469371393
- https://github.com/lm-sys/FastChat/tree/main#model-weights
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/274/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/274/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4718
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4718/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4718/comments
|
https://api.github.com/repos/ollama/ollama/issues/4718/events
|
https://github.com/ollama/ollama/issues/4718
| 2,325,130,484
|
I_kwDOJ0Z1Ps6Klqz0
| 4,718
|
Ensuring Flash Attention Support in Official Docker Images by Setting Environment Variables
|
{
"login": "00010110",
"id": 8935044,
"node_id": "MDQ6VXNlcjg5MzUwNDQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/8935044?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/00010110",
"html_url": "https://github.com/00010110",
"followers_url": "https://api.github.com/users/00010110/followers",
"following_url": "https://api.github.com/users/00010110/following{/other_user}",
"gists_url": "https://api.github.com/users/00010110/gists{/gist_id}",
"starred_url": "https://api.github.com/users/00010110/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/00010110/subscriptions",
"organizations_url": "https://api.github.com/users/00010110/orgs",
"repos_url": "https://api.github.com/users/00010110/repos",
"events_url": "https://api.github.com/users/00010110/events{/privacy}",
"received_events_url": "https://api.github.com/users/00010110/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-05-30T08:59:18
| 2024-05-30T16:23:01
| 2024-05-30T16:23:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Does the official Docker image have a runtime environment for Flash Attention? Is it sufficient to directly set the environment variable OLLAMA_FLASH_ATTENTION to configure it?
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
v0.1.39
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4718/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4718/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2840
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2840/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2840/comments
|
https://api.github.com/repos/ollama/ollama/issues/2840/events
|
https://github.com/ollama/ollama/issues/2840
| 2,161,765,792
|
I_kwDOJ0Z1Ps6A2e2g
| 2,840
|
Plans for Ollama Go library?
|
{
"login": "eliben",
"id": 1130906,
"node_id": "MDQ6VXNlcjExMzA5MDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/1130906?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eliben",
"html_url": "https://github.com/eliben",
"followers_url": "https://api.github.com/users/eliben/followers",
"following_url": "https://api.github.com/users/eliben/following{/other_user}",
"gists_url": "https://api.github.com/users/eliben/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eliben/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eliben/subscriptions",
"organizations_url": "https://api.github.com/users/eliben/orgs",
"repos_url": "https://api.github.com/users/eliben/repos",
"events_url": "https://api.github.com/users/eliben/events{/privacy}",
"received_events_url": "https://api.github.com/users/eliben/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 9
| 2024-02-29T17:30:56
| 2024-05-17T22:53:04
| 2024-05-17T22:53:04
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The Ollama team has recently released native client libraries for JS (https://github.com/ollama/ollama-js) and Python (https://github.com/ollama/ollama-python).
Are there plans to develop a similar native Go library?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2840/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2840/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5462
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5462/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5462/comments
|
https://api.github.com/repos/ollama/ollama/issues/5462/events
|
https://github.com/ollama/ollama/issues/5462
| 2,388,867,949
|
I_kwDOJ0Z1Ps6OYztt
| 5,462
|
Inaccurate reasoning of multimodal (llava) using HTTP
|
{
"login": "peanutpaste",
"id": 62008312,
"node_id": "MDQ6VXNlcjYyMDA4MzEy",
"avatar_url": "https://avatars.githubusercontent.com/u/62008312?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/peanutpaste",
"html_url": "https://github.com/peanutpaste",
"followers_url": "https://api.github.com/users/peanutpaste/followers",
"following_url": "https://api.github.com/users/peanutpaste/following{/other_user}",
"gists_url": "https://api.github.com/users/peanutpaste/gists{/gist_id}",
"starred_url": "https://api.github.com/users/peanutpaste/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/peanutpaste/subscriptions",
"organizations_url": "https://api.github.com/users/peanutpaste/orgs",
"repos_url": "https://api.github.com/users/peanutpaste/repos",
"events_url": "https://api.github.com/users/peanutpaste/events{/privacy}",
"received_events_url": "https://api.github.com/users/peanutpaste/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-07-03T15:03:11
| 2024-07-08T02:18:20
| 2024-07-04T16:22:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I passed the image path directly to the command line through cmd and asked llava to describe the image, which was very accurate. But when I called llava through http (http://127.0.0.1:11434/v1), its description was incorrect. How can I make it maintain the same inference effect as cmd? Is it because of my base64 conversion problem?
<img width="183" alt="dog" src="https://github.com/ollama/ollama/assets/62008312/fac417e1-68f2-46cb-b2e8-259a22594b7b">
this is cmd:
<img width="729" alt="aaaa" src="https://github.com/ollama/ollama/assets/62008312/4aff72bc-395e-4256-822d-fd0512811826">
this is http:
<img width="726" alt="Snipaste_2024-07-03_21-07-39" src="https://github.com/ollama/ollama/assets/62008312/baab806c-268d-4d97-b294-0db66127988e">
```
this is my img2base64 code:
with open(image_path, "rb") as f:
encoded_string = base64.b64encode(f.read()).decode("utf-8")
return encoded_string
```
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5462/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5462/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1077
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1077/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1077/comments
|
https://api.github.com/repos/ollama/ollama/issues/1077/events
|
https://github.com/ollama/ollama/issues/1077
| 1,988,302,452
|
I_kwDOJ0Z1Ps52gxZ0
| 1,077
|
Error when attempted to download Ollama Mac/Linux version `Error fetching release`
|
{
"login": "abdulazizalmass",
"id": 14962433,
"node_id": "MDQ6VXNlcjE0OTYyNDMz",
"avatar_url": "https://avatars.githubusercontent.com/u/14962433?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/abdulazizalmass",
"html_url": "https://github.com/abdulazizalmass",
"followers_url": "https://api.github.com/users/abdulazizalmass/followers",
"following_url": "https://api.github.com/users/abdulazizalmass/following{/other_user}",
"gists_url": "https://api.github.com/users/abdulazizalmass/gists{/gist_id}",
"starred_url": "https://api.github.com/users/abdulazizalmass/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/abdulazizalmass/subscriptions",
"organizations_url": "https://api.github.com/users/abdulazizalmass/orgs",
"repos_url": "https://api.github.com/users/abdulazizalmass/repos",
"events_url": "https://api.github.com/users/abdulazizalmass/events{/privacy}",
"received_events_url": "https://api.github.com/users/abdulazizalmass/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2023-11-10T19:58:07
| 2023-11-10T21:02:10
| 2023-11-10T21:02:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Thank you for the amazing project. Kindly, close this once the download issue is resolved
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1077/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1077/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7559
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7559/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7559/comments
|
https://api.github.com/repos/ollama/ollama/issues/7559/events
|
https://github.com/ollama/ollama/issues/7559
| 2,641,615,197
|
I_kwDOJ0Z1Ps6dc9ld
| 7,559
|
llama3.2-vision projector_info vision encoder absence
|
{
"login": "iBog",
"id": 168304,
"node_id": "MDQ6VXNlcjE2ODMwNA==",
"avatar_url": "https://avatars.githubusercontent.com/u/168304?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iBog",
"html_url": "https://github.com/iBog",
"followers_url": "https://api.github.com/users/iBog/followers",
"following_url": "https://api.github.com/users/iBog/following{/other_user}",
"gists_url": "https://api.github.com/users/iBog/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iBog/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iBog/subscriptions",
"organizations_url": "https://api.github.com/users/iBog/orgs",
"repos_url": "https://api.github.com/users/iBog/repos",
"events_url": "https://api.github.com/users/iBog/events{/privacy}",
"received_events_url": "https://api.github.com/users/iBog/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 0
| 2024-11-07T16:48:34
| 2024-11-08T22:21:58
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
How can I definitively identify a model as vision-compatible
without relying on keywords like "vision," "llava," or "-v" in its name?
I used to rely on the projector_info.has_vision_encoder parameter,
from API request POST http://localhost:11434/api/show (with correct body),
but it's absent in llama3.2-vision.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.0
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7559/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7559/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2731
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2731/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2731/comments
|
https://api.github.com/repos/ollama/ollama/issues/2731/events
|
https://github.com/ollama/ollama/issues/2731
| 2,152,311,654
|
I_kwDOJ0Z1Ps6ASatm
| 2,731
|
[Issue] using gemma model as a chatbot
|
{
"login": "cosmo3769",
"id": 53268607,
"node_id": "MDQ6VXNlcjUzMjY4NjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/53268607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cosmo3769",
"html_url": "https://github.com/cosmo3769",
"followers_url": "https://api.github.com/users/cosmo3769/followers",
"following_url": "https://api.github.com/users/cosmo3769/following{/other_user}",
"gists_url": "https://api.github.com/users/cosmo3769/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cosmo3769/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cosmo3769/subscriptions",
"organizations_url": "https://api.github.com/users/cosmo3769/orgs",
"repos_url": "https://api.github.com/users/cosmo3769/repos",
"events_url": "https://api.github.com/users/cosmo3769/events{/privacy}",
"received_events_url": "https://api.github.com/users/cosmo3769/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 13
| 2024-02-24T13:30:29
| 2024-07-18T22:37:15
| 2024-07-18T22:37:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I was using `mistral` model for my PDF chatbot. With the arrival of gemma model, I am trying to use this model. But it gives me an issue: ***After embedding external PDF document, when I ask question, it always gives me a response that it is not able to provide any information about the provided context.***
## Example of an issue:
If I uploaded `ssl cookbook` document, I ask a question: `What is SSL?`
In return the chatbot answers me with: `The context does not provide any information about what SSL is, so I cannot answer this question from the provided context.`
## Tech stack involved
* Using gemma:2b model. Also tried using gemma:7b (Will not use since this is running slow in local).
* Using `Xenova/all-MiniLM-L6-v2` embedding model from `@xenova/transformers` package.
* Using Langchain.
* Using Chroma as vectorstore.
## Reproduce
It is a next.js application using langchain, chroma and transfomers.js.
* Clone this repo: `https://github.com/cosmo3769/PDFChatter/tree/gemma-model`
* Follow `README.md` setup guide.
The same code works for `mistral` and `llama2:7b-chat` but fails to work when using `gemma:2b` or `gemma:7b`. Any specific tweaks needed for this?
@jmorganca @mxyng
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2731/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/2731/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7995
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7995/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7995/comments
|
https://api.github.com/repos/ollama/ollama/issues/7995/events
|
https://github.com/ollama/ollama/issues/7995
| 2,724,964,024
|
I_kwDOJ0Z1Ps6ia6a4
| 7,995
|
Not working properly after upgrading to the latest macOS version.
|
{
"login": "liujuncn",
"id": 902005,
"node_id": "MDQ6VXNlcjkwMjAwNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/902005?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/liujuncn",
"html_url": "https://github.com/liujuncn",
"followers_url": "https://api.github.com/users/liujuncn/followers",
"following_url": "https://api.github.com/users/liujuncn/following{/other_user}",
"gists_url": "https://api.github.com/users/liujuncn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/liujuncn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/liujuncn/subscriptions",
"organizations_url": "https://api.github.com/users/liujuncn/orgs",
"repos_url": "https://api.github.com/users/liujuncn/repos",
"events_url": "https://api.github.com/users/liujuncn/events{/privacy}",
"received_events_url": "https://api.github.com/users/liujuncn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-12-08T05:27:46
| 2024-12-13T20:15:08
| 2024-12-09T06:00:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama can not work properly after upgrading to 0.5.1 on Mac. In the previous version, all downloaded models could perform inference normally. However, the new version seems unable to utilize macOS unified memory and has a memory leak issue, with memory usage continuously increasing.
Hardware: Apple M4 Mac mini
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.5.1
|
{
"login": "liujuncn",
"id": 902005,
"node_id": "MDQ6VXNlcjkwMjAwNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/902005?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/liujuncn",
"html_url": "https://github.com/liujuncn",
"followers_url": "https://api.github.com/users/liujuncn/followers",
"following_url": "https://api.github.com/users/liujuncn/following{/other_user}",
"gists_url": "https://api.github.com/users/liujuncn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/liujuncn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/liujuncn/subscriptions",
"organizations_url": "https://api.github.com/users/liujuncn/orgs",
"repos_url": "https://api.github.com/users/liujuncn/repos",
"events_url": "https://api.github.com/users/liujuncn/events{/privacy}",
"received_events_url": "https://api.github.com/users/liujuncn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7995/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7995/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7262
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7262/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7262/comments
|
https://api.github.com/repos/ollama/ollama/issues/7262/events
|
https://github.com/ollama/ollama/pull/7262
| 2,598,371,296
|
PR_kwDOJ0Z1Ps5_JQJJ
| 7,262
|
Expose GPU and basic system info
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 2
| 2024-10-18T20:52:57
| 2024-11-19T17:54:38
| null |
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7262",
"html_url": "https://github.com/ollama/ollama/pull/7262",
"diff_url": "https://github.com/ollama/ollama/pull/7262.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7262.patch",
"merged_at": null
}
|
This adds a new API and CLI UX to expose some basic information about the system.
Example CLI output:
```
% ollama info
Client:
Version: 0.0.0
Configuration: /Users/daniel/.ollama
Connection: http://localhost:11434
Server:
Version: 0.3.13-13-g436602d
Models:
Store: /Users/daniel/.ollama/models
Downloaded: 41
Filesystem Used: 475.8 GiB
Running: 0
VRAM Used: 0 B
Compute:
Available Runners: cpu, cpu_avx, cpu_avx2,
cuda_v11, cuda_v12, rocm
System:
CPU Cores: 16
Total Memory: 62.6 GiB
Free Memory: 60.9 GiB
Free Swap: 7.5 GiB
Supported GPUs:
cuda_v12 GPU-19fc4f1e-fbcc-de33-f14a-ae21199420b6:
Name: NVIDIA GeForce RTX 3060
Total Memory: 11.8 GiB
Free Memory: 11.6 GiB
Compute: 8.6
Driver: 12.4
cuda_v12 GPU-f3a94ab8-b31d-61ff-9fbb-ce91ac1cdd95:
Name: NVIDIA GeForce RTX 3060
Total Memory: 11.8 GiB
Free Memory: 11.6 GiB
Compute: 8.6
Driver: 12.4
Unsupported GPUs: N/A
Discovery Errors:
AMD GPUs not detected
```
Fixes #7180
Fixes #3822
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7262/reactions",
"total_count": 5,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 5,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7262/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1903
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1903/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1903/comments
|
https://api.github.com/repos/ollama/ollama/issues/1903/events
|
https://github.com/ollama/ollama/issues/1903
| 2,074,768,994
|
I_kwDOJ0Z1Ps57qnZi
| 1,903
|
Container image not using CUDA despite GPU being visible
|
{
"login": "otavio-silva",
"id": 22914610,
"node_id": "MDQ6VXNlcjIyOTE0NjEw",
"avatar_url": "https://avatars.githubusercontent.com/u/22914610?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/otavio-silva",
"html_url": "https://github.com/otavio-silva",
"followers_url": "https://api.github.com/users/otavio-silva/followers",
"following_url": "https://api.github.com/users/otavio-silva/following{/other_user}",
"gists_url": "https://api.github.com/users/otavio-silva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/otavio-silva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/otavio-silva/subscriptions",
"organizations_url": "https://api.github.com/users/otavio-silva/orgs",
"repos_url": "https://api.github.com/users/otavio-silva/repos",
"events_url": "https://api.github.com/users/otavio-silva/events{/privacy}",
"received_events_url": "https://api.github.com/users/otavio-silva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-01-10T16:47:23
| 2024-01-10T23:21:58
| 2024-01-10T23:21:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# Description
When running the ollama container, it is not using the GPU although it is visible in the same container, as it outputs `nvidia-smi` normally.
# Steps to reproduce
1. Run the command `podman run --device nvidia.com/gpu=all --security-opt label=disable --detach --volume .ollama:/root/.ollama -p 11434:11434 --name ollama-19 ollama/ollama:0.1.19`
2. Run the command `podman exec -it ollama-19 ollama run llama2-uncensored`
3. Run the command `podman exec -it ollama-19 nvidia-smi` and confirm the output:
```
Wed Jan 10 16:44:39 2024
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 545.36 Driver Version: 546.33 CUDA Version: 12.3 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 3080 ... On | 00000000:01:00.0 Off | N/A |
| N/A 53C P0 32W / 165W | 0MiB / 16384MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| No running processes found |
+---------------------------------------------------------------------------------------+
```
# System info
```
Nome do host: GE76RAIDER
Nome do sistema operacional: Microsoft Windows 11 Pro
Versão do sistema operacional: 10.0.22631 N/A compilação 22631
Fabricante do sistema operacional: Microsoft Corporation
Configuração do SO: Estação de trabalho autônoma
Tipo de compilação do sistema operacional: Multiprocessor Free
Proprietário registrado: otavioasilva@hotmail.com
Organização registrada: N/A
Identificação do produto: 00330-80000-00000-AA520
Data da instalação original: 02/08/2023, 14:30:14
Tempo de Inicialização do Sistema: 10/01/2024, 12:32:44
Fabricante do sistema: Micro-Star International Co., Ltd.
Modelo do sistema: Raider GE76 12UHS
Tipo de sistema: x64-based PC
Processador(es): 1 processador(es) instalado(s).
[01]: Intel64 Family 6 Model 154 Stepping 3 GenuineIntel ~2900 Mhz
Versão do BIOS: American Megatrends International, LLC. E17K4IMS.20D, 26/06/2023
Pasta do Windows: C:\WINDOWS
Pasta do sistema: C:\WINDOWS\system32
Inicializar dispositivo: \Device\HarddiskVolume1
Localidade do sistema: pt-br;Português (Brasil)
Localidade de entrada: en-us;Inglês (Estados Unidos)
Fuso horário: (UTC-03:00) Brasília
Memória física total: 65.237 MB
Memória física disponível: 46.571 MB
Memória Virtual: Tamanho Máximo: 74.965 MB
Memória Virtual: Disponível: 50.991 MB
Memória Virtual: Em Uso: 23.974 MB
Local(is) de arquivo de paginação: C:\pagefile.sys
Domínio: WORKGROUP
Servidor de Logon: \\GE76RAIDER
Hotfix(es): 4 hotfix(es) instalado(s).
[01]: KB5033920
[02]: KB5027397
[03]: KB5034123
[04]: KB5032393
Placa(s) de Rede: 3 NIC(s) instalado(s).
[01]: Killer E3100G 2.5 Gigabit Ethernet Controller
Nome da conexão: Ethernet
Status: Mídia desconectada
[02]: Killer(R) Wi-Fi 6E AX1675i 160MHz Wireless Network Adapter (211NGW)
Nome da conexão: Wi-Fi
DHCP ativado: Sim
Servidor DHCP: 192.168.1.1
Endereço(es) IP
[01]: 192.168.1.26
[03]: TAP-Windows Adapter V9
Nome da conexão: TAP-Windows
Status: Mídia desconectada
Requisitos do Hyper-V: Hipervisor detectado. Recursos necessários para o Hyper-V não serão exibidos.
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1903/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1903/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2983
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2983/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2983/comments
|
https://api.github.com/repos/ollama/ollama/issues/2983/events
|
https://github.com/ollama/ollama/pull/2983
| 2,174,194,890
|
PR_kwDOJ0Z1Ps5o--OM
| 2,983
|
Get term size from stderr and /dev/tty to avoid redirection caused getting size error.
|
{
"login": "powerfooI",
"id": 26134111,
"node_id": "MDQ6VXNlcjI2MTM0MTEx",
"avatar_url": "https://avatars.githubusercontent.com/u/26134111?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/powerfooI",
"html_url": "https://github.com/powerfooI",
"followers_url": "https://api.github.com/users/powerfooI/followers",
"following_url": "https://api.github.com/users/powerfooI/following{/other_user}",
"gists_url": "https://api.github.com/users/powerfooI/gists{/gist_id}",
"starred_url": "https://api.github.com/users/powerfooI/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/powerfooI/subscriptions",
"organizations_url": "https://api.github.com/users/powerfooI/orgs",
"repos_url": "https://api.github.com/users/powerfooI/repos",
"events_url": "https://api.github.com/users/powerfooI/events{/privacy}",
"received_events_url": "https://api.github.com/users/powerfooI/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-03-07T15:57:52
| 2024-03-10T15:54:13
| 2024-03-10T15:54:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2983",
"html_url": "https://github.com/ollama/ollama/pull/2983",
"diff_url": "https://github.com/ollama/ollama/pull/2983.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2983.patch",
"merged_at": null
}
|
As issue #2970 said, `ollama run phi | tee llms/out.txt` leads to error. When using a pipe operator following the `ollama` command the`os.Stdout` points to a pipe instead of the terminal, so `term.Getsize` failed.
Now the code gets terminal size from stderr first, which enables `ollama run phi | tee llms/out.txt` to execute normally.
If the user enter `ollama run phi 2>&1 | tee llms/out.txt`, the code then tries to get `/dev/tty` instead.
|
{
"login": "powerfooI",
"id": 26134111,
"node_id": "MDQ6VXNlcjI2MTM0MTEx",
"avatar_url": "https://avatars.githubusercontent.com/u/26134111?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/powerfooI",
"html_url": "https://github.com/powerfooI",
"followers_url": "https://api.github.com/users/powerfooI/followers",
"following_url": "https://api.github.com/users/powerfooI/following{/other_user}",
"gists_url": "https://api.github.com/users/powerfooI/gists{/gist_id}",
"starred_url": "https://api.github.com/users/powerfooI/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/powerfooI/subscriptions",
"organizations_url": "https://api.github.com/users/powerfooI/orgs",
"repos_url": "https://api.github.com/users/powerfooI/repos",
"events_url": "https://api.github.com/users/powerfooI/events{/privacy}",
"received_events_url": "https://api.github.com/users/powerfooI/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2983/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2983/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/176
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/176/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/176/comments
|
https://api.github.com/repos/ollama/ollama/issues/176/events
|
https://github.com/ollama/ollama/issues/176
| 1,816,779,137
|
I_kwDOJ0Z1Ps5sSdmB
| 176
|
Javascript package
|
{
"login": "mekb-turtle",
"id": 48314599,
"node_id": "MDQ6VXNlcjQ4MzE0NTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/48314599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mekb-turtle",
"html_url": "https://github.com/mekb-turtle",
"followers_url": "https://api.github.com/users/mekb-turtle/followers",
"following_url": "https://api.github.com/users/mekb-turtle/following{/other_user}",
"gists_url": "https://api.github.com/users/mekb-turtle/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mekb-turtle/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mekb-turtle/subscriptions",
"organizations_url": "https://api.github.com/users/mekb-turtle/orgs",
"repos_url": "https://api.github.com/users/mekb-turtle/repos",
"events_url": "https://api.github.com/users/mekb-turtle/events{/privacy}",
"received_events_url": "https://api.github.com/users/mekb-turtle/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2023-07-22T14:11:32
| 2024-02-20T00:48:47
| 2024-02-20T00:48:47
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hey! Does a Node.js package exist to interact with models through ollama? I don't particularly want to run external commands to interact with it. Thanks
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/176/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/176/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3266
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3266/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3266/comments
|
https://api.github.com/repos/ollama/ollama/issues/3266/events
|
https://github.com/ollama/ollama/issues/3266
| 2,197,049,612
|
I_kwDOJ0Z1Ps6C9FEM
| 3,266
|
When will the openai embedding format be compatible?
|
{
"login": "yuanjie-ai",
"id": 20265321,
"node_id": "MDQ6VXNlcjIwMjY1MzIx",
"avatar_url": "https://avatars.githubusercontent.com/u/20265321?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanjie-ai",
"html_url": "https://github.com/yuanjie-ai",
"followers_url": "https://api.github.com/users/yuanjie-ai/followers",
"following_url": "https://api.github.com/users/yuanjie-ai/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanjie-ai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanjie-ai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanjie-ai/subscriptions",
"organizations_url": "https://api.github.com/users/yuanjie-ai/orgs",
"repos_url": "https://api.github.com/users/yuanjie-ai/repos",
"events_url": "https://api.github.com/users/yuanjie-ai/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanjie-ai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-03-20T09:27:29
| 2024-07-31T18:02:24
| 2024-07-31T18:02:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
When will the openai embedding format be compatible?
### How should we solve this?
When will the openai embedding format be compatible?
### What is the impact of not solving this?
When will the openai embedding format be compatible?
### Anything else?
When will the openai embedding format be compatible?
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3266/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3266/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7169
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7169/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7169/comments
|
https://api.github.com/repos/ollama/ollama/issues/7169/events
|
https://github.com/ollama/ollama/pull/7169
| 2,580,012,198
|
PR_kwDOJ0Z1Ps5-R2sz
| 7,169
|
CI testing...
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-10T23:00:47
| 2024-10-11T00:00:26
| 2024-10-11T00:00:22
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7169",
"html_url": "https://github.com/ollama/ollama/pull/7169",
"diff_url": "https://github.com/ollama/ollama/pull/7169.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7169.patch",
"merged_at": null
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7169/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7169/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4998
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4998/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4998/comments
|
https://api.github.com/repos/ollama/ollama/issues/4998/events
|
https://github.com/ollama/ollama/issues/4998
| 2,348,021,924
|
I_kwDOJ0Z1Ps6L8_ik
| 4,998
|
Error "transferring model data " when creating a model
|
{
"login": "tigerkin89",
"id": 52234230,
"node_id": "MDQ6VXNlcjUyMjM0MjMw",
"avatar_url": "https://avatars.githubusercontent.com/u/52234230?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tigerkin89",
"html_url": "https://github.com/tigerkin89",
"followers_url": "https://api.github.com/users/tigerkin89/followers",
"following_url": "https://api.github.com/users/tigerkin89/following{/other_user}",
"gists_url": "https://api.github.com/users/tigerkin89/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tigerkin89/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tigerkin89/subscriptions",
"organizations_url": "https://api.github.com/users/tigerkin89/orgs",
"repos_url": "https://api.github.com/users/tigerkin89/repos",
"events_url": "https://api.github.com/users/tigerkin89/events{/privacy}",
"received_events_url": "https://api.github.com/users/tigerkin89/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 10
| 2024-06-12T07:24:20
| 2024-09-19T01:23:36
| 2024-07-01T23:34:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I create model , error happen like "Error: invalid model reference: qwen2-7b-instruct-q8_0.gguf".
$ ollama create qwen2.7b -f ./Modelfile
transferring model data
Error: invalid model reference: qwen2-7b-instruct-q8_0.gguf
$ more Modelfile
FROM qwen2-7b-instruct-q8_0.gguf
TEMPLATE """{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
{{ .Prompt }}<|im_end|>
{{ end }}<|im_start|>assistant
{{ .Response }}<|im_end|>
"""
PARAMETER num_ctx 64000
PARAMETER stop "<|im_start|>"
PARAMETER stop "<|im_end|>"
$ ls
Modelfile qwen2-7b-instruct-q8_0.gguf
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.43
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4998/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4998/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5916
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5916/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5916/comments
|
https://api.github.com/repos/ollama/ollama/issues/5916/events
|
https://github.com/ollama/ollama/issues/5916
| 2,427,751,063
|
I_kwDOJ0Z1Ps6QtIqX
| 5,916
|
Simplify model loading via model numbering
|
{
"login": "jpummill",
"id": 9271237,
"node_id": "MDQ6VXNlcjkyNzEyMzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/9271237?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jpummill",
"html_url": "https://github.com/jpummill",
"followers_url": "https://api.github.com/users/jpummill/followers",
"following_url": "https://api.github.com/users/jpummill/following{/other_user}",
"gists_url": "https://api.github.com/users/jpummill/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jpummill/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jpummill/subscriptions",
"organizations_url": "https://api.github.com/users/jpummill/orgs",
"repos_url": "https://api.github.com/users/jpummill/repos",
"events_url": "https://api.github.com/users/jpummill/events{/privacy}",
"received_events_url": "https://api.github.com/users/jpummill/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-07-24T14:30:07
| 2024-09-04T03:37:14
| 2024-09-04T03:37:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Model names can be very long and somewhat cryptic.
For example, I may have the following models on my system for testing:
mistral-nemo:12b-instruct-2407-q3_K_S
mistral-nemo:12b-instruct-2407-q4_K_S
mistral-nemo:12b-instruct-2407-q5_K_M
I think it would be helpful to see a numbered list of model names when using the load command from within the Ollama interface. For example, if I type "/load mi", instead of seeing "Error: model 'mi' not found", I would instead see the following:
1: mistral-nemo:12b-instruct-2407-q3_K_S
2: mistral-nemo:12b-instruct-2407-q4_K_S
3: mistral-nemo:12b-instruct-2407-q5_K_M
At this point the "/load" prompt would be waiting for me to either type out the model name or just select the number of the model from the list, allowing us to avoid typing out long complex names with lots of hyphens (-) and underscores (_).
You could still have the existing functionality with the error message if no downloaded models match the search filter.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5916/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5916/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2434
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2434/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2434/comments
|
https://api.github.com/repos/ollama/ollama/issues/2434/events
|
https://github.com/ollama/ollama/issues/2434
| 2,127,951,783
|
I_kwDOJ0Z1Ps5-1fen
| 2,434
|
Linux Install Instructions
|
{
"login": "Deetss",
"id": 1056037,
"node_id": "MDQ6VXNlcjEwNTYwMzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/1056037?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Deetss",
"html_url": "https://github.com/Deetss",
"followers_url": "https://api.github.com/users/Deetss/followers",
"following_url": "https://api.github.com/users/Deetss/following{/other_user}",
"gists_url": "https://api.github.com/users/Deetss/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Deetss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Deetss/subscriptions",
"organizations_url": "https://api.github.com/users/Deetss/orgs",
"repos_url": "https://api.github.com/users/Deetss/repos",
"events_url": "https://api.github.com/users/Deetss/events{/privacy}",
"received_events_url": "https://api.github.com/users/Deetss/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-02-09T22:54:08
| 2024-02-10T18:30:36
| 2024-02-09T23:01:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The current install instructions showing this one-liner do not work.
`curl https://ollama.ai/install.sh | sh`
I had to change the command to this, for it to work.
`curl https://ollama.com/install.sh | sh`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2434/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2434/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/267
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/267/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/267/comments
|
https://api.github.com/repos/ollama/ollama/issues/267/events
|
https://github.com/ollama/ollama/issues/267
| 1,835,232,142
|
I_kwDOJ0Z1Ps5tY2uO
| 267
|
Allow overriding the prompt template at generate time
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-08-03T14:56:29
| 2023-08-08T05:02:43
| 2023-08-08T05:02:43
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/267/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/267/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1322
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1322/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1322/comments
|
https://api.github.com/repos/ollama/ollama/issues/1322/events
|
https://github.com/ollama/ollama/issues/1322
| 2,017,333,231
|
I_kwDOJ0Z1Ps54Pg_v
| 1,322
|
:grey_question: Run a given LLM/model within docker/podman/cloud run :baby:
|
{
"login": "adriens",
"id": 5235127,
"node_id": "MDQ6VXNlcjUyMzUxMjc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5235127?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adriens",
"html_url": "https://github.com/adriens",
"followers_url": "https://api.github.com/users/adriens/followers",
"following_url": "https://api.github.com/users/adriens/following{/other_user}",
"gists_url": "https://api.github.com/users/adriens/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adriens/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adriens/subscriptions",
"organizations_url": "https://api.github.com/users/adriens/orgs",
"repos_url": "https://api.github.com/users/adriens/repos",
"events_url": "https://api.github.com/users/adriens/events{/privacy}",
"received_events_url": "https://api.github.com/users/adriens/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 10
| 2023-11-29T20:04:21
| 2024-05-09T23:25:17
| 2024-05-09T22:52:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# :grey_question: About
I intend to deploy a given LLM (out of the box or customized) thanks to `ollama` (let's say `https://ollama.ai/library/mistral`), then call its APIs from various clients.
Here would be an example of my scenario:
1. **Build/customize the image** that runs mistral (for example) within ollama
2. **Push the image on any private docker** registry (let's says `ghcr` or dockerhub)
3. **Pull & run the image** (local [podman](https://podman.io/) on workstation, [GCP Google Cloud Run](https://cloud.google.com/run)) so the boot the API
4. **(Optional) : publish the resulting API through on any API Management** ([Google APIGEE](https://cloud.google.com/apigee/docs/api-platform/get-started/what-apigee) in my case, but could also land on [Tyk](https://tyk.io/), [Kong](https://konghq.com/products/kong-gateway),...)
5. **Call it from any client (Java/Python, js)** so we can build integrations and release next-gen ([llama_index](https://www.llamaindex.ai/) or [langchain](https://www.langchain.com/) based) solutions (under the hood)
# :dart: The question
:point_right: The question may be answered by **providing (or pointing) a howto** as for now I did not find a set-by-step (for dummies like) tutorial.
:pray: Thank you in advance for any help or feedback.
# :tickets: Related contents
- https://github.com/jmorganca/ollama/issues/1318
- [🦙 Mistral 7B & Ollama: LLMs 💏 Apache 2.0 Open Source on small hardwares](https://dev.to/adriens/mistral-7b-ollama-llms-apache-20-open-source-on-small-hardwares-3ln3)
- https://github.com/jmorganca/ollama/issues/546
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1322/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/1322/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/991
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/991/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/991/comments
|
https://api.github.com/repos/ollama/ollama/issues/991/events
|
https://github.com/ollama/ollama/pull/991
| 1,976,983,241
|
PR_kwDOJ0Z1Ps5elAyW
| 991
|
Add `/chat` API
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 11
| 2023-11-03T22:01:57
| 2023-12-04T23:01:07
| 2023-12-04T23:01:06
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/991",
"html_url": "https://github.com/ollama/ollama/pull/991",
"diff_url": "https://github.com/ollama/ollama/pull/991.diff",
"patch_url": "https://github.com/ollama/ollama/pull/991.patch",
"merged_at": "2023-12-04T23:01:06"
}
|
- add a new `/api/chat` API endpoint that takes an array of `message` objects. This endpoint is an alternative to `/api/generate`.
- deprecate generation context and template, but continue to support them
- rebuild chat content from messages
This changes adds a `/api/chat` endpoint to the API which takes an array of messages. This makes modifying and tracking the history on the fly much simpler. It is an alternative to prompt/response.
`context` will continue to work as expected for now, but at some point in the future we may want to replace it completely with `/api/chat`.
```
curl -X POST http://localhost:11434/api/generate -d '{
"model": "mistral",
"prompt": "hello, how are you?"
}'
OR
### Basic generate request with messages
curl -X POST http://localhost:11434/api/chat -d '{
"model": "mistral",
"messages": [
{
"role": "user",
"content": "why is the sky blue?"
}
]
}'
```
resolves #981
resolves #1203
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/991/reactions",
"total_count": 4,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 4,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/991/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4809
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4809/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4809/comments
|
https://api.github.com/repos/ollama/ollama/issues/4809/events
|
https://github.com/ollama/ollama/issues/4809
| 2,333,107,602
|
I_kwDOJ0Z1Ps6LEGWS
| 4,809
|
Add information on RAM and VRAM requirements of model quantization in library
|
{
"login": "geroldmeisinger",
"id": 112266044,
"node_id": "U_kgDOBrELPA",
"avatar_url": "https://avatars.githubusercontent.com/u/112266044?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/geroldmeisinger",
"html_url": "https://github.com/geroldmeisinger",
"followers_url": "https://api.github.com/users/geroldmeisinger/followers",
"following_url": "https://api.github.com/users/geroldmeisinger/following{/other_user}",
"gists_url": "https://api.github.com/users/geroldmeisinger/gists{/gist_id}",
"starred_url": "https://api.github.com/users/geroldmeisinger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/geroldmeisinger/subscriptions",
"organizations_url": "https://api.github.com/users/geroldmeisinger/orgs",
"repos_url": "https://api.github.com/users/geroldmeisinger/repos",
"events_url": "https://api.github.com/users/geroldmeisinger/events{/privacy}",
"received_events_url": "https://api.github.com/users/geroldmeisinger/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 2
| 2024-06-04T10:03:43
| 2024-07-11T06:02:14
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
this would help to decide which model can run
a) at all
b) at highest-quality although slowly
c) at highest-quality at GPU-speed
before downloading huge files
I think one of the great plus of Ollama is the curated model library. adding memory information would make it even more convenient.
_thank you for this great tool!_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4809/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4809/timeline
| null | null | false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.