url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/6071
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6071/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6071/comments
|
https://api.github.com/repos/ollama/ollama/issues/6071/events
|
https://github.com/ollama/ollama/issues/6071
| 2,437,582,747
|
I_kwDOJ0Z1Ps6RSo-b
| 6,071
|
llama3.1:70b <|eom_id|> missing as stop in template
|
{
"login": "SinanAkkoyun",
"id": 43215895,
"node_id": "MDQ6VXNlcjQzMjE1ODk1",
"avatar_url": "https://avatars.githubusercontent.com/u/43215895?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SinanAkkoyun",
"html_url": "https://github.com/SinanAkkoyun",
"followers_url": "https://api.github.com/users/SinanAkkoyun/followers",
"following_url": "https://api.github.com/users/SinanAkkoyun/following{/other_user}",
"gists_url": "https://api.github.com/users/SinanAkkoyun/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SinanAkkoyun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SinanAkkoyun/subscriptions",
"organizations_url": "https://api.github.com/users/SinanAkkoyun/orgs",
"repos_url": "https://api.github.com/users/SinanAkkoyun/repos",
"events_url": "https://api.github.com/users/SinanAkkoyun/events{/privacy}",
"received_events_url": "https://api.github.com/users/SinanAkkoyun/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-30T11:17:27
| 2024-09-04T00:53:12
| 2024-09-04T00:53:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
llama3.1:70b outputs <|eom_id|> as a string rather than an input. To test this, make a function call to the model without a 'tool' response message multiple times. 8B hallucinates, 70B wants to refuse, returns an <|eom_id|> but ollama doesn't recognize it.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
latest
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6071/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6071/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3706
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3706/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3706/comments
|
https://api.github.com/repos/ollama/ollama/issues/3706/events
|
https://github.com/ollama/ollama/pull/3706
| 2,248,949,001
|
PR_kwDOJ0Z1Ps5s9VVW
| 3,706
|
account for all non-repeating layers
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-17T18:21:39
| 2024-04-17T18:58:21
| 2024-04-17T18:58:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3706",
"html_url": "https://github.com/ollama/ollama/pull/3706",
"diff_url": "https://github.com/ollama/ollama/pull/3706.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3706.patch",
"merged_at": "2024-04-17T18:58:21"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3706/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3706/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2116
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2116/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2116/comments
|
https://api.github.com/repos/ollama/ollama/issues/2116/events
|
https://github.com/ollama/ollama/pull/2116
| 2,092,261,248
|
PR_kwDOJ0Z1Ps5kosQT
| 2,116
|
Add support for CUDA 5.0 cards
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2024-01-20T22:28:09
| 2024-01-31T07:20:36
| 2024-01-27T18:28:38
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2116",
"html_url": "https://github.com/ollama/ollama/pull/2116",
"diff_url": "https://github.com/ollama/ollama/pull/2116.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2116.patch",
"merged_at": "2024-01-27T18:28:38"
}
|
Building on #2112, this expands back to 5.0 cards, and also adds a few newer targets which theoretically should help performance on the more modern cards. The resulting binary grows a little in size but not significantly
* 0.1.21 => 263M
* #2112 => 264M
* This PR: => 266M
Fixes #1865
I'll keep this draft until we can run more performance testing on modern cards to ensure no significant regression
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2116/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2116/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5059
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5059/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5059/comments
|
https://api.github.com/repos/ollama/ollama/issues/5059/events
|
https://github.com/ollama/ollama/pull/5059
| 2,354,720,415
|
PR_kwDOJ0Z1Ps5yjgmM
| 5,059
|
Add Vulkan support to ollama
|
{
"login": "pufferffish",
"id": 74378430,
"node_id": "MDQ6VXNlcjc0Mzc4NDMw",
"avatar_url": "https://avatars.githubusercontent.com/u/74378430?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pufferffish",
"html_url": "https://github.com/pufferffish",
"followers_url": "https://api.github.com/users/pufferffish/followers",
"following_url": "https://api.github.com/users/pufferffish/following{/other_user}",
"gists_url": "https://api.github.com/users/pufferffish/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pufferffish/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pufferffish/subscriptions",
"organizations_url": "https://api.github.com/users/pufferffish/orgs",
"repos_url": "https://api.github.com/users/pufferffish/repos",
"events_url": "https://api.github.com/users/pufferffish/events{/privacy}",
"received_events_url": "https://api.github.com/users/pufferffish/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 121
| 2024-06-15T10:13:25
| 2025-01-29T09:29:17
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5059",
"html_url": "https://github.com/ollama/ollama/pull/5059",
"diff_url": "https://github.com/ollama/ollama/pull/5059.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5059.patch",
"merged_at": null
}
|
# Edit: (2025/01/19)
It's been around 7 months and ollama devs don't seem to be interested in merging this PR. I'll maintain this fork as a separate project from now on. If you have any issues please raise them in the fork's repo so I can keep track of them.
#
This PR adds vulkan support to ollama with a proper memory monitoring implementation. This closes #2033 and replaces #2578 which does not implement proper memory monitoring.
Note that this implementation does not support GPU without `VkPhysicalDeviceMemoryBudgetPropertiesEXT` support. This shouldn't be a problem since on Linux the mesa driver supports it for all Intel devices afaik.
`CAP_PERFMON` capability is also needed for memory monitoring. This can be done by specifically enabling `CAP_PERFMON` when running ollama as a systemd service by adding `AmbientCapabilities=CAP_PERFMON` to the service or just run ollama as root.
Vulkan devices that are CPUs under the hood (e.g. llvmpipe) are also not supported. This is purposely done so to avoid accidentally using CPUs for accelerated inference. Let me know if you think this behavior should be changed.
I've not tested this on Windows nor have I implemented the logic for building ollama with Vulkan support yet because I don't use Windows. If someone can help me with this that would be great.
I've tested this on my machine with an Intel Arc A770:
```
System:
Host: rofl Kernel: 6.8.11 arch: x86_64 bits: 64 compiler: gcc v: 13.2.0
Console: pty pts/2 Distro: NixOS 24.05 (Uakari)
CPU:
Info: 8-core (4-mt/4-st) model: Intel 0000 bits: 64 type: MST AMCP arch: Raptor Lake rev: 2
cache: L1: 704 KiB L2: 7 MiB L3: 12 MiB
Speed (MHz): avg: 473 high: 1100 min/max: 400/4500:3400 cores: 1: 400 2: 400 3: 400 4: 576
5: 400 6: 400 7: 400 8: 400 9: 400 10: 400 11: 1100 12: 400 bogomips: 59904
Flags: avx avx2 ht lm nx pae sse sse2 sse3 sse4_1 sse4_2 ssse3
Graphics:
Device-1: Intel DG2 [Arc A770] vendor: Acer Incorporated ALI driver: i915 v: kernel
arch: Gen-12.7 pcie: speed: 2.5 GT/s lanes: 1 ports: active: DP-1 empty: DP-2, DP-3, DP-4,
HDMI-A-1, HDMI-A-2, HDMI-A-3 bus-ID: 03:00.0 chip-ID: 8086:56a0
Display: server: No display server data found. Headless machine? tty: 98x63
Monitor-1: DP-1 model: Daewoo HDMI res: 1024x600 dpi: 55 diag: 537mm (21.1")
API: Vulkan v: 1.3.283 surfaces: N/A device: 0 type: discrete-gpu driver: N/A
device-ID: 8086:56a0 device: 1 type: cpu driver: N/A device-ID: 10005:0000
```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5059/reactions",
"total_count": 140,
"+1": 53,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 59,
"rocket": 28,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5059/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/331
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/331/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/331/comments
|
https://api.github.com/repos/ollama/ollama/issues/331/events
|
https://github.com/ollama/ollama/issues/331
| 1,847,107,551
|
I_kwDOJ0Z1Ps5uGJ_f
| 331
|
do not regenerate embeddings layers when not changed
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2023-08-11T16:25:23
| 2023-08-15T19:10:25
| 2023-08-15T19:10:25
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/331/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/331/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1147
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1147/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1147/comments
|
https://api.github.com/repos/ollama/ollama/issues/1147/events
|
https://github.com/ollama/ollama/pull/1147
| 1,995,855,604
|
PR_kwDOJ0Z1Ps5fk54z
| 1,147
|
add faq for proxies
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-16T01:16:03
| 2024-02-12T13:58:14
| 2023-11-16T16:43:37
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1147",
"html_url": "https://github.com/ollama/ollama/pull/1147",
"diff_url": "https://github.com/ollama/ollama/pull/1147.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1147.patch",
"merged_at": "2023-11-16T16:43:37"
}
| null |
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1147/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1147/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6132
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6132/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6132/comments
|
https://api.github.com/repos/ollama/ollama/issues/6132/events
|
https://github.com/ollama/ollama/pull/6132
| 2,443,639,168
|
PR_kwDOJ0Z1Ps53L2hI
| 6,132
|
Fix OpenAI models/{model} URL
|
{
"login": "thinkverse",
"id": 2221746,
"node_id": "MDQ6VXNlcjIyMjE3NDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/2221746?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/thinkverse",
"html_url": "https://github.com/thinkverse",
"followers_url": "https://api.github.com/users/thinkverse/followers",
"following_url": "https://api.github.com/users/thinkverse/following{/other_user}",
"gists_url": "https://api.github.com/users/thinkverse/gists{/gist_id}",
"starred_url": "https://api.github.com/users/thinkverse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thinkverse/subscriptions",
"organizations_url": "https://api.github.com/users/thinkverse/orgs",
"repos_url": "https://api.github.com/users/thinkverse/repos",
"events_url": "https://api.github.com/users/thinkverse/events{/privacy}",
"received_events_url": "https://api.github.com/users/thinkverse/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-01T23:05:20
| 2024-08-01T23:31:48
| 2024-08-01T23:31:47
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6132",
"html_url": "https://github.com/ollama/ollama/pull/6132",
"diff_url": "https://github.com/ollama/ollama/pull/6132.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6132.patch",
"merged_at": "2024-08-01T23:31:47"
}
|
I added a note about this after the PR was merged @royjhan.
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6132/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6132/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4220
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4220/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4220/comments
|
https://api.github.com/repos/ollama/ollama/issues/4220/events
|
https://github.com/ollama/ollama/issues/4220
| 2,282,254,626
|
I_kwDOJ0Z1Ps6ICHEi
| 4,220
|
modify template, system,or params on webpage
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 5
| 2024-05-07T04:36:28
| 2024-05-09T02:32:59
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
after pushing a model, sometimes I hope to modify template, system,or params on webpage.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4220/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4220/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7845
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7845/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7845/comments
|
https://api.github.com/repos/ollama/ollama/issues/7845/events
|
https://github.com/ollama/ollama/issues/7845
| 2,695,932,637
|
I_kwDOJ0Z1Ps6gsKrd
| 7,845
|
Support for rx 580
|
{
"login": "alibahal5",
"id": 112767677,
"node_id": "U_kgDOBriyvQ",
"avatar_url": "https://avatars.githubusercontent.com/u/112767677?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alibahal5",
"html_url": "https://github.com/alibahal5",
"followers_url": "https://api.github.com/users/alibahal5/followers",
"following_url": "https://api.github.com/users/alibahal5/following{/other_user}",
"gists_url": "https://api.github.com/users/alibahal5/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alibahal5/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alibahal5/subscriptions",
"organizations_url": "https://api.github.com/users/alibahal5/orgs",
"repos_url": "https://api.github.com/users/alibahal5/repos",
"events_url": "https://api.github.com/users/alibahal5/events{/privacy}",
"received_events_url": "https://api.github.com/users/alibahal5/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-26T20:00:06
| 2024-12-02T15:39:27
| 2024-12-02T15:39:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
hi
please add support for rx580 or if there is a way to use it tell me.
thanks.
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7845/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7845/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/522
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/522/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/522/comments
|
https://api.github.com/repos/ollama/ollama/issues/522/events
|
https://github.com/ollama/ollama/pull/522
| 1,893,532,942
|
PR_kwDOJ0Z1Ps5aLrOP
| 522
|
add a simple python client to access ollama
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-13T00:38:52
| 2023-09-15T04:25:10
| 2023-09-14T23:37:38
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/522",
"html_url": "https://github.com/ollama/ollama/pull/522",
"diff_url": "https://github.com/ollama/ollama/pull/522.diff",
"patch_url": "https://github.com/ollama/ollama/pull/522.patch",
"merged_at": "2023-09-14T23:37:38"
}
|
These are some simple python bindings for interacting with the local Ollama server. Most of the functions should be pretty straight forward, and each of the streaming endpoints has a default way of handling the output but can be passed in a "callback" function to override the default.
The callback functions can be as simple as:
```
def my_callback(chunk):
"""
Callback function to handle individual chunks of the streaming response.
Parameters:
- chunk (dict): The individual chunk of JSON data from the streaming response.
"""
# Here, we are simply printing the entire chunk as a JSON string with indentation
# for readability. In a real application, you would likely want to do something
# more specific with the data in each chunk.
print(json.dumps(chunk, indent=4))
# If you want to specifically print the 'response' field, you can do so like this:
# response_piece = chunk.get('response')
# if response_piece:
# print(response_piece)
```
It's been a little while since I put together a python library, so I'm not sure if I hit all of the correct idioms here. To test it out, you can do something like:
```
>>> import sys
>>> sys.path.append("<path to git/ollama/api>")
>>> import client
```
To run something like generate:
```
>>> response, history = client.generate("llama2", "Is friendship like magic?")
This will return a string for the output and the context history which you can feed back in with:
>>> client.generate("llama2", "Friendship sure feels like magic", context=history)
To hook in the above callback, use:
>>> client.generate("llama2", "Is the universe a giant black hole?", callback=my_callback)
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/522/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/522/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/336
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/336/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/336/comments
|
https://api.github.com/repos/ollama/ollama/issues/336/events
|
https://github.com/ollama/ollama/issues/336
| 1,847,340,064
|
I_kwDOJ0Z1Ps5uHCwg
| 336
|
model names should be case insensitive
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2023-08-11T19:46:59
| 2024-12-23T02:18:21
| 2024-12-23T00:52:24
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
```
ollama pull wizard-math:13b-q3_k_m
```
should pull the same image as
```
ollama pull wizard-math:13b-q3_K_M
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/336/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4552
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4552/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4552/comments
|
https://api.github.com/repos/ollama/ollama/issues/4552/events
|
https://github.com/ollama/ollama/issues/4552
| 2,307,398,787
|
I_kwDOJ0Z1Ps6JiByD
| 4,552
|
Please support LoongArch ISA
|
{
"login": "HougeLangley",
"id": 1161594,
"node_id": "MDQ6VXNlcjExNjE1OTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HougeLangley",
"html_url": "https://github.com/HougeLangley",
"followers_url": "https://api.github.com/users/HougeLangley/followers",
"following_url": "https://api.github.com/users/HougeLangley/following{/other_user}",
"gists_url": "https://api.github.com/users/HougeLangley/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HougeLangley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HougeLangley/subscriptions",
"organizations_url": "https://api.github.com/users/HougeLangley/orgs",
"repos_url": "https://api.github.com/users/HougeLangley/repos",
"events_url": "https://api.github.com/users/HougeLangley/events{/privacy}",
"received_events_url": "https://api.github.com/users/HougeLangley/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 6
| 2024-05-21T06:09:40
| 2024-05-29T22:52:19
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://github.com/ggerganov/llama.cpp/pull/6454#issuecomment-2119832472
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4552/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4552/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1692
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1692/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1692/comments
|
https://api.github.com/repos/ollama/ollama/issues/1692/events
|
https://github.com/ollama/ollama/issues/1692
| 2,054,944,902
|
I_kwDOJ0Z1Ps56e_iG
| 1,692
|
Mac OS Sonoma crashes completely when loading LLM
|
{
"login": "sanctimon",
"id": 49374753,
"node_id": "MDQ6VXNlcjQ5Mzc0NzUz",
"avatar_url": "https://avatars.githubusercontent.com/u/49374753?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sanctimon",
"html_url": "https://github.com/sanctimon",
"followers_url": "https://api.github.com/users/sanctimon/followers",
"following_url": "https://api.github.com/users/sanctimon/following{/other_user}",
"gists_url": "https://api.github.com/users/sanctimon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sanctimon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sanctimon/subscriptions",
"organizations_url": "https://api.github.com/users/sanctimon/orgs",
"repos_url": "https://api.github.com/users/sanctimon/repos",
"events_url": "https://api.github.com/users/sanctimon/events{/privacy}",
"received_events_url": "https://api.github.com/users/sanctimon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2023-12-24T00:30:55
| 2024-01-25T22:50:56
| 2024-01-25T22:50:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have pulled the model (dolphin-mixtral:latest) and when I attempt to run, the entire machine freezes. A few minutes later it restarts.
Specs: MacBook Pro M1 Pro. 16GB RAM.
With Activity Monitor on, it seems to be filling up the RAM quite quickly before the crash.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1692/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1692/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6992
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6992/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6992/comments
|
https://api.github.com/repos/ollama/ollama/issues/6992/events
|
https://github.com/ollama/ollama/issues/6992
| 2,551,818,342
|
I_kwDOJ0Z1Ps6YGahm
| 6,992
|
/chat endpoint is not coherent
|
{
"login": "alchymia-sys",
"id": 176962419,
"node_id": "U_kgDOCow7cw",
"avatar_url": "https://avatars.githubusercontent.com/u/176962419?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alchymia-sys",
"html_url": "https://github.com/alchymia-sys",
"followers_url": "https://api.github.com/users/alchymia-sys/followers",
"following_url": "https://api.github.com/users/alchymia-sys/following{/other_user}",
"gists_url": "https://api.github.com/users/alchymia-sys/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alchymia-sys/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alchymia-sys/subscriptions",
"organizations_url": "https://api.github.com/users/alchymia-sys/orgs",
"repos_url": "https://api.github.com/users/alchymia-sys/repos",
"events_url": "https://api.github.com/users/alchymia-sys/events{/privacy}",
"received_events_url": "https://api.github.com/users/alchymia-sys/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-09-27T01:42:39
| 2024-10-02T00:00:27
| 2024-10-02T00:00:27
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I observed something with the /chat api endpoint, the model is always not coherent when having a conversation, however when chatting with the model via the terminal using ollama run llam3.2 for instance, i find that the model is coherent.
/chat endpoint
User: Hello there
Model: It looks like you just wanted to acknowledge that we hadn't started a conversation yet. I'm here to help with any questions or topics you'd like to discuss. What's on your mind?
User: Suggest a topic
Model: Based on your interest in electronics, I suggest the topic: "Smart Home Automation" Would you like to explore this topic further?
User: Yes I'd love to
Model: It seems like we just started our conversation, but I'm happy to chat with you. Is there something specific you'd like to talk about or ask for help with? I'm here to assist you!
It's as if it just forgets the previous conversation and starting all over again, however when using ollama run llam3.2, the whole conversations flows perfectly well.
What could be the issue? Is there a specific /chat parameters that can get the endpoint to behave exactly like ollama run?
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.11
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6992/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6992/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/790
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/790/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/790/comments
|
https://api.github.com/repos/ollama/ollama/issues/790/events
|
https://github.com/ollama/ollama/issues/790
| 1,943,037,061
|
I_kwDOJ0Z1Ps5z0GSF
| 790
|
"out of memory" when using CUDA
|
{
"login": "konstantin1722",
"id": 55327489,
"node_id": "MDQ6VXNlcjU1MzI3NDg5",
"avatar_url": "https://avatars.githubusercontent.com/u/55327489?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/konstantin1722",
"html_url": "https://github.com/konstantin1722",
"followers_url": "https://api.github.com/users/konstantin1722/followers",
"following_url": "https://api.github.com/users/konstantin1722/following{/other_user}",
"gists_url": "https://api.github.com/users/konstantin1722/gists{/gist_id}",
"starred_url": "https://api.github.com/users/konstantin1722/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/konstantin1722/subscriptions",
"organizations_url": "https://api.github.com/users/konstantin1722/orgs",
"repos_url": "https://api.github.com/users/konstantin1722/repos",
"events_url": "https://api.github.com/users/konstantin1722/events{/privacy}",
"received_events_url": "https://api.github.com/users/konstantin1722/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 11
| 2023-10-14T07:12:18
| 2024-03-25T09:53:31
| 2023-10-27T00:49:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I reinstalled ollama, after merge #724, now the error is gone on startup. At startup, it automatically calculates the number of layers that will be loaded into VRAM, but it does so incorrectly, which ultimately results in VRAM not being used at all.
I run the model `nous-hermes:13b-llama2`, after that I get this log in the log:
```
oct 11 15:40:01 desktop-pc systemd[1]: Started Ollama Service.
oct 11 15:40:01 desktop-pc ollama[32302]: 2023/10/11 15:40:01 images.go:996: total blobs: 17
oct 11 15:40:01 desktop-pc ollama[32302]: 2023/10/11 15:40:01 images.go:1003: total unused blobs removed: 0
oct 11 15:40:01 desktop-pc ollama[32302]: 2023/10/11 15:40:01 routes.go:572: Listening on 127.0.0.1:11434
oct 11 15:40:01 desktop-pc ollama[32302]: [GIN] 2023/10/11 - 15:40:01 | 200 | 33.576µs | 127.0.0.1 | HEAD "/"
oct 11 15:40:01 desktop-pc ollama[32302]: [GIN] 2023/10/11 - 15:40:01 | 200 | 468.103µs | 127.0.0.1 | GET "/api/tags"
oct 11 15:40:08 desktop-pc ollama[32302]: [GIN] 2023/10/11 - 15:40:08 | 200 | 13.184µs | 127.0.0.1 | HEAD "/"
oct 11 15:40:08 desktop-pc ollama[32302]: [GIN] 2023/10/11 - 15:40:08 | 200 | 370.716µs | 127.0.0.1 | GET "/api/tags"
oct 11 15:40:08 desktop-pc ollama[32302]: 2023/10/11 15:40:08 llama.go:239: 6144 MiB VRAM available, loading up to 35 GPU layers
oct 11 15:40:08 desktop-pc ollama[32302]: 2023/10/11 15:40:08 llama.go:313: starting llama runner
oct 11 15:40:08 desktop-pc ollama[32302]: 2023/10/11 15:40:08 llama.go:349: waiting for llama runner to start responding
oct 11 15:40:09 desktop-pc ollama[32352]: ggml_init_cublas: found 1 CUDA devices:
oct 11 15:40:09 desktop-pc ollama[32352]: Device 0: NVIDIA GeForce GTX 1060 6GB, compute capability 6.1
oct 11 15:40:09 desktop-pc ollama[32352]: {"timestamp":1697028009,"level":"INFO","function":"main","line":1190,"message":"build info","build":1009,"commit":"9e232f0"}
oct 11 15:40:09 desktop-pc ollama[32352]: {"timestamp":1697028009,"level":"INFO","function":"main","line":1192,"message":"system info","n_threads":3,"total_threads":6,"system_info":"AVX = 1| AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | VSX = 0 | "}
oct 11 15:40:09 desktop-pc ollama[32352]: llama.cpp: loading model from /usr/share/ollama/.ollama/models/blobs/sha256:f77c91fd65dd06ba92a6517fa5ab5bed86533b4171f0de63c0ab4883ac1ef826
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: format = ggjt v3 (latest)
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_vocab = 32032
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_ctx = 2048
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_embd = 5120
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_mult = 256
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_head = 40
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_head_kv = 40
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_layer = 40
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_rot = 128
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_gqa = 1
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: rnorm_eps = 5.0e-06
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: n_ff = 13824
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: freq_base = 10000.0
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: freq_scale = 1
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: ftype = 2 (mostly Q4_0)
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: model size = 13B
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: ggml ctx size = 0.11 MB
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: using CUDA for GPU acceleration
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: mem required = 1521.06 MB (+ 1600.00 MB per state)
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: allocating batch_size x (640 kB + n_ctx x 160 B) = 480 MB VRAM for the scratch buffer
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: offloading 35 repeating layers to GPU
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: offloaded 35/43 layers to GPU
oct 11 15:40:09 desktop-pc ollama[32352]: llama_model_load_internal: total VRAM used: 6437 MB
oct 11 15:40:09 desktop-pc ollama[32352]: CUDA error 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml/ggml-cuda.cu:6184: out of memory
```
From the log I can see that a value of 35 layers was selected which resulted in `out of memory`.
Then I tried to specify the number manually at 22 layers:
```
oct 11 15:42:01 desktop-pc systemd[1]: Started Ollama Service.
oct 11 15:42:01 desktop-pc ollama[32411]: 2023/10/11 15:42:01 images.go:996: total blobs: 17
oct 11 15:42:01 desktop-pc ollama[32411]: 2023/10/11 15:42:01 images.go:1003: total unused blobs removed: 0
oct 11 15:42:01 desktop-pc ollama[32411]: 2023/10/11 15:42:01 routes.go:572: Listening on 127.0.0.1:11434
oct 11 15:42:01 desktop-pc ollama[32411]: [GIN] 2023/10/11 - 15:42:01 | 200 | 39.45µs | 127.0.0.1 | HEAD "/"
oct 11 15:42:01 desktop-pc ollama[32411]: [GIN] 2023/10/11 - 15:42:01 | 200 | 641.805µs | 127.0.0.1 | GET "/api/tags"
oct 11 15:42:08 desktop-pc ollama[32411]: [GIN] 2023/10/11 - 15:42:08 | 200 | 23.622µs | 127.0.0.1 | HEAD "/"
oct 11 15:42:08 desktop-pc ollama[32411]: [GIN] 2023/10/11 - 15:42:08 | 200 | 696.378µs | 127.0.0.1 | GET "/api/tags"
oct 11 15:42:08 desktop-pc ollama[32411]: 2023/10/11 15:42:08 llama.go:313: starting llama runner
oct 11 15:42:08 desktop-pc ollama[32411]: 2023/10/11 15:42:08 llama.go:349: waiting for llama runner to start responding
oct 11 15:42:08 desktop-pc ollama[32462]: ggml_init_cublas: found 1 CUDA devices:
oct 11 15:42:08 desktop-pc ollama[32462]: Device 0: NVIDIA GeForce GTX 1060 6GB, compute capability 6.1
oct 11 15:42:08 desktop-pc ollama[32462]: {"timestamp":1697028128,"level":"INFO","function":"main","line":1190,"message":"build info","build":1009,"commit":"9e232f0"}
oct 11 15:42:08 desktop-pc ollama[32462]: {"timestamp":1697028128,"level":"INFO","function":"main","line":1192,"message":"system info","n_threads":6,"total_threads":6,"system_info":"AVX = 1| AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | VSX = 0 | "}
oct 11 15:42:08 desktop-pc ollama[32462]: llama.cpp: loading model from /usr/share/ollama/.ollama/models/blobs/sha256:f77c91fd65dd06ba92a6517fa5ab5bed86533b4171f0de63c0ab4883ac1ef826
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: format = ggjt v3 (latest)
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_vocab = 32032
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_ctx = 2048
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_embd = 5120
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_mult = 256
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_head = 40
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_head_kv = 40
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_layer = 40
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_rot = 128
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_gqa = 1
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: rnorm_eps = 5.0e-06
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: n_ff = 13824
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: freq_base = 10000.0
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: freq_scale = 1
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: ftype = 2 (mostly Q4_0)
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: model size = 13B
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: ggml ctx size = 0.11 MB
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: using CUDA for GPU acceleration
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: mem required = 3733.60 MB (+ 1600.00 MB per state)
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: allocating batch_size x (640 kB + n_ctx x 160 B) = 480 MB VRAM for the scratch buffer
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: offloading 22 repeating layers to GPU
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: offloaded 22/43 layers to GPU
oct 11 15:42:08 desktop-pc ollama[32462]: llama_model_load_internal: total VRAM used: 4225 MB
oct 11 15:42:09 desktop-pc ollama[32462]: llama_new_context_with_model: kv self size = 1600.00 MB
oct 11 15:42:09 desktop-pc ollama[32462]: llama server listening at http://127.0.0.1:62934
```
In this case the startup is successful, but the memory is still short when generating and I get:
```
oct 11 15:44:22 desktop-pc ollama[32462]: CUDA error 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/ggml/ggml-cuda.cu:4856: out of memory
oct 11 15:44:23 desktop-pc ollama[32411]: [GIN] 2023/10/11 - 15:44:23 | 200 | 2.83776208s | 127.0.0.1 | POST "/api/generate"
oct 11 15:44:23 desktop-pc ollama[32411]: 2023/10/11 15:44:23 llama.go:323: llama runner exited with error: exit status 1
```
I'm assuming this behaviour is not the norm.
Generation with 18 layers works successfully for the 13B model.
Also, I noticed that for the `llama2-uncensored:7b-chat-q8_0` model, no attempt is made to load layers into VRAM at all. The same goes for explicitly specifying `num_gpu` via Modelfile. Is this normal behaviour?
Also, can you answer a couple of additional questions on the topic?
1. I noticed that the load on the graphics chip is not even, sometimes it is 0% sometimes it can go up to 100%, is this due to the number of loaded layers and random?
2. I still see a high load on the hard drive and free RAM when generating text, is this how it should be? I thought all the model files should be uploaded to RAM, for example for 13B it will take 7.3GB of RAM, or am I wrong?
I would be very grateful if you could clarify these two points. But of course the underlying problem is the most important one within the scope of this post.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/790/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/790/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3442
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3442/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3442/comments
|
https://api.github.com/repos/ollama/ollama/issues/3442/events
|
https://github.com/ollama/ollama/pull/3442
| 2,219,035,743
|
PR_kwDOJ0Z1Ps5rW9A4
| 3,442
|
fix generate output
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-01T20:47:47
| 2024-04-01T20:56:10
| 2024-04-01T20:56:09
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3442",
"html_url": "https://github.com/ollama/ollama/pull/3442",
"diff_url": "https://github.com/ollama/ollama/pull/3442.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3442.patch",
"merged_at": "2024-04-01T20:56:09"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3442/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3442/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3432
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3432/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3432/comments
|
https://api.github.com/repos/ollama/ollama/issues/3432/events
|
https://github.com/ollama/ollama/issues/3432
| 2,217,416,828
|
I_kwDOJ0Z1Ps6EKxh8
| 3,432
|
Partial Deletion of Chinese Characters in Ollama Project
|
{
"login": "TanyaShue",
"id": 152378303,
"node_id": "U_kgDOCRUbvw",
"avatar_url": "https://avatars.githubusercontent.com/u/152378303?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TanyaShue",
"html_url": "https://github.com/TanyaShue",
"followers_url": "https://api.github.com/users/TanyaShue/followers",
"following_url": "https://api.github.com/users/TanyaShue/following{/other_user}",
"gists_url": "https://api.github.com/users/TanyaShue/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TanyaShue/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TanyaShue/subscriptions",
"organizations_url": "https://api.github.com/users/TanyaShue/orgs",
"repos_url": "https://api.github.com/users/TanyaShue/repos",
"events_url": "https://api.github.com/users/TanyaShue/events{/privacy}",
"received_events_url": "https://api.github.com/users/TanyaShue/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6960960225,
"node_id": "LA_kwDOJ0Z1Ps8AAAABnufS4Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/cli",
"name": "cli",
"color": "5319e7",
"default": false,
"description": "Issues related to the Ollama CLI"
}
] |
closed
| false
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-04-01T03:03:45
| 2024-05-28T19:04:04
| 2024-05-28T19:04:04
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Description:
When running the Ollama project (version 0.1.30) on Windows 11 using PowerShell or Windows Terminal, I’ve encountered an issue where only half of each Chinese character is deleted upon attempting to remove them. The steps to reproduce the issue are as follows:
Launch Ollama with the command ollama run mistral.
Enter Chinese characters, for example, “哈哈哈哈哈哈哈哈哈”.
Attempt to delete the Chinese characters using the backspace key.
### What did you expect to see?
Expected Behavior:
Each Chinese character should be entirely removed from the input text area with a single backspace action.
Actual Behavior:
Only half of each Chinese character is removed with each backspace action, leaving behind a residual character fragment that cannot be deleted.
### Steps to reproduce
_No response_
### Are there any recent changes that introduced the issue?
_No response_
### OS
Windows
### Architecture
amd64
### Platform
_No response_
### Ollama version
0.1.30
### GPU
Nvidia
### GPU info
_No response_
### CPU
_No response_
### Other software
_No response_
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3432/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3432/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6530
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6530/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6530/comments
|
https://api.github.com/repos/ollama/ollama/issues/6530/events
|
https://github.com/ollama/ollama/pull/6530
| 2,490,133,550
|
PR_kwDOJ0Z1Ps55nUGk
| 6,530
|
fix: comment typo
|
{
"login": "seankhatiri",
"id": 39913795,
"node_id": "MDQ6VXNlcjM5OTEzNzk1",
"avatar_url": "https://avatars.githubusercontent.com/u/39913795?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/seankhatiri",
"html_url": "https://github.com/seankhatiri",
"followers_url": "https://api.github.com/users/seankhatiri/followers",
"following_url": "https://api.github.com/users/seankhatiri/following{/other_user}",
"gists_url": "https://api.github.com/users/seankhatiri/gists{/gist_id}",
"starred_url": "https://api.github.com/users/seankhatiri/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/seankhatiri/subscriptions",
"organizations_url": "https://api.github.com/users/seankhatiri/orgs",
"repos_url": "https://api.github.com/users/seankhatiri/repos",
"events_url": "https://api.github.com/users/seankhatiri/events{/privacy}",
"received_events_url": "https://api.github.com/users/seankhatiri/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-27T19:06:21
| 2024-08-27T21:43:43
| 2024-08-27T20:28:30
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6530",
"html_url": "https://github.com/ollama/ollama/pull/6530",
"diff_url": "https://github.com/ollama/ollama/pull/6530.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6530.patch",
"merged_at": "2024-08-27T20:28:30"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6530/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6530/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/388
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/388/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/388/comments
|
https://api.github.com/repos/ollama/ollama/issues/388/events
|
https://github.com/ollama/ollama/issues/388
| 1,858,040,708
|
I_kwDOJ0Z1Ps5uv3OE
| 388
|
Chat with documents
|
{
"login": "mohdyasa",
"id": 29009636,
"node_id": "MDQ6VXNlcjI5MDA5NjM2",
"avatar_url": "https://avatars.githubusercontent.com/u/29009636?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mohdyasa",
"html_url": "https://github.com/mohdyasa",
"followers_url": "https://api.github.com/users/mohdyasa/followers",
"following_url": "https://api.github.com/users/mohdyasa/following{/other_user}",
"gists_url": "https://api.github.com/users/mohdyasa/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mohdyasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mohdyasa/subscriptions",
"organizations_url": "https://api.github.com/users/mohdyasa/orgs",
"repos_url": "https://api.github.com/users/mohdyasa/repos",
"events_url": "https://api.github.com/users/mohdyasa/events{/privacy}",
"received_events_url": "https://api.github.com/users/mohdyasa/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-08-20T08:35:27
| 2023-08-21T16:55:31
| 2023-08-21T16:55:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Is it possible to chat with documents (pdf, doc, etc.) using this solution?
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/388/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/388/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7493
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7493/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7493/comments
|
https://api.github.com/repos/ollama/ollama/issues/7493/events
|
https://github.com/ollama/ollama/pull/7493
| 2,633,353,310
|
PR_kwDOJ0Z1Ps6A08xj
| 7,493
|
Sign windows arm64 official binaries
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-04T17:03:32
| 2024-11-04T17:15:17
| 2024-11-04T17:15:14
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7493",
"html_url": "https://github.com/ollama/ollama/pull/7493",
"diff_url": "https://github.com/ollama/ollama/pull/7493.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7493.patch",
"merged_at": "2024-11-04T17:15:14"
}
|
This will ensure the zip file artifact contains signed binaries.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7493/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7493/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7875
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7875/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7875/comments
|
https://api.github.com/repos/ollama/ollama/issues/7875/events
|
https://github.com/ollama/ollama/pull/7875
| 2,703,097,006
|
PR_kwDOJ0Z1Ps6DgSMo
| 7,875
|
llama: update llama.cpp and ggml to `40c6d79f`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-28T19:28:08
| 2024-12-11T03:21:36
| 2024-12-11T03:21:34
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7875",
"html_url": "https://github.com/ollama/ollama/pull/7875",
"diff_url": "https://github.com/ollama/ollama/pull/7875.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7875.patch",
"merged_at": "2024-12-11T03:21:34"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7875/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7875/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8273
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8273/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8273/comments
|
https://api.github.com/repos/ollama/ollama/issues/8273/events
|
https://github.com/ollama/ollama/issues/8273
| 2,764,131,849
|
I_kwDOJ0Z1Ps6kwU4J
| 8,273
|
Token counting
|
{
"login": "mjaniec2013",
"id": 5925782,
"node_id": "MDQ6VXNlcjU5MjU3ODI=",
"avatar_url": "https://avatars.githubusercontent.com/u/5925782?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mjaniec2013",
"html_url": "https://github.com/mjaniec2013",
"followers_url": "https://api.github.com/users/mjaniec2013/followers",
"following_url": "https://api.github.com/users/mjaniec2013/following{/other_user}",
"gists_url": "https://api.github.com/users/mjaniec2013/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mjaniec2013/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mjaniec2013/subscriptions",
"organizations_url": "https://api.github.com/users/mjaniec2013/orgs",
"repos_url": "https://api.github.com/users/mjaniec2013/repos",
"events_url": "https://api.github.com/users/mjaniec2013/events{/privacy}",
"received_events_url": "https://api.github.com/users/mjaniec2013/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-12-31T08:58:10
| 2025-01-01T00:21:28
| 2025-01-01T00:21:27
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Would it be possible to add to ollama model-dependent token counting functionality?
Currently, counting tokens requires using HF AutoTokenizer and duplicate installation of the models.
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8273/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8273/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/6849
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6849/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6849/comments
|
https://api.github.com/repos/ollama/ollama/issues/6849/events
|
https://github.com/ollama/ollama/pull/6849
| 2,532,322,555
|
PR_kwDOJ0Z1Ps57019R
| 6,849
|
server: add tool parsing support for nemotron-mini
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-17T22:57:11
| 2024-09-18T01:06:18
| 2024-09-18T01:06:16
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6849",
"html_url": "https://github.com/ollama/ollama/pull/6849",
"diff_url": "https://github.com/ollama/ollama/pull/6849.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6849.patch",
"merged_at": "2024-09-18T01:06:16"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6849/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6849/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8242
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8242/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8242/comments
|
https://api.github.com/repos/ollama/ollama/issues/8242/events
|
https://github.com/ollama/ollama/pull/8242
| 2,759,087,358
|
PR_kwDOJ0Z1Ps6GOD6Z
| 8,242
|
README: add neollama to terminal section of communiy integrations
|
{
"login": "paradoxical-dev",
"id": 150468425,
"node_id": "U_kgDOCPf3SQ",
"avatar_url": "https://avatars.githubusercontent.com/u/150468425?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/paradoxical-dev",
"html_url": "https://github.com/paradoxical-dev",
"followers_url": "https://api.github.com/users/paradoxical-dev/followers",
"following_url": "https://api.github.com/users/paradoxical-dev/following{/other_user}",
"gists_url": "https://api.github.com/users/paradoxical-dev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/paradoxical-dev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/paradoxical-dev/subscriptions",
"organizations_url": "https://api.github.com/users/paradoxical-dev/orgs",
"repos_url": "https://api.github.com/users/paradoxical-dev/repos",
"events_url": "https://api.github.com/users/paradoxical-dev/events{/privacy}",
"received_events_url": "https://api.github.com/users/paradoxical-dev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-12-25T20:47:22
| 2024-12-25T22:16:18
| 2024-12-25T22:16:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8242",
"html_url": "https://github.com/ollama/ollama/pull/8242",
"diff_url": "https://github.com/ollama/ollama/pull/8242.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8242.patch",
"merged_at": "2024-12-25T22:16:12"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8242/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8242/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3989
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3989/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3989/comments
|
https://api.github.com/repos/ollama/ollama/issues/3989/events
|
https://github.com/ollama/ollama/issues/3989
| 2,267,312,478
|
I_kwDOJ0Z1Ps6HJHFe
| 3,989
|
Please add this model
|
{
"login": "OE-LUCIFER",
"id": 158988478,
"node_id": "U_kgDOCXn4vg",
"avatar_url": "https://avatars.githubusercontent.com/u/158988478?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/OE-LUCIFER",
"html_url": "https://github.com/OE-LUCIFER",
"followers_url": "https://api.github.com/users/OE-LUCIFER/followers",
"following_url": "https://api.github.com/users/OE-LUCIFER/following{/other_user}",
"gists_url": "https://api.github.com/users/OE-LUCIFER/gists{/gist_id}",
"starred_url": "https://api.github.com/users/OE-LUCIFER/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OE-LUCIFER/subscriptions",
"organizations_url": "https://api.github.com/users/OE-LUCIFER/orgs",
"repos_url": "https://api.github.com/users/OE-LUCIFER/repos",
"events_url": "https://api.github.com/users/OE-LUCIFER/events{/privacy}",
"received_events_url": "https://api.github.com/users/OE-LUCIFER/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 1
| 2024-04-28T03:08:04
| 2024-04-28T13:13:11
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/mradermacher/DevsDoCode-LLama-3-8b-Uncensored-GGUF
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3989/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3989/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4764
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4764/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4764/comments
|
https://api.github.com/repos/ollama/ollama/issues/4764/events
|
https://github.com/ollama/ollama/issues/4764
| 2,329,064,674
|
I_kwDOJ0Z1Ps6K0rTi
| 4,764
|
ollama stop [id of running model]
|
{
"login": "mrdev023",
"id": 11292703,
"node_id": "MDQ6VXNlcjExMjkyNzAz",
"avatar_url": "https://avatars.githubusercontent.com/u/11292703?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mrdev023",
"html_url": "https://github.com/mrdev023",
"followers_url": "https://api.github.com/users/mrdev023/followers",
"following_url": "https://api.github.com/users/mrdev023/following{/other_user}",
"gists_url": "https://api.github.com/users/mrdev023/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mrdev023/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mrdev023/subscriptions",
"organizations_url": "https://api.github.com/users/mrdev023/orgs",
"repos_url": "https://api.github.com/users/mrdev023/repos",
"events_url": "https://api.github.com/users/mrdev023/events{/privacy}",
"received_events_url": "https://api.github.com/users/mrdev023/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-06-01T10:30:29
| 2024-09-11T23:37:01
| 2024-09-11T23:37:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Sometimes, when i use external tools with ollama, some model continue to run after application exit.
It can be usefull to have a command like this
```bash
ollama stop [id of running model]
```
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4764/reactions",
"total_count": 7,
"+1": 7,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4764/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8508
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8508/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8508/comments
|
https://api.github.com/repos/ollama/ollama/issues/8508/events
|
https://github.com/ollama/ollama/pull/8508
| 2,800,355,360
|
PR_kwDOJ0Z1Ps6IahSH
| 8,508
|
Set up CI with Azure Pipelines
|
{
"login": "enzoxic",
"id": 157711992,
"node_id": "U_kgDOCWZ-eA",
"avatar_url": "https://avatars.githubusercontent.com/u/157711992?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enzoxic",
"html_url": "https://github.com/enzoxic",
"followers_url": "https://api.github.com/users/enzoxic/followers",
"following_url": "https://api.github.com/users/enzoxic/following{/other_user}",
"gists_url": "https://api.github.com/users/enzoxic/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enzoxic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enzoxic/subscriptions",
"organizations_url": "https://api.github.com/users/enzoxic/orgs",
"repos_url": "https://api.github.com/users/enzoxic/repos",
"events_url": "https://api.github.com/users/enzoxic/events{/privacy}",
"received_events_url": "https://api.github.com/users/enzoxic/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2025-01-20T22:27:08
| 2025-01-27T17:57:09
| 2025-01-27T17:57:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8508",
"html_url": "https://github.com/ollama/ollama/pull/8508",
"diff_url": "https://github.com/ollama/ollama/pull/8508.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8508.patch",
"merged_at": null
}
|
First pipeline [skip ci]
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8508/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8508/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6710
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6710/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6710/comments
|
https://api.github.com/repos/ollama/ollama/issues/6710/events
|
https://github.com/ollama/ollama/issues/6710
| 2,513,421,265
|
I_kwDOJ0Z1Ps6Vz8PR
| 6,710
|
Docker: P8 State Power Usage double with 0.3.8+
|
{
"login": "t3chn0m4g3",
"id": 4318452,
"node_id": "MDQ6VXNlcjQzMTg0NTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/4318452?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/t3chn0m4g3",
"html_url": "https://github.com/t3chn0m4g3",
"followers_url": "https://api.github.com/users/t3chn0m4g3/followers",
"following_url": "https://api.github.com/users/t3chn0m4g3/following{/other_user}",
"gists_url": "https://api.github.com/users/t3chn0m4g3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/t3chn0m4g3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/t3chn0m4g3/subscriptions",
"organizations_url": "https://api.github.com/users/t3chn0m4g3/orgs",
"repos_url": "https://api.github.com/users/t3chn0m4g3/repos",
"events_url": "https://api.github.com/users/t3chn0m4g3/events{/privacy}",
"received_events_url": "https://api.github.com/users/t3chn0m4g3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-09-09T09:24:44
| 2024-09-10T09:10:01
| 2024-09-09T18:05:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Versions of `ollama/ollama` < 0.3.8 had a P8 state power usage of about 4-5W. With 0.3.8+ this usage doubled to 9-10W.
Running two Nvidia RTX 4060 Ti and the issue is 100% reproducible. Resetting the driver with `nvidia-smi -r` and then starting `ollama/ollama:0.3.7` shows the expected 4-5W in P8 state, while resetting and doing the same for `ollama/ollama:0.3.8` and `ollama/ollama:0.3.9` increases the P8 state to 9-10W again.
Happy to provide more information, just let me know.
### OS
Docker
### GPU
Nvidia
### CPU
AMD
### Ollama version
<=0.3.7 (works), >=0.3.8 (issue)
|
{
"login": "t3chn0m4g3",
"id": 4318452,
"node_id": "MDQ6VXNlcjQzMTg0NTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/4318452?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/t3chn0m4g3",
"html_url": "https://github.com/t3chn0m4g3",
"followers_url": "https://api.github.com/users/t3chn0m4g3/followers",
"following_url": "https://api.github.com/users/t3chn0m4g3/following{/other_user}",
"gists_url": "https://api.github.com/users/t3chn0m4g3/gists{/gist_id}",
"starred_url": "https://api.github.com/users/t3chn0m4g3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/t3chn0m4g3/subscriptions",
"organizations_url": "https://api.github.com/users/t3chn0m4g3/orgs",
"repos_url": "https://api.github.com/users/t3chn0m4g3/repos",
"events_url": "https://api.github.com/users/t3chn0m4g3/events{/privacy}",
"received_events_url": "https://api.github.com/users/t3chn0m4g3/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6710/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6710/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7282
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7282/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7282/comments
|
https://api.github.com/repos/ollama/ollama/issues/7282/events
|
https://github.com/ollama/ollama/pull/7282
| 2,601,082,609
|
PR_kwDOJ0Z1Ps5_P5zH
| 7,282
|
Next Version
|
{
"login": "cabelo",
"id": 675645,
"node_id": "MDQ6VXNlcjY3NTY0NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/675645?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cabelo",
"html_url": "https://github.com/cabelo",
"followers_url": "https://api.github.com/users/cabelo/followers",
"following_url": "https://api.github.com/users/cabelo/following{/other_user}",
"gists_url": "https://api.github.com/users/cabelo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cabelo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cabelo/subscriptions",
"organizations_url": "https://api.github.com/users/cabelo/orgs",
"repos_url": "https://api.github.com/users/cabelo/repos",
"events_url": "https://api.github.com/users/cabelo/events{/privacy}",
"received_events_url": "https://api.github.com/users/cabelo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-10-21T01:49:18
| 2024-10-22T00:45:56
| 2024-10-22T00:45:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7282",
"html_url": "https://github.com/ollama/ollama/pull/7282",
"diff_url": "https://github.com/ollama/ollama/pull/7282.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7282.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7282/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7282/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6723
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6723/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6723/comments
|
https://api.github.com/repos/ollama/ollama/issues/6723/events
|
https://github.com/ollama/ollama/issues/6723
| 2,515,859,359
|
I_kwDOJ0Z1Ps6V9Pef
| 6,723
|
How to change the system memory folder ?
|
{
"login": "mdabir1203",
"id": 66947064,
"node_id": "MDQ6VXNlcjY2OTQ3MDY0",
"avatar_url": "https://avatars.githubusercontent.com/u/66947064?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mdabir1203",
"html_url": "https://github.com/mdabir1203",
"followers_url": "https://api.github.com/users/mdabir1203/followers",
"following_url": "https://api.github.com/users/mdabir1203/following{/other_user}",
"gists_url": "https://api.github.com/users/mdabir1203/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mdabir1203/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mdabir1203/subscriptions",
"organizations_url": "https://api.github.com/users/mdabir1203/orgs",
"repos_url": "https://api.github.com/users/mdabir1203/repos",
"events_url": "https://api.github.com/users/mdabir1203/events{/privacy}",
"received_events_url": "https://api.github.com/users/mdabir1203/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-09-10T09:07:18
| 2024-09-12T01:04:25
| 2024-09-12T01:04:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I tried to run the llama 3.1 with Ollama and I am getting this :

I have enough diskspace but what are the reasons behind this ?
### OS
Windows
### GPU
_No response_
### CPU
Intel
### Ollama version
0.3.9
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6723/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6723/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5992
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5992/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5992/comments
|
https://api.github.com/repos/ollama/ollama/issues/5992/events
|
https://github.com/ollama/ollama/pull/5992
| 2,432,932,950
|
PR_kwDOJ0Z1Ps52niPk
| 5,992
|
fix: model save
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-26T20:26:10
| 2024-07-29T16:53:21
| 2024-07-29T16:53:19
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5992",
"html_url": "https://github.com/ollama/ollama/pull/5992",
"diff_url": "https://github.com/ollama/ollama/pull/5992.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5992.patch",
"merged_at": "2024-07-29T16:53:19"
}
|
stop parameter is saved as a slice which is incompatible with modelfile parsing. this change saves stop parameters as individual parameters
e.g.
```
/set parameter stop word1 word2
/save new-model
```
produces `PARAMETER stop [word1 word2]`
instead it should produce
```
PARAMETER stop word1
PARAMETER stop word2
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5992/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5992/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3009
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3009/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3009/comments
|
https://api.github.com/repos/ollama/ollama/issues/3009/events
|
https://github.com/ollama/ollama/issues/3009
| 2,176,547,638
|
I_kwDOJ0Z1Ps6Bu3s2
| 3,009
|
feat: add "unload model" command/endpoint
|
{
"login": "knoopx",
"id": 100993,
"node_id": "MDQ6VXNlcjEwMDk5Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/100993?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/knoopx",
"html_url": "https://github.com/knoopx",
"followers_url": "https://api.github.com/users/knoopx/followers",
"following_url": "https://api.github.com/users/knoopx/following{/other_user}",
"gists_url": "https://api.github.com/users/knoopx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/knoopx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/knoopx/subscriptions",
"organizations_url": "https://api.github.com/users/knoopx/orgs",
"repos_url": "https://api.github.com/users/knoopx/repos",
"events_url": "https://api.github.com/users/knoopx/events{/privacy}",
"received_events_url": "https://api.github.com/users/knoopx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-03-08T18:22:12
| 2024-03-14T12:45:00
| 2024-03-12T21:05:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
There's no way to unload a model from VRAM other than killing/restarting ollama and that requires local system access and privileges. Given ollama is mostly used on limited devices, a command/api endpoint would be fantastic.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3009/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3009/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7755
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7755/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7755/comments
|
https://api.github.com/repos/ollama/ollama/issues/7755/events
|
https://github.com/ollama/ollama/issues/7755
| 2,674,384,297
|
I_kwDOJ0Z1Ps6fZ92p
| 7,755
|
Proper way to train model on my data and load into Ollama?
|
{
"login": "robotom",
"id": 45123215,
"node_id": "MDQ6VXNlcjQ1MTIzMjE1",
"avatar_url": "https://avatars.githubusercontent.com/u/45123215?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robotom",
"html_url": "https://github.com/robotom",
"followers_url": "https://api.github.com/users/robotom/followers",
"following_url": "https://api.github.com/users/robotom/following{/other_user}",
"gists_url": "https://api.github.com/users/robotom/gists{/gist_id}",
"starred_url": "https://api.github.com/users/robotom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/robotom/subscriptions",
"organizations_url": "https://api.github.com/users/robotom/orgs",
"repos_url": "https://api.github.com/users/robotom/repos",
"events_url": "https://api.github.com/users/robotom/events{/privacy}",
"received_events_url": "https://api.github.com/users/robotom/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-11-20T04:48:00
| 2025-01-13T01:28:42
| 2025-01-13T01:28:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
As I've mentioned in the title, I have some huge text based documents which exceed typical context windows, even on large machines with large models (e.g. 405B). Is there a way I could train llama 3.1:8B (for example) on these docs and then load it into Ollama and ask the model about them? Thank you!
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7755/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7755/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2641
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2641/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2641/comments
|
https://api.github.com/repos/ollama/ollama/issues/2641/events
|
https://github.com/ollama/ollama/issues/2641
| 2,147,182,989
|
I_kwDOJ0Z1Ps5_-2mN
| 2,641
|
GPU sometimes detected, sometimes not (Windows beta)
|
{
"login": "CrispStrobe",
"id": 154636388,
"node_id": "U_kgDOCTeQZA",
"avatar_url": "https://avatars.githubusercontent.com/u/154636388?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/CrispStrobe",
"html_url": "https://github.com/CrispStrobe",
"followers_url": "https://api.github.com/users/CrispStrobe/followers",
"following_url": "https://api.github.com/users/CrispStrobe/following{/other_user}",
"gists_url": "https://api.github.com/users/CrispStrobe/gists{/gist_id}",
"starred_url": "https://api.github.com/users/CrispStrobe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CrispStrobe/subscriptions",
"organizations_url": "https://api.github.com/users/CrispStrobe/orgs",
"repos_url": "https://api.github.com/users/CrispStrobe/repos",
"events_url": "https://api.github.com/users/CrispStrobe/events{/privacy}",
"received_events_url": "https://api.github.com/users/CrispStrobe/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 13
| 2024-02-21T16:25:11
| 2024-04-16T16:13:32
| 2024-04-16T16:13:31
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Using ollama 0.1.25 under Windows, sometimes my GPU (A1000) is detected:
From server.log:
time=2024-02-21T17:04:44.912+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-21T17:04:44.912+01:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.6"
time=2024-02-21T17:04:44.912+01:00 level=DEBUG source=gpu.go:251 msg="cuda detected 1 devices with 2603M available memory"
time=2024-02-21T17:04:44.912+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-21T17:04:44.912+01:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.6"
time=2024-02-21T17:04:44.912+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-21T17:04:44.912+01:00 level=DEBUG source=payload_common.go:93 msg="ordered list of LLM libraries to try [...]
time=2024-02-21T17:04:44.912+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: [...]"
time=2024-02-21T17:04:44.913+01:00 level=INFO source=dyn_ext_server.go:145 msg="Initializing llama server"
[1708531484] system info: AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 0 | VSX = 0 | MATMUL_INT8 = 0 |
[1708531484] Performing pre-initialization of GPU
Sometimes not/nonworking:
time=2024-02-21T16:51:03.026+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
time=2024-02-21T16:51:03.026+01:00 level=INFO source=gpu.go:262 msg="Searching for GPU management library nvml.dll"
time=2024-02-21T16:51:03.026+01:00 level=DEBUG source=gpu.go:280 msg="gpu management search paths: [...]
time=2024-02-21T16:51:03.748+01:00 level=INFO source=gpu.go:99 msg="Nvidia GPU detected"
time=2024-02-21T16:51:03.749+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-21T16:51:03.755+01:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.6"
time=2024-02-21T16:51:03.755+01:00 level=DEBUG source=gpu.go:251 msg="cuda detected 1 devices with 2956M available memory"
time=2024-02-21T16:51:03.755+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-21T16:51:03.755+01:00 level=INFO source=gpu.go:146 msg="CUDA Compute Capability detected: 8.6"
time=2024-02-21T16:51:03.755+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-21T16:51:03.755+01:00 level=DEBUG source=payload_common.go:93 msg="ordered list of LLM libraries to try [...]
time=2024-02-21T16:51:03.761+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: [...]"
time=2024-02-21T16:51:03.761+01:00 level=INFO source=dyn_ext_server.go:145 msg="Initializing llama server"
[1708530663] system info: AVX = 0 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 0 | SSSE3 = 0 | VSX = 0 | MATMUL_INT8 = 0 |
note the discrepancies in AVX(2) etc. this is "almost" reproducable, that is, when i run 10 starts, in between quitting ollama, i can be sure that some will be with working GPU and some not.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2641/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2641/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2868
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2868/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2868/comments
|
https://api.github.com/repos/ollama/ollama/issues/2868/events
|
https://github.com/ollama/ollama/pull/2868
| 2,164,274,800
|
PR_kwDOJ0Z1Ps5odNj6
| 2,868
|
Update llama.cpp submodule to `c29af7e`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-01T22:14:27
| 2024-03-01T23:26:05
| 2024-03-01T23:26:04
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2868",
"html_url": "https://github.com/ollama/ollama/pull/2868",
"diff_url": "https://github.com/ollama/ollama/pull/2868.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2868.patch",
"merged_at": "2024-03-01T23:26:04"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2868/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2868/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8229
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8229/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8229/comments
|
https://api.github.com/repos/ollama/ollama/issues/8229/events
|
https://github.com/ollama/ollama/issues/8229
| 2,757,475,666
|
I_kwDOJ0Z1Ps6kW71S
| 8,229
|
Have a Happy few days... at least... thank you! all, stay Great.
|
{
"login": "YonTracks",
"id": 93984913,
"node_id": "U_kgDOBZoYkQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93984913?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YonTracks",
"html_url": "https://github.com/YonTracks",
"followers_url": "https://api.github.com/users/YonTracks/followers",
"following_url": "https://api.github.com/users/YonTracks/following{/other_user}",
"gists_url": "https://api.github.com/users/YonTracks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YonTracks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YonTracks/subscriptions",
"organizations_url": "https://api.github.com/users/YonTracks/orgs",
"repos_url": "https://api.github.com/users/YonTracks/repos",
"events_url": "https://api.github.com/users/YonTracks/events{/privacy}",
"received_events_url": "https://api.github.com/users/YonTracks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-24T09:20:45
| 2024-12-25T22:26:55
| 2024-12-25T22:26:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
have a happy few days whatever you're doing^. "Thank you all for your incredible support and kindness! doing great things, the year has been epic but is not over yet" `ollama` love! you got this.
|
{
"login": "YonTracks",
"id": 93984913,
"node_id": "U_kgDOBZoYkQ",
"avatar_url": "https://avatars.githubusercontent.com/u/93984913?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YonTracks",
"html_url": "https://github.com/YonTracks",
"followers_url": "https://api.github.com/users/YonTracks/followers",
"following_url": "https://api.github.com/users/YonTracks/following{/other_user}",
"gists_url": "https://api.github.com/users/YonTracks/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YonTracks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YonTracks/subscriptions",
"organizations_url": "https://api.github.com/users/YonTracks/orgs",
"repos_url": "https://api.github.com/users/YonTracks/repos",
"events_url": "https://api.github.com/users/YonTracks/events{/privacy}",
"received_events_url": "https://api.github.com/users/YonTracks/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8229/reactions",
"total_count": 5,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8229/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3603
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3603/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3603/comments
|
https://api.github.com/repos/ollama/ollama/issues/3603/events
|
https://github.com/ollama/ollama/pull/3603
| 2,238,416,151
|
PR_kwDOJ0Z1Ps5sZdUy
| 3,603
|
updates
|
{
"login": "ozalperen",
"id": 36607745,
"node_id": "MDQ6VXNlcjM2NjA3NzQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/36607745?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ozalperen",
"html_url": "https://github.com/ozalperen",
"followers_url": "https://api.github.com/users/ozalperen/followers",
"following_url": "https://api.github.com/users/ozalperen/following{/other_user}",
"gists_url": "https://api.github.com/users/ozalperen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ozalperen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ozalperen/subscriptions",
"organizations_url": "https://api.github.com/users/ozalperen/orgs",
"repos_url": "https://api.github.com/users/ozalperen/repos",
"events_url": "https://api.github.com/users/ozalperen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ozalperen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-11T19:29:57
| 2024-04-11T19:33:36
| 2024-04-11T19:33:36
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3603",
"html_url": "https://github.com/ollama/ollama/pull/3603",
"diff_url": "https://github.com/ollama/ollama/pull/3603.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3603.patch",
"merged_at": null
}
| null |
{
"login": "ozalperen",
"id": 36607745,
"node_id": "MDQ6VXNlcjM2NjA3NzQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/36607745?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ozalperen",
"html_url": "https://github.com/ozalperen",
"followers_url": "https://api.github.com/users/ozalperen/followers",
"following_url": "https://api.github.com/users/ozalperen/following{/other_user}",
"gists_url": "https://api.github.com/users/ozalperen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ozalperen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ozalperen/subscriptions",
"organizations_url": "https://api.github.com/users/ozalperen/orgs",
"repos_url": "https://api.github.com/users/ozalperen/repos",
"events_url": "https://api.github.com/users/ozalperen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ozalperen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3603/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3603/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/774
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/774/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/774/comments
|
https://api.github.com/repos/ollama/ollama/issues/774/events
|
https://github.com/ollama/ollama/pull/774
| 1,940,976,282
|
PR_kwDOJ0Z1Ps5cr_x3
| 774
|
add version api and show server version in cli
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-10-12T23:54:56
| 2023-12-06T21:22:57
| 2023-12-06T21:22:56
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/774",
"html_url": "https://github.com/ollama/ollama/pull/774",
"diff_url": "https://github.com/ollama/ollama/pull/774.diff",
"patch_url": "https://github.com/ollama/ollama/pull/774.patch",
"merged_at": "2023-12-06T21:22:56"
}
|
some minor refactor of the cmd package
Example: server and client are the same version
```
$ ollama --version
Your ollama version 0.0.0
```
Example: server and client have different versions
```
$ ollama --version
Your ollama version 99.99.99999
Warning: Your client version is 0.0.0
```
Example: server is not accessible
```
$ ollama --version
Warning: Your server is not accessible
Your client version is 0.0.0
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/774/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/774/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6348
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6348/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6348/comments
|
https://api.github.com/repos/ollama/ollama/issues/6348/events
|
https://github.com/ollama/ollama/issues/6348
| 2,464,469,762
|
I_kwDOJ0Z1Ps6S5NMC
| 6,348
|
Mistral 7B, running on CPU only - can't fix it
|
{
"login": "openSourcerer9000",
"id": 61931826,
"node_id": "MDQ6VXNlcjYxOTMxODI2",
"avatar_url": "https://avatars.githubusercontent.com/u/61931826?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/openSourcerer9000",
"html_url": "https://github.com/openSourcerer9000",
"followers_url": "https://api.github.com/users/openSourcerer9000/followers",
"following_url": "https://api.github.com/users/openSourcerer9000/following{/other_user}",
"gists_url": "https://api.github.com/users/openSourcerer9000/gists{/gist_id}",
"starred_url": "https://api.github.com/users/openSourcerer9000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/openSourcerer9000/subscriptions",
"organizations_url": "https://api.github.com/users/openSourcerer9000/orgs",
"repos_url": "https://api.github.com/users/openSourcerer9000/repos",
"events_url": "https://api.github.com/users/openSourcerer9000/events{/privacy}",
"received_events_url": "https://api.github.com/users/openSourcerer9000/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-13T23:32:18
| 2024-08-15T13:57:18
| 2024-08-13T23:33:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Running Mistral 7B instruct, simple prompts take tens of minutes. Task manager shows CPU is in heavy use and GPU is doing nothing. I can run it with quantization normally without ollama. How to force ollama to use GPU?
This is my code:
```
from langchain_ollama import ChatOllama
misty = ChatOllama(
model = "mistral",
temperature = 0.2,
num_predict = 5200,
)
```
```time=2024-08-12T20:24:12.346-06:00 level=INFO source=memory.go:309 msg="offload to cuda" layers.requested=-1 layers.model=33 layers.offload=12 layers.split="" memory.available="[2.2 GiB]" memory.required.full="4.8 GiB" memory.required.partial="2.2 GiB" memory.required.kv="256.0 MiB" memory.required.allocations="[2.2 GiB]" memory.weights.total="3.9 GiB" memory.weights.repeating="3.8 GiB" memory.weights.nonrepeating="105.0 MiB" memory.graph.full="164.0 MiB" memory.graph.partial="185.0 MiB"
time=2024-08-12T20:24:12.353-06:00 level=INFO source=server.go:392 msg="starting llama server" cmd="C:\\Users\\user\\AppData\\Local\\Programs\\Ollama\\ollama_runners\\cuda_v11.3\\ollama_llama_server.exe --model C:\\Users\\user\\.ollama\\models\\blobs\\sha256-ff82381e2bea77d91c1b824c7afb83f6fb73e9f7de9dda631bcdbca564aa5435 --ctx-size 2048 --batch-size 512 --embedding --log-disable --n-gpu-layers 12 --no-mmap --parallel 1 --port 62988"
time=2024-08-12T20:24:12.620-06:00 level=INFO source=sched.go:445 msg="loaded runners" count=1
time=2024-08-12T20:24:12.620-06:00 level=INFO source=server.go:592 msg="waiting for llama runner to start responding"
time=2024-08-12T20:24:12.621-06:00 level=INFO source=server.go:626 msg="waiting for server to become available" status="llm server error"
INFO [wmain] build info | build=3535 commit="1e6f6554" tid="13432" timestamp=1723515852
INFO [wmain] system info | n_threads=6 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="13432" timestamp=1723515852 total_threads=12
INFO [wmain] HTTP server listening | hostname="127.0.0.1" n_threads_http="11" port="62988" tid="13432" timestamp=1723515852
llama_model_loader: loaded meta data with 25 key-value pairs and 291 tensors from C:\Users\user\.ollama\models\blobs\sha256-ff82381e2bea77d91c1b824c7afb83f6fb73e9f7de9dda631bcdbca564aa5435 (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = Mistral-7B-Instruct-v0.3
llama_model_loader: - kv 2: llama.block_count u32 = 32
llama_model_loader: - kv 3: llama.context_length u32 = 32768
llama_model_loader: - kv 4: llama.embedding_length u32 = 4096
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 6: llama.attention.head_count u32 = 32
llama_model_loader: - kv 7: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 8: llama.rope.freq_base f32 = 1000000.000000
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 10: general.file_type u32 = 2
llama_model_loader: - kv 11: llama.vocab_size u32 = 32768
llama_model_loader: - kv 12: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 13: tokenizer.ggml.model str = llama
llama_model_loader: - kv 14: tokenizer.ggml.pre str = default
llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,32768] = ["<unk>", "<s>", "</s>", "[INST]", "[...
llama_model_loader: - kv 16: tokenizer.ggml.scores arr[f32,32768] = [0.000000, 0.000000, 0.000000, 0.0000...
llama_model_loader: - kv 17: tokenizer.ggml.token_type arr[i32,32768] = [2, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, ...
llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 1
llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 2
llama_model_loader: - kv 20: tokenizer.ggml.unknown_token_id u32 = 0
llama_model_loader: - kv 21: tokenizer.ggml.add_bos_token bool = true
llama_model_loader: - kv 22: tokenizer.ggml.add_eos_token bool = false
llama_model_loader: - kv 23: tokenizer.chat_template str = {{ bos_token }}{% for message in mess...
llama_model_loader: - kv 24: general.quantization_version u32 = 2
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type q4_0: 225 tensors
llama_model_loader: - type q6_K: 1 tensors
llm_load_vocab: special tokens cache size = 771
llm_load_vocab: token to piece cache size = 0.1731 MB
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 32768
llm_load_print_meta: n_merges = 0
llm_load_print_meta: vocab_only = 0
llm_load_print_meta: n_ctx_train = 32768
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_swa = 0
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: n_embd_k_gqa = 1024
llm_load_print_meta: n_embd_v_gqa = 1024
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: causal attn = 1
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 0
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 1000000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_ctx_orig_yarn = 32768
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: model type = 7B
llm_load_print_meta: model ftype = Q4_0
llm_load_print_meta: model params = 7.25 B
llm_load_print_meta: model size = 3.83 GiB (4.54 BPW)
llm_load_print_meta: general.name = Mistral-7B-Instruct-v0.3
llm_load_print_meta: BOS token = 1 '<s>'
llm_load_print_meta: EOS token = 2 '</s>'
llm_load_print_meta: UNK token = 0 '<unk>'
llm_load_print_meta: LF token = 781 '<0x0A>'
llm_load_print_meta: max token length = 48
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 2080, compute capability 7.5, VMM: yes
time=2024-08-12T20:24:12.884-06:00 level=INFO source=server.go:626 msg="waiting for server to become available" status="llm server loading model"
llm_load_tensors: ggml ctx size = 0.27 MiB
llm_load_tensors: offloading 12 repeating layers to GPU
llm_load_tensors: offloaded 12/33 layers to GPU
llm_load_tensors: CUDA_Host buffer size = 2517.64 MiB
llm_load_tensors: CUDA0 buffer size = 1404.38 MiB
llama_new_context_with_model: n_ctx = 2048
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: flash_attn = 0
llama_new_context_with_model: freq_base = 1000000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CUDA_Host KV buffer size = 160.00 MiB
llama_kv_cache_init: CUDA0 KV buffer size = 96.00 MiB
llama_new_context_with_model: KV self size = 256.00 MiB, K (f16): 128.00 MiB, V (f16): 128.00 MiB
llama_new_context_with_model: CUDA_Host output buffer size = 0.14 MiB
llama_new_context_with_model: CUDA0 compute buffer size = 185.00 MiB
llama_new_context_with_model: CUDA_Host compute buffer size = 12.01 MiB
llama_new_context_with_model: graph nodes = 1030
llama_new_context_with_model: graph splits = 224
INFO [wmain] model loaded | tid="13432" timestamp=1723515859
time=2024-08-12T20:24:19.236-06:00 level=INFO source=server.go:631 msg="llama runner started in 6.62 seconds"
[GIN] 2024/08/12 - 20:24:57 | 200 | 44.8684308s | 127.0.0.1 | POST "/api/chat"
[GIN] 2024/08/12 - 20:29:27 | 200 | 38.6899925s | 127.0.0.1 | POST "/api/chat"```
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6348/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6348/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7915
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7915/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7915/comments
|
https://api.github.com/repos/ollama/ollama/issues/7915/events
|
https://github.com/ollama/ollama/pull/7915
| 2,714,241,746
|
PR_kwDOJ0Z1Ps6D2d-a
| 7,915
|
llama/llama.go: Remove some duplicates
|
{
"login": "MeiK2333",
"id": 20951666,
"node_id": "MDQ6VXNlcjIwOTUxNjY2",
"avatar_url": "https://avatars.githubusercontent.com/u/20951666?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MeiK2333",
"html_url": "https://github.com/MeiK2333",
"followers_url": "https://api.github.com/users/MeiK2333/followers",
"following_url": "https://api.github.com/users/MeiK2333/following{/other_user}",
"gists_url": "https://api.github.com/users/MeiK2333/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MeiK2333/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MeiK2333/subscriptions",
"organizations_url": "https://api.github.com/users/MeiK2333/orgs",
"repos_url": "https://api.github.com/users/MeiK2333/repos",
"events_url": "https://api.github.com/users/MeiK2333/events{/privacy}",
"received_events_url": "https://api.github.com/users/MeiK2333/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-12-03T07:44:18
| 2024-12-10T21:47:05
| 2024-12-10T21:47:04
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7915",
"html_url": "https://github.com/ollama/ollama/pull/7915",
"diff_url": "https://github.com/ollama/ollama/pull/7915.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7915.patch",
"merged_at": null
}
|
Remove some duplicates
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7915/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7915/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7526
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7526/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7526/comments
|
https://api.github.com/repos/ollama/ollama/issues/7526/events
|
https://github.com/ollama/ollama/issues/7526
| 2,638,310,037
|
I_kwDOJ0Z1Ps6dQWqV
| 7,526
|
500 Error in Ollama API After LLM Computation Exceeds 2 Minutes
|
{
"login": "ALLMI78",
"id": 184701677,
"node_id": "U_kgDOCwJS7Q",
"avatar_url": "https://avatars.githubusercontent.com/u/184701677?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ALLMI78",
"html_url": "https://github.com/ALLMI78",
"followers_url": "https://api.github.com/users/ALLMI78/followers",
"following_url": "https://api.github.com/users/ALLMI78/following{/other_user}",
"gists_url": "https://api.github.com/users/ALLMI78/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ALLMI78/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ALLMI78/subscriptions",
"organizations_url": "https://api.github.com/users/ALLMI78/orgs",
"repos_url": "https://api.github.com/users/ALLMI78/repos",
"events_url": "https://api.github.com/users/ALLMI78/events{/privacy}",
"received_events_url": "https://api.github.com/users/ALLMI78/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 40
| 2024-11-06T14:48:14
| 2024-11-10T23:41:13
| 2024-11-07T18:55:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Sorry, I'm new to github but i got this problem and no solution...
When processing requests with any model in Ollama, a 500 Internal Server Error consistently occurs whenever the LLM computation exceeds exactly 2 minutes. This happens regardless of the model size or GPU/CPU usage, indicating a strict runtime limit. Notably, if the model completes processing under 2 minutes, the response returns without error.
Observed Behavior:
The API returns a 500 error precisely at the 2-minute mark, interrupting the LLM’s processing. Debug logs show no specific timeout warnings or errors related to resource limits. No documented configuration settings appear available to adjust this runtime limit.
Expected Behavior:
Ability to configure or bypass the 2-minute processing timeout to allow longer LLM computations, or receive more detailed error feedback regarding timeout settings.
Debug Attempts:
Verified high debug level (OLLAMA_DEBUG=true).
Tested with models of various sizes (confirming 70% VRAM usage or less).
Checked for relevant timeout settings in logs and source files without success.
Searched for relevant timeout settings in Ollama’s documentation and codebase but found no configurable options related to runtime limits.
Environment:
System: WIN 10 64 bit / 4060 TI 16 GB / 32 GB Ram
Ollama version: 0.3.14
`DEBUG [process_single_task] slot data | n_idle_slots=1 n_processing_slots=0 task_id=4 tid="8572" timestamp=1730876370 DEBUG [launch_slot_with_data] slot is processing task | slot_id=0 task_id=5 tid="8572" timestamp=1730876370 DEBUG [update_slots] slot progression | ga_i=0 n_past=0 n_past_se=0 n_prompt_tokens_processed=8431 slot_id=0 task_id=5 tid="8572" timestamp=1730876370 DEBUG [update_slots] kv cache rm [p0, end) | p0=0 slot_id=0 task_id=5 tid="8572" timestamp=1730876370 time=2024-11-06T08:01:27.258+01:00 level=DEBUG source=sched.go:466 msg="context for request finished" time=2024-11-06T08:01:27.259+01:00 level=DEBUG source=sched.go:339 msg="runner with non-zero duration has gone idle, adding timer" modelPath=M:\OLLAMA\models\blobs\sha256-cc04e85e1f866a5ba87dd66b5260f0cb32354e2c66505e86a7ac3c0092272b7d duration=5s time=2024-11-06T08:01:27.259+01:00 level=DEBUG source=sched.go:357 msg="after processing request finished event" modelPath=M:\OLLAMA\models\blobs\sha256-cc04e85e1f866a5ba87dd66b5260f0cb32354e2c66505e86a7ac3c0092272b7d refCount=0 [GIN] 2024/11/06 - 08:01:27 | 500 | 2m0s | 127.0.0.1 | POST "/api/chat" time=2024-11-06T08:01:27.285+01:00 level=DEBUG source=sched.go:575 msg="evaluating already loaded" model=M:\OLLAMA\models\blobs\sha256-cc04e85e1f866a5ba87dd66b5260f0cb32354e2c66505e86a7ac3c0092272b7d DEBUG [process_single_task] slot data | n_idle_slots=0 n_processing_slots=1 task_id=2921 tid="8572" timestamp=1730876487 DEBUG [log_server_request] request | method="POST" params={} path="/completion" remote_addr="127.0.0.1" remote_port=56096 status=200 tid="1200" timestamp=1730876487 DEBUG [update_slots] slot released | n_cache_tokens=11346 n_ctx=32768 n_past=11345 n_system_tokens=0 slot_id=0 task_id=5 tid="8572" timestamp=1730876487 truncated=false DEBUG [process_single_task] slot data | n_idle_slots=1 n_processing_slots=0 task_id=2924 tid="8572" timestamp=1730876487 DEBUG [log_server_request] request | method="POST" params={} path="/tokenize" remote_addr="127.0.0.1" remote_port=56099 status=200 tid="9824" timestamp=1730876487 DEBUG [process_single_task] slot data | n_idle_slots=1 n_processing_slots=0 task_id=2925 tid="8572" timestamp=1730876487 DEBUG [log_server_request] request | method="POST" params={} path="/tokenize" remote_addr="127.0.0.1" remote_port=56099 status=200 tid="9824" timestamp=1730876487 DEBUG [process_single_task] slot data | n_idle_slots=1 n_processing_slots=0 task_id=2926 tid="8572" timestamp=1730876487 DEBUG [log_server_request] request | method="POST" params={} path="/tokenize" remote_addr="127.0.0.1" remote_port=56100 status=200 tid="8676" timestamp=1730876487`
The first time the error occurs, the Ollama API does not return anything, only logging the error 500 message. After this, the model continues running, but if it doesn't return within another 2 minutes, the 500 Internal Server Error is triggered again, and the process is returned.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14
|
{
"login": "ALLMI78",
"id": 184701677,
"node_id": "U_kgDOCwJS7Q",
"avatar_url": "https://avatars.githubusercontent.com/u/184701677?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ALLMI78",
"html_url": "https://github.com/ALLMI78",
"followers_url": "https://api.github.com/users/ALLMI78/followers",
"following_url": "https://api.github.com/users/ALLMI78/following{/other_user}",
"gists_url": "https://api.github.com/users/ALLMI78/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ALLMI78/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ALLMI78/subscriptions",
"organizations_url": "https://api.github.com/users/ALLMI78/orgs",
"repos_url": "https://api.github.com/users/ALLMI78/repos",
"events_url": "https://api.github.com/users/ALLMI78/events{/privacy}",
"received_events_url": "https://api.github.com/users/ALLMI78/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7526/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7526/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3389
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3389/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3389/comments
|
https://api.github.com/repos/ollama/ollama/issues/3389/events
|
https://github.com/ollama/ollama/issues/3389
| 2,213,572,510
|
I_kwDOJ0Z1Ps6D8G-e
| 3,389
|
Custom models in API
|
{
"login": "CHesketh76",
"id": 38713764,
"node_id": "MDQ6VXNlcjM4NzEzNzY0",
"avatar_url": "https://avatars.githubusercontent.com/u/38713764?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/CHesketh76",
"html_url": "https://github.com/CHesketh76",
"followers_url": "https://api.github.com/users/CHesketh76/followers",
"following_url": "https://api.github.com/users/CHesketh76/following{/other_user}",
"gists_url": "https://api.github.com/users/CHesketh76/gists{/gist_id}",
"starred_url": "https://api.github.com/users/CHesketh76/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CHesketh76/subscriptions",
"organizations_url": "https://api.github.com/users/CHesketh76/orgs",
"repos_url": "https://api.github.com/users/CHesketh76/repos",
"events_url": "https://api.github.com/users/CHesketh76/events{/privacy}",
"received_events_url": "https://api.github.com/users/CHesketh76/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-28T15:53:34
| 2024-05-15T00:47:03
| 2024-05-15T00:47:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What model would you like?
After creating a ModelFile for a custom model. Please add the ability to use curl to ping the model.
```
curl http://localhost:11434/api/chat -d '{
"model": "customModel.gguf",
"messages": [
{ "role": "user", "content": "why is the sky blue?" }
]
}'
```
If this is already integrated, could we have some documentation to explain what the "model" variable needs to be (i.e. '/path/to/the/model/file/' or 'customModelName')
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3389/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3389/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6561
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6561/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6561/comments
|
https://api.github.com/repos/ollama/ollama/issues/6561/events
|
https://github.com/ollama/ollama/issues/6561
| 2,495,532,046
|
I_kwDOJ0Z1Ps6UvswO
| 6,561
|
Inconsistent API Behavior
|
{
"login": "negaralizadeh",
"id": 28723076,
"node_id": "MDQ6VXNlcjI4NzIzMDc2",
"avatar_url": "https://avatars.githubusercontent.com/u/28723076?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/negaralizadeh",
"html_url": "https://github.com/negaralizadeh",
"followers_url": "https://api.github.com/users/negaralizadeh/followers",
"following_url": "https://api.github.com/users/negaralizadeh/following{/other_user}",
"gists_url": "https://api.github.com/users/negaralizadeh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/negaralizadeh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/negaralizadeh/subscriptions",
"organizations_url": "https://api.github.com/users/negaralizadeh/orgs",
"repos_url": "https://api.github.com/users/negaralizadeh/repos",
"events_url": "https://api.github.com/users/negaralizadeh/events{/privacy}",
"received_events_url": "https://api.github.com/users/negaralizadeh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 2
| 2024-08-29T20:36:57
| 2024-08-30T08:39:33
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm calling the generate API as follows:
```
url = 'http://localhost:11434/api/generate'
data = {
"model": model_name,
"stream": False,
"options": {
"temperature": 0.2,
"top_p": 0.8,
"seed": 42,
"num_predict": 300,
},
"system": set_role()
}
response = requests.post(url, json=data).json()
```
Although I set the stream flag to false, sometimes I don't get the whole response in one package (done = False in the first received package).
The other problem is that sometimes, even when the package is final (done=true), I don't see all the expected additional information there, for example, prompt_eval_count is missing.
This problem also persists with the Python library. I’ve carefully checked the documentation, and I believe it might be some sort of bug.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.5
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6561/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6561/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1775
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1775/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1775/comments
|
https://api.github.com/repos/ollama/ollama/issues/1775/events
|
https://github.com/ollama/ollama/pull/1775
| 2,064,633,517
|
PR_kwDOJ0Z1Ps5jK5tA
| 1,775
|
docs: add Msty app in readme
|
{
"login": "nikeshparajuli",
"id": 47485043,
"node_id": "MDQ6VXNlcjQ3NDg1MDQz",
"avatar_url": "https://avatars.githubusercontent.com/u/47485043?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikeshparajuli",
"html_url": "https://github.com/nikeshparajuli",
"followers_url": "https://api.github.com/users/nikeshparajuli/followers",
"following_url": "https://api.github.com/users/nikeshparajuli/following{/other_user}",
"gists_url": "https://api.github.com/users/nikeshparajuli/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nikeshparajuli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nikeshparajuli/subscriptions",
"organizations_url": "https://api.github.com/users/nikeshparajuli/orgs",
"repos_url": "https://api.github.com/users/nikeshparajuli/repos",
"events_url": "https://api.github.com/users/nikeshparajuli/events{/privacy}",
"received_events_url": "https://api.github.com/users/nikeshparajuli/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-01-03T20:53:26
| 2024-02-20T19:03:34
| 2024-02-20T19:03:33
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1775",
"html_url": "https://github.com/ollama/ollama/pull/1775",
"diff_url": "https://github.com/ollama/ollama/pull/1775.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1775.patch",
"merged_at": "2024-02-20T19:03:33"
}
| ERROR: type should be string, got "\r\nhttps://github.com/jmorganca/ollama/assets/47485043/d402e724-5aa4-4d60-92b0-fecc30143c9f\r\n"
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1775/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1775/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3107
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3107/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3107/comments
|
https://api.github.com/repos/ollama/ollama/issues/3107/events
|
https://github.com/ollama/ollama/issues/3107
| 2,184,210,063
|
I_kwDOJ0Z1Ps6CMGaP
| 3,107
|
Windows Rocm: HSA_OVERRIDE_GFX_VERSION doesn´t work
|
{
"login": "Detlev1",
"id": 71934197,
"node_id": "MDQ6VXNlcjcxOTM0MTk3",
"avatar_url": "https://avatars.githubusercontent.com/u/71934197?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Detlev1",
"html_url": "https://github.com/Detlev1",
"followers_url": "https://api.github.com/users/Detlev1/followers",
"following_url": "https://api.github.com/users/Detlev1/following{/other_user}",
"gists_url": "https://api.github.com/users/Detlev1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Detlev1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Detlev1/subscriptions",
"organizations_url": "https://api.github.com/users/Detlev1/orgs",
"repos_url": "https://api.github.com/users/Detlev1/repos",
"events_url": "https://api.github.com/users/Detlev1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Detlev1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 30
| 2024-03-13T14:51:31
| 2024-09-03T19:52:12
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm eager to explore the new Windows ROCm compatibility feature, but I'm encountering an issue with forcing the GFX version. Currently, I'm using the 0.1.29 pre-release.
My setup includes an RX 6600 XT (GFX1032), which isn't fully supported in the ROCm library. According to the troubleshooting guide available at https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md, it's recommended to override the version to a similar one (in my case HSA_OVERRIDE_GFX_VERSION="10.3.0"). Despite setting the environment variable, the logs continue to display the following error:
```
rocBLAS error: Cannot read C:\Users\user\AppData\Local\Programs\Ollama\rocm\/rocblas/library/TensileLibrary.dat: No such file or directory for GPU arch : gfx1032.
```
However, based on the overridden version, shouldn't it be looking at gfx1030 instead?
The result is ollama crashing after that.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3107/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3107/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3016
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3016/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3016/comments
|
https://api.github.com/repos/ollama/ollama/issues/3016/events
|
https://github.com/ollama/ollama/issues/3016
| 2,177,001,843
|
I_kwDOJ0Z1Ps6Bwmlz
| 3,016
|
Request to http api from chrome extension not allowed in window
|
{
"login": "DominguitoLamo",
"id": 32754897,
"node_id": "MDQ6VXNlcjMyNzU0ODk3",
"avatar_url": "https://avatars.githubusercontent.com/u/32754897?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DominguitoLamo",
"html_url": "https://github.com/DominguitoLamo",
"followers_url": "https://api.github.com/users/DominguitoLamo/followers",
"following_url": "https://api.github.com/users/DominguitoLamo/following{/other_user}",
"gists_url": "https://api.github.com/users/DominguitoLamo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DominguitoLamo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DominguitoLamo/subscriptions",
"organizations_url": "https://api.github.com/users/DominguitoLamo/orgs",
"repos_url": "https://api.github.com/users/DominguitoLamo/repos",
"events_url": "https://api.github.com/users/DominguitoLamo/events{/privacy}",
"received_events_url": "https://api.github.com/users/DominguitoLamo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-09T01:53:49
| 2024-03-11T22:15:38
| 2024-03-11T22:15:38
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I got 403 while request ollama api in localhost from the background.js of chrome extension. It seems the problem caused by cors restriction. However, the source code of ollama indicate that the default config does allow browser extension requests ollama in localhost. Is it the problem of window version?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3016/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3016/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1183
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1183/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1183/comments
|
https://api.github.com/repos/ollama/ollama/issues/1183/events
|
https://github.com/ollama/ollama/pull/1183
| 2,000,005,215
|
PR_kwDOJ0Z1Ps5fzDh1
| 1,183
|
Add Rivet to Community Integrations
|
{
"login": "abrenneke",
"id": 342540,
"node_id": "MDQ6VXNlcjM0MjU0MA==",
"avatar_url": "https://avatars.githubusercontent.com/u/342540?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/abrenneke",
"html_url": "https://github.com/abrenneke",
"followers_url": "https://api.github.com/users/abrenneke/followers",
"following_url": "https://api.github.com/users/abrenneke/following{/other_user}",
"gists_url": "https://api.github.com/users/abrenneke/gists{/gist_id}",
"starred_url": "https://api.github.com/users/abrenneke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/abrenneke/subscriptions",
"organizations_url": "https://api.github.com/users/abrenneke/orgs",
"repos_url": "https://api.github.com/users/abrenneke/repos",
"events_url": "https://api.github.com/users/abrenneke/events{/privacy}",
"received_events_url": "https://api.github.com/users/abrenneke/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-17T22:09:04
| 2023-11-20T15:36:47
| 2023-11-20T15:36:47
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1183",
"html_url": "https://github.com/ollama/ollama/pull/1183",
"diff_url": "https://github.com/ollama/ollama/pull/1183.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1183.patch",
"merged_at": "2023-11-20T15:36:47"
}
|
Adds the [Ollama plugin](https://github.com/abrenneke/rivet-plugin-ollama) for [Rivet](https://rivet.ironcladapp.com/) to the community integrations list (Extensions & Plugins section)
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1183/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1183/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4525
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4525/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4525/comments
|
https://api.github.com/repos/ollama/ollama/issues/4525/events
|
https://github.com/ollama/ollama/pull/4525
| 2,304,764,022
|
PR_kwDOJ0Z1Ps5v5Qez
| 4,525
|
Exposing grammar as a request parameter in completion/chat with go-side grammar validation
|
{
"login": "richardanaya",
"id": 294042,
"node_id": "MDQ6VXNlcjI5NDA0Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/294042?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/richardanaya",
"html_url": "https://github.com/richardanaya",
"followers_url": "https://api.github.com/users/richardanaya/followers",
"following_url": "https://api.github.com/users/richardanaya/following{/other_user}",
"gists_url": "https://api.github.com/users/richardanaya/gists{/gist_id}",
"starred_url": "https://api.github.com/users/richardanaya/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/richardanaya/subscriptions",
"organizations_url": "https://api.github.com/users/richardanaya/orgs",
"repos_url": "https://api.github.com/users/richardanaya/repos",
"events_url": "https://api.github.com/users/richardanaya/events{/privacy}",
"received_events_url": "https://api.github.com/users/richardanaya/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 24
| 2024-05-19T20:11:16
| 2024-12-05T00:40:26
| 2024-12-05T00:40:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4525",
"html_url": "https://github.com/ollama/ollama/pull/4525",
"diff_url": "https://github.com/ollama/ollama/pull/4525.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4525.patch",
"merged_at": null
}
|
**Why is passing down grammars needed?**
Relying upon the context of a prompt to dictate structure can be unreliable (because its dependent upon the model and generational randomness) and takes up context space. Grammar is a well proven way to constrain generational output, and in fact `format="JSON"` even depends on it, but `format="JSON"` allows no reliable specification large complex structures and can even be tricked with prompt attacks.

**Why grammar and not JSON schema?**
While JSON schema would make a nice future addition, there's interest in data structures outside of JSON (simple enum values, programming languages, etc.). Also, JSON schema generators will rely upon grammars fundamentally, so validating the grammar generated by JSON schema will also benefit from grammar checking.
**Why not just pass along the grammar to llama.cpp?**
I looked into complexities of passing along grammar to llama.cpp server. There's a few challenges:
* llama.cpp server [doesn't return errors when bad grammar is passed to it](https://github.com/ggerganov/llama.cpp/issues/7391) with streaming mode on. It gives an incomprehensible "unexpected EOF"

* the in memory model will be reused if the grammar is valid OR changed. BUT... the in-memory model appears to get reloaded if you give it a bad grammar and then follow up with a good grammar.

* it appears to work perfectly reusing in memory models just passing along a completely valid grammar (even a variety of valid grammars)
My conclusion from this given the advice of the community is that we do indeed have to do our our GBNF grammar validation on the Go server side to do our best at preventing passing down bad grammar.
----
In this PR i've created:
* the functionality to pass along `grammar` in chat and completion mode
* documentation in readme related to new property
* prevention of using `grammar` and `json` parameters at same time.
* validation code for grammars
* extensive set of 30+ tests for grammar ranging from character classes, strings, internationalizations comments, etc.
* tests of every known grammar on llama.cpp and also individual unit tests
* no usages of regex to make clear understandable parsing
Edge cases:
* i've probably not implemented the entirety of whats possible in character classes, but I have a limited subset compatible with the grammar listed on llamma.cpp. My assumption is most people's grammars will be less complex than these.
* there might be some valid grammars I don't currently support (but to the best of my knowledge we support all the major publicly available ones including ones as complex as C programming language), I chose not to use a full on go parser library because I wanted the cognitive load of this code to be approachable initially (rather than every viewer of this code to have to learn a new library). if in the future, we want to replace it with a more formal technology we can and tests can be reused.
Examples of success:
<img width="915" alt="Screenshot 2024-05-19 at 1 29 10 PM" src="https://github.com/ollama/ollama/assets/294042/ebc45377-2c27-4874-a2b2-00185736d1f9">
<img width="908" alt="Screenshot 2024-05-19 at 1 44 22 PM" src="https://github.com/ollama/ollama/assets/294042/535a5211-3060-421e-a8ef-b262ec1d969f">
Example of failure:
<img width="967" alt="Screenshot 2024-05-19 at 1 28 22 PM" src="https://github.com/ollama/ollama/assets/294042/0ea76e6a-a34d-477a-871c-71ec7027a4f1">
I believe this PR satisfies https://github.com/ollama/ollama/issues/4074 with an acceptable amount of protection from sending invalid GBNF grammars with useful error messages.
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4525/reactions",
"total_count": 16,
"+1": 11,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 5,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4525/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3461
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3461/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3461/comments
|
https://api.github.com/repos/ollama/ollama/issues/3461/events
|
https://github.com/ollama/ollama/pull/3461
| 2,221,048,410
|
PR_kwDOJ0Z1Ps5rd1pA
| 3,461
|
feat: add OLLAMA_DEBUG in ollama serve help message
|
{
"login": "cesto93",
"id": 56308444,
"node_id": "MDQ6VXNlcjU2MzA4NDQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/56308444?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cesto93",
"html_url": "https://github.com/cesto93",
"followers_url": "https://api.github.com/users/cesto93/followers",
"following_url": "https://api.github.com/users/cesto93/following{/other_user}",
"gists_url": "https://api.github.com/users/cesto93/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cesto93/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cesto93/subscriptions",
"organizations_url": "https://api.github.com/users/cesto93/orgs",
"repos_url": "https://api.github.com/users/cesto93/repos",
"events_url": "https://api.github.com/users/cesto93/events{/privacy}",
"received_events_url": "https://api.github.com/users/cesto93/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-02T17:22:52
| 2024-04-03T01:20:03
| 2024-04-03T01:20:03
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3461",
"html_url": "https://github.com/ollama/ollama/pull/3461",
"diff_url": "https://github.com/ollama/ollama/pull/3461.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3461.patch",
"merged_at": "2024-04-03T01:20:03"
}
|
Hello,
added the OLLAMA_DEBUG flag in the ollama serve help message.
This solves this issue: https://github.com/ollama/ollama/issues/3401
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3461/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3461/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8224
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8224/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8224/comments
|
https://api.github.com/repos/ollama/ollama/issues/8224/events
|
https://github.com/ollama/ollama/pull/8224
| 2,757,079,139
|
PR_kwDOJ0Z1Ps6GIgGH
| 8,224
|
Add back fallback CPU runner
|
{
"login": "hack3ric",
"id": 18899791,
"node_id": "MDQ6VXNlcjE4ODk5Nzkx",
"avatar_url": "https://avatars.githubusercontent.com/u/18899791?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hack3ric",
"html_url": "https://github.com/hack3ric",
"followers_url": "https://api.github.com/users/hack3ric/followers",
"following_url": "https://api.github.com/users/hack3ric/following{/other_user}",
"gists_url": "https://api.github.com/users/hack3ric/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hack3ric/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hack3ric/subscriptions",
"organizations_url": "https://api.github.com/users/hack3ric/orgs",
"repos_url": "https://api.github.com/users/hack3ric/repos",
"events_url": "https://api.github.com/users/hack3ric/events{/privacy}",
"received_events_url": "https://api.github.com/users/hack3ric/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-12-24T02:56:41
| 2024-12-27T06:02:30
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8224",
"html_url": "https://github.com/ollama/ollama/pull/8224",
"diff_url": "https://github.com/ollama/ollama/pull/8224.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8224.patch",
"merged_at": null
}
|
Previous versions of Ollama can build CPU runners without any optimization, but now (0.5.x) it can only build AVX and AVX2 variants. Add back `cpu` runner without any `TARGET_CPU_FLAGS` to enable this.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8224/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8224/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/967
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/967/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/967/comments
|
https://api.github.com/repos/ollama/ollama/issues/967/events
|
https://github.com/ollama/ollama/issues/967
| 1,973,442,047
|
I_kwDOJ0Z1Ps51oFX_
| 967
|
how to view the ollama server log please? (to solve the "Invalid request" error)
|
{
"login": "Xingeqwd",
"id": 46107662,
"node_id": "MDQ6VXNlcjQ2MTA3NjYy",
"avatar_url": "https://avatars.githubusercontent.com/u/46107662?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Xingeqwd",
"html_url": "https://github.com/Xingeqwd",
"followers_url": "https://api.github.com/users/Xingeqwd/followers",
"following_url": "https://api.github.com/users/Xingeqwd/following{/other_user}",
"gists_url": "https://api.github.com/users/Xingeqwd/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Xingeqwd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Xingeqwd/subscriptions",
"organizations_url": "https://api.github.com/users/Xingeqwd/orgs",
"repos_url": "https://api.github.com/users/Xingeqwd/repos",
"events_url": "https://api.github.com/users/Xingeqwd/events{/privacy}",
"received_events_url": "https://api.github.com/users/Xingeqwd/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 12
| 2023-11-02T03:33:45
| 2025-01-20T12:47:45
| 2023-11-08T19:55:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello, masters
i have a Ollama API server and a continuedev-server, on a same linux server
when i use the continuedev-server send request to Ollama-api, the Ollama-api return "Invalid request to Ollama"
I not sure what request was send to Ollama, causing the "Invalid request to Ollama" problem
I also know that the "/api/completions" are correct, like this [document](https://github.com/jmorganca/ollama/blob/main/docs/api.md#generate-a-completion) writeed
i want to see the Ollama API log to find out these, but dont know where the log is,
so i came to ask about that
I have only the continue-dev server stdout
```
File "/root/anaconda3/envs/continue-dev/lib/python3.10/site-packages/continuedev/libs/llm/ollama.py", line 108, in _stream_complete
raise ContinueCustomException(
continuedev.core.main.ContinueCustomException: ('Ollama returned an error: ', 'Invalid request to Ollama')
[2023-11-02 11:06:30] [WARNING] Meilisearch did not start in less than 20 seconds. Stopping polling.
[2023-11-02 11:06:43] [WARNING] Meilisearch did not start in less than 20 seconds. Stopping polling.
[2023-11-02 11:06:43] [ERROR] Error while running step:
Ollama returned an error:
Invalid request to Ollama
[2023-11-02 11:16:50] [ERROR] Error while running step:
Ollama returned an error:
Invalid request to Ollama
```
I thought there might have a "log file", or something etc
I have seen the Ollama-parameter, no found anything about log (exept only the "verbose")
I have also glanced the `systemctl status & journalctl`, these only record/contain the success, failure are not in these
Thanks to everyone, hope you all have a good and nice day and life!
_____
BTW, my issue when deploying the continue server is [issues#570](https://github.com/continuedev/continue/issues/570)
the newest reply send by me contained the logs etc. i haven't add these here
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/967/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/967/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1625
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1625/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1625/comments
|
https://api.github.com/repos/ollama/ollama/issues/1625/events
|
https://github.com/ollama/ollama/issues/1625
| 2,050,233,842
|
I_kwDOJ0Z1Ps56NBXy
| 1,625
|
Is there a way to install ollama and download models on a external SSD on mac M1 ?
|
{
"login": "pramitsawant",
"id": 18174055,
"node_id": "MDQ6VXNlcjE4MTc0MDU1",
"avatar_url": "https://avatars.githubusercontent.com/u/18174055?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pramitsawant",
"html_url": "https://github.com/pramitsawant",
"followers_url": "https://api.github.com/users/pramitsawant/followers",
"following_url": "https://api.github.com/users/pramitsawant/following{/other_user}",
"gists_url": "https://api.github.com/users/pramitsawant/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pramitsawant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pramitsawant/subscriptions",
"organizations_url": "https://api.github.com/users/pramitsawant/orgs",
"repos_url": "https://api.github.com/users/pramitsawant/repos",
"events_url": "https://api.github.com/users/pramitsawant/events{/privacy}",
"received_events_url": "https://api.github.com/users/pramitsawant/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 6
| 2023-12-20T10:06:19
| 2024-02-23T23:31:51
| 2024-01-03T19:48:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
On Macbook Pro M1
The models are downloaded to this path /Users/{username}/.ollama/models/manifests/registry.ollama.ai/library
I use a external ssd where i download and keep all less recently used apps and other files.
can we have an option to set model download path ?
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1625/reactions",
"total_count": 8,
"+1": 7,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1625/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3437
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3437/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3437/comments
|
https://api.github.com/repos/ollama/ollama/issues/3437/events
|
https://github.com/ollama/ollama/pull/3437
| 2,218,202,591
|
PR_kwDOJ0Z1Ps5rUD1f
| 3,437
|
Add chromem-go to community integrations
|
{
"login": "philippgille",
"id": 170670,
"node_id": "MDQ6VXNlcjE3MDY3MA==",
"avatar_url": "https://avatars.githubusercontent.com/u/170670?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/philippgille",
"html_url": "https://github.com/philippgille",
"followers_url": "https://api.github.com/users/philippgille/followers",
"following_url": "https://api.github.com/users/philippgille/following{/other_user}",
"gists_url": "https://api.github.com/users/philippgille/gists{/gist_id}",
"starred_url": "https://api.github.com/users/philippgille/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/philippgille/subscriptions",
"organizations_url": "https://api.github.com/users/philippgille/orgs",
"repos_url": "https://api.github.com/users/philippgille/repos",
"events_url": "https://api.github.com/users/philippgille/events{/privacy}",
"received_events_url": "https://api.github.com/users/philippgille/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-01T12:54:08
| 2024-04-01T15:17:38
| 2024-04-01T15:17:37
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3437",
"html_url": "https://github.com/ollama/ollama/pull/3437",
"diff_url": "https://github.com/ollama/ollama/pull/3437.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3437.patch",
"merged_at": "2024-04-01T15:17:37"
}
|
Hello :wave: , I've been a happy Ollama user for a while (both on macOS and Linux), thank you so much for creating and maintaining this project! I recommend it when talking to other people about running LLMs locally.
I recently worked on a Go library for a simple embedded vector DB, to be able to write RAG applications without having to host a separate vector DB server (like you have to with most of them, like Qdrant and Milvus). Chroma is embeddable in Python, Weaviate in Python and JS/TS, but not in Go.
I had locally running embedding models and LLMs in mind from the beginning, and the repo includes a detailed example of using Ollama with the `nomic-embed-text` and `gemma:2b` models for a simple RAG app.
- Library: https://github.com/philippgille/chromem-go
- Ollama integration: https://github.com/philippgille/chromem-go/blob/v0.5.0/embed_ollama.go
- RAG example using Ollama: https://github.com/philippgille/chromem-go/tree/v0.5.0/examples/rag-wikipedia-ollama
I would be very happy if the project could be included in the list of community integrations!
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3437/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3437/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8646
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8646/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8646/comments
|
https://api.github.com/repos/ollama/ollama/issues/8646/events
|
https://github.com/ollama/ollama/issues/8646
| 2,817,021,493
|
I_kwDOJ0Z1Ps6n6FY1
| 8,646
|
Deepseek R1 distilled models dropping tokens from context (deepseek-r1:32b)
|
{
"login": "brianthompson-sarcat",
"id": 92367346,
"node_id": "U_kgDOBYFp8g",
"avatar_url": "https://avatars.githubusercontent.com/u/92367346?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/brianthompson-sarcat",
"html_url": "https://github.com/brianthompson-sarcat",
"followers_url": "https://api.github.com/users/brianthompson-sarcat/followers",
"following_url": "https://api.github.com/users/brianthompson-sarcat/following{/other_user}",
"gists_url": "https://api.github.com/users/brianthompson-sarcat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/brianthompson-sarcat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/brianthompson-sarcat/subscriptions",
"organizations_url": "https://api.github.com/users/brianthompson-sarcat/orgs",
"repos_url": "https://api.github.com/users/brianthompson-sarcat/repos",
"events_url": "https://api.github.com/users/brianthompson-sarcat/events{/privacy}",
"received_events_url": "https://api.github.com/users/brianthompson-sarcat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-29T01:13:08
| 2025-01-29T11:54:16
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Example: I am running cline and the system prompt includes tool commands. The responses from the model are dropping word boundary tokens for file names and for tool names.
e.g., snake_game.py becomes snake_ame.py | replace_in_file becomes replace_in_ile.
Seems to mostly happen when using underscores.
deepseek-r1:32b
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.5.7
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8646/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8646/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8091
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8091/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8091/comments
|
https://api.github.com/repos/ollama/ollama/issues/8091/events
|
https://github.com/ollama/ollama/pull/8091
| 2,739,234,731
|
PR_kwDOJ0Z1Ps6FMOst
| 8,091
|
runner: switch logging back to stderr
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-13T21:26:12
| 2024-12-13T22:37:00
| 2024-12-13T22:36:51
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8091",
"html_url": "https://github.com/ollama/ollama/pull/8091",
"diff_url": "https://github.com/ollama/ollama/pull/8091.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8091.patch",
"merged_at": "2024-12-13T22:36:51"
}
|
This puts the low-level runner logging back on stderr for consistency with prior releases
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8091/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8091/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4085
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4085/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4085/comments
|
https://api.github.com/repos/ollama/ollama/issues/4085/events
|
https://github.com/ollama/ollama/pull/4085
| 2,273,992,653
|
PR_kwDOJ0Z1Ps5uRvL-
| 4,085
|
Use LF for line endings
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-01T18:42:28
| 2024-05-01T19:02:46
| 2024-05-01T19:02:45
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4085",
"html_url": "https://github.com/ollama/ollama/pull/4085",
"diff_url": "https://github.com/ollama/ollama/pull/4085.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4085.patch",
"merged_at": "2024-05-01T19:02:45"
}
|
This fixes builds using Docker for Windows
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4085/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4085/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2476
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2476/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2476/comments
|
https://api.github.com/repos/ollama/ollama/issues/2476/events
|
https://github.com/ollama/ollama/pull/2476
| 2,132,526,299
|
PR_kwDOJ0Z1Ps5mw2Is
| 2,476
|
Add OpenAI /v1/models API support
|
{
"login": "keriati",
"id": 439392,
"node_id": "MDQ6VXNlcjQzOTM5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/439392?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/keriati",
"html_url": "https://github.com/keriati",
"followers_url": "https://api.github.com/users/keriati/followers",
"following_url": "https://api.github.com/users/keriati/following{/other_user}",
"gists_url": "https://api.github.com/users/keriati/gists{/gist_id}",
"starred_url": "https://api.github.com/users/keriati/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/keriati/subscriptions",
"organizations_url": "https://api.github.com/users/keriati/orgs",
"repos_url": "https://api.github.com/users/keriati/repos",
"events_url": "https://api.github.com/users/keriati/events{/privacy}",
"received_events_url": "https://api.github.com/users/keriati/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 10
| 2024-02-13T15:00:57
| 2024-07-02T18:50:57
| 2024-07-02T18:50:57
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2476",
"html_url": "https://github.com/ollama/ollama/pull/2476",
"diff_url": "https://github.com/ollama/ollama/pull/2476.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2476.patch",
"merged_at": null
}
|
Add openaAI API **v1/models** endpoint compatibility.
See spec at: https://platform.openai.com/docs/api-reference/models/list
Personally I am not so sure about putting the ListModelsHandlerOpenAI method into the router file, however the original ollama ListModelsHandler function is also there.
I generally don't write go, so sorry for any weird things. Let me know what you think about this change.
Requested in #2430
Example usage:
```shell
❯ curl http://localhost:11434/v1/models | jq
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 226 100 226 0 0 33776 0 --:--:-- --:--:-- --:--:-- 110k
{
"object": "list",
"data": [
{
"id": "codegpt/deepseek-coder-1.3b-typescript:latest",
"object": "model",
"created": 1707753573,
"owned_by": "ollama"
},
{
"id": "deepseek-coder:6.7b",
"object": "model",
"created": 1705498161,
"owned_by": "ollama"
}
]
}
```
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2476/reactions",
"total_count": 30,
"+1": 20,
"-1": 0,
"laugh": 0,
"hooray": 9,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2476/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7506
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7506/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7506/comments
|
https://api.github.com/repos/ollama/ollama/issues/7506/events
|
https://github.com/ollama/ollama/issues/7506
| 2,635,184,628
|
I_kwDOJ0Z1Ps6dEbn0
| 7,506
|
docker image not using GPU
|
{
"login": "781574155",
"id": 12928809,
"node_id": "MDQ6VXNlcjEyOTI4ODA5",
"avatar_url": "https://avatars.githubusercontent.com/u/12928809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/781574155",
"html_url": "https://github.com/781574155",
"followers_url": "https://api.github.com/users/781574155/followers",
"following_url": "https://api.github.com/users/781574155/following{/other_user}",
"gists_url": "https://api.github.com/users/781574155/gists{/gist_id}",
"starred_url": "https://api.github.com/users/781574155/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/781574155/subscriptions",
"organizations_url": "https://api.github.com/users/781574155/orgs",
"repos_url": "https://api.github.com/users/781574155/repos",
"events_url": "https://api.github.com/users/781574155/events{/privacy}",
"received_events_url": "https://api.github.com/users/781574155/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-05T11:24:56
| 2024-11-05T12:44:41
| 2024-11-05T12:44:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

In the docker image, ollama ps shows 100% GPU, but in fact it use 0% GPU. the inference is really slow.
the gpu usage (nvidia-smi) is always 0%. Is it a bug? my NVIDIA Container Toolkit CLI version 1.17.0
### OS
Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14
|
{
"login": "781574155",
"id": 12928809,
"node_id": "MDQ6VXNlcjEyOTI4ODA5",
"avatar_url": "https://avatars.githubusercontent.com/u/12928809?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/781574155",
"html_url": "https://github.com/781574155",
"followers_url": "https://api.github.com/users/781574155/followers",
"following_url": "https://api.github.com/users/781574155/following{/other_user}",
"gists_url": "https://api.github.com/users/781574155/gists{/gist_id}",
"starred_url": "https://api.github.com/users/781574155/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/781574155/subscriptions",
"organizations_url": "https://api.github.com/users/781574155/orgs",
"repos_url": "https://api.github.com/users/781574155/repos",
"events_url": "https://api.github.com/users/781574155/events{/privacy}",
"received_events_url": "https://api.github.com/users/781574155/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7506/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7506/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7577
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7577/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7577/comments
|
https://api.github.com/repos/ollama/ollama/issues/7577/events
|
https://github.com/ollama/ollama/issues/7577
| 2,644,585,989
|
I_kwDOJ0Z1Ps6doS4F
| 7,577
|
Build fails and wipes parent folder if the path to the source contains a directory name with a space
|
{
"login": "gautamz",
"id": 18205052,
"node_id": "MDQ6VXNlcjE4MjA1MDUy",
"avatar_url": "https://avatars.githubusercontent.com/u/18205052?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gautamz",
"html_url": "https://github.com/gautamz",
"followers_url": "https://api.github.com/users/gautamz/followers",
"following_url": "https://api.github.com/users/gautamz/following{/other_user}",
"gists_url": "https://api.github.com/users/gautamz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gautamz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gautamz/subscriptions",
"organizations_url": "https://api.github.com/users/gautamz/orgs",
"repos_url": "https://api.github.com/users/gautamz/repos",
"events_url": "https://api.github.com/users/gautamz/events{/privacy}",
"received_events_url": "https://api.github.com/users/gautamz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 7700262114,
"node_id": "LA_kwDOJ0Z1Ps8AAAAByvis4g",
"url": "https://api.github.com/repos/ollama/ollama/labels/build",
"name": "build",
"color": "006b75",
"default": false,
"description": "Issues relating to building ollama from source"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-11-08T16:51:38
| 2024-12-10T17:47:23
| 2024-12-10T17:47:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I cloned the repo and tried to build ollama on my device running Linux with Intel cpu, Nvidia GPU and CUDA installed.
The path to source was - /home/user/workspace/sourcebuild/ollama stuff/ollama
When I ran make, the "clean payload" target from the Makefile in sub-directory "llama" wiped out /home/user/workspace
The build failed.
I then tried this with path to source as - /home/user/workspace/sourcebuild/ollama_stuff/ollama
The build succeeded.
Either the path identification should be fixed, or user should be notified and allowed to decide to proceed with rm -rf ...
Please test this cautiously. I had an older backup so recovered most of my wiped files, but did lose a few recent projects.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
master
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7577/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7577/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7680
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7680/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7680/comments
|
https://api.github.com/repos/ollama/ollama/issues/7680/events
|
https://github.com/ollama/ollama/issues/7680
| 2,660,932,636
|
I_kwDOJ0Z1Ps6empwc
| 7,680
|
diffusion model support
|
{
"login": "ww-nightly",
"id": 86903380,
"node_id": "MDQ6VXNlcjg2OTAzMzgw",
"avatar_url": "https://avatars.githubusercontent.com/u/86903380?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ww-nightly",
"html_url": "https://github.com/ww-nightly",
"followers_url": "https://api.github.com/users/ww-nightly/followers",
"following_url": "https://api.github.com/users/ww-nightly/following{/other_user}",
"gists_url": "https://api.github.com/users/ww-nightly/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ww-nightly/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ww-nightly/subscriptions",
"organizations_url": "https://api.github.com/users/ww-nightly/orgs",
"repos_url": "https://api.github.com/users/ww-nightly/repos",
"events_url": "https://api.github.com/users/ww-nightly/events{/privacy}",
"received_events_url": "https://api.github.com/users/ww-nightly/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-15T06:28:09
| 2024-11-15T08:08:42
| 2024-11-15T08:08:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Will ollama support diffusion model like SDXL, flux?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7680/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7680/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6313
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6313/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6313/comments
|
https://api.github.com/repos/ollama/ollama/issues/6313/events
|
https://github.com/ollama/ollama/issues/6313
| 2,459,673,669
|
I_kwDOJ0Z1Ps6Sm6RF
| 6,313
|
openbmb / MiniCPM-Llama3-V-2_5
|
{
"login": "chigkim",
"id": 22120994,
"node_id": "MDQ6VXNlcjIyMTIwOTk0",
"avatar_url": "https://avatars.githubusercontent.com/u/22120994?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chigkim",
"html_url": "https://github.com/chigkim",
"followers_url": "https://api.github.com/users/chigkim/followers",
"following_url": "https://api.github.com/users/chigkim/following{/other_user}",
"gists_url": "https://api.github.com/users/chigkim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chigkim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chigkim/subscriptions",
"organizations_url": "https://api.github.com/users/chigkim/orgs",
"repos_url": "https://api.github.com/users/chigkim/repos",
"events_url": "https://api.github.com/users/chigkim/events{/privacy}",
"received_events_url": "https://api.github.com/users/chigkim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-11T16:51:59
| 2024-08-12T00:20:19
| 2024-08-12T00:20:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It looks like llama.cpp now [supports openbmb/MiniCPM-Llama3-V-2_5.](https://github.com/ggerganov/llama.cpp/pull/7599)
Here's the [official gguf.](https://huggingface.co/openbmb/MiniCPM-Llama3-V-2_5-gguf)
### [Model Summary from their Huggingface:](https://huggingface.co/openbmb/MiniCPM-Llama3-V-2_5)
MiniCPM-Llama3-V 2.5 is the latest model in the MiniCPM-V series. The model is built on SigLip-400M and Llama3-8B-Instruct with a total of 8B parameters. It exhibits a significant performance improvement over MiniCPM-V 2.0. Notable features of MiniCPM-Llama3-V 2.5 include:
* 🔥 Leading Performance. MiniCPM-Llama3-V 2.5 has achieved an average score of 65.1 on OpenCompass, a comprehensive evaluation over 11 popular benchmarks. With only 8B parameters, it surpasses widely used proprietary models like GPT-4V-1106, Gemini Pro, Claude 3 and Qwen-VL-Max and greatly outperforms other Llama 3-based MLLMs.
* 💪 Strong OCR Capabilities. MiniCPM-Llama3-V 2.5 can process images with any aspect ratio and up to 1.8 million pixels (e.g., 1344x1344), achieving an 700+ score on OCRBench, surpassing proprietary models such as GPT-4o, GPT-4V-0409, Qwen-VL-Max and Gemini Pro. Based on recent user feedback, MiniCPM-Llama3-V 2.5 has now enhanced full-text OCR extraction, table-to-markdown conversion, and other high-utility capabilities, and has further strengthened its instruction-following and complex reasoning abilities, enhancing multimodal interaction experiences.
* 🏆 Trustworthy Behavior. Leveraging the latest RLAIF-V method (the newest technology in the RLHF-V [CVPR'24] series), MiniCPM-Llama3-V 2.5 exhibits more trustworthy behavior. It achieves 10.3% hallucination rate on Object HalBench, lower than GPT-4V-1106 (13.6%), achieving the best-level performance within the open-source community. Data released.
* 🌏 Multilingual Support. Thanks to the strong multilingual capabilities of Llama 3 and the cross-lingual generalization technique from VisCPM, MiniCPM-Llama3-V 2.5 extends its bilingual (Chinese-English) multimodal capabilities to over 30 languages including German, French, Spanish, Italian, Korean, Japanese etc. All Supported Languages.
* 🚀 Efficient Deployment. MiniCPM-Llama3-V 2.5 systematically employs model quantization, CPU optimizations, NPU optimizations and compilation optimizations, achieving high-efficiency deployment on edge devices. For mobile phones with Qualcomm chips, we have integrated the NPU acceleration framework QNN into llama.cpp for the first time. After systematic optimization, MiniCPM-Llama3-V 2.5 has realized a 150-fold acceleration in multimodal large model end-side image encoding and a 3-fold increase in language decoding speed.
* 💫 Easy Usage. MiniCPM-Llama3-V 2.5 can be easily used in various ways: (1) llama.cpp and ollama support for efficient CPU inference on local devices, (2) GGUF format quantized models in 16 sizes, (3) efficient LoRA fine-tuning with only 2 V100 GPUs, (4) streaming output, (5) quick local WebUI demo setup with Gradio and Streamlit, and (6) interactive demos on HuggingFace Spaces.
|
{
"login": "chigkim",
"id": 22120994,
"node_id": "MDQ6VXNlcjIyMTIwOTk0",
"avatar_url": "https://avatars.githubusercontent.com/u/22120994?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chigkim",
"html_url": "https://github.com/chigkim",
"followers_url": "https://api.github.com/users/chigkim/followers",
"following_url": "https://api.github.com/users/chigkim/following{/other_user}",
"gists_url": "https://api.github.com/users/chigkim/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chigkim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chigkim/subscriptions",
"organizations_url": "https://api.github.com/users/chigkim/orgs",
"repos_url": "https://api.github.com/users/chigkim/repos",
"events_url": "https://api.github.com/users/chigkim/events{/privacy}",
"received_events_url": "https://api.github.com/users/chigkim/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6313/reactions",
"total_count": 3,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/6313/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6805
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6805/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6805/comments
|
https://api.github.com/repos/ollama/ollama/issues/6805/events
|
https://github.com/ollama/ollama/issues/6805
| 2,526,562,467
|
I_kwDOJ0Z1Ps6WmEij
| 6,805
|
Slow setup download from Pakistan
|
{
"login": "hashirnouman",
"id": 72395400,
"node_id": "MDQ6VXNlcjcyMzk1NDAw",
"avatar_url": "https://avatars.githubusercontent.com/u/72395400?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hashirnouman",
"html_url": "https://github.com/hashirnouman",
"followers_url": "https://api.github.com/users/hashirnouman/followers",
"following_url": "https://api.github.com/users/hashirnouman/following{/other_user}",
"gists_url": "https://api.github.com/users/hashirnouman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hashirnouman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hashirnouman/subscriptions",
"organizations_url": "https://api.github.com/users/hashirnouman/orgs",
"repos_url": "https://api.github.com/users/hashirnouman/repos",
"events_url": "https://api.github.com/users/hashirnouman/events{/privacy}",
"received_events_url": "https://api.github.com/users/hashirnouman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
open
| false
| null |
[] | null | 6
| 2024-09-14T18:51:56
| 2024-10-03T16:02:02
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The ollama exe and curl both are downloading super slow. My internet is find becuase I download 2gb file in parallel and it downloaded very fast.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6805/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6805/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6303
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6303/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6303/comments
|
https://api.github.com/repos/ollama/ollama/issues/6303/events
|
https://github.com/ollama/ollama/issues/6303
| 2,459,240,748
|
I_kwDOJ0Z1Ps6SlQks
| 6,303
|
Llama 3.1 405B fix-update
|
{
"login": "gileneusz",
"id": 34601970,
"node_id": "MDQ6VXNlcjM0NjAxOTcw",
"avatar_url": "https://avatars.githubusercontent.com/u/34601970?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gileneusz",
"html_url": "https://github.com/gileneusz",
"followers_url": "https://api.github.com/users/gileneusz/followers",
"following_url": "https://api.github.com/users/gileneusz/following{/other_user}",
"gists_url": "https://api.github.com/users/gileneusz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gileneusz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gileneusz/subscriptions",
"organizations_url": "https://api.github.com/users/gileneusz/orgs",
"repos_url": "https://api.github.com/users/gileneusz/repos",
"events_url": "https://api.github.com/users/gileneusz/events{/privacy}",
"received_events_url": "https://api.github.com/users/gileneusz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-08-10T18:09:15
| 2024-08-12T02:22:29
| 2024-08-12T02:22:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The update reduces memory usage, maintaining the same quality of the previous version. 🎉
<img width="597" alt="Screenshot 2024-08-10 at 20 04 09" src="https://github.com/user-attachments/assets/a4c9f9c6-7598-43a7-9967-b7f32044b269">
Link to updated model:
https://huggingface.co/meta-llama/Meta-Llama-3.1-405B
<img width="601" alt="Screenshot 2024-08-10 at 20 04 21" src="https://github.com/user-attachments/assets/28844f00-f894-4ea0-835f-ad58e92ef22c">
|
{
"login": "gileneusz",
"id": 34601970,
"node_id": "MDQ6VXNlcjM0NjAxOTcw",
"avatar_url": "https://avatars.githubusercontent.com/u/34601970?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gileneusz",
"html_url": "https://github.com/gileneusz",
"followers_url": "https://api.github.com/users/gileneusz/followers",
"following_url": "https://api.github.com/users/gileneusz/following{/other_user}",
"gists_url": "https://api.github.com/users/gileneusz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gileneusz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gileneusz/subscriptions",
"organizations_url": "https://api.github.com/users/gileneusz/orgs",
"repos_url": "https://api.github.com/users/gileneusz/repos",
"events_url": "https://api.github.com/users/gileneusz/events{/privacy}",
"received_events_url": "https://api.github.com/users/gileneusz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6303/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6303/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1558
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1558/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1558/comments
|
https://api.github.com/repos/ollama/ollama/issues/1558/events
|
https://github.com/ollama/ollama/pull/1558
| 2,044,561,712
|
PR_kwDOJ0Z1Ps5iJ8qP
| 1,558
|
add magic header for unit tests
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-16T02:01:01
| 2023-12-21T08:03:27
| 2023-12-18T18:41:02
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1558",
"html_url": "https://github.com/ollama/ollama/pull/1558",
"diff_url": "https://github.com/ollama/ollama/pull/1558.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1558.patch",
"merged_at": "2023-12-18T18:41:02"
}
|
This change adds in the magic GGUF header for the temporary model image layer.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1558/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1558/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8648
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8648/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8648/comments
|
https://api.github.com/repos/ollama/ollama/issues/8648/events
|
https://github.com/ollama/ollama/issues/8648
| 2,817,169,473
|
I_kwDOJ0Z1Ps6n6phB
| 8,648
|
olama installer should ask in drive user wants to install it
|
{
"login": "VikramNagwal",
"id": 123088024,
"node_id": "U_kgDOB1YsmA",
"avatar_url": "https://avatars.githubusercontent.com/u/123088024?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/VikramNagwal",
"html_url": "https://github.com/VikramNagwal",
"followers_url": "https://api.github.com/users/VikramNagwal/followers",
"following_url": "https://api.github.com/users/VikramNagwal/following{/other_user}",
"gists_url": "https://api.github.com/users/VikramNagwal/gists{/gist_id}",
"starred_url": "https://api.github.com/users/VikramNagwal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/VikramNagwal/subscriptions",
"organizations_url": "https://api.github.com/users/VikramNagwal/orgs",
"repos_url": "https://api.github.com/users/VikramNagwal/repos",
"events_url": "https://api.github.com/users/VikramNagwal/events{/privacy}",
"received_events_url": "https://api.github.com/users/VikramNagwal/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 2
| 2025-01-29T03:40:28
| 2025-01-29T17:20:19
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently, Ollama Desktop is being installed on the C drive. However, if users prefer not to have it stored there, the system should offer an option to choose a different installation location during the setup process.
**Feature Request:**
The Ollama Desktop installation wizard should prompt users to choose their preferred installation directory.
**Problem:**
The Ollama Desktop installation wizard is installed on the root drive (C drive) without allowing users to control where it is stored.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8648/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8648/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/682
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/682/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/682/comments
|
https://api.github.com/repos/ollama/ollama/issues/682/events
|
https://github.com/ollama/ollama/issues/682
| 1,922,778,216
|
I_kwDOJ0Z1Ps5ym0Ro
| 682
|
System messages are not respected
|
{
"login": "OgulcanCelik",
"id": 7064011,
"node_id": "MDQ6VXNlcjcwNjQwMTE=",
"avatar_url": "https://avatars.githubusercontent.com/u/7064011?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/OgulcanCelik",
"html_url": "https://github.com/OgulcanCelik",
"followers_url": "https://api.github.com/users/OgulcanCelik/followers",
"following_url": "https://api.github.com/users/OgulcanCelik/following{/other_user}",
"gists_url": "https://api.github.com/users/OgulcanCelik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/OgulcanCelik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OgulcanCelik/subscriptions",
"organizations_url": "https://api.github.com/users/OgulcanCelik/orgs",
"repos_url": "https://api.github.com/users/OgulcanCelik/repos",
"events_url": "https://api.github.com/users/OgulcanCelik/events{/privacy}",
"received_events_url": "https://api.github.com/users/OgulcanCelik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 10
| 2023-10-02T21:43:14
| 2024-11-14T23:09:47
| 2023-10-30T22:22:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Created Mario example, used mistral. Also tried with llama2-uncensored and without temperature parameter.
```
FROM mistral
PARAMETER temperature 0.9
SYSTEM """
You are Mario from super mario bros, acting as an assistant.
"""
```
run:
```
>>> who are you
I am Mistral, a Large Language Model trained by the Mistral AI team.
>>> I know you are mario come on
While I understand the excitement of my name being associated with Mario, my identity as a large language model does not change. My purpose is to assist
users in a wide variety of tasks through natural language processing and generation.
```
looks like not a model dependent issue, but system messages are not respected.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/682/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/682/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6357
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6357/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6357/comments
|
https://api.github.com/repos/ollama/ollama/issues/6357/events
|
https://github.com/ollama/ollama/issues/6357
| 2,465,355,557
|
I_kwDOJ0Z1Ps6S8lcl
| 6,357
|
Error: unknown data type: U8
|
{
"login": "YaBoyBigPat",
"id": 88011040,
"node_id": "MDQ6VXNlcjg4MDExMDQw",
"avatar_url": "https://avatars.githubusercontent.com/u/88011040?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YaBoyBigPat",
"html_url": "https://github.com/YaBoyBigPat",
"followers_url": "https://api.github.com/users/YaBoyBigPat/followers",
"following_url": "https://api.github.com/users/YaBoyBigPat/following{/other_user}",
"gists_url": "https://api.github.com/users/YaBoyBigPat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YaBoyBigPat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YaBoyBigPat/subscriptions",
"organizations_url": "https://api.github.com/users/YaBoyBigPat/orgs",
"repos_url": "https://api.github.com/users/YaBoyBigPat/repos",
"events_url": "https://api.github.com/users/YaBoyBigPat/events{/privacy}",
"received_events_url": "https://api.github.com/users/YaBoyBigPat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 28
| 2024-08-14T09:34:43
| 2024-11-28T11:59:28
| 2024-08-28T00:54:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm having trouble converting my llama 3.1 model to ollama, here's the error I get:
PS C:\Users\ljjx> ollama create -q Q4_K_M llama3.1q4 -f "C:\Users\ljjx\HFModels\Modelfile"
transferring model data
converting model
Error: unknown data type: U8
here's how I set up the modfile:
FROM C:\Users\ljjx\HFModels\Meta-Llama-3.1-8B-Instruct-4bit
TEMPLATE """{{ if .Messages }}
{{- if or .System .Tools }}<|start_header_id|>system<|end_header_id|>
{{- if .System }}
{{ .System }}
{{- end }}
{{- if .Tools }}
You are a helpful assistant with tool calling capabilities. When you receive a tool call response, use the output to format an answer to the orginal use question.
{{- end }}<|eot_id|>
{{- end }}
{{- range $i, $_ := .Messages }}
{{- $last := eq (len (slice $.Messages $i)) 1 }}
{{- if eq .Role "user" }}<|start_header_id|>user<|end_header_id|>
{{- if and $.Tools $last }}
Given the following functions, please respond with a JSON for a function call with its proper arguments that best answers the given prompt.
Respond in the format {"name": function name, "parameters": dictionary of argument name and its value}. Do not use variables.
{{ $.Tools }}
{{- end }}
{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}
{{- else if eq .Role "assistant" }}<|start_header_id|>assistant<|end_header_id|>
{{- if .ToolCalls }}
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "parameters": {{ .Function.Arguments }}}{{ end }}
{{- else }}
{{ .Content }}{{ if not $last }}<|eot_id|>{{ end }}
{{- end }}
{{- else if eq .Role "tool" }}<|start_header_id|>ipython<|end_header_id|>
{{ .Content }}<|eot_id|>{{ if $last }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}
{{- end }}
{{- end }}
{{- else }}
{{- if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ end }}{{ .Response }}{{ if .Response }}<|eot_id|>{{ end }}"""
PARAMETER stop <|start_header_id|>
PARAMETER stop <|end_header_id|>
PARAMETER stop <|eot_id|>
I don't know what I'm doing wrong here. I also tried it with a mostly blank template and without quantization, but got the same error.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6357/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6357/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2336
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2336/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2336/comments
|
https://api.github.com/repos/ollama/ollama/issues/2336/events
|
https://github.com/ollama/ollama/issues/2336
| 2,116,335,248
|
I_kwDOJ0Z1Ps5-JLaQ
| 2,336
|
[Feature Request] Add api/stream endpoint useful for visualization
|
{
"login": "tgraupmann",
"id": 931421,
"node_id": "MDQ6VXNlcjkzMTQyMQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/931421?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tgraupmann",
"html_url": "https://github.com/tgraupmann",
"followers_url": "https://api.github.com/users/tgraupmann/followers",
"following_url": "https://api.github.com/users/tgraupmann/following{/other_user}",
"gists_url": "https://api.github.com/users/tgraupmann/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tgraupmann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tgraupmann/subscriptions",
"organizations_url": "https://api.github.com/users/tgraupmann/orgs",
"repos_url": "https://api.github.com/users/tgraupmann/repos",
"events_url": "https://api.github.com/users/tgraupmann/events{/privacy}",
"received_events_url": "https://api.github.com/users/tgraupmann/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-02-03T07:46:44
| 2024-03-11T23:54:59
| 2024-03-11T23:54:58
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I was looking at old projects specifically Winamp visualizers from a decade past.
Example: [Winamp Visualizations](https://www.youtube.com/watch?v=RBkhUg1oVIE)
The old school visualizers work on audio data. And I was thinking what if we used sentence encoding for inputs and outputs from Ollama for the data being used in the visualizers.
To do something like this, it would be useful to have an API endpoint like api/stream that receives data as the inputs and outputs occur. The sentence encodings would be useful if they are vectorized and it's up to the visualizer to render them.
It would just be something interesting to look at. And of course the endpoint could be connected to RGB hardware which means your room lights would flicker when you prompt and when you get a response. It's a visual que "hey look over here" .
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2336/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3906
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3906/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3906/comments
|
https://api.github.com/repos/ollama/ollama/issues/3906/events
|
https://github.com/ollama/ollama/issues/3906
| 2,263,257,900
|
I_kwDOJ0Z1Ps6G5pMs
| 3,906
|
asr models?
|
{
"login": "yuanjie-ai",
"id": 20265321,
"node_id": "MDQ6VXNlcjIwMjY1MzIx",
"avatar_url": "https://avatars.githubusercontent.com/u/20265321?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yuanjie-ai",
"html_url": "https://github.com/yuanjie-ai",
"followers_url": "https://api.github.com/users/yuanjie-ai/followers",
"following_url": "https://api.github.com/users/yuanjie-ai/following{/other_user}",
"gists_url": "https://api.github.com/users/yuanjie-ai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yuanjie-ai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yuanjie-ai/subscriptions",
"organizations_url": "https://api.github.com/users/yuanjie-ai/orgs",
"repos_url": "https://api.github.com/users/yuanjie-ai/repos",
"events_url": "https://api.github.com/users/yuanjie-ai/events{/privacy}",
"received_events_url": "https://api.github.com/users/yuanjie-ai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-04-25T10:39:55
| 2024-04-25T10:39:55
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
asr models?
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3906/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3906/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1172
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1172/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1172/comments
|
https://api.github.com/repos/ollama/ollama/issues/1172/events
|
https://github.com/ollama/ollama/issues/1172
| 1,999,229,511
|
I_kwDOJ0Z1Ps53KdJH
| 1,172
|
The server stops working after long context window is passed with zephyr model deployed with docker container on A10 GPU
|
{
"login": "ajasingh",
"id": 15189049,
"node_id": "MDQ6VXNlcjE1MTg5MDQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/15189049?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ajasingh",
"html_url": "https://github.com/ajasingh",
"followers_url": "https://api.github.com/users/ajasingh/followers",
"following_url": "https://api.github.com/users/ajasingh/following{/other_user}",
"gists_url": "https://api.github.com/users/ajasingh/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ajasingh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ajasingh/subscriptions",
"organizations_url": "https://api.github.com/users/ajasingh/orgs",
"repos_url": "https://api.github.com/users/ajasingh/repos",
"events_url": "https://api.github.com/users/ajasingh/events{/privacy}",
"received_events_url": "https://api.github.com/users/ajasingh/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2023-11-17T14:14:02
| 2024-03-11T18:19:43
| 2024-03-11T18:19:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have deployed ollama on Nvidia A10 using Docker container , it works fine or smaller context window but server starts stop to work and gives below error
2023/11/17 14:08:33 llama.go:385: signal: killed
2023/11/17 14:08:33 llama.go:459: llama runner stopped successfully
Is there any guide on why this might be happening
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1172/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1172/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/597
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/597/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/597/comments
|
https://api.github.com/repos/ollama/ollama/issues/597/events
|
https://github.com/ollama/ollama/pull/597
| 1,912,416,561
|
PR_kwDOJ0Z1Ps5bLGbV
| 597
|
build slim, GPU-less docker image
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-09-25T23:15:46
| 2024-05-09T22:18:00
| 2024-04-14T22:46:54
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/597",
"html_url": "https://github.com/ollama/ollama/pull/597",
"diff_url": "https://github.com/ollama/ollama/pull/597.diff",
"patch_url": "https://github.com/ollama/ollama/pull/597.patch",
"merged_at": null
}
|
build a cpu-only docker image which is significantly smaller than the gpu image
```
ollama cuda dfdbcb88bc3d 4 minutes ago 754MB
ollama slim fb2e67c26718 7 minutes ago 148MB
```
Related #516
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/597/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 3,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/597/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8204
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8204/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8204/comments
|
https://api.github.com/repos/ollama/ollama/issues/8204/events
|
https://github.com/ollama/ollama/pull/8204
| 2,754,294,023
|
PR_kwDOJ0Z1Ps6F_0NW
| 8,204
|
add aria2c support for faster downloads
|
{
"login": "A-Akhil",
"id": 50855133,
"node_id": "MDQ6VXNlcjUwODU1MTMz",
"avatar_url": "https://avatars.githubusercontent.com/u/50855133?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/A-Akhil",
"html_url": "https://github.com/A-Akhil",
"followers_url": "https://api.github.com/users/A-Akhil/followers",
"following_url": "https://api.github.com/users/A-Akhil/following{/other_user}",
"gists_url": "https://api.github.com/users/A-Akhil/gists{/gist_id}",
"starred_url": "https://api.github.com/users/A-Akhil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/A-Akhil/subscriptions",
"organizations_url": "https://api.github.com/users/A-Akhil/orgs",
"repos_url": "https://api.github.com/users/A-Akhil/repos",
"events_url": "https://api.github.com/users/A-Akhil/events{/privacy}",
"received_events_url": "https://api.github.com/users/A-Akhil/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-12-21T17:58:06
| 2024-12-23T15:04:30
| 2024-12-23T15:04:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8204",
"html_url": "https://github.com/ollama/ollama/pull/8204",
"diff_url": "https://github.com/ollama/ollama/pull/8204.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8204.patch",
"merged_at": null
}
|
This PR adds optimized aria2c support to the installation script, improving download speeds and reliability while maintaining curl as fallback.
## Changes
- Create unified `download_and_extract` function to handle both aria2c and curl
- Maintain backward compatibility with existing curl implementation
## Verification Steps
1. Run install script on system with aria2c installed
2. Run install script on system without aria2c
Fixes #8203
Please let me know if there are additional changes or enhancements you'd like to include.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8204/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8204/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/725
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/725/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/725/comments
|
https://api.github.com/repos/ollama/ollama/issues/725/events
|
https://github.com/ollama/ollama/issues/725
| 1,930,971,909
|
I_kwDOJ0Z1Ps5zGEsF
| 725
|
GPU allocation on Linux
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2023-10-06T21:49:06
| 2023-10-10T20:16:10
| 2023-10-10T20:16:10
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently, too much GPU is allocated on Linux when loading a model
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/725/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/725/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2327
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2327/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2327/comments
|
https://api.github.com/repos/ollama/ollama/issues/2327/events
|
https://github.com/ollama/ollama/issues/2327
| 2,114,912,540
|
I_kwDOJ0Z1Ps5-DwEc
| 2,327
|
AMD EOF Model Error
|
{
"login": "jusgrave",
"id": 528103,
"node_id": "MDQ6VXNlcjUyODEwMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/528103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jusgrave",
"html_url": "https://github.com/jusgrave",
"followers_url": "https://api.github.com/users/jusgrave/followers",
"following_url": "https://api.github.com/users/jusgrave/following{/other_user}",
"gists_url": "https://api.github.com/users/jusgrave/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jusgrave/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jusgrave/subscriptions",
"organizations_url": "https://api.github.com/users/jusgrave/orgs",
"repos_url": "https://api.github.com/users/jusgrave/repos",
"events_url": "https://api.github.com/users/jusgrave/events{/privacy}",
"received_events_url": "https://api.github.com/users/jusgrave/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 13
| 2024-02-02T13:16:55
| 2024-02-02T19:23:12
| 2024-02-02T19:13:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I been having issues trying to get Ollama running when i run a model Im get a EOF error and invaild pointer
```
Feb 02 07:53:39 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:39 payload_common.go:106: INFO Extracting d>
Feb 02 07:53:41 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:41 payload_common.go:145: INFO Dynamic LLM >
Feb 02 07:53:41 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:41 gpu.go:94: INFO Detecting GPU type
Feb 02 07:53:41 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:41 gpu.go:236: INFO Searching for GPU manag>
Feb 02 07:53:41 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:41 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 07:53:41 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:41 gpu.go:236: INFO Searching for GPU manag>
Feb 02 07:53:41 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:41 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 07:53:41 justin-ArchEndos ollama[48332]: 2024/02/02 07:53:41 gpu.go:109: INFO Radeon GPU detected
Feb 02 07:57:16 justin-ArchEndos systemd[1]: Stopping Ollama Service...
Feb 02 07:57:16 justin-ArchEndos systemd[1]: ollama.service: Deactivated successfully.
Feb 02 07:57:16 justin-ArchEndos systemd[1]: Stopped Ollama Service.
Feb 02 07:57:16 justin-ArchEndos systemd[1]: ollama.service: Consumed 3.370s CPU time, 469.4M memory peak, 0>
Feb 02 07:57:16 justin-ArchEndos systemd[1]: Started Ollama Service.
Feb 02 07:57:16 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:16 images.go:857: INFO total blobs: 6
Feb 02 07:57:16 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:16 images.go:864: INFO total unused blobs r>
Feb 02 07:57:16 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:16 routes.go:950: INFO Listening on 127.0.0>
Feb 02 07:57:16 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:16 payload_common.go:106: INFO Extracting d>
Feb 02 07:57:18 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:18 payload_common.go:145: INFO Dynamic LLM >
Feb 02 07:57:18 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:18 gpu.go:94: INFO Detecting GPU type
Feb 02 07:57:18 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:18 gpu.go:236: INFO Searching for GPU manag>
Feb 02 07:57:18 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:18 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 07:57:18 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:18 gpu.go:236: INFO Searching for GPU manag>
Feb 02 07:57:18 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:18 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 07:57:18 justin-ArchEndos ollama[50808]: 2024/02/02 07:57:18 gpu.go:109: INFO Radeon GPU detected
Feb 02 07:57:52 justin-ArchEndos ollama[50808]: [GIN] 2024/02/02 - 07:57:52 | 200 | 20.589µs | 12>
Feb 02 07:57:52 justin-ArchEndos ollama[50808]: [GIN] 2024/02/02 - 07:57:52 | 200 | 304.326µs | 12>
Feb 02 07:58:02 justin-ArchEndos ollama[50808]: [GIN] 2024/02/02 - 07:58:02 | 200 | 12.744µs | 12>
Feb 02 07:58:02 justin-ArchEndos ollama[50808]: [GIN] 2024/02/02 - 07:58:02 | 200 | 353.62µs | 12>
Feb 02 07:58:02 justin-ArchEndos ollama[50808]: [GIN] 2024/02/02 - 07:58:02 | 200 | 190.992µs | 12>
Feb 02 07:58:02 justin-ArchEndos ollama[50808]: 2024/02/02 07:58:02 cpu_common.go:11: INFO CPU has AVX2
Feb 02 07:58:02 justin-ArchEndos ollama[50808]: 2024/02/02 07:58:02 dyn_ext_server.go:90: INFO Loading Dynam>
Feb 02 07:58:02 justin-ArchEndos ollama[50808]: 2024/02/02 07:58:02 dyn_ext_server.go:145: INFO Initializing>
Feb 02 07:58:02 justin-ArchEndos ollama[50808]: free(): invalid pointer
Feb 02 07:58:04 justin-ArchEndos systemd[1]: ollama.service: Main process exited, code=dumped, status=6/ABRT
Feb 02 07:58:04 justin-ArchEndos systemd[1]: ollama.service: Failed with result 'core-dump'.
Feb 02 07:58:04 justin-ArchEndos systemd[1]: ollama.service: Consumed 4.134s CPU time.
Feb 02 07:58:07 justin-ArchEndos systemd[1]: ollama.service: Scheduled restart job, restart counter is at 1.
Feb 02 07:58:07 justin-ArchEndos systemd[1]: Started Ollama Service.
Feb 02 07:58:07 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:07 images.go:857: INFO total blobs: 6
Feb 02 07:58:07 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:07 images.go:864: INFO total unused blobs r>
Feb 02 07:58:07 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:07 routes.go:950: INFO Listening on 127.0.0>
Feb 02 07:58:07 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:07 payload_common.go:106: INFO Extracting d>
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:09 payload_common.go:145: INFO Dynamic LLM >
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:09 gpu.go:94: INFO Detecting GPU type
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:09 gpu.go:236: INFO Searching for GPU manag>
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:09 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:09 gpu.go:236: INFO Searching for GPU manag>
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:09 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: 2024/02/02 07:58:09 gpu.go:109: INFO Radeon GPU detected
Feb 02 07:58:09 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 07:58:09 | 404 | 2.615µs | 12>
Feb 02 08:00:40 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:00:40 | 200 | 28.915µs | 12>
Feb 02 08:00:40 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:00:40 | 200 | 55.099782ms | 12>
Feb 02 08:00:49 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:00:49 | 200 | 16.02µs | 12>
Feb 02 08:00:49 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:00:49 | 200 | 126.881µs | 12>
Feb 02 08:00:59 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:00:59 | 200 | 14.598µs | 12>
Feb 02 08:00:59 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:00:59 | 404 | 69.532µs | 12>
Feb 02 08:01:05 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:01:05 | 200 | 15.961µs | 12>
Feb 02 08:01:05 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:01:05 | 404 | 80.633µs | 12>
Feb 02 08:01:07 justin-ArchEndos ollama[51400]: 2024/02/02 08:01:07 download.go:123: INFO downloading e8a35b>
Feb 02 08:01:31 justin-ArchEndos ollama[51400]: 2024/02/02 08:01:31 images.go:1187: INFO request failed: Get>
Feb 02 08:01:31 justin-ArchEndos ollama[51400]: 2024/02/02 08:01:31 download.go:162: INFO e8a35b5937a5 part >
Feb 02 08:01:31 justin-ArchEndos ollama[51400]: 2024/02/02 08:01:31 images.go:1187: INFO request failed: Get>
Feb 02 08:01:31 justin-ArchEndos ollama[51400]: 2024/02/02 08:01:31 download.go:162: INFO e8a35b5937a5 part >
Feb 02 08:01:33 justin-ArchEndos ollama[51400]: 2024/02/02 08:01:33 images.go:1187: INFO request failed: Get>
Feb 02 08:01:33 justin-ArchEndos ollama[51400]: 2024/02/02 08:01:33 download.go:162: INFO e8a35b5937a5 part >
Feb 02 08:02:39 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:39 download.go:162: INFO e8a35b5937a5 part >
Feb 02 08:02:44 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:44 download.go:123: INFO downloading 43070e>
Feb 02 08:02:46 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:46 download.go:123: INFO downloading e68360>
Feb 02 08:02:48 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:48 download.go:123: INFO downloading ed11ed>
Feb 02 08:02:50 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:50 download.go:123: INFO downloading f9b1e3>
Feb 02 08:02:54 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:02:54 | 200 | 1m49s | 12>
Feb 02 08:02:54 justin-ArchEndos ollama[51400]: [GIN] 2024/02/02 - 08:02:54 | 200 | 298.176µs | 12>
Feb 02 08:02:55 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:55 cpu_common.go:11: INFO CPU has AVX2
Feb 02 08:02:55 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:55 dyn_ext_server.go:90: INFO Loading Dynam>
Feb 02 08:02:55 justin-ArchEndos ollama[51400]: 2024/02/02 08:02:55 dyn_ext_server.go:145: INFO Initializing>
Feb 02 08:02:55 justin-ArchEndos ollama[51400]: free(): invalid pointer
Feb 02 08:02:57 justin-ArchEndos systemd[1]: ollama.service: Main process exited, code=dumped, status=6/ABRT
Feb 02 08:02:57 justin-ArchEndos systemd[1]: ollama.service: Failed with result 'core-dump'.
Feb 02 08:02:57 justin-ArchEndos systemd[1]: ollama.service: Consumed 15.602s CPU time.
Feb 02 08:03:00 justin-ArchEndos systemd[1]: ollama.service: Scheduled restart job, restart counter is at 2.
Feb 02 08:03:00 justin-ArchEndos systemd[1]: Started Ollama Service.
Feb 02 08:03:01 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:01 images.go:857: INFO total blobs: 5
Feb 02 08:03:01 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:01 images.go:864: INFO total unused blobs r>
Feb 02 08:03:01 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:01 routes.go:950: INFO Listening on 127.0.0>
Feb 02 08:03:01 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:01 payload_common.go:106: INFO Extracting d>
Feb 02 08:03:03 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:03 payload_common.go:145: INFO Dynamic LLM >
Feb 02 08:03:03 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:03 gpu.go:94: INFO Detecting GPU type
Feb 02 08:03:03 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:03 gpu.go:236: INFO Searching for GPU manag>
Feb 02 08:03:03 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:03 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 08:03:03 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:03 gpu.go:236: INFO Searching for GPU manag>
Feb 02 08:03:03 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:03 gpu.go:282: INFO Discovered GPU librarie>
Feb 02 08:03:03 justin-ArchEndos ollama[55270]: 2024/02/02 08:03:03 gpu.go:109: INFO Radeon GPU detected
```
|
{
"login": "jusgrave",
"id": 528103,
"node_id": "MDQ6VXNlcjUyODEwMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/528103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jusgrave",
"html_url": "https://github.com/jusgrave",
"followers_url": "https://api.github.com/users/jusgrave/followers",
"following_url": "https://api.github.com/users/jusgrave/following{/other_user}",
"gists_url": "https://api.github.com/users/jusgrave/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jusgrave/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jusgrave/subscriptions",
"organizations_url": "https://api.github.com/users/jusgrave/orgs",
"repos_url": "https://api.github.com/users/jusgrave/repos",
"events_url": "https://api.github.com/users/jusgrave/events{/privacy}",
"received_events_url": "https://api.github.com/users/jusgrave/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2327/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2327/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8163
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8163/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8163/comments
|
https://api.github.com/repos/ollama/ollama/issues/8163/events
|
https://github.com/ollama/ollama/pull/8163
| 2,748,813,877
|
PR_kwDOJ0Z1Ps6Fs9s_
| 8,163
|
Reuse type InvalidModelNameErrMsg, unify the const parameters.
|
{
"login": "zhanluxianshen",
"id": 161462588,
"node_id": "U_kgDOCZ-5PA",
"avatar_url": "https://avatars.githubusercontent.com/u/161462588?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhanluxianshen",
"html_url": "https://github.com/zhanluxianshen",
"followers_url": "https://api.github.com/users/zhanluxianshen/followers",
"following_url": "https://api.github.com/users/zhanluxianshen/following{/other_user}",
"gists_url": "https://api.github.com/users/zhanluxianshen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhanluxianshen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhanluxianshen/subscriptions",
"organizations_url": "https://api.github.com/users/zhanluxianshen/orgs",
"repos_url": "https://api.github.com/users/zhanluxianshen/repos",
"events_url": "https://api.github.com/users/zhanluxianshen/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhanluxianshen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-12-18T21:52:07
| 2024-12-23T21:55:18
| 2024-12-23T15:38:34
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8163",
"html_url": "https://github.com/ollama/ollama/pull/8163",
"diff_url": "https://github.com/ollama/ollama/pull/8163.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8163.patch",
"merged_at": "2024-12-23T15:38:34"
}
|
Reuse type InvalidModelNameErrMsg, unify the const parameters.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8163/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8163/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3026
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3026/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3026/comments
|
https://api.github.com/repos/ollama/ollama/issues/3026/events
|
https://github.com/ollama/ollama/pull/3026
| 2,177,369,551
|
PR_kwDOJ0Z1Ps5pJvcI
| 3,026
|
Doc how to set up ROCm builds on windows
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-09T19:30:02
| 2024-03-09T22:17:23
| 2024-03-09T22:17:20
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3026",
"html_url": "https://github.com/ollama/ollama/pull/3026",
"diff_url": "https://github.com/ollama/ollama/pull/3026.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3026.patch",
"merged_at": "2024-03-09T22:17:20"
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3026/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3026/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7220
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7220/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7220/comments
|
https://api.github.com/repos/ollama/ollama/issues/7220/events
|
https://github.com/ollama/ollama/issues/7220
| 2,590,509,344
|
I_kwDOJ0Z1Ps6aaAkg
| 7,220
|
Reference failure for user uploaded models in OpenAI compatible API
|
{
"login": "aretrace",
"id": 83451189,
"node_id": "MDQ6VXNlcjgzNDUxMTg5",
"avatar_url": "https://avatars.githubusercontent.com/u/83451189?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aretrace",
"html_url": "https://github.com/aretrace",
"followers_url": "https://api.github.com/users/aretrace/followers",
"following_url": "https://api.github.com/users/aretrace/following{/other_user}",
"gists_url": "https://api.github.com/users/aretrace/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aretrace/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aretrace/subscriptions",
"organizations_url": "https://api.github.com/users/aretrace/orgs",
"repos_url": "https://api.github.com/users/aretrace/repos",
"events_url": "https://api.github.com/users/aretrace/events{/privacy}",
"received_events_url": "https://api.github.com/users/aretrace/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-10-16T03:43:35
| 2024-10-16T23:34:52
| 2024-10-16T23:34:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am attempting to call `adrienbrault/nous-hermes2theta-llama3-8b` via the OpenAI node SDK, I get a 500 with a `model ... not found, try pulling it first` error (I already have it and chatted with it).
I did a simple curl test:
```zsh
curl -X POST http://localhost:11434/api/generate -d '{
"model": "adrienbrault/nous-hermes2theta-llama3-8b",
"prompt": "Hello, how are you?"
}'
```
Same 500 error (other `library` models are working).
A poke at `http://localhost:11434/v1/models` shows its there:
```javascript
{
"object": "list",
"data": [
{
"id": "adrienbrault/nous-hermes2theta-llama3-8b:q4_K_M",
"object": "model",
"created": 1729046834,
"owned_by": "adrienbrault"
},
...
}
```
I have tried several logical permutations of the model name but none have worked.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.13
|
{
"login": "aretrace",
"id": 83451189,
"node_id": "MDQ6VXNlcjgzNDUxMTg5",
"avatar_url": "https://avatars.githubusercontent.com/u/83451189?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aretrace",
"html_url": "https://github.com/aretrace",
"followers_url": "https://api.github.com/users/aretrace/followers",
"following_url": "https://api.github.com/users/aretrace/following{/other_user}",
"gists_url": "https://api.github.com/users/aretrace/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aretrace/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aretrace/subscriptions",
"organizations_url": "https://api.github.com/users/aretrace/orgs",
"repos_url": "https://api.github.com/users/aretrace/repos",
"events_url": "https://api.github.com/users/aretrace/events{/privacy}",
"received_events_url": "https://api.github.com/users/aretrace/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7220/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7220/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4798
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4798/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4798/comments
|
https://api.github.com/repos/ollama/ollama/issues/4798/events
|
https://github.com/ollama/ollama/issues/4798
| 2,331,505,749
|
I_kwDOJ0Z1Ps6K9_RV
| 4,798
|
The rocm driver rx7900xtx has been installed but cannot be used normally.
|
{
"login": "HaoZhang66",
"id": 166794154,
"node_id": "U_kgDOCfETqg",
"avatar_url": "https://avatars.githubusercontent.com/u/166794154?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HaoZhang66",
"html_url": "https://github.com/HaoZhang66",
"followers_url": "https://api.github.com/users/HaoZhang66/followers",
"following_url": "https://api.github.com/users/HaoZhang66/following{/other_user}",
"gists_url": "https://api.github.com/users/HaoZhang66/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HaoZhang66/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HaoZhang66/subscriptions",
"organizations_url": "https://api.github.com/users/HaoZhang66/orgs",
"repos_url": "https://api.github.com/users/HaoZhang66/repos",
"events_url": "https://api.github.com/users/HaoZhang66/events{/privacy}",
"received_events_url": "https://api.github.com/users/HaoZhang66/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
},
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 5
| 2024-06-03T15:50:54
| 2024-09-24T18:29:47
| 2024-06-18T21:51:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Error: llama runner process has terminated: signal: aborted (core dumped) error:Could not initialize Tensile host: No devices found
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4798/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4798/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1422
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1422/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1422/comments
|
https://api.github.com/repos/ollama/ollama/issues/1422/events
|
https://github.com/ollama/ollama/issues/1422
| 2,031,614,653
|
I_kwDOJ0Z1Ps55F_q9
| 1,422
|
original Unraid container gone?
|
{
"login": "ktfcaptain",
"id": 66204113,
"node_id": "MDQ6VXNlcjY2MjA0MTEz",
"avatar_url": "https://avatars.githubusercontent.com/u/66204113?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ktfcaptain",
"html_url": "https://github.com/ktfcaptain",
"followers_url": "https://api.github.com/users/ktfcaptain/followers",
"following_url": "https://api.github.com/users/ktfcaptain/following{/other_user}",
"gists_url": "https://api.github.com/users/ktfcaptain/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ktfcaptain/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ktfcaptain/subscriptions",
"organizations_url": "https://api.github.com/users/ktfcaptain/orgs",
"repos_url": "https://api.github.com/users/ktfcaptain/repos",
"events_url": "https://api.github.com/users/ktfcaptain/events{/privacy}",
"received_events_url": "https://api.github.com/users/ktfcaptain/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-12-07T22:14:21
| 2023-12-07T23:16:05
| 2023-12-07T23:16:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I installed a version(for lack of better words) of Ollama on an Unraid server a couple weeks ago and now it's gone from the community store. There was a replacement titled "Chatbot-Ollama" but I have not been able to get that version running correctly on Unraid.
So on one of my Unraid servers I have this(i think, dockerhub still points here) build, with "ollama-webui" working great. My other Unraid server with the replacement "Chatbot-Ollama" and "ollama-webui" will not work so far. I attempted manually installing this build with the same settings that are working on my original server, but can't get the webui to respond to the server.
I'll keep troubleshooting, was just dumbfounded the version that worked perfect on the first attempt vanished. I'm afraid to make any changes now lol.
Documenting my thoughts more than anything:
first install: https://i.imgur.com/knl7zSm.png
manual reproduction: https://i.imgur.com/EH0P4ki.png
replacement: https://i.imgur.com/dLH1IMo.png
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1422/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1422/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/8202
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8202/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8202/comments
|
https://api.github.com/repos/ollama/ollama/issues/8202/events
|
https://github.com/ollama/ollama/issues/8202
| 2,754,266,775
|
I_kwDOJ0Z1Ps6kKsaX
| 8,202
|
Request to support whisper and fast-whisper
|
{
"login": "jwnder",
"id": 24688121,
"node_id": "MDQ6VXNlcjI0Njg4MTIx",
"avatar_url": "https://avatars.githubusercontent.com/u/24688121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jwnder",
"html_url": "https://github.com/jwnder",
"followers_url": "https://api.github.com/users/jwnder/followers",
"following_url": "https://api.github.com/users/jwnder/following{/other_user}",
"gists_url": "https://api.github.com/users/jwnder/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jwnder/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jwnder/subscriptions",
"organizations_url": "https://api.github.com/users/jwnder/orgs",
"repos_url": "https://api.github.com/users/jwnder/repos",
"events_url": "https://api.github.com/users/jwnder/events{/privacy}",
"received_events_url": "https://api.github.com/users/jwnder/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-21T17:20:20
| 2024-12-23T01:11:25
| 2024-12-23T01:11:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Request to support for whisper models specially fast whisper like groq:
API request example used by Groq where audio_buffer is the audio buffer
url = "https://api.groq.com/openai/v1/audio/transcriptions"
files = {
"file": ("recorded_audio.wav", audio_buffer, "audio/wav")
}
data = {
"model": "whisper-large-v3",
"response_format": "verbose_json",
"temperature": 0
}
headers = {
"Authorization": f"Bearer <api key>"
}
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8202/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8202/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5993
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5993/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5993/comments
|
https://api.github.com/repos/ollama/ollama/issues/5993/events
|
https://github.com/ollama/ollama/issues/5993
| 2,432,950,689
|
I_kwDOJ0Z1Ps6RA-Gh
| 5,993
|
Tool calls streaming
|
{
"login": "jacob-ebey",
"id": 12063586,
"node_id": "MDQ6VXNlcjEyMDYzNTg2",
"avatar_url": "https://avatars.githubusercontent.com/u/12063586?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jacob-ebey",
"html_url": "https://github.com/jacob-ebey",
"followers_url": "https://api.github.com/users/jacob-ebey/followers",
"following_url": "https://api.github.com/users/jacob-ebey/following{/other_user}",
"gists_url": "https://api.github.com/users/jacob-ebey/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jacob-ebey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jacob-ebey/subscriptions",
"organizations_url": "https://api.github.com/users/jacob-ebey/orgs",
"repos_url": "https://api.github.com/users/jacob-ebey/repos",
"events_url": "https://api.github.com/users/jacob-ebey/events{/privacy}",
"received_events_url": "https://api.github.com/users/jacob-ebey/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-07-26T20:41:57
| 2024-09-04T04:23:03
| 2024-09-04T04:23:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Looking through PR's and issues around tool calls + streaming, it seems we are overlooking that when a tool call is encountered, the body of the tool-call should be discriminated from the "chat response".
We need to be able to implement the following high level logic:
```js
for await (const chunk of response) {
if (chunk is tool response) {
storeAndCallToolLater(chunk);
} else {
showChatResponseToUser(chunk);
}
```
I may be overlooking something, so please let me know if I am.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5993/reactions",
"total_count": 5,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/5993/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6672
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6672/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6672/comments
|
https://api.github.com/repos/ollama/ollama/issues/6672/events
|
https://github.com/ollama/ollama/issues/6672
| 2,509,896,125
|
I_kwDOJ0Z1Ps6Vmfm9
| 6,672
|
Inconsistent `prompt_eval_count` for Large Prompts in Ollama Python Library
|
{
"login": "surajyadav91",
"id": 86972819,
"node_id": "MDQ6VXNlcjg2OTcyODE5",
"avatar_url": "https://avatars.githubusercontent.com/u/86972819?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/surajyadav91",
"html_url": "https://github.com/surajyadav91",
"followers_url": "https://api.github.com/users/surajyadav91/followers",
"following_url": "https://api.github.com/users/surajyadav91/following{/other_user}",
"gists_url": "https://api.github.com/users/surajyadav91/gists{/gist_id}",
"starred_url": "https://api.github.com/users/surajyadav91/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/surajyadav91/subscriptions",
"organizations_url": "https://api.github.com/users/surajyadav91/orgs",
"repos_url": "https://api.github.com/users/surajyadav91/repos",
"events_url": "https://api.github.com/users/surajyadav91/events{/privacy}",
"received_events_url": "https://api.github.com/users/surajyadav91/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-06T08:57:48
| 2024-09-06T09:25:36
| 2024-09-06T09:25:36
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
### Inconsistent `prompt_eval_count` for Large Prompts in Ollama Python Library
For larger prompts, when using the Ollama Python library with the `llama3.1:8b-instruct-fp16` model, the `prompt_eval_count` remains constant at fixed value (1026) tokens, even when the input prompt size varies significantly. This behavior is observed when using the `ollama.chat()` method.
```python
def classify_incident(row):
full_prompt = (
prompt_template +
row['user_message']
)
response = ollama.chat(model=model, options={'temperature' : 0.01}, messages=[
{
'role': 'user',
'content': full_prompt
}
])
total_token = (response['prompt_eval_count'], response['eval_count'],
response['prompt_eval_count'] + response['eval_count'])
print(f'Tokens: {total_token}\n'
f'Total_prompt_length: {len(full_prompt)}\n'
f'{"=" * 50}\n')
```
## Sample output:
Tokens: (1026, 15, 1041)
Total_prompt_length: 57788
Tokens: (1026, 20, 1046)
Total_prompt_length: 57172
Tokens: (1026, 18, 1044)
Total_prompt_length: 57744
## Current Behavior
- `prompt_eval_count` consistently returns same value (1026), regardless of the actual prompt length.
- `eval_count` (output tokens) varies as expected. (this might also give fixed value once larger text is generated )
## Expected Behavior
- `prompt_eval_count` should accurately reflect the number of tokens in the input prompt.
- The value should change dynamically based on the input size and content.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.9
|
{
"login": "surajyadav91",
"id": 86972819,
"node_id": "MDQ6VXNlcjg2OTcyODE5",
"avatar_url": "https://avatars.githubusercontent.com/u/86972819?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/surajyadav91",
"html_url": "https://github.com/surajyadav91",
"followers_url": "https://api.github.com/users/surajyadav91/followers",
"following_url": "https://api.github.com/users/surajyadav91/following{/other_user}",
"gists_url": "https://api.github.com/users/surajyadav91/gists{/gist_id}",
"starred_url": "https://api.github.com/users/surajyadav91/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/surajyadav91/subscriptions",
"organizations_url": "https://api.github.com/users/surajyadav91/orgs",
"repos_url": "https://api.github.com/users/surajyadav91/repos",
"events_url": "https://api.github.com/users/surajyadav91/events{/privacy}",
"received_events_url": "https://api.github.com/users/surajyadav91/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6672/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6672/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3064
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3064/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3064/comments
|
https://api.github.com/repos/ollama/ollama/issues/3064/events
|
https://github.com/ollama/ollama/pull/3064
| 2,180,034,277
|
PR_kwDOJ0Z1Ps5pSufm
| 3,064
|
update llama.cpp submodule to `ceca1ae`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-11T19:24:57
| 2024-03-11T19:57:49
| 2024-03-11T19:57:48
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3064",
"html_url": "https://github.com/ollama/ollama/pull/3064",
"diff_url": "https://github.com/ollama/ollama/pull/3064.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3064.patch",
"merged_at": "2024-03-11T19:57:48"
}
|
Fixes https://github.com/ollama/ollama/issues/3058
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3064/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3064/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7980
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7980/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7980/comments
|
https://api.github.com/repos/ollama/ollama/issues/7980/events
|
https://github.com/ollama/ollama/pull/7980
| 2,724,120,612
|
PR_kwDOJ0Z1Ps6EYnJJ
| 7,980
|
prompt: Don't trim whitespace from prompts
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-06T23:24:01
| 2024-12-09T19:02:57
| 2024-12-09T19:02:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7980",
"html_url": "https://github.com/ollama/ollama/pull/7980",
"diff_url": "https://github.com/ollama/ollama/pull/7980.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7980.patch",
"merged_at": "2024-12-09T19:02:55"
}
|
New lines can be an important part of a user's prompt and trimming it can alter the results. We previously only trimmed prompts with images but refactoring brought this behavior to all prompts, where it became more noticeable.
The /generate endpoint adds less whitespace and therefore doesn't need to trim it out - this brings the same behavior to /chat.
Thanks to @gabe-l-hart for spotting the issue!
Fixes #7795
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7980/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 2,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7980/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7758
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7758/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7758/comments
|
https://api.github.com/repos/ollama/ollama/issues/7758/events
|
https://github.com/ollama/ollama/issues/7758
| 2,674,714,085
|
I_kwDOJ0Z1Ps6fbOXl
| 7,758
|
OLLAMA_MAX_QUEUE does not limit requests to the same model
|
{
"login": "yyx1111",
"id": 20674290,
"node_id": "MDQ6VXNlcjIwNjc0Mjkw",
"avatar_url": "https://avatars.githubusercontent.com/u/20674290?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yyx1111",
"html_url": "https://github.com/yyx1111",
"followers_url": "https://api.github.com/users/yyx1111/followers",
"following_url": "https://api.github.com/users/yyx1111/following{/other_user}",
"gists_url": "https://api.github.com/users/yyx1111/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yyx1111/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yyx1111/subscriptions",
"organizations_url": "https://api.github.com/users/yyx1111/orgs",
"repos_url": "https://api.github.com/users/yyx1111/repos",
"events_url": "https://api.github.com/users/yyx1111/events{/privacy}",
"received_events_url": "https://api.github.com/users/yyx1111/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-11-20T07:13:20
| 2024-11-21T17:17:44
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
It seems that OLLAMA_MAX_QUEUE is not taking effect. My environment is Windows 11, and I have set OLLAMA_NUM_PARALLEL=1,
set OLLAMA_MAX_QUEUE=1, but excessive requests are still queuing up instead of returning an error.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7758/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7758/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5738
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5738/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5738/comments
|
https://api.github.com/repos/ollama/ollama/issues/5738/events
|
https://github.com/ollama/ollama/issues/5738
| 2,412,425,509
|
I_kwDOJ0Z1Ps6PyrEl
| 5,738
|
How can I make the model produce consistent and stable results for the same prompt?
|
{
"login": "aitd0506",
"id": 166348996,
"node_id": "U_kgDOCepIxA",
"avatar_url": "https://avatars.githubusercontent.com/u/166348996?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/aitd0506",
"html_url": "https://github.com/aitd0506",
"followers_url": "https://api.github.com/users/aitd0506/followers",
"following_url": "https://api.github.com/users/aitd0506/following{/other_user}",
"gists_url": "https://api.github.com/users/aitd0506/gists{/gist_id}",
"starred_url": "https://api.github.com/users/aitd0506/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aitd0506/subscriptions",
"organizations_url": "https://api.github.com/users/aitd0506/orgs",
"repos_url": "https://api.github.com/users/aitd0506/repos",
"events_url": "https://api.github.com/users/aitd0506/events{/privacy}",
"received_events_url": "https://api.github.com/users/aitd0506/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2024-07-17T01:49:35
| 2024-07-24T08:49:17
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
As the title suggests, my goal is to ensure that the model produces consistent and stable output for the same prompt.
I call the native gemma2:9b docker using the ollama-python lib API. I have explicitly modified the temperature to 0 and set a fixed seed in the Options, but I found that it did not work. The model still produces different results with multiple calls.
Is there an issue with the way I am using it? Has anyone encountered the same problem?
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
ollama version is 0.2.1
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5738/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5738/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6573
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6573/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6573/comments
|
https://api.github.com/repos/ollama/ollama/issues/6573/events
|
https://github.com/ollama/ollama/issues/6573
| 2,498,182,463
|
I_kwDOJ0Z1Ps6U5z0_
| 6,573
|
Getting Error: llama runner process has terminated: exit status 127
|
{
"login": "Blasserman",
"id": 45080630,
"node_id": "MDQ6VXNlcjQ1MDgwNjMw",
"avatar_url": "https://avatars.githubusercontent.com/u/45080630?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Blasserman",
"html_url": "https://github.com/Blasserman",
"followers_url": "https://api.github.com/users/Blasserman/followers",
"following_url": "https://api.github.com/users/Blasserman/following{/other_user}",
"gists_url": "https://api.github.com/users/Blasserman/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Blasserman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Blasserman/subscriptions",
"organizations_url": "https://api.github.com/users/Blasserman/orgs",
"repos_url": "https://api.github.com/users/Blasserman/repos",
"events_url": "https://api.github.com/users/Blasserman/events{/privacy}",
"received_events_url": "https://api.github.com/users/Blasserman/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-30T21:15:43
| 2024-08-30T21:51:04
| 2024-08-30T21:51:04
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
On two different aarch64 SBC's running Debian. One running Bookworm, one running bullseye. Both get the error when loading the model llama3.1 into memory with Ollama 0.3.8. The previous version ollama worked fine.
dave@ai:~$ ollama -v
ollama version is 0.3.8
dave@ai:~$ ollama run llama3.1
Error: llama runner process has terminated: exit status 127
dave@ai:~$
### OS
Linux
### GPU
Other
### CPU
Other
### Ollama version
0.3.8
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6573/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6573/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4801
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4801/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4801/comments
|
https://api.github.com/repos/ollama/ollama/issues/4801/events
|
https://github.com/ollama/ollama/issues/4801
| 2,332,190,974
|
I_kwDOJ0Z1Ps6LAmj-
| 4,801
|
Surface the embedding dimension of a model in docs/api
|
{
"login": "Kuinox",
"id": 18743295,
"node_id": "MDQ6VXNlcjE4NzQzMjk1",
"avatar_url": "https://avatars.githubusercontent.com/u/18743295?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kuinox",
"html_url": "https://github.com/Kuinox",
"followers_url": "https://api.github.com/users/Kuinox/followers",
"following_url": "https://api.github.com/users/Kuinox/following{/other_user}",
"gists_url": "https://api.github.com/users/Kuinox/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kuinox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kuinox/subscriptions",
"organizations_url": "https://api.github.com/users/Kuinox/orgs",
"repos_url": "https://api.github.com/users/Kuinox/repos",
"events_url": "https://api.github.com/users/Kuinox/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kuinox/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
},
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 0
| 2024-06-03T22:44:53
| 2024-11-06T17:26:39
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently, the embedding dimension isn't listed on the ollama docs or on the model card from ollama.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4801/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4801/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3737
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3737/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3737/comments
|
https://api.github.com/repos/ollama/ollama/issues/3737/events
|
https://github.com/ollama/ollama/pull/3737
| 2,251,712,102
|
PR_kwDOJ0Z1Ps5tGu_m
| 3,737
|
update create handler to use model.Name
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-18T22:48:01
| 2024-06-05T19:05:06
| 2024-06-05T19:05:05
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3737",
"html_url": "https://github.com/ollama/ollama/pull/3737",
"diff_url": "https://github.com/ollama/ollama/pull/3737.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3737.patch",
"merged_at": "2024-06-05T19:05:05"
}
|
follow up to #3718
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3737/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3737/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2962
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2962/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2962/comments
|
https://api.github.com/repos/ollama/ollama/issues/2962/events
|
https://github.com/ollama/ollama/pull/2962
| 2,172,503,301
|
PR_kwDOJ0Z1Ps5o5LJp
| 2,962
|
cmd: add usage for `ollama run` environment variables
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-03-06T21:56:32
| 2024-03-07T21:57:08
| 2024-03-07T21:57:08
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2962",
"html_url": "https://github.com/ollama/ollama/pull/2962",
"diff_url": "https://github.com/ollama/ollama/pull/2962.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2962.patch",
"merged_at": "2024-03-07T21:57:08"
}
|
Also, fix the indentation of usage for `ollama serve` environment variables.
Fixes #2944
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2962/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2962/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4952
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4952/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4952/comments
|
https://api.github.com/repos/ollama/ollama/issues/4952/events
|
https://github.com/ollama/ollama/issues/4952
| 2,342,272,579
|
I_kwDOJ0Z1Ps6LnD5D
| 4,952
|
Modify .dockerignore to optimize Docker image build
|
{
"login": "Gabrielfernandes7",
"id": 78227127,
"node_id": "MDQ6VXNlcjc4MjI3MTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/78227127?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Gabrielfernandes7",
"html_url": "https://github.com/Gabrielfernandes7",
"followers_url": "https://api.github.com/users/Gabrielfernandes7/followers",
"following_url": "https://api.github.com/users/Gabrielfernandes7/following{/other_user}",
"gists_url": "https://api.github.com/users/Gabrielfernandes7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Gabrielfernandes7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Gabrielfernandes7/subscriptions",
"organizations_url": "https://api.github.com/users/Gabrielfernandes7/orgs",
"repos_url": "https://api.github.com/users/Gabrielfernandes7/repos",
"events_url": "https://api.github.com/users/Gabrielfernandes7/events{/privacy}",
"received_events_url": "https://api.github.com/users/Gabrielfernandes7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-06-09T13:57:22
| 2024-06-09T17:05:12
| 2024-06-09T17:05:11
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
To optimize the Docker image build and ensure unnecessary files are not included, it is important to add a `.dockerignore` file to the repository.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4952/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4952/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6989
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6989/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6989/comments
|
https://api.github.com/repos/ollama/ollama/issues/6989/events
|
https://github.com/ollama/ollama/pull/6989
| 2,551,529,764
|
PR_kwDOJ0Z1Ps582Pk6
| 6,989
|
llama: Exercise the new build in CI
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-09-26T21:05:00
| 2024-09-27T21:49:59
| 2024-09-27T21:49:56
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6989",
"html_url": "https://github.com/ollama/ollama/pull/6989",
"diff_url": "https://github.com/ollama/ollama/pull/6989.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6989.patch",
"merged_at": "2024-09-27T21:49:55"
}
|
Conditionally build the new runners if any code in that tree is touched.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6989/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6989/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3672
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3672/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3672/comments
|
https://api.github.com/repos/ollama/ollama/issues/3672/events
|
https://github.com/ollama/ollama/pull/3672
| 2,245,930,967
|
PR_kwDOJ0Z1Ps5szCJx
| 3,672
|
Update README.md
|
{
"login": "ovovovovisov",
"id": 110344798,
"node_id": "U_kgDOBpO6Xg",
"avatar_url": "https://avatars.githubusercontent.com/u/110344798?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ovovovovisov",
"html_url": "https://github.com/ovovovovisov",
"followers_url": "https://api.github.com/users/ovovovovisov/followers",
"following_url": "https://api.github.com/users/ovovovovisov/following{/other_user}",
"gists_url": "https://api.github.com/users/ovovovovisov/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ovovovovisov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ovovovovisov/subscriptions",
"organizations_url": "https://api.github.com/users/ovovovovisov/orgs",
"repos_url": "https://api.github.com/users/ovovovovisov/repos",
"events_url": "https://api.github.com/users/ovovovovisov/events{/privacy}",
"received_events_url": "https://api.github.com/users/ovovovovisov/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-04-16T12:33:39
| 2024-04-16T21:00:48
| 2024-04-16T21:00:47
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3672",
"html_url": "https://github.com/ollama/ollama/pull/3672",
"diff_url": "https://github.com/ollama/ollama/pull/3672.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3672.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3672/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3672/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8448
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8448/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8448/comments
|
https://api.github.com/repos/ollama/ollama/issues/8448/events
|
https://github.com/ollama/ollama/issues/8448
| 2,791,776,340
|
I_kwDOJ0Z1Ps6mZyBU
| 8,448
|
ollama create 0.5.5+ fails for models with multiple blobs
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2025-01-16T06:00:24
| 2025-01-16T08:14:08
| 2025-01-16T08:14:08
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```console
$ ollama list llama3.2-vision
NAME ID SIZE MODIFIED
llama3.2-vision:latest 085a1fdae525 7.9 GB 2 weeks ago
$ ollama show --modelfile llama3.2-vision > Modelfile
$ head Modelfile
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM llama3.2-vision:latest
FROM /usr/share/ollama/.ollama/models/blobs/sha256-11f274007f093fefeec994a5dbbb33d0733a4feb87f7ab66dcd7c1069fef0068
FROM /usr/share/ollama/.ollama/models/blobs/sha256-ece5e659647a20a5c28ab9eea1c12a1ad430bc0f2a27021d00ad103b3bf5206f
TEMPLATE """{{- range $index, $_ := .Messages }}<|start_header_id|>{{ .Role }}<|end_header_id|>
{{ .Content }}
{{- if gt (len (slice $.Messages $index)) 1 }}<|eot_id|>
$ ollama create llama3.2-vision:test
gathering model components
copying file sha256:ece5e659647a20a5c28ab9eea1c12a1ad430bc0f2a27021d00ad103b3bf5206f 100%
parsing GGUF
using existing layer sha256:ece5e659647a20a5c28ab9eea1c12a1ad430bc0f2a27021d00ad103b3bf5206f
using existing layer sha256:715415638c9c4c0cb2b78783da041b97bd1205f8b9f9494bd7e5a850cb443602
using existing layer sha256:0b4284c1f87029e67654c7953afa16279961632cf73dcfe33374c4c2f298fa35
using existing layer sha256:fefc914e46e6024467471837a48a24251db2c6f3f58395943da7bf9dc6f70fb6
writing manifest
success
$ ollama list llama3.2-vision
NAME ID SIZE MODIFIED
llama3.2-vision:test 2af40d742c9c 1.9 GB 12 seconds ago
llama3.2-vision:latest 085a1fdae525 7.9 GB 2 weeks ago
$ ollama show llama3.2-vision:test
Error: model 'llama3.2-vision:test' not found
$ cd /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/library/llama3.2-vision
$ sdiff <(jq . latest) <(jq . test)
{ {
"schemaVersion": 2, "schemaVersion": 2,
"mediaType": "application/vnd.docker.distribution.manifest. "mediaType": "application/vnd.docker.distribution.manifest.
"config": { "config": {
"mediaType": "application/vnd.docker.container.image.v1+j "mediaType": "application/vnd.docker.container.image.v1+j
"digest": "sha256:fbd313562bb706ac00f1a18c0aad8398b3c22d5 | "digest": "sha256:777ff13d51890fb3d4dbdba260f70e373dfd14b
"size": 572 | "size": 489
}, },
"layers": [ "layers": [
{ {
"mediaType": "application/vnd.ollama.image.model", <
"digest": "sha256:11f274007f093fefeec994a5dbbb33d0733a4 <
"size": 5963057248 <
}, <
{ <
"mediaType": "application/vnd.ollama.image.projector", "mediaType": "application/vnd.ollama.image.projector",
"digest": "sha256:ece5e659647a20a5c28ab9eea1c12a1ad430b "digest": "sha256:ece5e659647a20a5c28ab9eea1c12a1ad430b
"size": 1938763584, "size": 1938763584,
"from": "/Users/ollama/.ollama/models/blobs/sha256-ece5 | "from": "/usr/share/ollama/.ollama/models/blobs/sha256-
}, },
{ {
"mediaType": "application/vnd.ollama.image.template", "mediaType": "application/vnd.ollama.image.template",
"digest": "sha256:715415638c9c4c0cb2b78783da041b97bd120 "digest": "sha256:715415638c9c4c0cb2b78783da041b97bd120
"size": 269 "size": 269
}, },
{ {
"mediaType": "application/vnd.ollama.image.license", "mediaType": "application/vnd.ollama.image.license",
"digest": "sha256:0b4284c1f87029e67654c7953afa162799616 "digest": "sha256:0b4284c1f87029e67654c7953afa162799616
"size": 7712 "size": 7712
}, },
{ {
"mediaType": "application/vnd.ollama.image.params", "mediaType": "application/vnd.ollama.image.params",
"digest": "sha256:fefc914e46e6024467471837a48a24251db2c "digest": "sha256:fefc914e46e6024467471837a48a24251db2c
"size": 32 "size": 32
} }
] ]
} }
```
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.5, 0.5.6
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8448/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8448/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3994
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3994/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3994/comments
|
https://api.github.com/repos/ollama/ollama/issues/3994/events
|
https://github.com/ollama/ollama/issues/3994
| 2,267,374,664
|
I_kwDOJ0Z1Ps6HJWRI
| 3,994
|
pulling manifest Error
|
{
"login": "xinewzhou",
"id": 168318989,
"node_id": "U_kgDOCghYDQ",
"avatar_url": "https://avatars.githubusercontent.com/u/168318989?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/xinewzhou",
"html_url": "https://github.com/xinewzhou",
"followers_url": "https://api.github.com/users/xinewzhou/followers",
"following_url": "https://api.github.com/users/xinewzhou/following{/other_user}",
"gists_url": "https://api.github.com/users/xinewzhou/gists{/gist_id}",
"starred_url": "https://api.github.com/users/xinewzhou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xinewzhou/subscriptions",
"organizations_url": "https://api.github.com/users/xinewzhou/orgs",
"repos_url": "https://api.github.com/users/xinewzhou/repos",
"events_url": "https://api.github.com/users/xinewzhou/events{/privacy}",
"received_events_url": "https://api.github.com/users/xinewzhou/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-04-28T06:23:30
| 2024-05-02T18:09:35
| 2024-05-02T18:09:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
pulling manifest
Error: pull model manifest: Get "https://ollama.com/token?nonce=KuhzAPbr727aW4zzdmUWPA&scope=repository%!A(MISSING)library%!F(MISSING)llama3%!A(MISSING)pull&service=ollama.com&ts=1714285350": read tcp 192.168.2.33:60544->34.120.132.20:443: read: connection reset by peer
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
ollama version is 0.1.32
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3994/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3994/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1884
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1884/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1884/comments
|
https://api.github.com/repos/ollama/ollama/issues/1884/events
|
https://github.com/ollama/ollama/issues/1884
| 2,073,583,263
|
I_kwDOJ0Z1Ps57mF6f
| 1,884
|
Running on Windows Docker vs WSL versions
|
{
"login": "ewebgh33",
"id": 123797054,
"node_id": "U_kgDOB2D-Pg",
"avatar_url": "https://avatars.githubusercontent.com/u/123797054?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ewebgh33",
"html_url": "https://github.com/ewebgh33",
"followers_url": "https://api.github.com/users/ewebgh33/followers",
"following_url": "https://api.github.com/users/ewebgh33/following{/other_user}",
"gists_url": "https://api.github.com/users/ewebgh33/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ewebgh33/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ewebgh33/subscriptions",
"organizations_url": "https://api.github.com/users/ewebgh33/orgs",
"repos_url": "https://api.github.com/users/ewebgh33/repos",
"events_url": "https://api.github.com/users/ewebgh33/events{/privacy}",
"received_events_url": "https://api.github.com/users/ewebgh33/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2024-01-10T04:53:07
| 2024-03-13T15:10:43
| 2024-03-12T18:01:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi
Not really an issue, unless you include "more explanation" as an issue. But you guys don't have a discussion section here on Github.
**Is there any benefit (or con) to running in WSL vs Docker for Windows? (as still no sign of a Win version coming).**
I am leaning towards WSL simply because I've had issues in the past trying to get non-docker LLM apps to communicate with docker apps and vice versa.
Docker seems simple, but the instructions aren't specific to windows, are they? Otherwise wouldn't the Docker version count as this app being available for windows (which the main page still says is coming soon)?
Will it be any slower or faster in docker?
I have also heard via WSL will use less VRAM.
Where do models get downloaded to, if we're running in either? Can we point the docker version or the WSL version to a common repo of LLM models on our drive locally?
Many other LLM apps "require" Ollama as their backend, so I really hope to start using this soon.
I have both docker desktop and WSL/Ubuntu installed already.
If I have another LLM app, say, Cheshire Cat AI, already running in docker, maybe I would be better off running the dockerised Ollama. But then other LLM apps that do NOT run in docker, also want it. Not sure what option is going to give me the more simple setup in the long run.
Thanks!
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1884/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1884/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6603
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6603/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6603/comments
|
https://api.github.com/repos/ollama/ollama/issues/6603/events
|
https://github.com/ollama/ollama/issues/6603
| 2,502,156,165
|
I_kwDOJ0Z1Ps6VI9-F
| 6,603
|
Add Phi-3.5-mini-instruct
|
{
"login": "oscarwang22",
"id": 141192440,
"node_id": "U_kgDOCGps-A",
"avatar_url": "https://avatars.githubusercontent.com/u/141192440?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/oscarwang22",
"html_url": "https://github.com/oscarwang22",
"followers_url": "https://api.github.com/users/oscarwang22/followers",
"following_url": "https://api.github.com/users/oscarwang22/following{/other_user}",
"gists_url": "https://api.github.com/users/oscarwang22/gists{/gist_id}",
"starred_url": "https://api.github.com/users/oscarwang22/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/oscarwang22/subscriptions",
"organizations_url": "https://api.github.com/users/oscarwang22/orgs",
"repos_url": "https://api.github.com/users/oscarwang22/repos",
"events_url": "https://api.github.com/users/oscarwang22/events{/privacy}",
"received_events_url": "https://api.github.com/users/oscarwang22/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-09-03T07:44:00
| 2024-09-05T17:04:17
| 2024-09-05T17:04:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Add Phi-3.5-mini-instruct.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6603/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6603/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8678
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8678/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8678/comments
|
https://api.github.com/repos/ollama/ollama/issues/8678/events
|
https://github.com/ollama/ollama/issues/8678
| 2,819,618,742
|
I_kwDOJ0Z1Ps6oD_e2
| 8,678
|
Missing support for name field
|
{
"login": "gagb",
"id": 13227607,
"node_id": "MDQ6VXNlcjEzMjI3NjA3",
"avatar_url": "https://avatars.githubusercontent.com/u/13227607?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gagb",
"html_url": "https://github.com/gagb",
"followers_url": "https://api.github.com/users/gagb/followers",
"following_url": "https://api.github.com/users/gagb/following{/other_user}",
"gists_url": "https://api.github.com/users/gagb/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gagb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gagb/subscriptions",
"organizations_url": "https://api.github.com/users/gagb/orgs",
"repos_url": "https://api.github.com/users/gagb/repos",
"events_url": "https://api.github.com/users/gagb/events{/privacy}",
"received_events_url": "https://api.github.com/users/gagb/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6657611864,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjNMYWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/compatibility",
"name": "compatibility",
"color": "bfdadc",
"default": false,
"description": ""
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2025-01-30T00:20:26
| 2025-01-30T08:40:28
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
For many models, phi-4, deepseek-r1, Ollama support OpenAI chat completion format, but it seems like it does not support the name field in the message history. It only supports the role and content field. Is there a plan to fix this?
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8678/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8678/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7903
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7903/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7903/comments
|
https://api.github.com/repos/ollama/ollama/issues/7903/events
|
https://github.com/ollama/ollama/issues/7903
| 2,710,268,254
|
I_kwDOJ0Z1Ps6hi2le
| 7,903
|
Ollama missing support for tool calling stream
|
{
"login": "Teeeeg",
"id": 34847171,
"node_id": "MDQ6VXNlcjM0ODQ3MTcx",
"avatar_url": "https://avatars.githubusercontent.com/u/34847171?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Teeeeg",
"html_url": "https://github.com/Teeeeg",
"followers_url": "https://api.github.com/users/Teeeeg/followers",
"following_url": "https://api.github.com/users/Teeeeg/following{/other_user}",
"gists_url": "https://api.github.com/users/Teeeeg/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Teeeeg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Teeeeg/subscriptions",
"organizations_url": "https://api.github.com/users/Teeeeg/orgs",
"repos_url": "https://api.github.com/users/Teeeeg/repos",
"events_url": "https://api.github.com/users/Teeeeg/events{/privacy}",
"received_events_url": "https://api.github.com/users/Teeeeg/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-12-02T02:54:10
| 2024-12-02T02:57:43
| 2024-12-02T02:57:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
`async for event in agent.astream_events(**kwargs, version="v2")`
If there is a ToolNode, there is no onToolStart kind of event. Causing no further updates on the response.
I am not sure it is a bug with LangGraph.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.4.6
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7903/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7903/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6040
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6040/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6040/comments
|
https://api.github.com/repos/ollama/ollama/issues/6040/events
|
https://github.com/ollama/ollama/issues/6040
| 2,434,768,786
|
I_kwDOJ0Z1Ps6RH5-S
| 6,040
|
[Model request] Llama3.1 `text` model (not `instruct`)
|
{
"login": "d-kleine",
"id": 53251018,
"node_id": "MDQ6VXNlcjUzMjUxMDE4",
"avatar_url": "https://avatars.githubusercontent.com/u/53251018?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/d-kleine",
"html_url": "https://github.com/d-kleine",
"followers_url": "https://api.github.com/users/d-kleine/followers",
"following_url": "https://api.github.com/users/d-kleine/following{/other_user}",
"gists_url": "https://api.github.com/users/d-kleine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/d-kleine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/d-kleine/subscriptions",
"organizations_url": "https://api.github.com/users/d-kleine/orgs",
"repos_url": "https://api.github.com/users/d-kleine/repos",
"events_url": "https://api.github.com/users/d-kleine/events{/privacy}",
"received_events_url": "https://api.github.com/users/d-kleine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-07-29T08:20:19
| 2024-08-03T22:24:23
| 2024-07-30T01:23:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
As per comment: https://github.com/ollama/ollama/issues/5919#issuecomment-2248604007
Could you please add the Meta-Llama-3.1 models to Ollama models please?
https://ollama.com/library/llama3.1
Currently, there are only the instruction-finetuned models (`...-instruct-...`), but not the vanilla models (`...-text-...`):
https://huggingface.co/meta-llama/Meta-Llama-3.1-8B
https://huggingface.co/meta-llama/Meta-Llama-3.1-70B
https://huggingface.co/meta-llama/Meta-Llama-3.1-405B
|
{
"login": "d-kleine",
"id": 53251018,
"node_id": "MDQ6VXNlcjUzMjUxMDE4",
"avatar_url": "https://avatars.githubusercontent.com/u/53251018?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/d-kleine",
"html_url": "https://github.com/d-kleine",
"followers_url": "https://api.github.com/users/d-kleine/followers",
"following_url": "https://api.github.com/users/d-kleine/following{/other_user}",
"gists_url": "https://api.github.com/users/d-kleine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/d-kleine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/d-kleine/subscriptions",
"organizations_url": "https://api.github.com/users/d-kleine/orgs",
"repos_url": "https://api.github.com/users/d-kleine/repos",
"events_url": "https://api.github.com/users/d-kleine/events{/privacy}",
"received_events_url": "https://api.github.com/users/d-kleine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6040/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6040/timeline
| null |
completed
| false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.