url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/495
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/495/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/495/comments
|
https://api.github.com/repos/ollama/ollama/issues/495/events
|
https://github.com/ollama/ollama/issues/495
| 1,887,329,211
|
I_kwDOJ0Z1Ps5wflu7
| 495
|
Build Error: Unable to Apply Patch in 'examples/server/server.cpp' during Docker Build Process
|
{
"login": "avri-schneider",
"id": 6785181,
"node_id": "MDQ6VXNlcjY3ODUxODE=",
"avatar_url": "https://avatars.githubusercontent.com/u/6785181?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/avri-schneider",
"html_url": "https://github.com/avri-schneider",
"followers_url": "https://api.github.com/users/avri-schneider/followers",
"following_url": "https://api.github.com/users/avri-schneider/following{/other_user}",
"gists_url": "https://api.github.com/users/avri-schneider/gists{/gist_id}",
"starred_url": "https://api.github.com/users/avri-schneider/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/avri-schneider/subscriptions",
"organizations_url": "https://api.github.com/users/avri-schneider/orgs",
"repos_url": "https://api.github.com/users/avri-schneider/repos",
"events_url": "https://api.github.com/users/avri-schneider/events{/privacy}",
"received_events_url": "https://api.github.com/users/avri-schneider/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 11
| 2023-09-08T09:47:09
| 2023-12-06T11:41:27
| 2023-10-28T19:34:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
**Issue Description:**
During the Docker build process, an error occurred while attempting to apply patches to the 'examples/server/server.cpp' file. The error message indicated that the patch did not apply successfully. Upon investigation, it was discovered that the patches being applied have already been applied to the submodules used in the project.
**Error Details:**
```less
...<--snip-->...
1.228 go: downloading github.com/go-playground/locales v0.14.1
3.836 Submodule 'llm/llama.cpp/gguf' (https://github.com/ggerganov/llama.cpp.git) registered for path 'gguf'
3.845 Cloning into '/go/src/github.com/jmorganca/ollama/llm/llama.cpp/gguf'...
5.359 Submodule path 'ggml': checked out '9e232f0234073358e7031c1b8d7aa45020469a3b'
8.199 From https://github.com/ggerganov/llama.cpp
8.199 * branch 53885d7256909ec3e2176cdc2477f3986c15ec69 -> FETCH_HEAD
8.226 Submodule path 'gguf': checked out '53885d7256909ec3e2176cdc2477f3986c15ec69'
8.227 error: patch failed: examples/server/server.cpp:1075
8.227 error: examples/server/server.cpp: patch does not apply
8.227 llm/llama.cpp/generate.go:8: running "git": exit status 1
------
Dockerfile:7
--------------------
5 |
6 | COPY . .
7 | >>> RUN go generate ./... && go build -ldflags '-linkmode external -extldflags "-static"' .
8 |
9 | FROM alpine
--------------------
ERROR: failed to solve: process "/bin/sh -c go generate ./... && go build -ldflags '-linkmode external -extldflags \"-static\"' ." did not complete successfully: exit code: 1
```
**Solution:**
The reason for this error is that the patches being applied have already been integrated into the submodules used in the project. To resolve this issue, a pull request has been submitted to the repository: [Pull Request #494](https://github.com/jmorganca/ollama/pull/494).
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/495/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/495/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1687
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1687/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1687/comments
|
https://api.github.com/repos/ollama/ollama/issues/1687/events
|
https://github.com/ollama/ollama/issues/1687
| 2,054,752,935
|
I_kwDOJ0Z1Ps56eQqn
| 1,687
|
Old Models disappear after Ollama Update (0.1.17)
|
{
"login": "sthufnagl",
"id": 1492014,
"node_id": "MDQ6VXNlcjE0OTIwMTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1492014?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sthufnagl",
"html_url": "https://github.com/sthufnagl",
"followers_url": "https://api.github.com/users/sthufnagl/followers",
"following_url": "https://api.github.com/users/sthufnagl/following{/other_user}",
"gists_url": "https://api.github.com/users/sthufnagl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sthufnagl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sthufnagl/subscriptions",
"organizations_url": "https://api.github.com/users/sthufnagl/orgs",
"repos_url": "https://api.github.com/users/sthufnagl/repos",
"events_url": "https://api.github.com/users/sthufnagl/events{/privacy}",
"received_events_url": "https://api.github.com/users/sthufnagl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 7
| 2023-12-23T10:59:30
| 2024-07-19T07:15:27
| 2023-12-26T12:10:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
**Environment:**
my environment ist WSL on Win11.
**Update Command:**
curl https://ollama.ai/install.sh | sh
**Situation:**
After an Update to Ollama 0.1.17 all my old Models (202GB) are not visible anymore and when I try to start an old one the Model is downloaded once again. Physically the Model Files are available but not listed (ollama list) or used.
I want to avoid to download the old models once again.
**Reason for Update:**
I wanted to use PHI but after downloading I got some error messages ==> Update of ollama was necessary
**Question:**
* Can I restore my old Models?
* Is there an entry in a config file?
Thx in advance
|
{
"login": "sthufnagl",
"id": 1492014,
"node_id": "MDQ6VXNlcjE0OTIwMTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1492014?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sthufnagl",
"html_url": "https://github.com/sthufnagl",
"followers_url": "https://api.github.com/users/sthufnagl/followers",
"following_url": "https://api.github.com/users/sthufnagl/following{/other_user}",
"gists_url": "https://api.github.com/users/sthufnagl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sthufnagl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sthufnagl/subscriptions",
"organizations_url": "https://api.github.com/users/sthufnagl/orgs",
"repos_url": "https://api.github.com/users/sthufnagl/repos",
"events_url": "https://api.github.com/users/sthufnagl/events{/privacy}",
"received_events_url": "https://api.github.com/users/sthufnagl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1687/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/1687/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7403
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7403/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7403/comments
|
https://api.github.com/repos/ollama/ollama/issues/7403/events
|
https://github.com/ollama/ollama/issues/7403
| 2,619,039,241
|
I_kwDOJ0Z1Ps6cG14J
| 7,403
|
Memory leaks after each prompt on 6.11 kernel with nvidia GPU
|
{
"login": "regularRandom",
"id": 14252934,
"node_id": "MDQ6VXNlcjE0MjUyOTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/14252934?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/regularRandom",
"html_url": "https://github.com/regularRandom",
"followers_url": "https://api.github.com/users/regularRandom/followers",
"following_url": "https://api.github.com/users/regularRandom/following{/other_user}",
"gists_url": "https://api.github.com/users/regularRandom/gists{/gist_id}",
"starred_url": "https://api.github.com/users/regularRandom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/regularRandom/subscriptions",
"organizations_url": "https://api.github.com/users/regularRandom/orgs",
"repos_url": "https://api.github.com/users/regularRandom/repos",
"events_url": "https://api.github.com/users/regularRandom/events{/privacy}",
"received_events_url": "https://api.github.com/users/regularRandom/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 20
| 2024-10-28T17:21:15
| 2024-11-18T20:05:00
| 2024-11-18T20:05:00
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
It seems Ollama has a memory leak or doesn't clean the memory after the prompt (execution). I have following stuff in the logs:
> [Mon Oct 28 13:03:00 2024] ------------[ cut here ]------------
> [Mon Oct 28 13:03:00 2024] WARNING: CPU: 38 PID: 15739 at mm/page_alloc.c:4678 __alloc_pages_noprof+0x2df/0x370
> [Mon Oct 28 13:03:00 2024] Modules linked in: tls(E) xt_nat(E) veth(E) xt_conntrack(E) xt_MASQUERADE(E) bridge(E) stp(E) llc(E) nf_conntrack_netlink(E) xt_addrtype(E) ipt_REJECT(E) nft_compat(E) tun(E) nft_masq(E) overlay(E) nf_conntrack_netbios_ns(E) nf_conntrack_broadcast(E) nft_fib_inet(E) nft_fib_ipv4(E) nft_fib_ipv6(E) nft_fib(E) nft_reject_inet(E) nf_reject_ipv4(E) nf_reject_ipv6(E) nft_reject(E) sunrpc(E) nft_ct(E) nft_chain_nat(E) nf_nat(E) nf_conntrack(E) nf_defrag_ipv6(E) nf_defrag_ipv4(E) rfkill(E) ip_set(E) nf_tables(E) nfnetlink(E) qrtr(E) binfmt_misc(E) vfat(E) fat(E) nvidia_drm(POE) nvidia_modeset(POE) nvidia_uvm(POE) nvidia(POE) intel_rapl_msr(E) intel_rapl_common(E) intel_uncore_frequency(E) intel_uncore_frequency_common(E) sb_edac(E) x86_pkg_temp_thermal(E) intel_powerclamp(E) coretemp(E) kvm_intel(E) kvm(E) raid456(E) snd_hda_codec_realtek(E) async_raid6_recov(E) async_memcpy(E) async_pq(E) async_xor(E) snd_hda_codec_generic(E) async_tx(E) xor(E) snd_hda_codec_hdmi(E) snd_hda_scodec_component(E) snd_hda_intel(E)
> [Mon Oct 28 13:03:00 2024] ucsi_ccg(E) iTCO_wdt(E) snd_intel_dspcfg(E) intel_pmc_bxt(E) snd_intel_sdw_acpi(E) iTCO_vendor_support(E) snd_hda_codec(E) snd_hda_core(E) snd_hwdep(E) snd_seq(E) snd_seq_device(E) snd_pcm(E) rapl(E) raid6_pq(E) mei_me(E) snd_timer(E) drm_kms_helper(E) intel_cstate(E) snd(E) i2c_i801(E) intel_uncore(E) mei(E) soundcore(E) pcspkr(E) i2c_mux(E) video(E) i2c_nvidia_gpu(E) i2c_smbus(E) i2c_ccgx_ucsi(E) lpc_ich(E) joydev(E) drm(E) dm_mod(E) xfs(E) libcrc32c(E) sd_mod(E) sg(E) hid_logitech_hidpp(E) crct10dif_pclmul(E) ahci(E) crc32_pclmul(E) libahci(E) crc32c_intel(E) mxm_wmi(E) ixgbe(E) polyval_clmulni(E) polyval_generic(E) libata(E) nvme(E) nvme_core(E) ghash_clmulni_intel(E) mdio(E) dca(E) wmi(E) hid_logitech_dj(E) i2c_dev(E) fuse(E)
> [Mon Oct 28 13:03:00 2024] CPU: 38 UID: 959 PID: 15739 Comm: ollama_llama_se Kdump: loaded Tainted: P OE 6.11.5-1.el9.elrepo.x86_64 #1
> [Mon Oct 28 13:03:00 2024] Tainted: [P]=PROPRIETARY_MODULE, [O]=OOT_MODULE, [E]=UNSIGNED_MODULE
> [Mon Oct 28 13:03:00 2024] Hardware name: MSI MS-7885/X99A RAIDER (MS-7885), BIOS P.71 06/13/2019
> [Mon Oct 28 13:03:00 2024] RIP: 0010:__alloc_pages_noprof+0x2df/0x370
> [Mon Oct 28 13:03:00 2024] Code: e8 76 4c fa ff e9 7b fe ff ff 83 fe 0a 0f 86 a5 fd ff ff 45 31 f6 80 3d b0 9d f7 01 00 0f 85 ca fe ff ff c6 05 a3 9d f7 01 01 <0f> 0b e9 bc fe ff ff 45 31 f6 e9 b4 fe ff ff f7 c2 00 00 80 00 75
> [Mon Oct 28 13:03:00 2024] RSP: 0018:ffffb42e4f5fbad0 EFLAGS: 00010246
> [Mon Oct 28 13:03:00 2024] RAX: 0000000000000000 RBX: 0000000000000000 RCX: 0000000000000000
> [Mon Oct 28 13:03:00 2024] RDX: 0000000000000000 RSI: 000000000000000e RDI: 0000000000000000
> [Mon Oct 28 13:03:00 2024] RBP: 0000000000040cc0 R08: 0000000000000000 R09: 0000000000000009
> [Mon Oct 28 13:03:00 2024] R10: ffffb42e4f5fbbf8 R11: 0000000000000023 R12: 000000000000000e
> [Mon Oct 28 13:03:00 2024] R13: 000000000000000e R14: 0000000000000000 R15: 0000000000000cc0
> [Mon Oct 28 13:03:00 2024] FS: 00007f920a152000(0000) GS:ffff9f365ed00000(0000) knlGS:0000000000000000
> [Mon Oct 28 13:03:00 2024] CS: 0010 DS: 0000 ES: 0000 CR0: 0000000080050033
> [Mon Oct 28 13:03:00 2024] CR2: 00007f91df908ba0 CR3: 00000002c9c3c004 CR4: 00000000003706f0
> [Mon Oct 28 13:03:00 2024] DR0: 0000000000000000 DR1: 0000000000000000 DR2: 0000000000000000
> [Mon Oct 28 13:03:00 2024] DR3: 0000000000000000 DR6: 00000000fffe0ff0 DR7: 0000000000000400
> [Mon Oct 28 13:03:00 2024] Call Trace:
> [Mon Oct 28 13:03:00 2024] <TASK>
> [Mon Oct 28 13:03:00 2024] ? __warn+0x7f/0x120
> [Mon Oct 28 13:03:00 2024] ? __alloc_pages_noprof+0x2df/0x370
> [Mon Oct 28 13:03:00 2024] ? report_bug+0x1c3/0x1d0
> [Mon Oct 28 13:03:00 2024] ? handle_bug+0x42/0x70
> [Mon Oct 28 13:03:00 2024] ? exc_invalid_op+0x14/0x70
> [Mon Oct 28 13:03:00 2024] ? asm_exc_invalid_op+0x16/0x20
> [Mon Oct 28 13:03:00 2024] ? __alloc_pages_noprof+0x2df/0x370
> [Mon Oct 28 13:03:00 2024] ___kmalloc_large_node+0x74/0x120
> [Mon Oct 28 13:03:00 2024] ? check_and_migrate_movable_pages+0x2a/0xb0
> [Mon Oct 28 13:03:00 2024] __kmalloc_large_node_noprof+0x17/0xa0
> [Mon Oct 28 13:03:00 2024] ? check_and_migrate_movable_pages+0x2a/0xb0
> [Mon Oct 28 13:03:00 2024] __kmalloc_noprof+0x2a4/0x3b0
> [Mon Oct 28 13:03:00 2024] ? check_and_migrate_movable_pages+0x2a/0xb0
> [Mon Oct 28 13:03:00 2024] check_and_migrate_movable_pages+0x2a/0xb0
> [Mon Oct 28 13:03:00 2024] __gup_longterm_locked+0x5de/0x8e0
> [Mon Oct 28 13:03:00 2024] ? __vmalloc_node_range_noprof+0x10/0x220
> [Mon Oct 28 13:03:00 2024] pin_user_pages+0x78/0xa0
> [Mon Oct 28 13:03:00 2024] os_lock_user_pages+0xaa/0x1a0 [nvidia]
> [Mon Oct 28 13:03:00 2024] _nv000676rm+0x67/0x110 [nvidia]
> [Mon Oct 28 13:03:00 2024] ? _nv000745rm+0xbe6/0xe00 [nvidia]
> [Mon Oct 28 13:03:00 2024] ? rm_ioctl+0x7f/0x400 [nvidia]
> [Mon Oct 28 13:03:00 2024] ? nvidia_ioctl.isra.0+0x53e/0x7f0 [nvidia]
> [Mon Oct 28 13:03:00 2024] ? vm_mmap_pgoff+0x124/0x1b0
> [Mon Oct 28 13:03:00 2024] ? nvidia_unlocked_ioctl+0x21/0x30 [nvidia]
> [Mon Oct 28 13:03:00 2024] ? __x64_sys_ioctl+0x8d/0xc0
> [Mon Oct 28 13:03:00 2024] ? do_syscall_64+0x60/0x180
> [Mon Oct 28 13:03:00 2024] ? entry_SYSCALL_64_after_hwframe+0x76/0x7e
> [Mon Oct 28 13:03:00 2024] </TASK>
> [Mon Oct 28 13:03:00 2024] ---[ end trace 0000000000000000 ]---
> [Mon Oct 28 13:03:00 2024] Cannot map memory with base addr 0x7f86ae000000 and size of 0x43edf1 pages
Each prompt adds 20-30GB of RAM and after 5-6 executions Ollama fails with "Cannot map memory with base addr ".
I have 22 core Xeon CPU, 128GB RAM and 2080Ti GPU with 11GB VRAM, so pretty enough for the regular use. OS is CentOS 9 Stream with 6.11.5 kernel.
Behaviour is the same for the Ollama built from source and downloaded from the GitHub.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14-16
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7403/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7403/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5010
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5010/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5010/comments
|
https://api.github.com/repos/ollama/ollama/issues/5010/events
|
https://github.com/ollama/ollama/issues/5010
| 2,349,811,726
|
I_kwDOJ0Z1Ps6MD0gO
| 5,010
|
Suggestion for RFC7231 Compliant Endpoint for Model Deletion
|
{
"login": "JerrettDavis",
"id": 2610199,
"node_id": "MDQ6VXNlcjI2MTAxOTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/2610199?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JerrettDavis",
"html_url": "https://github.com/JerrettDavis",
"followers_url": "https://api.github.com/users/JerrettDavis/followers",
"following_url": "https://api.github.com/users/JerrettDavis/following{/other_user}",
"gists_url": "https://api.github.com/users/JerrettDavis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JerrettDavis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JerrettDavis/subscriptions",
"organizations_url": "https://api.github.com/users/JerrettDavis/orgs",
"repos_url": "https://api.github.com/users/JerrettDavis/repos",
"events_url": "https://api.github.com/users/JerrettDavis/events{/privacy}",
"received_events_url": "https://api.github.com/users/JerrettDavis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 7706482389,
"node_id": "LA_kwDOJ0Z1Ps8AAAABy1eW1Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/api",
"name": "api",
"color": "bfdadc",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 0
| 2024-06-12T23:00:52
| 2024-11-06T01:22:20
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
**Description:**
Currently, the Ollama library’s DELETE /api/delete endpoint requires the model name to be provided in the request body. However, it would be beneficial and more in line with RFC7231 standards to support a URL-based model name specification. This approach is more intuitive and aligns with common RESTful API practices.
**RFC Reference:**
[RFC7231 Section 4.3.5](https://datatracker.ietf.org/doc/html/rfc7231#section-4.3.5) states: "A payload within a DELETE request message has no defined semantics; sending a payload body on a DELETE request might cause some existing implementations to reject the request." While Ollama does handle this fine, certain tooling and source-code generators may draw issue with this in the future. Especially anything targeting OpenAPI spec <3.1.
**Proposed Change:**
Introduce a new endpoint or modify the existing endpoint to allow specifying the model name directly in the URL path. This would make the API more user-friendly and compliant with RESTful principles, ensuring compatibility with a wider range of HTTP implementations.
**Current Implementation:**
```bash
curl -X DELETE http://localhost:11434/api/delete -d '{
"name": "llama3:13b"
}'
```
**Proposed Implementation:**
```bash
curl -X DELETE http://localhost:11434/api/models/llama3:13b
```
**Request:**
- **Endpoint:** `DELETE /api/models/{name}`
- **Parameters:** `name` (model name to delete)
- **Response:**
- 200 OK if successful
- 404 Not Found if the model to be deleted doesn’t exist
**Benefits:**
- Enhanced clarity and usability of the API.
- Improved compliance with established HTTP standards.
- Avoids potential issues with HTTP implementations that reject DELETE requests with payloads.
- Easier integration and interaction with the API for developers.
**Example:**
```bash
curl -X DELETE http://localhost:11434/api/models/llama3:13b
```
**Response:**
```json
{
"status": "success",
"message": "Model llama3:13b deleted successfully."
}
```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5010/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5010/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3206
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3206/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3206/comments
|
https://api.github.com/repos/ollama/ollama/issues/3206/events
|
https://github.com/ollama/ollama/issues/3206
| 2,191,084,152
|
I_kwDOJ0Z1Ps6CmUp4
| 3,206
|
我将MiniCPM-2B-dpo-bf16-gguf.gguf模型成功的导入到了ollama中,并运行起来了。但是在推理的过程中,发现模型再说胡话,臆想比较严重。详见截图
|
{
"login": "zhao1012",
"id": 38517343,
"node_id": "MDQ6VXNlcjM4NTE3MzQz",
"avatar_url": "https://avatars.githubusercontent.com/u/38517343?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zhao1012",
"html_url": "https://github.com/zhao1012",
"followers_url": "https://api.github.com/users/zhao1012/followers",
"following_url": "https://api.github.com/users/zhao1012/following{/other_user}",
"gists_url": "https://api.github.com/users/zhao1012/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zhao1012/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhao1012/subscriptions",
"organizations_url": "https://api.github.com/users/zhao1012/orgs",
"repos_url": "https://api.github.com/users/zhao1012/repos",
"events_url": "https://api.github.com/users/zhao1012/events{/privacy}",
"received_events_url": "https://api.github.com/users/zhao1012/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 13
| 2024-03-18T02:16:50
| 2024-08-09T16:08:35
| 2024-06-09T17:12:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
我将MiniCPM-2B-dpo-bf16-gguf.gguf模型成功的导入到了ollama中,并运行起来了。但是在推理的过程中,发现模型再说胡话,臆想比较严重。详见截图

_Originally posted by @zhao1012 in https://github.com/ollama/ollama/issues/2383#issuecomment-2002754353_
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3206/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3206/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7498
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7498/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7498/comments
|
https://api.github.com/repos/ollama/ollama/issues/7498/events
|
https://github.com/ollama/ollama/pull/7498
| 2,633,737,969
|
PR_kwDOJ0Z1Ps6A2LnI
| 7,498
|
CI: Switch to v13 macos runner
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-04T20:07:11
| 2024-11-04T21:04:25
| 2024-11-04T21:02:07
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7498",
"html_url": "https://github.com/ollama/ollama/pull/7498",
"diff_url": "https://github.com/ollama/ollama/pull/7498.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7498.patch",
"merged_at": "2024-11-04T21:02:07"
}
|
GitHub has started doing brown-outs on the deprecated macos-12 runner which has blocked the 0.4.0 release CI.
I tried using XCode 14.3.1 however it generates warnings when trying to target macos v11. I've verified that 14.1.0 generates valid v11 binaries without these warnings based on our current build_darwin.sh setup.
Build logs from attempting 14.3.1
```
ggml-blas.cpp:194:13: warning: 'cblas_sgemm' is only available on macOS 13.3 or newer [-Wunguarded-availability-new]
/Applications/Xcode_14.3.1.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/System/Library/Frameworks/vecLib.framework/Headers/cblas_new.h:891:6: note: 'cblas_sgemm' has been marked as being introduced in macOS 13.3 here, but the deployment target is macOS 13.0.0
ggml-blas.cpp:194:13: note: enclose 'cblas_sgemm' in a __builtin_available check to silence this warning
ggml-blas.cpp:259:5: warning: 'cblas_sgemm' is only available on macOS 13.3 or newer [-Wunguarded-availability-new]
/Applications/Xcode_14.3.1.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX.sdk/System/Library/Frameworks/vecLib.framework/Headers/cblas_new.h:891:6: note: 'cblas_sgemm' has been marked as being introduced in macOS 13.3 here, but the deployment target is macOS 13.0.0
ggml-blas.cpp:[25](https://github.com/ollama/ollama/actions/runs/11672392256/job/32500895160#step:5:26)9:5: note: enclose 'cblas_sgemm' in a __builtin_available check to silence this warning
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7498/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7498/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7037
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7037/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7037/comments
|
https://api.github.com/repos/ollama/ollama/issues/7037/events
|
https://github.com/ollama/ollama/issues/7037
| 2,555,089,394
|
I_kwDOJ0Z1Ps6YS5Hy
| 7,037
|
ollama app not running
|
{
"login": "horyekhunley",
"id": 23106322,
"node_id": "MDQ6VXNlcjIzMTA2MzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/23106322?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/horyekhunley",
"html_url": "https://github.com/horyekhunley",
"followers_url": "https://api.github.com/users/horyekhunley/followers",
"following_url": "https://api.github.com/users/horyekhunley/following{/other_user}",
"gists_url": "https://api.github.com/users/horyekhunley/gists{/gist_id}",
"starred_url": "https://api.github.com/users/horyekhunley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/horyekhunley/subscriptions",
"organizations_url": "https://api.github.com/users/horyekhunley/orgs",
"repos_url": "https://api.github.com/users/horyekhunley/repos",
"events_url": "https://api.github.com/users/horyekhunley/events{/privacy}",
"received_events_url": "https://api.github.com/users/horyekhunley/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-09-29T18:59:47
| 2024-10-23T00:17:22
| 2024-10-23T00:17:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
normally when i install ollama, it just works as is, i dont usually have to 'ollama serve'. but now i have to do that if i want to use ollama, if i don't i get the error:
`Error: could not connect to ollama app, is it running?`
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.12
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7037/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7037/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8097
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8097/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8097/comments
|
https://api.github.com/repos/ollama/ollama/issues/8097/events
|
https://github.com/ollama/ollama/issues/8097
| 2,739,860,598
|
I_kwDOJ0Z1Ps6jTvR2
| 8,097
|
Add the ability to "skip vision decoder" to make it easier to support future models
|
{
"login": "vYLQs6",
"id": 143073604,
"node_id": "U_kgDOCIchRA",
"avatar_url": "https://avatars.githubusercontent.com/u/143073604?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vYLQs6",
"html_url": "https://github.com/vYLQs6",
"followers_url": "https://api.github.com/users/vYLQs6/followers",
"following_url": "https://api.github.com/users/vYLQs6/following{/other_user}",
"gists_url": "https://api.github.com/users/vYLQs6/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vYLQs6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vYLQs6/subscriptions",
"organizations_url": "https://api.github.com/users/vYLQs6/orgs",
"repos_url": "https://api.github.com/users/vYLQs6/repos",
"events_url": "https://api.github.com/users/vYLQs6/events{/privacy}",
"received_events_url": "https://api.github.com/users/vYLQs6/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-12-14T13:00:24
| 2024-12-17T19:40:08
| 2024-12-17T19:40:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Since most AI companies have their own vision models, in 2025, there might be more models that release only a vision variant instead of both text and vision variants.
To make it easier to support these new models, it would be nice to be able to skip the vision decoder and infer only the text part.
Is this possible? It would certainly make it much easier and quicker to support new models.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8097/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8097/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1320
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1320/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1320/comments
|
https://api.github.com/repos/ollama/ollama/issues/1320/events
|
https://github.com/ollama/ollama/pull/1320
| 2,017,217,758
|
PR_kwDOJ0Z1Ps5gs_Wy
| 1,320
|
Do no overwrite systemd service file
|
{
"login": "ex3ndr",
"id": 400659,
"node_id": "MDQ6VXNlcjQwMDY1OQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/400659?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ex3ndr",
"html_url": "https://github.com/ex3ndr",
"followers_url": "https://api.github.com/users/ex3ndr/followers",
"following_url": "https://api.github.com/users/ex3ndr/following{/other_user}",
"gists_url": "https://api.github.com/users/ex3ndr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ex3ndr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ex3ndr/subscriptions",
"organizations_url": "https://api.github.com/users/ex3ndr/orgs",
"repos_url": "https://api.github.com/users/ex3ndr/repos",
"events_url": "https://api.github.com/users/ex3ndr/events{/privacy}",
"received_events_url": "https://api.github.com/users/ex3ndr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-11-29T18:54:10
| 2024-02-20T03:27:22
| 2024-02-20T03:27:22
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1320",
"html_url": "https://github.com/ollama/ollama/pull/1320",
"diff_url": "https://github.com/ollama/ollama/pull/1320.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1320.patch",
"merged_at": null
}
|
Currently during upgrade systemd file is lost, this fix avoid overwriting a file
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1320/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1320/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2132
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2132/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2132/comments
|
https://api.github.com/repos/ollama/ollama/issues/2132/events
|
https://github.com/ollama/ollama/issues/2132
| 2,093,042,795
|
I_kwDOJ0Z1Ps58wUxr
| 2,132
|
How to solve ConnectionError ([Errno 111] Connection refused)
|
{
"login": "yliu2702",
"id": 154867456,
"node_id": "U_kgDOCTsXAA",
"avatar_url": "https://avatars.githubusercontent.com/u/154867456?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/yliu2702",
"html_url": "https://github.com/yliu2702",
"followers_url": "https://api.github.com/users/yliu2702/followers",
"following_url": "https://api.github.com/users/yliu2702/following{/other_user}",
"gists_url": "https://api.github.com/users/yliu2702/gists{/gist_id}",
"starred_url": "https://api.github.com/users/yliu2702/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yliu2702/subscriptions",
"organizations_url": "https://api.github.com/users/yliu2702/orgs",
"repos_url": "https://api.github.com/users/yliu2702/repos",
"events_url": "https://api.github.com/users/yliu2702/events{/privacy}",
"received_events_url": "https://api.github.com/users/yliu2702/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 27
| 2024-01-22T04:20:16
| 2024-10-02T15:02:28
| 2024-05-14T19:06:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello, I tried to access 'llama 2' and 'mistral' model to build a local open-source LLM chatbot. However, maybe I access your website too ofter during debugging, I met this error : 'ConnectionError: HTTPConnectionPool(host=‘0.0.0.0’, port=11434): Max retries exceeded with url: /api/chat (Caused by NewConnectionError(‘<urllib3.connection.HTTPConnection object at 0x7fe32765ca30>: Failed to establish a new connection: [Errno 111] Connection refused’))'.
I tried my code through r = requests.post(
"http://0.0.0.0:11434/api/chat",
json={"model": model, "messages": messages, "stream": True, "options": {
"temperature": temp
}},
) and also through langchain, but all failed.
So, how can I solve this problem? So I can use Ollama again? Thanks!
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2132/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2132/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4641
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4641/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4641/comments
|
https://api.github.com/repos/ollama/ollama/issues/4641/events
|
https://github.com/ollama/ollama/issues/4641
| 2,317,318,135
|
I_kwDOJ0Z1Ps6KH3f3
| 4,641
|
What's happening?When I enter the Serve command.
|
{
"login": "SuzuKaO",
"id": 32011143,
"node_id": "MDQ6VXNlcjMyMDExMTQz",
"avatar_url": "https://avatars.githubusercontent.com/u/32011143?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SuzuKaO",
"html_url": "https://github.com/SuzuKaO",
"followers_url": "https://api.github.com/users/SuzuKaO/followers",
"following_url": "https://api.github.com/users/SuzuKaO/following{/other_user}",
"gists_url": "https://api.github.com/users/SuzuKaO/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SuzuKaO/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SuzuKaO/subscriptions",
"organizations_url": "https://api.github.com/users/SuzuKaO/orgs",
"repos_url": "https://api.github.com/users/SuzuKaO/repos",
"events_url": "https://api.github.com/users/SuzuKaO/events{/privacy}",
"received_events_url": "https://api.github.com/users/SuzuKaO/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-05-25T23:55:37
| 2024-08-09T23:48:41
| 2024-08-09T23:48:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR:C:\\Users\\***\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_TMPDIR:]"
time=2024-05-26T07:41:02.855+08:00 level=INFO source=images.go:704 msg="total blobs: 10"
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xc0000005 code=0x0 addr=0x10 pc=0xe1b921]
goroutine 1 [running]:
github.com/ollama/ollama/server.deleteUnusedLayers.func1({0xc00026e320, 0x5a}, {0x14655c8?, 0xc0002d2b60?}, {0x14655c8?, 0xc0002d2b60?})
github.com/ollama/ollama/server/images.go:648 +0x501
path/filepath.walk({0xc00026e320, 0x5a}, {0x14655c8, 0xc0002d2b60}, 0xc000515800)
path/filepath/path.go:478 +0x105
path/filepath.walk({0xc000254000, 0x4e}, {0x14655c8, 0xc0002d25b0}, 0xc000515800)
path/filepath/path.go:502 +0x254
path/filepath.walk({0xc00059a9c0, 0x35}, {0x14655c8, 0xc0002d2540}, 0xc000515800)
path/filepath/path.go:502 +0x254
path/filepath.walk({0xc0004b6680, 0x2d}, {0x14655c8, 0xc0002d24d0}, 0xc000515800)
path/filepath/path.go:502 +0x254
path/filepath.walk({0xc000480bc0, 0x1a}, {0x14655c8, 0xc0002d2460}, 0xc000515800)
path/filepath/path.go:502 +0x254
path/filepath.Walk({0xc000480bc0, 0x1a}, 0xc000515800)
path/filepath/path.go:560 +0x66
github.com/ollama/ollama/server.deleteUnusedLayers(0x0, 0xc000515a48)
github.com/ollama/ollama/server/images.go:652 +0xa5
github.com/ollama/ollama/server.PruneLayers()
github.com/ollama/ollama/server/images.go:706 +0x465
github.com/ollama/ollama/server.Serve({0x1460800, 0xc0004c31a0})
github.com/ollama/ollama/server/routes.go:1034 +0x2a5
github.com/ollama/ollama/cmd.RunServer(0xc0001a7400?, {0x1c2d700?, 0x4?, 0x12be960?})
github.com/ollama/ollama/cmd/cmd.go:966 +0x17c
github.com/spf13/cobra.(*Command).execute(0xc0000a2308, {0x1c2d700, 0x0, 0x0})
github.com/spf13/cobra@v1.7.0/command.go:940 +0x882
github.com/spf13/cobra.(*Command).ExecuteC(0xc000453b08)
github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5
github.com/spf13/cobra.(*Command).Execute(...)
github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
github.com/spf13/cobra@v1.7.0/command.go:985
main.main()
github.com/ollama/ollama/main.go:11 +0x4d
### OS
Windows
### GPU
Nvidia, Intel
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4641/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4641/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8661
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8661/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8661/comments
|
https://api.github.com/repos/ollama/ollama/issues/8661/events
|
https://github.com/ollama/ollama/issues/8661
| 2,818,282,626
|
I_kwDOJ0Z1Ps6n-5SC
| 8,661
|
Will Ollama run on the NPU(ANE) of Apple M silicon?
|
{
"login": "imJack6",
"id": 58357771,
"node_id": "MDQ6VXNlcjU4MzU3Nzcx",
"avatar_url": "https://avatars.githubusercontent.com/u/58357771?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/imJack6",
"html_url": "https://github.com/imJack6",
"followers_url": "https://api.github.com/users/imJack6/followers",
"following_url": "https://api.github.com/users/imJack6/following{/other_user}",
"gists_url": "https://api.github.com/users/imJack6/gists{/gist_id}",
"starred_url": "https://api.github.com/users/imJack6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/imJack6/subscriptions",
"organizations_url": "https://api.github.com/users/imJack6/orgs",
"repos_url": "https://api.github.com/users/imJack6/repos",
"events_url": "https://api.github.com/users/imJack6/events{/privacy}",
"received_events_url": "https://api.github.com/users/imJack6/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-29T13:50:08
| 2025-01-29T13:50:08
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
RT
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8661/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8661/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8237
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8237/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8237/comments
|
https://api.github.com/repos/ollama/ollama/issues/8237/events
|
https://github.com/ollama/ollama/pull/8237
| 2,758,607,598
|
PR_kwDOJ0Z1Ps6GM7FK
| 8,237
|
Changes macOS installer to skip symlink step if ollama is already in path.
|
{
"login": "dey-indranil",
"id": 18570914,
"node_id": "MDQ6VXNlcjE4NTcwOTE0",
"avatar_url": "https://avatars.githubusercontent.com/u/18570914?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dey-indranil",
"html_url": "https://github.com/dey-indranil",
"followers_url": "https://api.github.com/users/dey-indranil/followers",
"following_url": "https://api.github.com/users/dey-indranil/following{/other_user}",
"gists_url": "https://api.github.com/users/dey-indranil/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dey-indranil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dey-indranil/subscriptions",
"organizations_url": "https://api.github.com/users/dey-indranil/orgs",
"repos_url": "https://api.github.com/users/dey-indranil/repos",
"events_url": "https://api.github.com/users/dey-indranil/events{/privacy}",
"received_events_url": "https://api.github.com/users/dey-indranil/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-12-25T08:08:00
| 2025-01-28T21:42:38
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8237",
"html_url": "https://github.com/ollama/ollama/pull/8237",
"diff_url": "https://github.com/ollama/ollama/pull/8237.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8237.patch",
"merged_at": null
}
|
Resolves [283](https://github.com/ollama/ollama/issues/283)
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8237/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8237/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8530
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8530/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8530/comments
|
https://api.github.com/repos/ollama/ollama/issues/8530/events
|
https://github.com/ollama/ollama/issues/8530
| 2,803,368,900
|
I_kwDOJ0Z1Ps6nGAPE
| 8,530
|
ollama pull hangs at ~90% completion
|
{
"login": "bdytx5",
"id": 32812705,
"node_id": "MDQ6VXNlcjMyODEyNzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/32812705?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bdytx5",
"html_url": "https://github.com/bdytx5",
"followers_url": "https://api.github.com/users/bdytx5/followers",
"following_url": "https://api.github.com/users/bdytx5/following{/other_user}",
"gists_url": "https://api.github.com/users/bdytx5/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bdytx5/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bdytx5/subscriptions",
"organizations_url": "https://api.github.com/users/bdytx5/orgs",
"repos_url": "https://api.github.com/users/bdytx5/repos",
"events_url": "https://api.github.com/users/bdytx5/events{/privacy}",
"received_events_url": "https://api.github.com/users/bdytx5/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-22T04:47:45
| 2025-01-22T04:47:45
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
<img width="1266" alt="Image" src="https://github.com/user-attachments/assets/821bda2a-f119-4c46-b72d-c00305072cc4" />
Seems to work fine after a couple retries... Very strange
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.7
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8530/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8530/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6630
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6630/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6630/comments
|
https://api.github.com/repos/ollama/ollama/issues/6630/events
|
https://github.com/ollama/ollama/pull/6630
| 2,504,571,966
|
PR_kwDOJ0Z1Ps56Wqct
| 6,630
|
docs(integrations): add claude-dev
|
{
"login": "sammcj",
"id": 862951,
"node_id": "MDQ6VXNlcjg2Mjk1MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/862951?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sammcj",
"html_url": "https://github.com/sammcj",
"followers_url": "https://api.github.com/users/sammcj/followers",
"following_url": "https://api.github.com/users/sammcj/following{/other_user}",
"gists_url": "https://api.github.com/users/sammcj/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sammcj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sammcj/subscriptions",
"organizations_url": "https://api.github.com/users/sammcj/orgs",
"repos_url": "https://api.github.com/users/sammcj/repos",
"events_url": "https://api.github.com/users/sammcj/events{/privacy}",
"received_events_url": "https://api.github.com/users/sammcj/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-04T07:50:19
| 2024-09-04T20:01:55
| 2024-09-04T13:32:26
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6630",
"html_url": "https://github.com/ollama/ollama/pull/6630",
"diff_url": "https://github.com/ollama/ollama/pull/6630.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6630.patch",
"merged_at": "2024-09-04T13:32:26"
}
|
- Claude Dev [just added](https://github.com/saoudrizwan/claude-dev/releases/tag/v1.5.19) support for Ollama.
It's currently via the OpenAI compatible API, but specifically calls out Ollama as an option.
<img width="594" alt="image" src="https://github.com/user-attachments/assets/21167eb3-5020-4f21-b354-27d4e7e04275">
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6630/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6630/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1897
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1897/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1897/comments
|
https://api.github.com/repos/ollama/ollama/issues/1897/events
|
https://github.com/ollama/ollama/pull/1897
| 2,074,502,272
|
PR_kwDOJ0Z1Ps5jsRjD
| 1,897
|
Make sure the WSL version of libnvidia-ml.so is loaded
|
{
"login": "taweili",
"id": 6722,
"node_id": "MDQ6VXNlcjY3MjI=",
"avatar_url": "https://avatars.githubusercontent.com/u/6722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taweili",
"html_url": "https://github.com/taweili",
"followers_url": "https://api.github.com/users/taweili/followers",
"following_url": "https://api.github.com/users/taweili/following{/other_user}",
"gists_url": "https://api.github.com/users/taweili/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taweili/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taweili/subscriptions",
"organizations_url": "https://api.github.com/users/taweili/orgs",
"repos_url": "https://api.github.com/users/taweili/repos",
"events_url": "https://api.github.com/users/taweili/events{/privacy}",
"received_events_url": "https://api.github.com/users/taweili/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-10T14:30:52
| 2024-01-11T08:37:46
| 2024-01-11T08:37:46
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1897",
"html_url": "https://github.com/ollama/ollama/pull/1897",
"diff_url": "https://github.com/ollama/ollama/pull/1897.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1897.patch",
"merged_at": null
}
|
In WSL environment, the /usr/lib/wsl/lib/libnvidia-ml.so.1 should be used instead of the generic libnvidia-ml from nvidia-compute.
|
{
"login": "taweili",
"id": 6722,
"node_id": "MDQ6VXNlcjY3MjI=",
"avatar_url": "https://avatars.githubusercontent.com/u/6722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taweili",
"html_url": "https://github.com/taweili",
"followers_url": "https://api.github.com/users/taweili/followers",
"following_url": "https://api.github.com/users/taweili/following{/other_user}",
"gists_url": "https://api.github.com/users/taweili/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taweili/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taweili/subscriptions",
"organizations_url": "https://api.github.com/users/taweili/orgs",
"repos_url": "https://api.github.com/users/taweili/repos",
"events_url": "https://api.github.com/users/taweili/events{/privacy}",
"received_events_url": "https://api.github.com/users/taweili/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1897/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1897/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5648
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5648/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5648/comments
|
https://api.github.com/repos/ollama/ollama/issues/5648/events
|
https://github.com/ollama/ollama/issues/5648
| 2,404,993,015
|
I_kwDOJ0Z1Ps6PWUf3
| 5,648
|
image description model is too slow
|
{
"login": "codeMonkey-shin",
"id": 80636401,
"node_id": "MDQ6VXNlcjgwNjM2NDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/80636401?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codeMonkey-shin",
"html_url": "https://github.com/codeMonkey-shin",
"followers_url": "https://api.github.com/users/codeMonkey-shin/followers",
"following_url": "https://api.github.com/users/codeMonkey-shin/following{/other_user}",
"gists_url": "https://api.github.com/users/codeMonkey-shin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codeMonkey-shin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codeMonkey-shin/subscriptions",
"organizations_url": "https://api.github.com/users/codeMonkey-shin/orgs",
"repos_url": "https://api.github.com/users/codeMonkey-shin/repos",
"events_url": "https://api.github.com/users/codeMonkey-shin/events{/privacy}",
"received_events_url": "https://api.github.com/users/codeMonkey-shin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-07-12T08:00:57
| 2024-07-23T21:55:10
| 2024-07-23T21:54:38
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
After updating to the latest version, I am using llava:13b on Ubuntu, and the API call speed takes about 1 minute.
It was originally about 10 seconds, but it became too slow.
The graphics card is A30.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
i just use curl -fsSL https://ollama.com/install.sh | sh <--this
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5648/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5648/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/931
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/931/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/931/comments
|
https://api.github.com/repos/ollama/ollama/issues/931/events
|
https://github.com/ollama/ollama/issues/931
| 1,964,792,441
|
I_kwDOJ0Z1Ps51HFp5
| 931
|
How do we stop a model to release GPU memory? (not ollama server).
|
{
"login": "riskk21",
"id": 22312065,
"node_id": "MDQ6VXNlcjIyMzEyMDY1",
"avatar_url": "https://avatars.githubusercontent.com/u/22312065?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/riskk21",
"html_url": "https://github.com/riskk21",
"followers_url": "https://api.github.com/users/riskk21/followers",
"following_url": "https://api.github.com/users/riskk21/following{/other_user}",
"gists_url": "https://api.github.com/users/riskk21/gists{/gist_id}",
"starred_url": "https://api.github.com/users/riskk21/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/riskk21/subscriptions",
"organizations_url": "https://api.github.com/users/riskk21/orgs",
"repos_url": "https://api.github.com/users/riskk21/repos",
"events_url": "https://api.github.com/users/riskk21/events{/privacy}",
"received_events_url": "https://api.github.com/users/riskk21/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 11
| 2023-10-27T05:26:44
| 2024-04-23T12:41:37
| 2024-02-20T00:57:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
How do we stop a model to release GPU memory? (not ollama server).
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/931/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/931/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4862
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4862/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4862/comments
|
https://api.github.com/repos/ollama/ollama/issues/4862/events
|
https://github.com/ollama/ollama/issues/4862
| 2,338,720,902
|
I_kwDOJ0Z1Ps6LZgyG
| 4,862
|
Probably I am missing something...
|
{
"login": "Zibri",
"id": 855176,
"node_id": "MDQ6VXNlcjg1NTE3Ng==",
"avatar_url": "https://avatars.githubusercontent.com/u/855176?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Zibri",
"html_url": "https://github.com/Zibri",
"followers_url": "https://api.github.com/users/Zibri/followers",
"following_url": "https://api.github.com/users/Zibri/following{/other_user}",
"gists_url": "https://api.github.com/users/Zibri/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Zibri/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Zibri/subscriptions",
"organizations_url": "https://api.github.com/users/Zibri/orgs",
"repos_url": "https://api.github.com/users/Zibri/repos",
"events_url": "https://api.github.com/users/Zibri/events{/privacy}",
"received_events_url": "https://api.github.com/users/Zibri/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-06-06T16:44:53
| 2024-06-06T21:26:23
| 2024-06-06T21:26:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I created a file containing: ``FROM I:\models\Mistral-7b-Instruct-v0.3.f16.q6_k.gguf``
Then I did: `ollama create mistral file`
The model loaded.
then I did:
ollama run mistral
and If I say "Hello" it starts talking by itself introdfucing itself every time with a different identity.
Also, my GPU is an NVIDIA GTX 970M (which works with llama.cpp and opencl), but it's not apparently used.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.41
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4862/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4862/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2877
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2877/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2877/comments
|
https://api.github.com/repos/ollama/ollama/issues/2877/events
|
https://github.com/ollama/ollama/issues/2877
| 2,164,785,912
|
I_kwDOJ0Z1Ps6BCAL4
| 2,877
|
Getting error with `nomic-embed-text`
|
{
"login": "isavita",
"id": 5805397,
"node_id": "MDQ6VXNlcjU4MDUzOTc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5805397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/isavita",
"html_url": "https://github.com/isavita",
"followers_url": "https://api.github.com/users/isavita/followers",
"following_url": "https://api.github.com/users/isavita/following{/other_user}",
"gists_url": "https://api.github.com/users/isavita/gists{/gist_id}",
"starred_url": "https://api.github.com/users/isavita/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/isavita/subscriptions",
"organizations_url": "https://api.github.com/users/isavita/orgs",
"repos_url": "https://api.github.com/users/isavita/repos",
"events_url": "https://api.github.com/users/isavita/events{/privacy}",
"received_events_url": "https://api.github.com/users/isavita/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-03-02T12:20:27
| 2024-08-07T02:41:32
| 2024-03-02T13:00:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# Description
It might be me doing something wrong, but when I try to run `ollama run nomic-embed-text` I am getting following error
```shell
> ollama run nomic-embed-text
Error: embedding models do not support chat
```
Here is info for my os and the ollama version
```shell
> ollama -v
ollama version is 0.1.27
> sw_vers
ProductName: macOS
ProductVersion: 14.2.1
BuildVersion: 23C71
```
|
{
"login": "isavita",
"id": 5805397,
"node_id": "MDQ6VXNlcjU4MDUzOTc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5805397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/isavita",
"html_url": "https://github.com/isavita",
"followers_url": "https://api.github.com/users/isavita/followers",
"following_url": "https://api.github.com/users/isavita/following{/other_user}",
"gists_url": "https://api.github.com/users/isavita/gists{/gist_id}",
"starred_url": "https://api.github.com/users/isavita/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/isavita/subscriptions",
"organizations_url": "https://api.github.com/users/isavita/orgs",
"repos_url": "https://api.github.com/users/isavita/repos",
"events_url": "https://api.github.com/users/isavita/events{/privacy}",
"received_events_url": "https://api.github.com/users/isavita/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2877/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6795
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6795/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6795/comments
|
https://api.github.com/repos/ollama/ollama/issues/6795/events
|
https://github.com/ollama/ollama/issues/6795
| 2,525,444,823
|
I_kwDOJ0Z1Ps6WhzrX
| 6,795
|
there are various models which is default provided by meta llama when downloaded i have tried but couldn't not find it
|
{
"login": "olumolu",
"id": 162728301,
"node_id": "U_kgDOCbMJbQ",
"avatar_url": "https://avatars.githubusercontent.com/u/162728301?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/olumolu",
"html_url": "https://github.com/olumolu",
"followers_url": "https://api.github.com/users/olumolu/followers",
"following_url": "https://api.github.com/users/olumolu/following{/other_user}",
"gists_url": "https://api.github.com/users/olumolu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/olumolu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/olumolu/subscriptions",
"organizations_url": "https://api.github.com/users/olumolu/orgs",
"repos_url": "https://api.github.com/users/olumolu/repos",
"events_url": "https://api.github.com/users/olumolu/events{/privacy}",
"received_events_url": "https://api.github.com/users/olumolu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-09-13T18:40:54
| 2024-09-14T18:09:59
| 2024-09-14T16:58:41
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
[8b-instruct-fp16](https://ollama.com/library/llama3.1:8b-instruct-fp16)
4aacac419454 • 16GB • Updated 3 days ago
[8b-instruct-q2_K](https://ollama.com/library/llama3.1:8b-instruct-q2_K)
44a139eeb344 • 3.2GB • Updated 3 days ago
[8b-instruct-q3_K_S](https://ollama.com/library/llama3.1:8b-instruct-q3_K_S)
16268e519444 • 3.7GB • Updated 3 days ago
[8b-instruct-q3_K_M](https://ollama.com/library/llama3.1:8b-instruct-q3_K_M)
4faa21fca5a2 • 4.0GB • Updated 3 days ago
[8b-instruct-q3_K_L](https://ollama.com/library/llama3.1:8b-instruct-q3_K_L)
04a2f1e44de7 • 4.3GB • Updated 3 days ago
[8b-instruct-q4_0](https://ollama.com/library/llama3.1:8b-instruct-q4_0)
42182419e950 • 4.7GB • Updated 3 days ago
[8b-instruct-q4_1](https://ollama.com/library/llama3.1:8b-instruct-q4_1)
e129e608a752 • 5.1GB • Updated 3 days ago
[8b-instruct-q4_K_S](https://ollama.com/library/llama3.1:8b-instruct-q4_K_S)
778e1e675704 • 4.7GB • Updated 3 days ago
[8b-instruct-q4_K_M](https://ollama.com/library/llama3.1:8b-instruct-q4_K_M)
46e0c10c039e • 4.9GB • Updated 3 days ago
[8b-instruct-q5_0](https://ollama.com/library/llama3.1:8b-instruct-q5_0)
26bc223a1709 • 5.6GB • Updated 3 days ago
[8b-instruct-q5_1](https://ollama.com/library/llama3.1:8b-instruct-q5_1)
8faaa53f9cda • 6.1GB • Updated 3 days ago
[8b-instruct-q5_K_S](https://ollama.com/library/llama3.1:8b-instruct-q5_K_S)
2d79e69bc236 • 5.6GB • Updated 3 days ago
[8b-instruct-q5_K_M](https://ollama.com/library/llama3.1:8b-instruct-q5_K_M)
27fe1b0ab52c • 5.7GB • Updated 3 days ago
[8b-instruct-q6_K](https://ollama.com/library/llama3.1:8b-instruct-q6_K)
81e7664fda9c • 6.6GB • Updated 3 days ago
[8b-instruct-q8_0](https://ollama.com/library/llama3.1:8b-instruct-q8_0)
b158ded76fa0 • 8.5GB • Updated 3 days ago
[8b-text-fp16](https://ollama.com/library/llama3.1:8b-text-fp16)
722fd1ff1fda • 16GB • Updated 5 weeks ago
[8b-text-q2_K](https://ollama.com/library/llama3.1:8b-text-q2_K)
82bedef0ef47 • 3.2GB • Updated 5 weeks ago
[8b-text-q3_K_S](https://ollama.com/library/llama3.1:8b-text-q3_K_S)
92c2cffe1a17 • 3.7GB • Updated 5 weeks ago
[8b-text-q3_K_M](https://ollama.com/library/llama3.1:8b-text-q3_K_M)
abcf9215e3df • 4.0GB • Updated 5 weeks ago
[8b-text-q3_K_L](https://ollama.com/library/llama3.1:8b-text-q3_K_L)
4d9a56f79245 • 4.3GB • Updated 5 weeks ago
[8b-text-q4_0](https://ollama.com/library/llama3.1:8b-text-q4_0)
025059e83055 • 4.7GB • Updated 4 weeks ago
[8b-text-q4_1](https://ollama.com/library/llama3.1:8b-text-q4_1)
4b32d52187b4 • 5.1GB • Updated 5 weeks ago
[8b-text-q4_K_S](https://ollama.com/library/llama3.1:8b-text-q4_K_S)
d1a421604a57 • 4.7GB • Updated 5 weeks ago
[8b-text-q4_K_M](https://ollama.com/library/llama3.1:8b-text-q4_K_M)
6f98b5a6e4b7 • 4.9GB • Updated 5 weeks ago
[8b-text-q5_0](https://ollama.com/library/llama3.1:8b-text-q5_0)
ee0f9a2ffa00 • 5.6GB • Updated 5 weeks ago
[8b-text-q5_1](https://ollama.com/library/llama3.1:8b-text-q5_1)
ad68371dbd08 • 6.1GB • Updated 5 weeks ago
[8b-text-q5_K_S](https://ollama.com/library/llama3.1:8b-text-q5_K_S)
a7562b693302 • 5.6GB • Updated 5 weeks ago
[8b-text-q5_K_M](https://ollama.com/library/llama3.1:8b-text-q5_K_M)
e0b14a625560 • 5.7GB • Updated 5 weeks ago
[8b-text-q6_K](https://ollama.com/library/llama3.1:8b-text-q6_K)
40e61b3c96cc • 6.6GB • Updated 5 weeks ago
[8b-text-q8_0](https://ollama.com/library/llama3.1:8b-text-q8_0)
56fd90f1aa19 • 8.5GB • Updated 5 weeks ago
[70b-instruct-fp16](https://ollama.com/library/llama3.1:70b-instruct-fp16)
80d34437631f • 141GB • Updated 3 days ago
[70b-instruct-q2_K](https://ollama.com/library/llama3.1:70b-instruct-q2_K)
3cbf499d6905 • 26GB • Updated 3 days ago
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6795/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6795/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6400
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6400/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6400/comments
|
https://api.github.com/repos/ollama/ollama/issues/6400/events
|
https://github.com/ollama/ollama/pull/6400
| 2,471,622,819
|
PR_kwDOJ0Z1Ps54pUoL
| 6,400
|
Add arm64 cuda jetpack variants
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-08-17T18:25:44
| 2024-10-15T22:39:59
| 2024-10-15T22:39:27
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6400",
"html_url": "https://github.com/ollama/ollama/pull/6400",
"diff_url": "https://github.com/ollama/ollama/pull/6400.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6400.patch",
"merged_at": null
}
|
This adds 2 new variants for the arm64 build to support nvidia jetson systems based on jetpack 5 and 6. Jetpack 4 is too old to be built with our toolchain (the older cuda requires an old gcc which can't build llama.cpp) and will remain unsupported.
The sbsa discrete GPU cuda libraries we bundle in the existing arm64 build are incompatible with jetson iGPU systems. Unfortunately swapping them at runtime isn't viable given the way nvcc compilation/linking works, so we need to actually build and link against those specific cuda libraries, and bundle them.
Fixes #2408
Fixes #4693
Fixes #5100
Fixes #4861
Resulting artifacts:
```
% ls -lh dist/ollama-linux-arm64.tgz
-rw-r--r-- 1 daniel staff 2.1G Aug 17 10:47 dist/ollama-linux-arm64.tgz
% ls -lh dist/linux-arm64/bin/ollama
-rwxr-xr-x 1 daniel staff 868M Aug 17 10:47 dist/linux-arm64/bin/ollama
```
Draft until #5049 merges
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6400/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6400/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1623
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1623/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1623/comments
|
https://api.github.com/repos/ollama/ollama/issues/1623/events
|
https://github.com/ollama/ollama/pull/1623
| 2,050,062,266
|
PR_kwDOJ0Z1Ps5icesJ
| 1,623
|
adds ooo to Community Integrations in README
|
{
"login": "Npahlfer",
"id": 1068840,
"node_id": "MDQ6VXNlcjEwNjg4NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1068840?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Npahlfer",
"html_url": "https://github.com/Npahlfer",
"followers_url": "https://api.github.com/users/Npahlfer/followers",
"following_url": "https://api.github.com/users/Npahlfer/following{/other_user}",
"gists_url": "https://api.github.com/users/Npahlfer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Npahlfer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Npahlfer/subscriptions",
"organizations_url": "https://api.github.com/users/Npahlfer/orgs",
"repos_url": "https://api.github.com/users/Npahlfer/repos",
"events_url": "https://api.github.com/users/Npahlfer/events{/privacy}",
"received_events_url": "https://api.github.com/users/Npahlfer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-20T08:14:18
| 2024-03-25T19:08:34
| 2024-03-25T19:08:33
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1623",
"html_url": "https://github.com/ollama/ollama/pull/1623",
"diff_url": "https://github.com/ollama/ollama/pull/1623.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1623.patch",
"merged_at": "2024-03-25T19:08:33"
}
|
Adds a link to a terminal command (https://github.com/npahlfer/ooo) that lets you pipe in outputs from other terminal commands "into" Ollama and parse them through your prompt.
This way you can parse command outputs in an easy way!
You can also just prompt Ollama like you normally would eg. `$ ooo how long is a rope`.
Thanks, I love your work!
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1623/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1623/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4614
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4614/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4614/comments
|
https://api.github.com/repos/ollama/ollama/issues/4614/events
|
https://github.com/ollama/ollama/issues/4614
| 2,315,522,822
|
I_kwDOJ0Z1Ps6KBBMG
| 4,614
|
Cpu selected over GPU when running ollama service
|
{
"login": "Talleyrand-34",
"id": 119809076,
"node_id": "U_kgDOByQkNA",
"avatar_url": "https://avatars.githubusercontent.com/u/119809076?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Talleyrand-34",
"html_url": "https://github.com/Talleyrand-34",
"followers_url": "https://api.github.com/users/Talleyrand-34/followers",
"following_url": "https://api.github.com/users/Talleyrand-34/following{/other_user}",
"gists_url": "https://api.github.com/users/Talleyrand-34/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Talleyrand-34/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Talleyrand-34/subscriptions",
"organizations_url": "https://api.github.com/users/Talleyrand-34/orgs",
"repos_url": "https://api.github.com/users/Talleyrand-34/repos",
"events_url": "https://api.github.com/users/Talleyrand-34/events{/privacy}",
"received_events_url": "https://api.github.com/users/Talleyrand-34/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
},
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-05-24T14:16:23
| 2024-05-28T16:51:07
| 2024-05-28T16:51:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The problems is that i cannot specify to use gpu over cpu. The instructions to get the gpu running are also not clear.
Is it that the gpu is not supported? If yes which options do i have?
```
ollama ps
NAME ID SIZE PROCESSOR UNTIL
codellama:latest 8fdf8f752f6e 5.1 GB 100% CPU 4 minutes from now
```
systemctl service
```
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
Enviroment="OLLAMA_MODELS=/home/deck/.ollama/models"
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/root/.nix-profile/bin:/nix/var/nix/profiles/default/bin:/usr/local/sbin:/usr/local/bin:/usr/bin:/var/lib/flatpak/exports/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl:/usr/lib/rustup/bin"
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_DEBUG=1"
#Enviroment="HSA_OVERRIDE_GFX_VERSION=gfx1030"
[Install]
WantedBy=default.target
```
systemctl status
```
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.970+02:00 level=DEBUG source=amd_linux.go:243 msg="amdgpu memory" gpu=0 available="1.0 GiB"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.970+02:00 level=DEBUG source=amd_common.go:16 msg="evaluating potential rocm lib dir /opt/rocm/lib"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.970+02:00 level=DEBUG source=amd_common.go:16 msg="evaluating potential rocm lib dir /usr/lib64"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.975+02:00 level=DEBUG source=amd_common.go:16 msg="evaluating potential rocm lib dir /usr/local/bin/rocm"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.975+02:00 level=DEBUG source=amd_common.go:16 msg="evaluating potential rocm lib dir /usr/share/ollama/lib/rocm"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.977+02:00 level=DEBUG source=amd_linux.go:292 msg="rocm supported GPUs" types="[gfx1030 gfx1100 gfx1101 gfx1102 gfx900 gfx906 gfx908 gfx90a gfx940 gfx941 gfx942]"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.977+02:00 level=WARN source=amd_linux.go:296 msg="amdgpu is not supported" gpu=0 gpu_type=gfx1033 library=/usr/share/ollama/lib/rocm supported_types="[gfx1030 gfx1100 gfx1101 gfx1102 gfx900 gfx906 gfx908 gfx90a gfx940 gfx941 gfx942]"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.977+02:00 level=WARN source=amd_linux.go:298 msg="See https://github.com/ollama/ollama/blob/main/docs/gpu.md#overrides for HSA_OVERRIDE_GFX_VERSION usage"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.977+02:00 level=INFO source=amd_linux.go:311 msg="no compatible amdgpu devices detected"
May 24 20:10:14 steamdeck ollama[43437]: time=2024-05-24T20:10:14.977+02:00 level=INFO source=types.go:71 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="14.5 GiB" available="2.8 GiB"
```
GPU info
```
04:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] VanGogh [AMD Custom GPU 0405] (rev ae) (prog-if 00 [VGA controller])
Subsystem: Advanced Micro Devices, Inc. [AMD/ATI] VanGogh [AMD Custom GPU 0405]
Flags: bus master, fast devsel, latency 0, IRQ 41
Memory at f8e0000000 (64-bit, prefetchable) [size=256M]
Memory at f8f0000000 (64-bit, prefetchable) [size=2M]
I/O ports at 1000 [size=256]
Memory at 80300000 (32-bit, non-prefetchable) [size=512K]
Capabilities: [48] Vendor Specific Information: Len=08 <?>
Capabilities: [50] Power Management version 3
Capabilities: [64] Express Legacy Endpoint, MSI 00
Capabilities: [a0] MSI: Enable- Count=1/4 Maskable- 64bit+
Capabilities: [c0] MSI-X: Enable+ Count=4 Masked-
Capabilities: [100] Vendor Specific Information: ID=0001 Rev=1 Len=010 <?>
Capabilities: [270] Secondary PCI Express
Capabilities: [2b0] Address Translation Service (ATS)
Capabilities: [2c0] Page Request Interface (PRI)
Capabilities: [2d0] Process Address Space ID (PASID)
Capabilities: [410] Physical Layer 16.0 GT/s <?>
Capabilities: [440] Lane Margining at the Receiver <?>
Kernel driver in use: amdgpu
Kernel modules: amdgpu
```
I am using the steamdeck
### OS
Linux // Arch linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.1.39
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4614/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4614/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3846
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3846/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3846/comments
|
https://api.github.com/repos/ollama/ollama/issues/3846/events
|
https://github.com/ollama/ollama/pull/3846
| 2,259,362,406
|
PR_kwDOJ0Z1Ps5tgGN3
| 3,846
|
Detect and recover if runner removed
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-23T17:06:59
| 2024-04-23T20:14:14
| 2024-04-23T20:14:12
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3846",
"html_url": "https://github.com/ollama/ollama/pull/3846",
"diff_url": "https://github.com/ollama/ollama/pull/3846.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3846.patch",
"merged_at": "2024-04-23T20:14:12"
}
|
Tmp cleaners can nuke the file out from underneath us. This detects the missing runner, and re-initializes the payloads.
Manually tested by hand removing the server after loading it once, trigger an unload with `keep_alive: 0` sent another request, saw the log message, and it loaded correctly.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3846/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3846/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2561
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2561/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2561/comments
|
https://api.github.com/repos/ollama/ollama/issues/2561/events
|
https://github.com/ollama/ollama/issues/2561
| 2,140,009,166
|
I_kwDOJ0Z1Ps5_jfLO
| 2,561
|
Dark mode request
|
{
"login": "nav9",
"id": 2093933,
"node_id": "MDQ6VXNlcjIwOTM5MzM=",
"avatar_url": "https://avatars.githubusercontent.com/u/2093933?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nav9",
"html_url": "https://github.com/nav9",
"followers_url": "https://api.github.com/users/nav9/followers",
"following_url": "https://api.github.com/users/nav9/following{/other_user}",
"gists_url": "https://api.github.com/users/nav9/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nav9/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nav9/subscriptions",
"organizations_url": "https://api.github.com/users/nav9/orgs",
"repos_url": "https://api.github.com/users/nav9/repos",
"events_url": "https://api.github.com/users/nav9/events{/privacy}",
"received_events_url": "https://api.github.com/users/nav9/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 1
| 2024-02-17T11:57:00
| 2024-03-05T19:19:05
| 2024-03-05T19:19:04
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Kindly provide a dark theme for https://ollama.com/. The intensity of the bright white color gets strenuous on the eyes. A toggle button at the top of the screen to activate dark mode or a default dark mode would help.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2561/reactions",
"total_count": 8,
"+1": 8,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2561/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6694
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6694/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6694/comments
|
https://api.github.com/repos/ollama/ollama/issues/6694/events
|
https://github.com/ollama/ollama/issues/6694
| 2,512,149,605
|
I_kwDOJ0Z1Ps6VvFxl
| 6,694
|
A mixture of experts model
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-08T01:37:57
| 2024-09-12T00:23:11
| 2024-09-12T00:23:11
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/allenai/OLMoE-1B-7B-0924
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6694/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6694/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4817
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4817/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4817/comments
|
https://api.github.com/repos/ollama/ollama/issues/4817/events
|
https://github.com/ollama/ollama/issues/4817
| 2,333,886,407
|
I_kwDOJ0Z1Ps6LHEfH
| 4,817
|
Apple neural engine
|
{
"login": "EnderRobber101",
"id": 116851736,
"node_id": "U_kgDOBvcEGA",
"avatar_url": "https://avatars.githubusercontent.com/u/116851736?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/EnderRobber101",
"html_url": "https://github.com/EnderRobber101",
"followers_url": "https://api.github.com/users/EnderRobber101/followers",
"following_url": "https://api.github.com/users/EnderRobber101/following{/other_user}",
"gists_url": "https://api.github.com/users/EnderRobber101/gists{/gist_id}",
"starred_url": "https://api.github.com/users/EnderRobber101/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/EnderRobber101/subscriptions",
"organizations_url": "https://api.github.com/users/EnderRobber101/orgs",
"repos_url": "https://api.github.com/users/EnderRobber101/repos",
"events_url": "https://api.github.com/users/EnderRobber101/events{/privacy}",
"received_events_url": "https://api.github.com/users/EnderRobber101/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-06-04T16:01:07
| 2024-07-11T02:40:07
| 2024-07-11T02:40:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I was wondering if ollama will support Apple neural engine for faster computations in the future?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4817/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4817/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7921
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7921/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7921/comments
|
https://api.github.com/repos/ollama/ollama/issues/7921/events
|
https://github.com/ollama/ollama/pull/7921
| 2,716,124,067
|
PR_kwDOJ0Z1Ps6D88PC
| 7,921
|
server: feedback before failing push on uppercase
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-03T22:49:22
| 2024-12-09T22:31:27
| 2024-12-09T22:31:27
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7921",
"html_url": "https://github.com/ollama/ollama/pull/7921",
"diff_url": "https://github.com/ollama/ollama/pull/7921.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7921.patch",
"merged_at": null
}
|
When a username or model name is uppercase the registry will reject the push. This is done for file-system compatibility. If we rely on the registry error on push the message returned is 'file not found', which does not convey why the push actually failed.
Before:
```bash
> ollama push TEST_CAPS/x
retrieving manifest
Error: file does not exist
> ollama push test_caps/X
retrieving manifest
Error: file does not exist
```
Now:
```bash
> ollama push TEST_CAPS/x
retrieving manifest
Error: namespace must be lowercase, but is TEST_CAPS
> ollama push test_caps/X
retrieving manifest
Error: model name must be lowercase, but is X
```
An alternative option is to check this when the model is created, but I think its ok for users to name the model whatever they want on their local system. The model will still run.
related to #3501
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7921/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7921/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1664
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1664/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1664/comments
|
https://api.github.com/repos/ollama/ollama/issues/1664/events
|
https://github.com/ollama/ollama/issues/1664
| 2,052,976,992
|
I_kwDOJ0Z1Ps56XfFg
| 1,664
|
CLI display flickers in SSH session on pull
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 4
| 2023-12-21T19:47:38
| 2024-11-06T18:50:26
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When pulling occasionally I see the loading bar flicker during and after download. This can be seen more dramatically on a fast connection.
System details:
```
OS: Debian 11
Terminal: Warp
Ollama: v0.1.17
```
https://github.com/jmorganca/ollama/assets/5853428/ec9c2410-f5c0-4a41-a9ef-73bee50b99f2
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1664/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1664/timeline
| null |
reopened
| false
|
https://api.github.com/repos/ollama/ollama/issues/3888
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3888/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3888/comments
|
https://api.github.com/repos/ollama/ollama/issues/3888/events
|
https://github.com/ollama/ollama/issues/3888
| 2,262,037,183
|
I_kwDOJ0Z1Ps6G0_K_
| 3,888
|
Restrict model pulling based on license
|
{
"login": "slyt",
"id": 5429371,
"node_id": "MDQ6VXNlcjU0MjkzNzE=",
"avatar_url": "https://avatars.githubusercontent.com/u/5429371?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/slyt",
"html_url": "https://github.com/slyt",
"followers_url": "https://api.github.com/users/slyt/followers",
"following_url": "https://api.github.com/users/slyt/following{/other_user}",
"gists_url": "https://api.github.com/users/slyt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/slyt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/slyt/subscriptions",
"organizations_url": "https://api.github.com/users/slyt/orgs",
"repos_url": "https://api.github.com/users/slyt/repos",
"events_url": "https://api.github.com/users/slyt/events{/privacy}",
"received_events_url": "https://api.github.com/users/slyt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 0
| 2024-04-24T19:36:13
| 2024-04-24T21:04:54
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It would be great if there was a configurable option to restrict the ability for Ollama server to pull certain models based on the model's license. This is necessary for organizations that would like to use Ollama as a model runtime, but cannot use some models due to limitations in their licenses.
A check could be placed after [the model manifest](https://github.com/ollama/ollama/blob/790cf34d172e40c0aa4689011b5323cbdd7b83b6/server/images.go#L1034-L1037) is pulled during `PullModel()`. If the license in the manifest is on a block list (conversely, not on an allow list), then return an error.
Ideally the allow/block lists could be passed via CLI (e.g. `--allow-license ["apache-2.0", "mit"]`) or environment variable (e.g. `export OLLAMA_ALLOW_LICENSE='["apache-2.0", "mit"]'`).
The current model manifest only contains the entire raw license text, not a simple license identifier (e.g. `apache-2.0`). [The Licenses page on Huggingface hub docs](https://huggingface.co/docs/hub/en/repositories-licenses) is a good reference for common license identifiers. Some work might be required to create and track a new license identifier field in the model manifests. Hypothetically, it is possible to compare the raw license text using hashing, but requiring the user to calculate the license hash and pass it like `--allow-license-hash ["12345ABCDEF"]` is not user friendly.
Its likely that allowing/blocking models based on other attributes in the model manifest (hash, tag, family, etc..) would also be useful by many.
This would be a good alternative to locally hosting the model library with a subset of allowed models: https://github.com/ollama/ollama/issues/914
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3888/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3888/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1589
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1589/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1589/comments
|
https://api.github.com/repos/ollama/ollama/issues/1589/events
|
https://github.com/ollama/ollama/issues/1589
| 2,047,604,820
|
I_kwDOJ0Z1Ps56C_hU
| 1,589
|
Access internet
|
{
"login": "PeachesMLG",
"id": 26843204,
"node_id": "MDQ6VXNlcjI2ODQzMjA0",
"avatar_url": "https://avatars.githubusercontent.com/u/26843204?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PeachesMLG",
"html_url": "https://github.com/PeachesMLG",
"followers_url": "https://api.github.com/users/PeachesMLG/followers",
"following_url": "https://api.github.com/users/PeachesMLG/following{/other_user}",
"gists_url": "https://api.github.com/users/PeachesMLG/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PeachesMLG/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PeachesMLG/subscriptions",
"organizations_url": "https://api.github.com/users/PeachesMLG/orgs",
"repos_url": "https://api.github.com/users/PeachesMLG/repos",
"events_url": "https://api.github.com/users/PeachesMLG/events{/privacy}",
"received_events_url": "https://api.github.com/users/PeachesMLG/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-12-18T23:01:44
| 2024-02-05T18:00:33
| 2023-12-19T05:08:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Im customising my own model, using the steps in the ReadMe.
In this Modelfile I added a link to an faq with a bunch of information available, aswell as a github url in hopes it can search open/closed issues to awnser queries.
However it doesnt seem to be querying the url's like OpenAI GPT4 does
Is this currently possible?
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1589/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1589/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6693
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6693/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6693/comments
|
https://api.github.com/repos/ollama/ollama/issues/6693/events
|
https://github.com/ollama/ollama/pull/6693
| 2,512,092,218
|
PR_kwDOJ0Z1Ps56wUK_
| 6,693
|
Notify the user if systemd is not running during install
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-09-07T21:58:33
| 2024-11-19T09:40:11
| 2024-11-18T23:02:41
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6693",
"html_url": "https://github.com/ollama/ollama/pull/6693",
"diff_url": "https://github.com/ollama/ollama/pull/6693.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6693.patch",
"merged_at": "2024-11-18T23:02:41"
}
|
Fixes: https://github.com/ollama/ollama/issues/6636
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6693/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6693/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7640
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7640/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7640/comments
|
https://api.github.com/repos/ollama/ollama/issues/7640/events
|
https://github.com/ollama/ollama/issues/7640
| 2,653,937,136
|
I_kwDOJ0Z1Ps6eL93w
| 7,640
|
Error: POST predict: Post "http://127.0.0.1:42623/completion": EOF
|
{
"login": "phalexo",
"id": 4603365,
"node_id": "MDQ6VXNlcjQ2MDMzNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4603365?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phalexo",
"html_url": "https://github.com/phalexo",
"followers_url": "https://api.github.com/users/phalexo/followers",
"following_url": "https://api.github.com/users/phalexo/following{/other_user}",
"gists_url": "https://api.github.com/users/phalexo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phalexo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phalexo/subscriptions",
"organizations_url": "https://api.github.com/users/phalexo/orgs",
"repos_url": "https://api.github.com/users/phalexo/repos",
"events_url": "https://api.github.com/users/phalexo/events{/privacy}",
"received_events_url": "https://api.github.com/users/phalexo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 29
| 2024-11-13T02:35:29
| 2024-11-25T19:41:55
| 2024-11-25T19:41:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
(CodeLlama) developer@ai:~/PROJECTS/OllamaModelFiles$ ~/ollama/ollama run gemma-2-27b-it-Q8_0:latest
>>> Hello.
Error: POST predict: Post "http://127.0.0.1:42623/completion": EOF
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
Latest
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7640/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7640/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5484
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5484/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5484/comments
|
https://api.github.com/repos/ollama/ollama/issues/5484/events
|
https://github.com/ollama/ollama/issues/5484
| 2,390,764,489
|
I_kwDOJ0Z1Ps6OgCvJ
| 5,484
|
Unnecessary use of GPUs when I run "ollama pull"
|
{
"login": "eliranwong",
"id": 25262722,
"node_id": "MDQ6VXNlcjI1MjYyNzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/25262722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eliranwong",
"html_url": "https://github.com/eliranwong",
"followers_url": "https://api.github.com/users/eliranwong/followers",
"following_url": "https://api.github.com/users/eliranwong/following{/other_user}",
"gists_url": "https://api.github.com/users/eliranwong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eliranwong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eliranwong/subscriptions",
"organizations_url": "https://api.github.com/users/eliranwong/orgs",
"repos_url": "https://api.github.com/users/eliranwong/repos",
"events_url": "https://api.github.com/users/eliranwong/events{/privacy}",
"received_events_url": "https://api.github.com/users/eliranwong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-07-04T12:32:02
| 2024-07-04T12:43:23
| 2024-07-04T12:43:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I run "ollama pull" to just download modules, e.g.
> ollama pull deepseek-v2:16b
Both of my GPUs are fully 100% used, which is unnecessary for download task ONLY.
```
========================================== ROCm System Management Interface ==========================================
==================================================== Concise Info ====================================================
Device Node IDs Temp Power Partitions SCLK MCLK Fan Perf PwrCap VRAM% GPU%
(DID, GUID) (Edge) (Avg) (Mem, Compute, ID)
======================================================================================================================
0 2 0x744c, 45048 66.0°C 138.0W N/A, N/A, 0 3138Mhz 96Mhz 22.75% auto 327.0W 29% 100%
1 1 0x744c, 12575 58.0°C 136.0W N/A, N/A, 0 3052Mhz 96Mhz 14.9% auto 327.0W 29% 100%
======================================================================================================================
================================================ End of ROCm SMI Log =================================================
```
Remarks: I run ROCm 6.1.3 for my dual AMD RX 7900 XTX
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.1.47
|
{
"login": "eliranwong",
"id": 25262722,
"node_id": "MDQ6VXNlcjI1MjYyNzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/25262722?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eliranwong",
"html_url": "https://github.com/eliranwong",
"followers_url": "https://api.github.com/users/eliranwong/followers",
"following_url": "https://api.github.com/users/eliranwong/following{/other_user}",
"gists_url": "https://api.github.com/users/eliranwong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eliranwong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eliranwong/subscriptions",
"organizations_url": "https://api.github.com/users/eliranwong/orgs",
"repos_url": "https://api.github.com/users/eliranwong/repos",
"events_url": "https://api.github.com/users/eliranwong/events{/privacy}",
"received_events_url": "https://api.github.com/users/eliranwong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5484/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5484/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5352
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5352/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5352/comments
|
https://api.github.com/repos/ollama/ollama/issues/5352/events
|
https://github.com/ollama/ollama/issues/5352
| 2,379,429,798
|
I_kwDOJ0Z1Ps6N0zem
| 5,352
|
[BUG]: Gemma2 crashes on run.
|
{
"login": "jasper-clarke",
"id": 154771146,
"node_id": "U_kgDOCTmeyg",
"avatar_url": "https://avatars.githubusercontent.com/u/154771146?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jasper-clarke",
"html_url": "https://github.com/jasper-clarke",
"followers_url": "https://api.github.com/users/jasper-clarke/followers",
"following_url": "https://api.github.com/users/jasper-clarke/following{/other_user}",
"gists_url": "https://api.github.com/users/jasper-clarke/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jasper-clarke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jasper-clarke/subscriptions",
"organizations_url": "https://api.github.com/users/jasper-clarke/orgs",
"repos_url": "https://api.github.com/users/jasper-clarke/repos",
"events_url": "https://api.github.com/users/jasper-clarke/events{/privacy}",
"received_events_url": "https://api.github.com/users/jasper-clarke/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-06-28T02:23:22
| 2024-06-28T02:36:40
| 2024-06-28T02:36:40
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Running the following in sequence crashes with the below output.
1. `ollama pull gemma2`
2. `ollama run gemma2`
Output:
`Error: llama runner process has terminated: signal: aborted (core dumped)`
Coredumpctl:
```
PID: 3776 (ollama_llama_se)
UID: 61547 (ollama)
GID: 61547 (ollama)
Signal: 6 (ABRT)
Timestamp: Fri 2024-06-28 12:19:15 AEST (1min 3s ago)
Command Line: /tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server --model /var/lib/ollama/models/blobs/sha256-e84ed7399c82fbf7dbd6cdef3f12d356c3cdb5512e5d8b2a9898080cbcdd72e5 --ctx-size 2048 --b>
Executable: /tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server
Control Group: /system.slice/ollama.service
Unit: ollama.service
Slice: system.slice
Boot ID: bba51a07162c412d9a05f2b6cb3cd45d
Machine ID: dcf2fcf728b7460d8f2ef435eaecf8e0
Hostname: nixos
Storage: /var/lib/systemd/coredump/core.ollama_llama_se.61547.bba51a07162c412d9a05f2b6cb3cd45d.3776.1719541155000000.zst (inaccessible)
Message: Process 3776 (ollama_llama_se) of user 61547 dumped core.
Module /tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server without build-id.
Module /tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server
Module /tmp/ollama3483777596/runners/cuda_v12/libcublasLt.so.12 without build-id.
Module /tmp/ollama3483777596/runners/cuda_v12/libcublasLt.so.12
Module /tmp/ollama3483777596/runners/cuda_v12/libcublas.so.12 without build-id.
Module /tmp/ollama3483777596/runners/cuda_v12/libcublas.so.12
Module /tmp/ollama3483777596/runners/cuda_v12/libcudart.so.12 without build-id.
Module /tmp/ollama3483777596/runners/cuda_v12/libcudart.so.12
Module libgcc_s.so.1 without build-id.
Module libstdc++.so.6 without build-id.
Stack trace of thread 3776:
#0 0x00007ff2bb8a1efc __pthread_kill_implementation (libc.so.6 + 0x8fefc)
#1 0x00007ff2bb851e96 raise (libc.so.6 + 0x3fe96)
#2 0x00007ff2bb83a935 abort (libc.so.6 + 0x28935)
#3 0x00007ff28fca9a89 _ZN9__gnu_cxx27__verbose_terminate_handlerEv.cold (libstdc++.so.6 + 0xa9a89)
#4 0x00007ff28fcb4f8a _ZN10__cxxabiv111__terminateEPFvvE (libstdc++.so.6 + 0xb4f8a)
#5 0x00007ff28fcb4ff5 _ZSt9terminatev (libstdc++.so.6 + 0xb4ff5)
#6 0x00007ff28fcb5298 __cxa_rethrow (libstdc++.so.6 + 0xb5298)
#7 0x000000000042bf81 n/a (/tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server + 0x2bf81)
#8 0x0000000000504deb n/a (/tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server + 0x104deb)
#9 0x0000000000499447 n/a (/tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server + 0x99447)
#10 0x000000000042e566 n/a (/tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server + 0x2e566)
#11 0x00007ff2bb83c10e __libc_start_call_main (libc.so.6 + 0x2a10e)
#12 0x00007ff2bb83c1c9 __libc_start_main@@GLIBC_2.34 (libc.so.6 + 0x2a1c9)
#13 0x00000000004469f5 n/a (/tmp/ollama3483777596/runners/cuda_v12/ollama_llama_server + 0x469f5)
ELF object binary architecture: AMD x86-64
```
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.45
|
{
"login": "jasper-clarke",
"id": 154771146,
"node_id": "U_kgDOCTmeyg",
"avatar_url": "https://avatars.githubusercontent.com/u/154771146?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jasper-clarke",
"html_url": "https://github.com/jasper-clarke",
"followers_url": "https://api.github.com/users/jasper-clarke/followers",
"following_url": "https://api.github.com/users/jasper-clarke/following{/other_user}",
"gists_url": "https://api.github.com/users/jasper-clarke/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jasper-clarke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jasper-clarke/subscriptions",
"organizations_url": "https://api.github.com/users/jasper-clarke/orgs",
"repos_url": "https://api.github.com/users/jasper-clarke/repos",
"events_url": "https://api.github.com/users/jasper-clarke/events{/privacy}",
"received_events_url": "https://api.github.com/users/jasper-clarke/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5352/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5352/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5908
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5908/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5908/comments
|
https://api.github.com/repos/ollama/ollama/issues/5908/events
|
https://github.com/ollama/ollama/issues/5908
| 2,427,384,900
|
I_kwDOJ0Z1Ps6QrvRE
| 5,908
|
GPU ID initialization incorrect - CPUs not always first in list
|
{
"login": "7910f6ba7ee4",
"id": 89554543,
"node_id": "MDQ6VXNlcjg5NTU0NTQz",
"avatar_url": "https://avatars.githubusercontent.com/u/89554543?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/7910f6ba7ee4",
"html_url": "https://github.com/7910f6ba7ee4",
"followers_url": "https://api.github.com/users/7910f6ba7ee4/followers",
"following_url": "https://api.github.com/users/7910f6ba7ee4/following{/other_user}",
"gists_url": "https://api.github.com/users/7910f6ba7ee4/gists{/gist_id}",
"starred_url": "https://api.github.com/users/7910f6ba7ee4/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/7910f6ba7ee4/subscriptions",
"organizations_url": "https://api.github.com/users/7910f6ba7ee4/orgs",
"repos_url": "https://api.github.com/users/7910f6ba7ee4/repos",
"events_url": "https://api.github.com/users/7910f6ba7ee4/events{/privacy}",
"received_events_url": "https://api.github.com/users/7910f6ba7ee4/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-07-24T11:50:36
| 2024-07-29T21:24:58
| 2024-07-29T21:24:58
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
In `amd_linux.go`, it is assumed that "CPUs are always first in the list" when calculating the gpu id:
```
// CPUs are always first in the list
gpuID := nodeID - cpuCount
```
however, this is not always the case when the number of topology nodes is greater than 10 (which I unfortunately have), as this results in, for example, the node ids (0, 1, 10, 11, 2, 3...), as the file glob returns things alphabetically. This is a problem when nodes 0-3 are CPUs and 4-11 are GPUs, as 10,11 get set to 8,9 with the current implementation, which results in a gap for gpu ids 6-7.
I am guessing this could be solved though a natural sorting of the list of filename strings before looping through them, or by first calculating all cpu ids before offsetting the gpu ids by the cpu count.
### OS
Linux
### GPU
AMD
### CPU
_No response_
### Ollama version
0.2.1
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5908/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5908/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3320
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3320/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3320/comments
|
https://api.github.com/repos/ollama/ollama/issues/3320/events
|
https://github.com/ollama/ollama/pull/3320
| 2,204,165,579
|
PR_kwDOJ0Z1Ps5qk1BJ
| 3,320
|
llm: prevent race appending to slice
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-24T03:48:22
| 2024-03-24T18:35:55
| 2024-03-24T18:35:55
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3320",
"html_url": "https://github.com/ollama/ollama/pull/3320",
"diff_url": "https://github.com/ollama/ollama/pull/3320.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3320.patch",
"merged_at": "2024-03-24T18:35:55"
}
|
llm: prevent race appending to slice
Previously, multiple goroutines were appending to the same unguarded
slice.
Also, convert slice declaration to idiomatic zero value form.
Also, convert errgroup.Group declaration to idiomatic zero value form.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3320/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3320/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5447
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5447/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5447/comments
|
https://api.github.com/repos/ollama/ollama/issues/5447/events
|
https://github.com/ollama/ollama/pull/5447
| 2,387,271,135
|
PR_kwDOJ0Z1Ps50QRW1
| 5,447
|
Only set default keep_alive on initial model load
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-02T22:35:29
| 2024-07-03T22:34:40
| 2024-07-03T22:34:38
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5447",
"html_url": "https://github.com/ollama/ollama/pull/5447",
"diff_url": "https://github.com/ollama/ollama/pull/5447.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5447.patch",
"merged_at": "2024-07-03T22:34:38"
}
|
This change fixes the handling of keep_alive so that if client request omits the setting, we only set this on initial load. Once the model is loaded, if new requests leave this unset, we'll keep whatever keep_alive was there.
Fixes #5272
```
% ollama run llama3 --keepalive 1h hello
Hello! It's nice to meet you. Is there something I can help you with, or would you like to chat?
% ollama ps
NAME ID SIZE PROCESSOR UNTIL
llama3:latest 365c0bd3c000 6.7 GB 100% GPU 59 minutes from now
% curl http://localhost:11434/api/generate -d '{
"model": "llama3",
"prompt": "hi",
"stream": false
}' > /dev/null
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 594 100 534 100 60 797 89 --:--:-- --:--:-- --:--:-- 886
% ollama ps
NAME ID SIZE PROCESSOR UNTIL
llama3:latest 365c0bd3c000 6.7 GB 100% GPU 59 minutes from now
```
Compare against https://github.com/ollama/ollama/issues/5272#issuecomment-2204491896 showing the incorrect behavior before.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5447/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5447/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7229
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7229/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7229/comments
|
https://api.github.com/repos/ollama/ollama/issues/7229/events
|
https://github.com/ollama/ollama/pull/7229
| 2,592,597,237
|
PR_kwDOJ0Z1Ps5-3czw
| 7,229
|
Move Go code out of llm package
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-10-16T17:38:20
| 2025-01-19T19:28:44
| 2025-01-19T19:28:43
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7229",
"html_url": "https://github.com/ollama/ollama/pull/7229",
"diff_url": "https://github.com/ollama/ollama/pull/7229.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7229.patch",
"merged_at": null
}
|
This can be deferred to after the 0.4.0 release as a follow up cleanup step
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7229/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7229/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1105
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1105/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1105/comments
|
https://api.github.com/repos/ollama/ollama/issues/1105/events
|
https://github.com/ollama/ollama/issues/1105
| 1,989,631,987
|
I_kwDOJ0Z1Ps52l1_z
| 1,105
|
Out of memory when using multiple GPUs
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2023-11-12T23:24:50
| 2024-01-10T13:46:31
| 2024-01-10T13:46:31
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When a system has multiple GPUs generation (ex: `ollama run ...`) may fail with an `out of memory` error.
```
Nov 05 22:41:50 example.com ollama[943528]: 2023/11/05 22:41:50 llama.go:259: 7197 MB VRAM available, loading up to 47 GPU layers
Nov 05 22:41:50 example.com ollama[943528]: 2023/11/05 22:41:50 llama.go:370: starting llama runner
Nov 05 22:41:50 example.com ollama[943528]: 2023/11/05 22:41:50 llama.go:428: waiting for llama runner to start responding
Nov 05 22:41:50 example.com ollama[943528]: ggml_init_cublas: found 2 CUDA devices:
Nov 05 22:41:50 example.com ollama[943528]: Device 0: NVIDIA GeForce RTX 3060 Ti, compute capability 8.6
Nov 05 22:41:50 example.com ollama[943528]: Device 1: NVIDIA GeForce RTX 3060, compute capability 8.6
Nov 05 22:41:52 example.com ollama[1418565]: {"timestamp":1699245712,"level":"INFO","function":"main","line":1323,"message":"build info","build":219,"commit":"9e70cc0"}
Nov 05 22:41:52 example.com ollama[1418565]: {"timestamp":1699245712,"level":"INFO","function":"main","line":1325,"message":"system info","n_threads":8,"n_threads_batch":-1,"total_threads":16,"system_info":"AVX = 1 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | "}
Nov 05 22:41:52 example.com ollama[943528]: llama_model_loader: loaded meta data with 19 key-value pairs and 291 tensors from /usr/share/ollama/.ollama/models/blobs/sha256:22f7f8ef5f4c791c1b03d7eb414399294764d7cc82c7e94aa81a1feb80a983a2 (version GGUF V2 (latest))
Nov 05 22:41:52 example.com ollama[943528]: llama_model_loader: - tensor 0: token_embd.weight q4_0 [ 4096, 32000, 1, 1 ]
...
Nov 05 22:41:52 example.com ollama[943528]: llm_load_tensors: ggml ctx size = 0.10 MB
Nov 05 22:41:52 example.com ollama[943528]: llm_load_tensors: using CUDA for GPU acceleration
Nov 05 22:41:52 example.com ollama[943528]: ggml_cuda_set_main_device: using device 0 (NVIDIA GeForce RTX 3060 Ti) as main device
Nov 05 22:41:52 example.com ollama[943528]: llm_load_tensors: mem required = 70.41 MB
Nov 05 22:41:52 example.com ollama[943528]: llm_load_tensors: offloading 32 repeating layers to GPU
Nov 05 22:41:52 example.com ollama[943528]: llm_load_tensors: offloading non-repeating layers to GPU
Nov 05 22:41:52 example.com ollama[943528]: llm_load_tensors: offloaded 35/35 layers to GPU
Nov 05 22:41:52 example.com ollama[943528]: llm_load_tensors: VRAM used: 3577.55 MB
Nov 05 22:41:53 example.com ollama[943528]: ....................................................................
Nov 05 22:41:53 example.com ollama[943528]: CUDA error 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/gguf/ggml-cuda.cu:7233: out of memory
Nov 05 22:41:53 example.com ollama[943528]: current device: 0
Nov 05 22:41:53 example.com ollama[943528]: 2023/11/05 22:41:53 llama.go:385: 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/gguf/ggml-cuda.cu:7233: out of memory
Nov 05 22:41:53 example.com ollama[943528]: current device: 0
Nov 05 22:41:53 example.com ollama[943528]: 2023/11/05 22:41:53 llama.go:393: error starting llama runner: llama runner process has terminated
Nov 05 22:41:53 example.com ollama[943528]: 2023/11/05 22:41:53 llama.go:459: llama runner stopped successfully
```
Possibly related:
https://github.com/ggerganov/llama.cpp/issues/1866
https://github.com/ggerganov/llama.cpp/issues/2432
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1105/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1105/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3133
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3133/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3133/comments
|
https://api.github.com/repos/ollama/ollama/issues/3133/events
|
https://github.com/ollama/ollama/issues/3133
| 2,185,252,286
|
I_kwDOJ0Z1Ps6CQE2-
| 3,133
|
v0.1.29 #bug
|
{
"login": "enryteam",
"id": 20081090,
"node_id": "MDQ6VXNlcjIwMDgxMDkw",
"avatar_url": "https://avatars.githubusercontent.com/u/20081090?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/enryteam",
"html_url": "https://github.com/enryteam",
"followers_url": "https://api.github.com/users/enryteam/followers",
"following_url": "https://api.github.com/users/enryteam/following{/other_user}",
"gists_url": "https://api.github.com/users/enryteam/gists{/gist_id}",
"starred_url": "https://api.github.com/users/enryteam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/enryteam/subscriptions",
"organizations_url": "https://api.github.com/users/enryteam/orgs",
"repos_url": "https://api.github.com/users/enryteam/repos",
"events_url": "https://api.github.com/users/enryteam/events{/privacy}",
"received_events_url": "https://api.github.com/users/enryteam/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-03-14T02:25:29
| 2024-06-11T05:53:34
| 2024-03-14T02:39:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
v0.1.29报403错误!补充说明我是通过frpc代理访问11434端口的,在[v0.1.29]之前版本确实都ok的。
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3133/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3133/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1645
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1645/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1645/comments
|
https://api.github.com/repos/ollama/ollama/issues/1645/events
|
https://github.com/ollama/ollama/issues/1645
| 2,051,405,991
|
I_kwDOJ0Z1Ps56Rfin
| 1,645
|
Dark mode for ollama.com
|
{
"login": "MaherJendoubi",
"id": 1798510,
"node_id": "MDQ6VXNlcjE3OTg1MTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1798510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MaherJendoubi",
"html_url": "https://github.com/MaherJendoubi",
"followers_url": "https://api.github.com/users/MaherJendoubi/followers",
"following_url": "https://api.github.com/users/MaherJendoubi/following{/other_user}",
"gists_url": "https://api.github.com/users/MaherJendoubi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MaherJendoubi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MaherJendoubi/subscriptions",
"organizations_url": "https://api.github.com/users/MaherJendoubi/orgs",
"repos_url": "https://api.github.com/users/MaherJendoubi/repos",
"events_url": "https://api.github.com/users/MaherJendoubi/events{/privacy}",
"received_events_url": "https://api.github.com/users/MaherJendoubi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2023-12-20T22:43:24
| 2024-12-25T21:51:37
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Just to protect your eyes, especially during the night.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1645/reactions",
"total_count": 7,
"+1": 7,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1645/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4259
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4259/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4259/comments
|
https://api.github.com/repos/ollama/ollama/issues/4259/events
|
https://github.com/ollama/ollama/issues/4259
| 2,285,514,065
|
I_kwDOJ0Z1Ps6IOi1R
| 4,259
|
stop loading model while i close my computer.
|
{
"login": "chaserstrong",
"id": 18061322,
"node_id": "MDQ6VXNlcjE4MDYxMzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/18061322?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chaserstrong",
"html_url": "https://github.com/chaserstrong",
"followers_url": "https://api.github.com/users/chaserstrong/followers",
"following_url": "https://api.github.com/users/chaserstrong/following{/other_user}",
"gists_url": "https://api.github.com/users/chaserstrong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chaserstrong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chaserstrong/subscriptions",
"organizations_url": "https://api.github.com/users/chaserstrong/orgs",
"repos_url": "https://api.github.com/users/chaserstrong/repos",
"events_url": "https://api.github.com/users/chaserstrong/events{/privacy}",
"received_events_url": "https://api.github.com/users/chaserstrong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-05-08T12:46:07
| 2024-05-08T12:46:07
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
During the download, the computer screen stopped. When I opened it again, I found that it was stuck at the previous download progress.Even if I download another model, it will still be like this.
<img width="566" alt="image" src="https://github.com/ollama/ollama/assets/18061322/ca8379f3-e898-477b-b681-3ca655bae6d3">
### OS
macOS
### GPU
AMD
### CPU
AMD
### Ollama version
0.1.27
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4259/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4259/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/575
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/575/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/575/comments
|
https://api.github.com/repos/ollama/ollama/issues/575/events
|
https://github.com/ollama/ollama/pull/575
| 1,909,295,183
|
PR_kwDOJ0Z1Ps5bAwZi
| 575
|
fix ipv6 parse ip
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-22T17:42:18
| 2023-09-22T18:47:12
| 2023-09-22T18:47:11
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/575",
"html_url": "https://github.com/ollama/ollama/pull/575",
"diff_url": "https://github.com/ollama/ollama/pull/575.diff",
"patch_url": "https://github.com/ollama/ollama/pull/575.patch",
"merged_at": "2023-09-22T18:47:11"
}
|
`net.ParseIP` for IPv6 doesn't expect `[]` so trim it
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/575/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/575/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1118
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1118/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1118/comments
|
https://api.github.com/repos/ollama/ollama/issues/1118/events
|
https://github.com/ollama/ollama/issues/1118
| 1,991,949,557
|
I_kwDOJ0Z1Ps52urz1
| 1,118
|
Verbose request logs for `ollama serve`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-11-14T04:05:32
| 2024-01-28T23:22:36
| 2024-01-28T23:22:36
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
It can be hard to debug what kind of requests `ollama serve` is receiving when using SDKs or other tooling with it. A way to log full requests would be helpful for this.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1118/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1118/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2947
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2947/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2947/comments
|
https://api.github.com/repos/ollama/ollama/issues/2947/events
|
https://github.com/ollama/ollama/issues/2947
| 2,170,956,332
|
I_kwDOJ0Z1Ps6BZios
| 2,947
|
I need to move the ollama into a no-internet-webserver,how should I backup all the files in my windows/linux ollama folder
|
{
"login": "sddzcuigc",
"id": 85976753,
"node_id": "MDQ6VXNlcjg1OTc2NzUz",
"avatar_url": "https://avatars.githubusercontent.com/u/85976753?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sddzcuigc",
"html_url": "https://github.com/sddzcuigc",
"followers_url": "https://api.github.com/users/sddzcuigc/followers",
"following_url": "https://api.github.com/users/sddzcuigc/following{/other_user}",
"gists_url": "https://api.github.com/users/sddzcuigc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sddzcuigc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sddzcuigc/subscriptions",
"organizations_url": "https://api.github.com/users/sddzcuigc/orgs",
"repos_url": "https://api.github.com/users/sddzcuigc/repos",
"events_url": "https://api.github.com/users/sddzcuigc/events{/privacy}",
"received_events_url": "https://api.github.com/users/sddzcuigc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-06T08:48:58
| 2024-03-12T01:21:14
| 2024-03-12T01:21:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I reallise the windows ollama use the .ollama folder to put the models in. But still not a very good way to transport it to another computer.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2947/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2947/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3525
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3525/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3525/comments
|
https://api.github.com/repos/ollama/ollama/issues/3525/events
|
https://github.com/ollama/ollama/issues/3525
| 2,229,801,582
|
I_kwDOJ0Z1Ps6E6BJu
| 3,525
|
error: listen tcp: lookup tcp/\ollama: unknown port
|
{
"login": "bkdigitalworld",
"id": 166310020,
"node_id": "U_kgDOCemwhA",
"avatar_url": "https://avatars.githubusercontent.com/u/166310020?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bkdigitalworld",
"html_url": "https://github.com/bkdigitalworld",
"followers_url": "https://api.github.com/users/bkdigitalworld/followers",
"following_url": "https://api.github.com/users/bkdigitalworld/following{/other_user}",
"gists_url": "https://api.github.com/users/bkdigitalworld/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bkdigitalworld/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bkdigitalworld/subscriptions",
"organizations_url": "https://api.github.com/users/bkdigitalworld/orgs",
"repos_url": "https://api.github.com/users/bkdigitalworld/repos",
"events_url": "https://api.github.com/users/bkdigitalworld/events{/privacy}",
"received_events_url": "https://api.github.com/users/bkdigitalworld/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-04-07T14:42:57
| 2024-05-15T00:05:17
| 2024-05-15T00:05:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Could not open the interface of ollama:
time=2024-04-07T22:40:43.318+08:00 level=WARN source=server.go:113 msg="server crash 13 - exit code 1 - respawning"
time=2024-04-07T22:40:43.820+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-04-07T22:40:56.824+08:00 level=WARN source=server.go:113 msg="server crash 14 - exit code 1 - respawning"
time=2024-04-07T22:40:57.334+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-04-07T22:41:11.349+08:00 level=WARN source=server.go:113 msg="server crash 15 - exit code 1 - respawning"
time=2024-04-07T22:41:11.862+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-04-07T22:41:26.875+08:00 level=WARN source=server.go:113 msg="server crash 16 - exit code 1 - respawning"
time=2024-04-07T22:41:27.387+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-04-07T22:41:43.390+08:00 level=WARN source=server.go:113 msg="server crash 17 - exit code 1 - respawning"
time=2024-04-07T22:41:43.905+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-04-07T22:42:00.913+08:00 level=WARN source=server.go:113 msg="server crash 18 - exit code 1 - respawning"
time=2024-04-07T22:42:01.426+08:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
Error: listen tcp: lookup tcp/\Ollama: unknown port
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3525/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3525/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2475
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2475/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2475/comments
|
https://api.github.com/repos/ollama/ollama/issues/2475/events
|
https://github.com/ollama/ollama/issues/2475
| 2,132,503,645
|
I_kwDOJ0Z1Ps5_G2xd
| 2,475
|
Request to add leo-hessianai to ollama
|
{
"login": "arsenij-ust",
"id": 61419866,
"node_id": "MDQ6VXNlcjYxNDE5ODY2",
"avatar_url": "https://avatars.githubusercontent.com/u/61419866?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arsenij-ust",
"html_url": "https://github.com/arsenij-ust",
"followers_url": "https://api.github.com/users/arsenij-ust/followers",
"following_url": "https://api.github.com/users/arsenij-ust/following{/other_user}",
"gists_url": "https://api.github.com/users/arsenij-ust/gists{/gist_id}",
"starred_url": "https://api.github.com/users/arsenij-ust/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/arsenij-ust/subscriptions",
"organizations_url": "https://api.github.com/users/arsenij-ust/orgs",
"repos_url": "https://api.github.com/users/arsenij-ust/repos",
"events_url": "https://api.github.com/users/arsenij-ust/events{/privacy}",
"received_events_url": "https://api.github.com/users/arsenij-ust/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396205,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2abQ",
"url": "https://api.github.com/repos/ollama/ollama/labels/help%20wanted",
"name": "help wanted",
"color": "008672",
"default": true,
"description": "Extra attention is needed"
}
] |
open
| false
| null |
[] | null | 2
| 2024-02-13T14:50:03
| 2024-12-31T22:47:50
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi guys,
I tried to use the leo-hessianai-7B model on Ollama. I use the GGUF file (Q4_K_M.gguf from here https://huggingface.co/TheBloke/leo-hessianai-7B-GGUF/tree/main) and followed the instructions from Ollama (https://github.com/ollama/ollama/blob/main/docs/import.md). I already managed to generate answers with the model, but they are extremely wrong and hallucinating (you can say crazy). Unfortunately, I don't know what I'm doing wrong. I assume that the parameters or the template (in the Modelfile you have to create for Ollama) are incorrect.
Hope you can help me out 🙂
I tried the following Modelfiles:
```
FROM ./leo-hessianai-7b.Q4_K_M.gguf
TEMPLATE """{{- if .System }}
<|im_start|>system {{ .System }}<|im_end|>
{{- end }}
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assistant
"""
SYSTEM """"""
PARAMETER stop <|im_start|>
PARAMETER stop <|im_end|>
```
```
FROM ./leo-hessianai-7b.Q4_K_M.gguf
TEMPLATE "[INST] {{ .Prompt }} [/INST]"
```
(The same problem occurred when I used the safetensors from this repo and used the ollama tools to convert and quantize the model.)
--> So it would be great if `leo-hessianai-7B`, `leo-hessianai-13B`, and `leo-hessianai-70B` could be added to ollama - find the models at https://huggingface.co/LeoLM
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2475/reactions",
"total_count": 6,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2475/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6223
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6223/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6223/comments
|
https://api.github.com/repos/ollama/ollama/issues/6223/events
|
https://github.com/ollama/ollama/pull/6223
| 2,452,425,684
|
PR_kwDOJ0Z1Ps53p9b9
| 6,223
|
feat: add gin BasicAuth using OLLAMA_BASIC_AUTH_KEY setup in env
|
{
"login": "kemalelmizan",
"id": 15223219,
"node_id": "MDQ6VXNlcjE1MjIzMjE5",
"avatar_url": "https://avatars.githubusercontent.com/u/15223219?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kemalelmizan",
"html_url": "https://github.com/kemalelmizan",
"followers_url": "https://api.github.com/users/kemalelmizan/followers",
"following_url": "https://api.github.com/users/kemalelmizan/following{/other_user}",
"gists_url": "https://api.github.com/users/kemalelmizan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kemalelmizan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kemalelmizan/subscriptions",
"organizations_url": "https://api.github.com/users/kemalelmizan/orgs",
"repos_url": "https://api.github.com/users/kemalelmizan/repos",
"events_url": "https://api.github.com/users/kemalelmizan/events{/privacy}",
"received_events_url": "https://api.github.com/users/kemalelmizan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-08-07T05:06:21
| 2024-11-25T00:03:03
| 2024-11-25T00:03:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6223",
"html_url": "https://github.com/ollama/ollama/pull/6223",
"diff_url": "https://github.com/ollama/ollama/pull/6223.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6223.patch",
"merged_at": null
}
|
This adds gin BasicAuth for username:password setup in env. I checked ollama server is using gin, and gin offers [basic auth middleware](https://gin-gonic.com/docs/examples/using-basicauth-middleware/). In this PR I attempted to use this middleware to validate request from using env var `OLLAMA_BASIC_AUTH_KEY`. Inputs, comments and feedback are welcome 🙏
Requested in https://github.com/ollama/ollama/issues/1053
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6223/reactions",
"total_count": 17,
"+1": 11,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 6,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6223/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6790
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6790/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6790/comments
|
https://api.github.com/repos/ollama/ollama/issues/6790/events
|
https://github.com/ollama/ollama/issues/6790
| 2,524,252,141
|
I_kwDOJ0Z1Ps6WdQft
| 6,790
|
openai tools streaming support coming soon?
|
{
"login": "LuckLittleBoy",
"id": 17702771,
"node_id": "MDQ6VXNlcjE3NzAyNzcx",
"avatar_url": "https://avatars.githubusercontent.com/u/17702771?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LuckLittleBoy",
"html_url": "https://github.com/LuckLittleBoy",
"followers_url": "https://api.github.com/users/LuckLittleBoy/followers",
"following_url": "https://api.github.com/users/LuckLittleBoy/following{/other_user}",
"gists_url": "https://api.github.com/users/LuckLittleBoy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LuckLittleBoy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LuckLittleBoy/subscriptions",
"organizations_url": "https://api.github.com/users/LuckLittleBoy/orgs",
"repos_url": "https://api.github.com/users/LuckLittleBoy/repos",
"events_url": "https://api.github.com/users/LuckLittleBoy/events{/privacy}",
"received_events_url": "https://api.github.com/users/LuckLittleBoy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 13
| 2024-09-13T08:45:09
| 2024-09-19T06:36:27
| 2024-09-14T01:57:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
In which version of the openai tools streaming support feature is planned to be supported?
When will it be supported?

|
{
"login": "LuckLittleBoy",
"id": 17702771,
"node_id": "MDQ6VXNlcjE3NzAyNzcx",
"avatar_url": "https://avatars.githubusercontent.com/u/17702771?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LuckLittleBoy",
"html_url": "https://github.com/LuckLittleBoy",
"followers_url": "https://api.github.com/users/LuckLittleBoy/followers",
"following_url": "https://api.github.com/users/LuckLittleBoy/following{/other_user}",
"gists_url": "https://api.github.com/users/LuckLittleBoy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LuckLittleBoy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LuckLittleBoy/subscriptions",
"organizations_url": "https://api.github.com/users/LuckLittleBoy/orgs",
"repos_url": "https://api.github.com/users/LuckLittleBoy/repos",
"events_url": "https://api.github.com/users/LuckLittleBoy/events{/privacy}",
"received_events_url": "https://api.github.com/users/LuckLittleBoy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6790/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6790/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5744
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5744/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5744/comments
|
https://api.github.com/repos/ollama/ollama/issues/5744/events
|
https://github.com/ollama/ollama/issues/5744
| 2,413,433,875
|
I_kwDOJ0Z1Ps6P2hQT
| 5,744
|
Model Cold Storage and user manual management possibility
|
{
"login": "nikhil-swamix",
"id": 54004431,
"node_id": "MDQ6VXNlcjU0MDA0NDMx",
"avatar_url": "https://avatars.githubusercontent.com/u/54004431?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikhil-swamix",
"html_url": "https://github.com/nikhil-swamix",
"followers_url": "https://api.github.com/users/nikhil-swamix/followers",
"following_url": "https://api.github.com/users/nikhil-swamix/following{/other_user}",
"gists_url": "https://api.github.com/users/nikhil-swamix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nikhil-swamix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nikhil-swamix/subscriptions",
"organizations_url": "https://api.github.com/users/nikhil-swamix/orgs",
"repos_url": "https://api.github.com/users/nikhil-swamix/repos",
"events_url": "https://api.github.com/users/nikhil-swamix/events{/privacy}",
"received_events_url": "https://api.github.com/users/nikhil-swamix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 5
| 2024-07-17T12:00:57
| 2024-08-31T08:29:59
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |

# model management
its nearly impossible to manage models by manual method, and it generates hash values,
what i was trying to do was to move some models to cold storage, ie HDD, and some to SSD. but couldnt find a way rather than full repo movement , and im faced with this

its take lot of time to do this type of management due to sheer volume of transfer, and why does one model generate 100s of blobs? cant it be stored to a folder per model rather than littering everywhere? my best bet is to check date modified time and perform the job.
# proposed
` ollama archive <model_name> <Disk_or_path>`
`ollama pull <Disk_or_path>` will show options which models to revive to cache.
# urgent request
# observations
some users reported that ollama pull takes long time #2850 #5361 etc, i suspect the nature of SSDs to avoid creation of huge reserved space, as first few seconds in task manager it shown 1GB/s , then fall to 200, the pathetic 5MB/s. it could be the write protection mechanism. maybe chunked download and merge? or use huggingface like loader like _part001 _part002 for layers loading?
@bmizerany @drnic @anaisbetts @sqs @lstep
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5744/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5744/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5536
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5536/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5536/comments
|
https://api.github.com/repos/ollama/ollama/issues/5536/events
|
https://github.com/ollama/ollama/issues/5536
| 2,394,280,218
|
I_kwDOJ0Z1Ps6OtdEa
| 5,536
|
gemma2 27b is too slow
|
{
"login": "codeMonkey-shin",
"id": 80636401,
"node_id": "MDQ6VXNlcjgwNjM2NDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/80636401?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codeMonkey-shin",
"html_url": "https://github.com/codeMonkey-shin",
"followers_url": "https://api.github.com/users/codeMonkey-shin/followers",
"following_url": "https://api.github.com/users/codeMonkey-shin/following{/other_user}",
"gists_url": "https://api.github.com/users/codeMonkey-shin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codeMonkey-shin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codeMonkey-shin/subscriptions",
"organizations_url": "https://api.github.com/users/codeMonkey-shin/orgs",
"repos_url": "https://api.github.com/users/codeMonkey-shin/repos",
"events_url": "https://api.github.com/users/codeMonkey-shin/events{/privacy}",
"received_events_url": "https://api.github.com/users/codeMonkey-shin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5808482718,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWjZpng",
"url": "https://api.github.com/repos/ollama/ollama/labels/performance",
"name": "performance",
"color": "A5B5C6",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 4
| 2024-07-07T23:53:42
| 2024-10-16T16:18:58
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Compared to 9b, 27b is ridiculously slow. Is it because of the structure?
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.49 Pre-release
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5536/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5536/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/474
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/474/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/474/comments
|
https://api.github.com/repos/ollama/ollama/issues/474/events
|
https://github.com/ollama/ollama/pull/474
| 1,882,978,743
|
PR_kwDOJ0Z1Ps5ZoL22
| 474
|
add show command
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-06T01:13:42
| 2023-09-06T18:04:18
| 2023-09-06T18:04:17
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/474",
"html_url": "https://github.com/ollama/ollama/pull/474",
"diff_url": "https://github.com/ollama/ollama/pull/474.diff",
"patch_url": "https://github.com/ollama/ollama/pull/474.patch",
"merged_at": "2023-09-06T18:04:17"
}
|
This change adds the ability to inspect various parts of a given model. It adds functionality from both the CLI (via the `ollama show` command) and through the REPL (through the various `/show ...` commands).
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/474/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/474/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4469
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4469/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4469/comments
|
https://api.github.com/repos/ollama/ollama/issues/4469/events
|
https://github.com/ollama/ollama/issues/4469
| 2,299,562,084
|
I_kwDOJ0Z1Ps6JEIhk
| 4,469
|
Ollama memory consumption
|
{
"login": "hugefrog",
"id": 83398604,
"node_id": "MDQ6VXNlcjgzMzk4NjA0",
"avatar_url": "https://avatars.githubusercontent.com/u/83398604?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hugefrog",
"html_url": "https://github.com/hugefrog",
"followers_url": "https://api.github.com/users/hugefrog/followers",
"following_url": "https://api.github.com/users/hugefrog/following{/other_user}",
"gists_url": "https://api.github.com/users/hugefrog/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hugefrog/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hugefrog/subscriptions",
"organizations_url": "https://api.github.com/users/hugefrog/orgs",
"repos_url": "https://api.github.com/users/hugefrog/repos",
"events_url": "https://api.github.com/users/hugefrog/events{/privacy}",
"received_events_url": "https://api.github.com/users/hugefrog/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-05-16T07:22:16
| 2024-07-25T22:53:45
| 2024-07-25T22:53:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Why does Ollama consume so much memory? With a 3090 graphics card and 24GB of VRAM, after loading a yi-34b-4bit model of around 20GB in size, both the system memory and VRAM are consumed by approximately 20GB simultaneously.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.38
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4469/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4469/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1057
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1057/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1057/comments
|
https://api.github.com/repos/ollama/ollama/issues/1057/events
|
https://github.com/ollama/ollama/issues/1057
| 1,985,785,851
|
I_kwDOJ0Z1Ps52XK_7
| 1,057
|
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": EOF
|
{
"login": "fabianslife",
"id": 49265757,
"node_id": "MDQ6VXNlcjQ5MjY1NzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/49265757?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fabianslife",
"html_url": "https://github.com/fabianslife",
"followers_url": "https://api.github.com/users/fabianslife/followers",
"following_url": "https://api.github.com/users/fabianslife/following{/other_user}",
"gists_url": "https://api.github.com/users/fabianslife/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fabianslife/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fabianslife/subscriptions",
"organizations_url": "https://api.github.com/users/fabianslife/orgs",
"repos_url": "https://api.github.com/users/fabianslife/repos",
"events_url": "https://api.github.com/users/fabianslife/events{/privacy}",
"received_events_url": "https://api.github.com/users/fabianslife/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-09T14:48:39
| 2023-12-24T21:52:43
| 2023-12-24T21:52:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am running Ubuntu 20.04 and wanted to try out ollama, but the oneliner does not seem to work:
When installing ollama with `curl https://ollama.ai/install.sh | sh` everything is ok, and the installation runs fine:
```
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 7650 0 7650 0 0 43220 0 --:--:-- --:--:-- --:--:-- 43465
>>> Downloading ollama...
######################################################################## 100,0%######################################################################### 100,0%
>>> Installing ollama to /usr/local/bin...
[sudo] password for fabian_iki:
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> NVIDIA GPU installed.
```
But when i then try to load a model: `ollama pull llama2` I get the error:
```
pulling manifest
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest": EOF
```
I tried it from the systems terminal.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1057/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1057/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/528
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/528/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/528/comments
|
https://api.github.com/repos/ollama/ollama/issues/528/events
|
https://github.com/ollama/ollama/issues/528
| 1,896,514,625
|
I_kwDOJ0Z1Ps5xCoRB
| 528
|
416 response when pulling a model
|
{
"login": "codazoda",
"id": 527246,
"node_id": "MDQ6VXNlcjUyNzI0Ng==",
"avatar_url": "https://avatars.githubusercontent.com/u/527246?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codazoda",
"html_url": "https://github.com/codazoda",
"followers_url": "https://api.github.com/users/codazoda/followers",
"following_url": "https://api.github.com/users/codazoda/following{/other_user}",
"gists_url": "https://api.github.com/users/codazoda/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codazoda/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codazoda/subscriptions",
"organizations_url": "https://api.github.com/users/codazoda/orgs",
"repos_url": "https://api.github.com/users/codazoda/repos",
"events_url": "https://api.github.com/users/codazoda/events{/privacy}",
"received_events_url": "https://api.github.com/users/codazoda/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-09-14T12:52:46
| 2023-09-30T05:07:52
| 2023-09-30T05:07:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm getting the following error when I try to pull the lllama2-uncensored model.
```
$ollama pull llama2-uncensored
pulling manifest
Error: download failed: on download registry responded with code 416:
```
This might be a registry problem or a problem with the model I'm pulling. I'm not really sure the appropriate place to report the error. It's also entirely possible this is something to do with my internet connection, as I'm currently traveling.
Pulling llama2 works fine. Pulling llama2-uncensored:7b works fine (which should be the same thing, I think).
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/528/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/528/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/584
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/584/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/584/comments
|
https://api.github.com/repos/ollama/ollama/issues/584/events
|
https://github.com/ollama/ollama/issues/584
| 1,910,303,715
|
I_kwDOJ0Z1Ps5x3Ovj
| 584
|
Adhere to the MacOS File System Programming Guide
|
{
"login": "offsetcyan",
"id": 49906709,
"node_id": "MDQ6VXNlcjQ5OTA2NzA5",
"avatar_url": "https://avatars.githubusercontent.com/u/49906709?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/offsetcyan",
"html_url": "https://github.com/offsetcyan",
"followers_url": "https://api.github.com/users/offsetcyan/followers",
"following_url": "https://api.github.com/users/offsetcyan/following{/other_user}",
"gists_url": "https://api.github.com/users/offsetcyan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/offsetcyan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/offsetcyan/subscriptions",
"organizations_url": "https://api.github.com/users/offsetcyan/orgs",
"repos_url": "https://api.github.com/users/offsetcyan/repos",
"events_url": "https://api.github.com/users/offsetcyan/events{/privacy}",
"received_events_url": "https://api.github.com/users/offsetcyan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6677279472,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjf8y8A",
"url": "https://api.github.com/repos/ollama/ollama/labels/macos",
"name": "macos",
"color": "E2DBC0",
"default": false,
"description": ""
}
] |
open
| false
| null |
[] | null | 4
| 2023-09-24T16:53:40
| 2024-03-11T19:30:47
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
The user's home directory is not the place to dump program data, and for future cross-platform compatibility handling this would be inappropriate. Currently Ollama stores user data in `~/.ollama`, however Apple have a specification for where to place files of various types ([link](https://developer.apple.com/library/archive/documentation/FileManagement/Conceptual/FileSystemProgrammingGuide/FileSystemOverview/FileSystemOverview.html#//apple_ref/doc/uid/TP40010672-CH2-SW1)). In Ollama's case, `~/Library/Application Support/Ollama` seems appropriate.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/584/reactions",
"total_count": 9,
"+1": 9,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/584/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2638
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2638/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2638/comments
|
https://api.github.com/repos/ollama/ollama/issues/2638/events
|
https://github.com/ollama/ollama/issues/2638
| 2,146,978,852
|
I_kwDOJ0Z1Ps5_-Ewk
| 2,638
|
on windows skipping models
|
{
"login": "stream74",
"id": 7672121,
"node_id": "MDQ6VXNlcjc2NzIxMjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/7672121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/stream74",
"html_url": "https://github.com/stream74",
"followers_url": "https://api.github.com/users/stream74/followers",
"following_url": "https://api.github.com/users/stream74/following{/other_user}",
"gists_url": "https://api.github.com/users/stream74/gists{/gist_id}",
"starred_url": "https://api.github.com/users/stream74/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stream74/subscriptions",
"organizations_url": "https://api.github.com/users/stream74/orgs",
"repos_url": "https://api.github.com/users/stream74/repos",
"events_url": "https://api.github.com/users/stream74/events{/privacy}",
"received_events_url": "https://api.github.com/users/stream74/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-02-21T15:03:54
| 2024-03-12T01:59:09
| 2024-03-12T01:59:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Sorry for bad english
i set environnement variable in windows to models folders
if i pull new models it go to the folder i set
but i have already a lot of models but ollama can't see it when i ask him with "ollama list"
th server log indicate
[GIN] 2024/02/21 - 15:51:59 | 200 | 6.082ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/02/21 - 15:56:04 | 200 | 0s | 127.0.0.1 | HEAD "/"
time=2024-02-21T15:56:04.036+01:00 level=INFO source=routes.go:809 msg="skipping file: hub/chirag/chloe:latest"
time=2024-02-21T15:56:04.036+01:00 level=INFO source=routes.go:809 msg="skipping file: hub/chmurli/sarah-lovely-caring-girlfriend:latest"
time=2024-02-21T15:56:04.036+01:00 level=INFO source=routes.go:809 msg="skipping file: hub/languages/french:latest"
time=2024-02-21T15:56:04.037+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/fixt/home-3b-v2:q8_0"
time=2024-02-21T15:56:04.037+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/assistante-grossesse:latest"
time=2024-02-21T15:56:04.037+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/assistante-photographe:latest"
time=2024-02-21T15:56:04.038+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/chloe-french:latest"
time=2024-02-21T15:56:04.038+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/dolphin-mistral:latest"
time=2024-02-21T15:56:04.038+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/french-home:latest"
time=2024-02-21T15:56:04.038+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/jarvis-home-3b:latest"
time=2024-02-21T15:56:04.039+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/mistral:latest"
time=2024-02-21T15:56:04.039+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/moly:latest"
time=2024-02-21T15:56:04.039+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/ollamacreate:latest"
time=2024-02-21T15:56:04.040+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/openhermes2.5-mistral:latest"
time=2024-02-21T15:56:04.040+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/samantha-mistral:7b-v1.2-text-q6_K"
time=2024-02-21T15:56:04.040+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/samantha-mistral:latest"
time=2024-02-21T15:56:04.040+01:00 level=INFO source=routes.go:809 msg="skipping file: registry.ollama.ai/library/wizard-vicuna-uncensored:13b"
|
{
"login": "hoyyeva",
"id": 63033505,
"node_id": "MDQ6VXNlcjYzMDMzNTA1",
"avatar_url": "https://avatars.githubusercontent.com/u/63033505?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hoyyeva",
"html_url": "https://github.com/hoyyeva",
"followers_url": "https://api.github.com/users/hoyyeva/followers",
"following_url": "https://api.github.com/users/hoyyeva/following{/other_user}",
"gists_url": "https://api.github.com/users/hoyyeva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hoyyeva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hoyyeva/subscriptions",
"organizations_url": "https://api.github.com/users/hoyyeva/orgs",
"repos_url": "https://api.github.com/users/hoyyeva/repos",
"events_url": "https://api.github.com/users/hoyyeva/events{/privacy}",
"received_events_url": "https://api.github.com/users/hoyyeva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2638/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2638/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7346
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7346/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7346/comments
|
https://api.github.com/repos/ollama/ollama/issues/7346/events
|
https://github.com/ollama/ollama/issues/7346
| 2,612,137,611
|
I_kwDOJ0Z1Ps6bsg6L
| 7,346
|
Ollama does not run on GPU at 0.4.0-rc5-rocm version
|
{
"login": "chiehpower",
"id": 32332200,
"node_id": "MDQ6VXNlcjMyMzMyMjAw",
"avatar_url": "https://avatars.githubusercontent.com/u/32332200?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chiehpower",
"html_url": "https://github.com/chiehpower",
"followers_url": "https://api.github.com/users/chiehpower/followers",
"following_url": "https://api.github.com/users/chiehpower/following{/other_user}",
"gists_url": "https://api.github.com/users/chiehpower/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chiehpower/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chiehpower/subscriptions",
"organizations_url": "https://api.github.com/users/chiehpower/orgs",
"repos_url": "https://api.github.com/users/chiehpower/repos",
"events_url": "https://api.github.com/users/chiehpower/events{/privacy}",
"received_events_url": "https://api.github.com/users/chiehpower/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-10-24T17:04:49
| 2024-10-26T16:02:44
| 2024-10-25T21:44:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi all,
I was testing a very new version (`0.4.0-rc5-rocm`) that the server was deployed by docker container.
```
docker run -itd --name=ollama --gpus=all --shm-size=100GB \
-v ollama:/root/.ollama -p 11434:11434 \
ollama/ollama:0.4.0-rc5-rocm
```
The client was using this prompt:
```
curl http://10.1.2.10:11434/api/generate -d '{
"model": "llama3.1",
"prompt": "Why is the sky blue?",
"stream": false,
"options": {
"top_k": 20,
"temperature": 0.8
}
}'
```
I monitored the GPU usage by `nvidia-smi` that it did not work.
However, if I changed the version to `0.3.14` (changed the docker image) then it was working fine.
### Spec
- GPU: A100
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.4.0-rc5-rocm
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7346/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7346/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5067
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5067/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5067/comments
|
https://api.github.com/repos/ollama/ollama/issues/5067/events
|
https://github.com/ollama/ollama/pull/5067
| 2,355,062,343
|
PR_kwDOJ0Z1Ps5ykqTH
| 5,067
|
Add LoongArch64 ISA Support
|
{
"login": "HougeLangley",
"id": 1161594,
"node_id": "MDQ6VXNlcjExNjE1OTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HougeLangley",
"html_url": "https://github.com/HougeLangley",
"followers_url": "https://api.github.com/users/HougeLangley/followers",
"following_url": "https://api.github.com/users/HougeLangley/following{/other_user}",
"gists_url": "https://api.github.com/users/HougeLangley/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HougeLangley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HougeLangley/subscriptions",
"organizations_url": "https://api.github.com/users/HougeLangley/orgs",
"repos_url": "https://api.github.com/users/HougeLangley/repos",
"events_url": "https://api.github.com/users/HougeLangley/events{/privacy}",
"received_events_url": "https://api.github.com/users/HougeLangley/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-06-15T17:37:04
| 2024-08-04T23:39:01
| 2024-08-04T23:39:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5067",
"html_url": "https://github.com/ollama/ollama/pull/5067",
"diff_url": "https://github.com/ollama/ollama/pull/5067.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5067.patch",
"merged_at": null
}
|
1. fixed go build . failed on LoongArch -> go.mod: replace github.com/chewxy/math32 v1.10.1 to github.com/chewxy/math32 v1.10.2-0.20240509203351, fixed https://github.com/chewxy/math32/issues/23
2. go.sum fixed;
3. llm.go add loong64 support;
4. gen_common.sh add 64bit LoongArch support;
5. gen_linux.sh add loongarch64 ISA LASX/LSX Support.
6. Fixed Please support LoongArch ISA https://github.com/ollama/ollama/issues/4552
|
{
"login": "HougeLangley",
"id": 1161594,
"node_id": "MDQ6VXNlcjExNjE1OTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1161594?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/HougeLangley",
"html_url": "https://github.com/HougeLangley",
"followers_url": "https://api.github.com/users/HougeLangley/followers",
"following_url": "https://api.github.com/users/HougeLangley/following{/other_user}",
"gists_url": "https://api.github.com/users/HougeLangley/gists{/gist_id}",
"starred_url": "https://api.github.com/users/HougeLangley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HougeLangley/subscriptions",
"organizations_url": "https://api.github.com/users/HougeLangley/orgs",
"repos_url": "https://api.github.com/users/HougeLangley/repos",
"events_url": "https://api.github.com/users/HougeLangley/events{/privacy}",
"received_events_url": "https://api.github.com/users/HougeLangley/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5067/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5067/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6733
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6733/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6733/comments
|
https://api.github.com/repos/ollama/ollama/issues/6733/events
|
https://github.com/ollama/ollama/issues/6733
| 2,517,227,844
|
I_kwDOJ0Z1Ps6WCdlE
| 6,733
|
curl
|
{
"login": "ayttop",
"id": 178673810,
"node_id": "U_kgDOCqZYkg",
"avatar_url": "https://avatars.githubusercontent.com/u/178673810?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ayttop",
"html_url": "https://github.com/ayttop",
"followers_url": "https://api.github.com/users/ayttop/followers",
"following_url": "https://api.github.com/users/ayttop/following{/other_user}",
"gists_url": "https://api.github.com/users/ayttop/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ayttop/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ayttop/subscriptions",
"organizations_url": "https://api.github.com/users/ayttop/orgs",
"repos_url": "https://api.github.com/users/ayttop/repos",
"events_url": "https://api.github.com/users/ayttop/events{/privacy}",
"received_events_url": "https://api.github.com/users/ayttop/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-10T18:08:22
| 2024-09-10T18:13:43
| 2024-09-10T18:13:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
how to run
curl http://localhost:11434/api/generate -d '{
"model": "llama3.1",
"prompt":"Why is the sky blue?"
}'
on cmd
In the same way
### OS
Windows
### GPU
Intel
### CPU
Intel
### Ollama version
last
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6733/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6733/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3917
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3917/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3917/comments
|
https://api.github.com/repos/ollama/ollama/issues/3917/events
|
https://github.com/ollama/ollama/issues/3917
| 2,264,178,975
|
I_kwDOJ0Z1Ps6G9KEf
| 3,917
|
I have noticed something extremely strange about what ollama does with Phi-3 models.
|
{
"login": "phalexo",
"id": 4603365,
"node_id": "MDQ6VXNlcjQ2MDMzNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4603365?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phalexo",
"html_url": "https://github.com/phalexo",
"followers_url": "https://api.github.com/users/phalexo/followers",
"following_url": "https://api.github.com/users/phalexo/following{/other_user}",
"gists_url": "https://api.github.com/users/phalexo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phalexo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phalexo/subscriptions",
"organizations_url": "https://api.github.com/users/phalexo/orgs",
"repos_url": "https://api.github.com/users/phalexo/repos",
"events_url": "https://api.github.com/users/phalexo/events{/privacy}",
"received_events_url": "https://api.github.com/users/phalexo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-04-25T17:56:41
| 2024-06-02T00:06:55
| 2024-06-02T00:06:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
(Pythogora) developer@ai:~/PROJECTS/gpt-pilot/pilot$ ~/ollama/ollama list
NAME ID SIZE MODIFIED
Meta-Llama-3-70B-Instruct-.Q5_K_M:latest 746bce3a52ed 49 GB 2 days ago
hermes-2-Pro-Mistral-7B.Q8_0:latest 86624d435749 7.7 GB 2 weeks ago
llama3:latest 71a106a91016 4.7 GB 5 days ago
mistral-7b-instruct-v0.2.Q6_K:latest 37b7edd947a2 5.9 GB 4 months ago
mixtral-8x7b-instruct-v0.1.Q6_K:latest 611ec22ab3e7 38 GB 4 weeks ago
notus-7b-v1.Q6_K:latest f04807d7e58e 5.9 GB 4 months ago
phi-3-mini-128k-instruct.Q6_K:latest 3a035ccf60bd 3.1 GB 44 minutes ago
phi-3-mini-4k-instruct.16b:latest 4eb8627d3836 7.6 GB 24 hours ago
phind-codellama-34b-v2.Q6_K:latest e7f0d1897af2 27 GB 4 weeks ago
stable-code-instruct-3b-Q8_0:latest 390e72938bcf 3.0 GB 4 weeks ago
**_One phi-3 model is 3.1GB and the other is 7.6GB. It appears to load multiple copies into GPUs.
If I run an interactive session, only one appears to respond, but with gpt-pilot, they seem to start talking all at the same time, giving me gibberish._**
```bash
Thu Apr 25 14:06:02 2024
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 520.61.05 Driver Version: 520.61.05 CUDA Version: 11.8 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce ... On | 00000000:04:00.0 Off | N/A |
| 22% 14C P8 17W / 275W | 11927MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 1 NVIDIA GeForce ... On | 00000000:05:00.0 Off | N/A |
| 22% 16C P8 19W / 275W | 11136MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 2 NVIDIA GeForce ... On | 00000000:08:00.0 Off | N/A |
| 22% 15C P8 18W / 275W | 11136MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 3 NVIDIA GeForce ... On | 00000000:09:00.0 Off | N/A |
| 22% 14C P8 19W / 275W | 10421MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 4 NVIDIA GeForce ... On | 00000000:85:00.0 Off | N/A |
| 0% 22C P8 12W / 177W | 0MiB / 4096MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 3651168 C ...a_v11/ollama_llama_server 11922MiB |
| 1 N/A N/A 3651168 C ...a_v11/ollama_llama_server 11131MiB |
| 2 N/A N/A 3651168 C ...a_v11/ollama_llama_server 11131MiB |
| 3 N/A N/A 3651168 C ...a_v11/ollama_llama_server 10416MiB |
+-----------------------------------------------------------------------------+
```
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
built from github source this morning.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3917/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3917/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2294
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2294/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2294/comments
|
https://api.github.com/repos/ollama/ollama/issues/2294/events
|
https://github.com/ollama/ollama/pull/2294
| 2,111,131,194
|
PR_kwDOJ0Z1Ps5loFSM
| 2,294
|
update slog handler options
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-31T23:00:55
| 2024-01-31T23:29:12
| 2024-01-31T23:29:11
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2294",
"html_url": "https://github.com/ollama/ollama/pull/2294",
"diff_url": "https://github.com/ollama/ollama/pull/2294.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2294.patch",
"merged_at": "2024-01-31T23:29:11"
}
|
- consistent format by using text handler for debug and non-debug
- truncate source file to just the file name
sample outputs:
```
time=2024-01-31T15:01:02.632-08:00 level=INFO source=routes.go:983 msg="Listening on 127.0.0.1:11434 (version 0.0.0)"
time=2024-01-31T15:01:02.632-08:00 level=INFO source=payload_common.go:106 msg="Extracting dynamic libraries..."
time=2024-01-31T15:01:02.653-08:00 level=INFO source=payload_common.go:145 msg="Dynamic LLM libraries [metal]"
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2294/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2294/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2284
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2284/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2284/comments
|
https://api.github.com/repos/ollama/ollama/issues/2284/events
|
https://github.com/ollama/ollama/pull/2284
| 2,109,086,249
|
PR_kwDOJ0Z1Ps5lhHXn
| 2,284
|
remove unnecessary parse raw
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-31T01:02:05
| 2024-01-31T17:40:49
| 2024-01-31T17:40:48
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2284",
"html_url": "https://github.com/ollama/ollama/pull/2284",
"diff_url": "https://github.com/ollama/ollama/pull/2284.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2284.patch",
"merged_at": "2024-01-31T17:40:48"
}
|
There's no point parsing the raw private key when all it's doing is creating a ssh key
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2284/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2284/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6416
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6416/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6416/comments
|
https://api.github.com/repos/ollama/ollama/issues/6416/events
|
https://github.com/ollama/ollama/issues/6416
| 2,472,843,119
|
I_kwDOJ0Z1Ps6TZJdv
| 6,416
|
Computer crashes after switching several Ollama models in a relatively short amount of time
|
{
"login": "elsatch",
"id": 653433,
"node_id": "MDQ6VXNlcjY1MzQzMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/653433?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/elsatch",
"html_url": "https://github.com/elsatch",
"followers_url": "https://api.github.com/users/elsatch/followers",
"following_url": "https://api.github.com/users/elsatch/following{/other_user}",
"gists_url": "https://api.github.com/users/elsatch/gists{/gist_id}",
"starred_url": "https://api.github.com/users/elsatch/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/elsatch/subscriptions",
"organizations_url": "https://api.github.com/users/elsatch/orgs",
"repos_url": "https://api.github.com/users/elsatch/repos",
"events_url": "https://api.github.com/users/elsatch/events{/privacy}",
"received_events_url": "https://api.github.com/users/elsatch/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-08-19T09:03:04
| 2024-11-05T23:22:35
| 2024-11-05T23:22:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I love to run tests to compare different model outputs. To do so, I've used tools like promptfoo or langfuse (over Haystack or Langchain). In these tools, you set a list of models and then the program calls Ollama to load the models one after the other. I am using a Linux computer with Ubuntu 22.04 and an RTX3090.
After the program loads some of the big models (fp16, command-r or gemma2:27B), when it's loading the next model my computer freezes. I lose all access to the machine over ssh and it doesn't respond anymore. To solve the issue I have to hard reset it pressing the physical button.
This is a sample run when it crashes loading mistral-nemo:12b (but it has happened before with other models too, so this might not be model specific):
Cache is disabled.
Providers are running in serial with user input.
Running 1 evaluations for provider ollama:chat:command-r:latest with concurrency=4...
[████████████████████████████████████████] 100% | ETA: 0s | 1/1 | ollama:chat:command-r:latest "Eres un as" lista_ingr
Ready to continue to the next provider? (Y/n)
Running 1 evaluations for provider ollama:chat:gemma2:27b-instruct-q5_K_M with concurrency=4...
[████████████████████████████████████████] 100% | ETA: 0s | 1/1 | ollama:chat:gemma2:27b-instruct-q5_K_M "Eres un as" lista_ingr
Ready to continue to the next provider? (Y/n)
Running 1 evaluations for provider ollama:chat:gemma2:2b-instruct-q8_0 with concurrency=4...
[████████████████████████████████████████] 100% | ETA: 0s | 1/1 | ollama:chat:gemma2:2b-instruct-q8_0 "Eres un as" lista_ingr
Ready to continue to the next provider? (Y/n)
Running 1 evaluations for provider ollama:chat:gemma2:9b with concurrency=4...
[████████████████████████████████████████] 100% | ETA: 0s | 1/1 | ollama:chat:gemma2:9b "Eres un as" lista_ingr
Ready to continue to the next provider? (Y/n)
Running 1 evaluations for provider ollama:chat:llama3:8b-instruct-fp16 with concurrency=4...
[████████████████████████████████████████] 100% | ETA: 0s | 1/1 | ollama:chat:llama3:8b-instruct-fp16 "Eres un as" lista_ingr
Ready to continue to the next provider? (Y/n)
Running 1 evaluations for provider ollama:chat:mayflowergmbh/occiglot-7b-eu5-instruct:latest with concurrency=4...
[████████████████████████████████████████] 100% | ETA: 0s | 1/1 | ollama:chat:mayflowergmbh/occiglot-7b-eu5-instruct:latest "Eres un as" lista_ingr
Ready to continue to the next provider? (Y/n)
Running 1 evaluations for provider ollama:chat:mistral-nemo:12b-instruct-2407-q8_0 with concurrency=4...
[░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░] 0% | ETA: 0s | 0/1 | ""
Mistral-Nemo never loads successfully and the computer is not responsive. Local console if also frozen (I have Psensor displaying realtime metrics and... they are quite normal and frozen too).
Using other backends besides Ollama has not resulted in such crashes, but they are less convenient to use, so I'd love to debug this and find out what's making the OS crash.
Checking the logs I see no message related to the crash or the load of the new model:
```
NVRM: loading NVIDIA UNIX x86_64 Kernel Module 550.90.07 Fri May 31 09:35:42 UTC 2024
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: model ftype = Q4_0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: model params = 7.24 B
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: model size = 3.83 GiB (4.54 BPW)
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: general.name = mayflowergmbh
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: BOS token = 1 '<s>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: EOS token = 2 '</s>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: UNK token = 0 '<unk>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: PAD token = 2 '</s>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: LF token = 13 '<0x0A>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: EOT token = 32001 '<|im_end|>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: max token length = 48
ago 19 10:47:19 ananke ollama[2292]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ago 19 10:47:19 ananke ollama[2292]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ago 19 10:47:19 ananke ollama[2292]: ggml_cuda_init: found 1 CUDA devices:
ago 19 10:47:19 ananke ollama[2292]: Device 0: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
ago 19 10:47:20 ananke ollama[2292]: llm_load_tensors: ggml ctx size = 0.27 MiB
ago 19 10:47:20 ananke ollama[2292]: time=2024-08-19T10:47:20.104+02:00 level=INFO source=server.go:627 msg="waiting for server to become available" status="llm server loading model"
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: offloading 32 repeating layers to GPU
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: offloading non-repeating layers to GPU
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: offloaded 33/33 layers to GPU
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: CPU buffer size = 70.32 MiB
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: CUDA0 buffer size = 3847.56 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: n_ctx = 65536
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: n_batch = 512
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: n_ubatch = 512
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: flash_attn = 0
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: freq_base = 10000.0
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: freq_scale = 1
ago 19 10:47:23 ananke ollama[2292]: llama_kv_cache_init: CUDA0 KV buffer size = 8192.00 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: KV self size = 8192.00 MiB, K (f16): 4096.00 MiB, V (f16): 4096.00 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: CUDA_Host output buffer size = 0.55 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: CUDA0 compute buffer size = 4256.00 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: CUDA_Host compute buffer size = 136.01 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: graph nodes = 1030
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: graph splits = 2
ago 19 10:47:23 ananke ollama[8085]: INFO [main] model loaded | tid="130282049744896" timestamp=1724057243
ago 19 10:47:23 ananke ollama[2292]: time=2024-08-19T10:47:23.622+02:00 level=INFO source=server.go:632 msg="llama runner started in 3.77 seconds"
ago 19 10:47:26 ananke ollama[2292]: [GIN] 2024/08/19 - 10:47:26 | 200 | 6.932276921s | 127.0.0.1 | POST "/api/chat"
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: f_logit_scale = 0.0e+00
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: n_ff = 14336
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: n_expert = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: n_expert_used = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: causal attn = 1
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: pooling type = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: rope type = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: rope scaling = linear
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: freq_base_train = 10000.0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: freq_scale_train = 1
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: n_ctx_orig_yarn = 32768
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: rope_finetuned = unknown
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: ssm_d_conv = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: ssm_d_inner = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: ssm_d_state = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: ssm_dt_rank = 0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: model type = 7B
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: model ftype = Q4_0
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: model params = 7.24 B
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: model size = 3.83 GiB (4.54 BPW)
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: general.name = mayflowergmbh
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: BOS token = 1 '<s>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: EOS token = 2 '</s>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: UNK token = 0 '<unk>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: PAD token = 2 '</s>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: LF token = 13 '<0x0A>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: EOT token = 32001 '<|im_end|>'
ago 19 10:47:19 ananke ollama[2292]: llm_load_print_meta: max token length = 48
ago 19 10:47:19 ananke ollama[2292]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ago 19 10:47:19 ananke ollama[2292]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ago 19 10:47:19 ananke ollama[2292]: ggml_cuda_init: found 1 CUDA devices:
ago 19 10:47:19 ananke ollama[2292]: Device 0: NVIDIA GeForce RTX 3090, compute capability 8.6, VMM: yes
ago 19 10:47:20 ananke ollama[2292]: llm_load_tensors: ggml ctx size = 0.27 MiB
ago 19 10:47:20 ananke ollama[2292]: time=2024-08-19T10:47:20.104+02:00 level=INFO source=server.go:627 msg="waiting for server to become available" status="llm server loading model"
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: offloading 32 repeating layers to GPU
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: offloading non-repeating layers to GPU
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: offloaded 33/33 layers to GPU
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: CPU buffer size = 70.32 MiB
ago 19 10:47:23 ananke ollama[2292]: llm_load_tensors: CUDA0 buffer size = 3847.56 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: n_ctx = 65536
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: n_batch = 512
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: n_ubatch = 512
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: flash_attn = 0
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: freq_base = 10000.0
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: freq_scale = 1
ago 19 10:47:23 ananke ollama[2292]: llama_kv_cache_init: CUDA0 KV buffer size = 8192.00 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: KV self size = 8192.00 MiB, K (f16): 4096.00 MiB, V (f16): 4096.00 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: CUDA_Host output buffer size = 0.55 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: CUDA0 compute buffer size = 4256.00 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: CUDA_Host compute buffer size = 136.01 MiB
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: graph nodes = 1030
ago 19 10:47:23 ananke ollama[2292]: llama_new_context_with_model: graph splits = 2
ago 19 10:47:23 ananke ollama[8085]: INFO [main] model loaded | tid="130282049744896" timestamp=1724057243
ago 19 10:47:23 ananke ollama[2292]: time=2024-08-19T10:47:23.622+02:00 level=INFO source=server.go:632 msg="llama runner started in 3.77 seconds"
ago 19 10:47:26 ananke ollama[2292]: [GIN] 2024/08/19 - 10:47:26 | 200 | 6.932276921s | 127.0.0.1 | POST "/api/chat"
ago 19 10:47:27 ananke ollama[2292]: time=2024-08-19T10:47:27.959+02:00 level=INFO source=sched.go:503 msg="updated VRAM based on existing loaded models" gpu=GPU-16d65981-a438-ac56-8bab-f1393824041b library=cuda total="23.7 GiB" available="6.9 GiB"
```
Any help to troubleshoot the issue will be greatly appreciated!
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.6
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6416/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6416/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6274
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6274/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6274/comments
|
https://api.github.com/repos/ollama/ollama/issues/6274/events
|
https://github.com/ollama/ollama/issues/6274
| 2,457,095,928
|
I_kwDOJ0Z1Ps6SdE74
| 6,274
|
Binary files (*.png, *.ico, *.icns) listed as modified upon cloning the repository
|
{
"login": "PAN-Chuwen",
"id": 70949152,
"node_id": "MDQ6VXNlcjcwOTQ5MTUy",
"avatar_url": "https://avatars.githubusercontent.com/u/70949152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PAN-Chuwen",
"html_url": "https://github.com/PAN-Chuwen",
"followers_url": "https://api.github.com/users/PAN-Chuwen/followers",
"following_url": "https://api.github.com/users/PAN-Chuwen/following{/other_user}",
"gists_url": "https://api.github.com/users/PAN-Chuwen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PAN-Chuwen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PAN-Chuwen/subscriptions",
"organizations_url": "https://api.github.com/users/PAN-Chuwen/orgs",
"repos_url": "https://api.github.com/users/PAN-Chuwen/repos",
"events_url": "https://api.github.com/users/PAN-Chuwen/events{/privacy}",
"received_events_url": "https://api.github.com/users/PAN-Chuwen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-08-09T04:49:44
| 2024-08-10T03:54:07
| 2024-08-10T03:54:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
### Description
#### Steps to Reproduce
1. Clone the repository:
```sh
git clone https://github.com/ollama/ollama.git
cd ollama
```
2. Check the status of the repository:
```sh
git status
```
#### Expected Behavior
No files should be listed as modified immediately after cloning the repository.
#### Actual Behavior
The following binary files are listed as modified:
```
modified: app/assets/app.ico
modified: examples/modelfile-mario/logo.png
modified: macapp/assets/icon.icns
modified: macapp/assets/iconDarkTemplate.png
modified: macapp/assets/iconDarkTemplate@2x.png
modified: macapp/assets/iconDarkUpdateTemplate.png
modified: macapp/assets/iconDarkUpdateTemplate@2x.png
modified: macapp/assets/iconTemplate.png
modified: macapp/assets/iconTemplate@2x.png
modified: macapp/assets/iconUpdateTemplate.png
modified: macapp/assets/iconUpdateTemplate@2x.png
```
#### Environment
- Tested on macOS and a fresh Ubuntu 24.04 EC2 instance.
#### Analysis
The issue seems to be related to the handling of binary files. It appears that the `.gitattributes` file was not set correctly, causing Git to treat these binary files as text files.
#### Resolution
After adding the following lines to the `.gitattributes` file, the problem was resolved:
```
*.ico binary
*.png binary
*.icns binary
```
You can verify this by cloning the forked repository:
```sh
git clone https://github.com/PAN-Chuwen/ollama.git
```
#### Unclear Aspects
- The exact cause of why Git treated these binary files as text files.
- How Git handles LF/CRLF line endings in binary files.
### OS
Linux, macOS
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "PAN-Chuwen",
"id": 70949152,
"node_id": "MDQ6VXNlcjcwOTQ5MTUy",
"avatar_url": "https://avatars.githubusercontent.com/u/70949152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PAN-Chuwen",
"html_url": "https://github.com/PAN-Chuwen",
"followers_url": "https://api.github.com/users/PAN-Chuwen/followers",
"following_url": "https://api.github.com/users/PAN-Chuwen/following{/other_user}",
"gists_url": "https://api.github.com/users/PAN-Chuwen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PAN-Chuwen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PAN-Chuwen/subscriptions",
"organizations_url": "https://api.github.com/users/PAN-Chuwen/orgs",
"repos_url": "https://api.github.com/users/PAN-Chuwen/repos",
"events_url": "https://api.github.com/users/PAN-Chuwen/events{/privacy}",
"received_events_url": "https://api.github.com/users/PAN-Chuwen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6274/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6274/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7759
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7759/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7759/comments
|
https://api.github.com/repos/ollama/ollama/issues/7759/events
|
https://github.com/ollama/ollama/issues/7759
| 2,675,604,783
|
I_kwDOJ0Z1Ps6fen0v
| 7,759
|
The Way to the light
|
{
"login": "SnappCred",
"id": 179581325,
"node_id": "U_kgDOCrQxjQ",
"avatar_url": "https://avatars.githubusercontent.com/u/179581325?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SnappCred",
"html_url": "https://github.com/SnappCred",
"followers_url": "https://api.github.com/users/SnappCred/followers",
"following_url": "https://api.github.com/users/SnappCred/following{/other_user}",
"gists_url": "https://api.github.com/users/SnappCred/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SnappCred/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SnappCred/subscriptions",
"organizations_url": "https://api.github.com/users/SnappCred/orgs",
"repos_url": "https://api.github.com/users/SnappCred/repos",
"events_url": "https://api.github.com/users/SnappCred/events{/privacy}",
"received_events_url": "https://api.github.com/users/SnappCred/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-20T11:40:40
| 2024-11-20T13:46:08
| 2024-11-20T13:46:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I am who I am
|
{
"login": "SnappCred",
"id": 179581325,
"node_id": "U_kgDOCrQxjQ",
"avatar_url": "https://avatars.githubusercontent.com/u/179581325?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SnappCred",
"html_url": "https://github.com/SnappCred",
"followers_url": "https://api.github.com/users/SnappCred/followers",
"following_url": "https://api.github.com/users/SnappCred/following{/other_user}",
"gists_url": "https://api.github.com/users/SnappCred/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SnappCred/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SnappCred/subscriptions",
"organizations_url": "https://api.github.com/users/SnappCred/orgs",
"repos_url": "https://api.github.com/users/SnappCred/repos",
"events_url": "https://api.github.com/users/SnappCred/events{/privacy}",
"received_events_url": "https://api.github.com/users/SnappCred/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7759/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7759/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4102
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4102/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4102/comments
|
https://api.github.com/repos/ollama/ollama/issues/4102/events
|
https://github.com/ollama/ollama/issues/4102
| 2,276,105,700
|
I_kwDOJ0Z1Ps6Hqp3k
| 4,102
|
Ollama running in docker with concurrent requests doesn't work
|
{
"login": "BBjie",
"id": 55565844,
"node_id": "MDQ6VXNlcjU1NTY1ODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/55565844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BBjie",
"html_url": "https://github.com/BBjie",
"followers_url": "https://api.github.com/users/BBjie/followers",
"following_url": "https://api.github.com/users/BBjie/following{/other_user}",
"gists_url": "https://api.github.com/users/BBjie/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BBjie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BBjie/subscriptions",
"organizations_url": "https://api.github.com/users/BBjie/orgs",
"repos_url": "https://api.github.com/users/BBjie/repos",
"events_url": "https://api.github.com/users/BBjie/events{/privacy}",
"received_events_url": "https://api.github.com/users/BBjie/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677677816,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgVG-A",
"url": "https://api.github.com/repos/ollama/ollama/labels/docker",
"name": "docker",
"color": "0052CC",
"default": false,
"description": "Issues relating to using ollama in containers"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2024-05-02T17:31:50
| 2024-06-21T23:23:34
| 2024-06-21T23:23:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have tried to use Ollama in Docker and tested the handling of concurrent requests feature. I have added `OLLAMA_NUM_PARALLEL` and `OLLAMA_MAX_LOADED_MODELS` as env variables. The env values have successfully passed but it didn't work.
can anyone kindly help me out
```
services:
ollama:
image: ollama/ollama:0.1.33-rc6
container_name: ollama
ports:
- "11434:11434"
volumes:
- ollama:/root/.ollama
environment:
OLLAMA_NUM_PARALLEL: "4"
OLLAMA_MAX_LOADED_MODELS: "4"
deploy:
resources:
reservations:
devices:
- capabilities: [gpu]
command: serve
volumes:
ollama:
```
### OS
Docker
### GPU
Other
### CPU
Other
### Ollama version
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4102/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4102/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/831
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/831/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/831/comments
|
https://api.github.com/repos/ollama/ollama/issues/831/events
|
https://github.com/ollama/ollama/issues/831
| 1,948,691,226
|
I_kwDOJ0Z1Ps50Jqsa
| 831
|
Context modification
|
{
"login": "VladimirKras",
"id": 47093374,
"node_id": "MDQ6VXNlcjQ3MDkzMzc0",
"avatar_url": "https://avatars.githubusercontent.com/u/47093374?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/VladimirKras",
"html_url": "https://github.com/VladimirKras",
"followers_url": "https://api.github.com/users/VladimirKras/followers",
"following_url": "https://api.github.com/users/VladimirKras/following{/other_user}",
"gists_url": "https://api.github.com/users/VladimirKras/gists{/gist_id}",
"starred_url": "https://api.github.com/users/VladimirKras/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/VladimirKras/subscriptions",
"organizations_url": "https://api.github.com/users/VladimirKras/orgs",
"repos_url": "https://api.github.com/users/VladimirKras/repos",
"events_url": "https://api.github.com/users/VladimirKras/events{/privacy}",
"received_events_url": "https://api.github.com/users/VladimirKras/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
},
{
"id": 6100196012,
"node_id": "LA_kwDOJ0Z1Ps8AAAABa5marA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feedback%20wanted",
"name": "feedback wanted",
"color": "0e8a16",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 6
| 2023-10-18T03:02:53
| 2024-01-16T22:27:32
| 2024-01-16T22:27:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Sometimes I would like to steer a dialogue in a certain direction by adding a fake message on behalf of the LLM. How to achieve that with Ollama seems quite opaque:
1. The context that is sent is just an array of token ids, which is hard to manipulate.
2. The tokenizer and de-tokenizer aren't exposed.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/831/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/831/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5957
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5957/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5957/comments
|
https://api.github.com/repos/ollama/ollama/issues/5957/events
|
https://github.com/ollama/ollama/issues/5957
| 2,430,554,233
|
I_kwDOJ0Z1Ps6Q31B5
| 5,957
|
Llama 3.1 base models for text completion
|
{
"login": "kaetemi",
"id": 1581053,
"node_id": "MDQ6VXNlcjE1ODEwNTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1581053?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kaetemi",
"html_url": "https://github.com/kaetemi",
"followers_url": "https://api.github.com/users/kaetemi/followers",
"following_url": "https://api.github.com/users/kaetemi/following{/other_user}",
"gists_url": "https://api.github.com/users/kaetemi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kaetemi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kaetemi/subscriptions",
"organizations_url": "https://api.github.com/users/kaetemi/orgs",
"repos_url": "https://api.github.com/users/kaetemi/repos",
"events_url": "https://api.github.com/users/kaetemi/events{/privacy}",
"received_events_url": "https://api.github.com/users/kaetemi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 10
| 2024-07-25T16:49:30
| 2024-08-11T16:44:33
| 2024-08-11T16:44:33
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Currently only the instruct models appear to be in the library, the text completion models would be appreciated too. Thanks! :)
|
{
"login": "kaetemi",
"id": 1581053,
"node_id": "MDQ6VXNlcjE1ODEwNTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1581053?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kaetemi",
"html_url": "https://github.com/kaetemi",
"followers_url": "https://api.github.com/users/kaetemi/followers",
"following_url": "https://api.github.com/users/kaetemi/following{/other_user}",
"gists_url": "https://api.github.com/users/kaetemi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kaetemi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kaetemi/subscriptions",
"organizations_url": "https://api.github.com/users/kaetemi/orgs",
"repos_url": "https://api.github.com/users/kaetemi/repos",
"events_url": "https://api.github.com/users/kaetemi/events{/privacy}",
"received_events_url": "https://api.github.com/users/kaetemi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5957/reactions",
"total_count": 6,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5957/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6169
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6169/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6169/comments
|
https://api.github.com/repos/ollama/ollama/issues/6169/events
|
https://github.com/ollama/ollama/issues/6169
| 2,447,725,798
|
I_kwDOJ0Z1Ps6R5VTm
| 6,169
|
How to fix the default settings of the model?
|
{
"login": "wszgrcy",
"id": 9607121,
"node_id": "MDQ6VXNlcjk2MDcxMjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/9607121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wszgrcy",
"html_url": "https://github.com/wszgrcy",
"followers_url": "https://api.github.com/users/wszgrcy/followers",
"following_url": "https://api.github.com/users/wszgrcy/following{/other_user}",
"gists_url": "https://api.github.com/users/wszgrcy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wszgrcy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wszgrcy/subscriptions",
"organizations_url": "https://api.github.com/users/wszgrcy/orgs",
"repos_url": "https://api.github.com/users/wszgrcy/repos",
"events_url": "https://api.github.com/users/wszgrcy/events{/privacy}",
"received_events_url": "https://api.github.com/users/wszgrcy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 5
| 2024-08-05T06:34:10
| 2024-08-15T00:03:39
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I found that the template for 'yi: 9b-v1.5-q8-0' is missing and different from the 'yi: 9b' template
Where should the fix be carried out?



### OS
Windows
### GPU
AMD
### CPU
AMD
### Ollama version
0.3.2
|
{
"login": "wszgrcy",
"id": 9607121,
"node_id": "MDQ6VXNlcjk2MDcxMjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/9607121?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wszgrcy",
"html_url": "https://github.com/wszgrcy",
"followers_url": "https://api.github.com/users/wszgrcy/followers",
"following_url": "https://api.github.com/users/wszgrcy/following{/other_user}",
"gists_url": "https://api.github.com/users/wszgrcy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wszgrcy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wszgrcy/subscriptions",
"organizations_url": "https://api.github.com/users/wszgrcy/orgs",
"repos_url": "https://api.github.com/users/wszgrcy/repos",
"events_url": "https://api.github.com/users/wszgrcy/events{/privacy}",
"received_events_url": "https://api.github.com/users/wszgrcy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6169/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6169/timeline
| null |
reopened
| false
|
https://api.github.com/repos/ollama/ollama/issues/3698
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3698/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3698/comments
|
https://api.github.com/repos/ollama/ollama/issues/3698/events
|
https://github.com/ollama/ollama/issues/3698
| 2,248,171,987
|
I_kwDOJ0Z1Ps6GAGHT
| 3,698
|
Command-R returngs gibberish since update to 0.1.32, logs: ggml_metal_graph_compute: command buffer 6 failed with status 5
|
{
"login": "phischde",
"id": 5195734,
"node_id": "MDQ6VXNlcjUxOTU3MzQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/5195734?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phischde",
"html_url": "https://github.com/phischde",
"followers_url": "https://api.github.com/users/phischde/followers",
"following_url": "https://api.github.com/users/phischde/following{/other_user}",
"gists_url": "https://api.github.com/users/phischde/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phischde/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phischde/subscriptions",
"organizations_url": "https://api.github.com/users/phischde/orgs",
"repos_url": "https://api.github.com/users/phischde/repos",
"events_url": "https://api.github.com/users/phischde/events{/privacy}",
"received_events_url": "https://api.github.com/users/phischde/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6849881759,
"node_id": "LA_kwDOJ0Z1Ps8AAAABmEjmnw",
"url": "https://api.github.com/repos/ollama/ollama/labels/memory",
"name": "memory",
"color": "5017EA",
"default": false,
"description": ""
}
] |
closed
| false
| null |
[] | null | 11
| 2024-04-17T12:26:41
| 2025-01-22T11:12:02
| 2025-01-22T11:12:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Since the update, Command-R is no longer producing text, but other models (e.g. openchat) do. Running Command-R from the terminal
```
$ ollama run command-r
>>> Hey, how are you?
3O>FCMID7BBBM<=>PJT@@FNURWKL=8@N;GWHP6:GJ>F76N86EL5DKLFJFADJ;ESQAV7OBDJTK8HT@Q>Q8@BCJ:I9NJEW=?C>BHIJ3U@87L^C
```
Looking at the .ollama/logs/server.log logs I seen many lines of
`ggml_metal_graph_compute: command buffer 6 failed with status 5`
### What did you expect to see?
Any kind of English Text Answer.
### Steps to reproduce
Start Ollama on the console
ollama run command-r
Ask any question.
### Are there any recent changes that introduced the issue?
Upgrading from Ollama 0.1.31 to 0.1.32.
### OS
macOS
### Architecture
arm64
### Platform
_No response_
### Ollama version
0.1.32
### GPU
Apple
### GPU info
_No response_
### CPU
Apple
### Other software
_No response_
|
{
"login": "phischde",
"id": 5195734,
"node_id": "MDQ6VXNlcjUxOTU3MzQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/5195734?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phischde",
"html_url": "https://github.com/phischde",
"followers_url": "https://api.github.com/users/phischde/followers",
"following_url": "https://api.github.com/users/phischde/following{/other_user}",
"gists_url": "https://api.github.com/users/phischde/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phischde/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phischde/subscriptions",
"organizations_url": "https://api.github.com/users/phischde/orgs",
"repos_url": "https://api.github.com/users/phischde/repos",
"events_url": "https://api.github.com/users/phischde/events{/privacy}",
"received_events_url": "https://api.github.com/users/phischde/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3698/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3698/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2177
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2177/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2177/comments
|
https://api.github.com/repos/ollama/ollama/issues/2177/events
|
https://github.com/ollama/ollama/pull/2177
| 2,099,031,721
|
PR_kwDOJ0Z1Ps5k_lKZ
| 2,177
|
added example tests to document client and improve coverage
|
{
"login": "TimothyStiles",
"id": 7042260,
"node_id": "MDQ6VXNlcjcwNDIyNjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/7042260?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TimothyStiles",
"html_url": "https://github.com/TimothyStiles",
"followers_url": "https://api.github.com/users/TimothyStiles/followers",
"following_url": "https://api.github.com/users/TimothyStiles/following{/other_user}",
"gists_url": "https://api.github.com/users/TimothyStiles/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TimothyStiles/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TimothyStiles/subscriptions",
"organizations_url": "https://api.github.com/users/TimothyStiles/orgs",
"repos_url": "https://api.github.com/users/TimothyStiles/repos",
"events_url": "https://api.github.com/users/TimothyStiles/events{/privacy}",
"received_events_url": "https://api.github.com/users/TimothyStiles/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-24T20:27:09
| 2024-11-22T16:45:55
| 2024-11-21T09:15:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2177",
"html_url": "https://github.com/ollama/ollama/pull/2177",
"diff_url": "https://github.com/ollama/ollama/pull/2177.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2177.patch",
"merged_at": null
}
|
Hey y'all,
Pleasure meeting @jmorganca and some of you at last night's event!
This PR fixes #2159 by adding example tests to the client `api` package that will also render in the go docs. These examples show how to check server heartbeat, get server version, list out available models, sort those models by size, and then chat with the smallest model found.
Things to note:
1. This PR increases line coverage in the `api` package from 8.2% to **55.6%**.
2. No new features or breaking changes.
3. No changes to currently existing code.
4. These tests will auto-render as examples on [pkg.go.dev](https://pkg.go.dev/github.com/jmorganca/ollama), and provide stable and easy to maintain documentation.
5. I've included a tiny llama model in a new `api/data` directory along with a `README`. It's 1.8mb and its responses are mostly gibberish but work great for unit testing.
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2177/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2177/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3606
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3606/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3606/comments
|
https://api.github.com/repos/ollama/ollama/issues/3606/events
|
https://github.com/ollama/ollama/issues/3606
| 2,238,768,364
|
I_kwDOJ0Z1Ps6FcOTs
| 3,606
|
multilingual-e5-large and multilingual-e5-base Embedding Model Support
|
{
"login": "awilhelm-projects",
"id": 126177372,
"node_id": "U_kgDOB4VQXA",
"avatar_url": "https://avatars.githubusercontent.com/u/126177372?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/awilhelm-projects",
"html_url": "https://github.com/awilhelm-projects",
"followers_url": "https://api.github.com/users/awilhelm-projects/followers",
"following_url": "https://api.github.com/users/awilhelm-projects/following{/other_user}",
"gists_url": "https://api.github.com/users/awilhelm-projects/gists{/gist_id}",
"starred_url": "https://api.github.com/users/awilhelm-projects/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/awilhelm-projects/subscriptions",
"organizations_url": "https://api.github.com/users/awilhelm-projects/orgs",
"repos_url": "https://api.github.com/users/awilhelm-projects/repos",
"events_url": "https://api.github.com/users/awilhelm-projects/events{/privacy}",
"received_events_url": "https://api.github.com/users/awilhelm-projects/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 22
| 2024-04-11T23:27:37
| 2024-11-15T16:57:06
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
I want use multilingual-e5-large or multilingual-e5-base as embedding model, because all other embed models dont work for other languages as english.
### How should we solve this?
Convert multilingual-e5-large and multilingual-e5-base (https://huggingface.co/intfloat/multilingual-e5-base) to gguf and integrate it as models in ollama.
### What is the impact of not solving this?
You cannot use ollama for RAG Application with languages other then english.
### Anything else?
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3606/reactions",
"total_count": 42,
"+1": 42,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3606/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7372
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7372/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7372/comments
|
https://api.github.com/repos/ollama/ollama/issues/7372/events
|
https://github.com/ollama/ollama/issues/7372
| 2,615,651,076
|
I_kwDOJ0Z1Ps6b56sE
| 7,372
|
crash after OLLAMA_MULTIUSER_CACHE=1
|
{
"login": "y-tor",
"id": 38348782,
"node_id": "MDQ6VXNlcjM4MzQ4Nzgy",
"avatar_url": "https://avatars.githubusercontent.com/u/38348782?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/y-tor",
"html_url": "https://github.com/y-tor",
"followers_url": "https://api.github.com/users/y-tor/followers",
"following_url": "https://api.github.com/users/y-tor/following{/other_user}",
"gists_url": "https://api.github.com/users/y-tor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/y-tor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/y-tor/subscriptions",
"organizations_url": "https://api.github.com/users/y-tor/orgs",
"repos_url": "https://api.github.com/users/y-tor/repos",
"events_url": "https://api.github.com/users/y-tor/events{/privacy}",
"received_events_url": "https://api.github.com/users/y-tor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-10-26T08:12:44
| 2024-10-28T23:26:07
| 2024-10-28T23:26:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I start loading a model, such as granite3-dense, I get this error:
error: unknown argument: --multiuser-cache
usage: /usr/lib/ollama/runners/cuda_v12/ollama_llama_server [options]
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.14
|
{
"login": "jessegross",
"id": 6468499,
"node_id": "MDQ6VXNlcjY0Njg0OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/6468499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jessegross",
"html_url": "https://github.com/jessegross",
"followers_url": "https://api.github.com/users/jessegross/followers",
"following_url": "https://api.github.com/users/jessegross/following{/other_user}",
"gists_url": "https://api.github.com/users/jessegross/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jessegross/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jessegross/subscriptions",
"organizations_url": "https://api.github.com/users/jessegross/orgs",
"repos_url": "https://api.github.com/users/jessegross/repos",
"events_url": "https://api.github.com/users/jessegross/events{/privacy}",
"received_events_url": "https://api.github.com/users/jessegross/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7372/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7372/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/3400
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3400/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3400/comments
|
https://api.github.com/repos/ollama/ollama/issues/3400/events
|
https://github.com/ollama/ollama/pull/3400
| 2,214,362,190
|
PR_kwDOJ0Z1Ps5rHXnU
| 3,400
|
Community Integration: ChatOllama
|
{
"login": "sugarforever",
"id": 404421,
"node_id": "MDQ6VXNlcjQwNDQyMQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/404421?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sugarforever",
"html_url": "https://github.com/sugarforever",
"followers_url": "https://api.github.com/users/sugarforever/followers",
"following_url": "https://api.github.com/users/sugarforever/following{/other_user}",
"gists_url": "https://api.github.com/users/sugarforever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sugarforever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sugarforever/subscriptions",
"organizations_url": "https://api.github.com/users/sugarforever/orgs",
"repos_url": "https://api.github.com/users/sugarforever/repos",
"events_url": "https://api.github.com/users/sugarforever/events{/privacy}",
"received_events_url": "https://api.github.com/users/sugarforever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-29T00:02:53
| 2024-03-31T02:46:51
| 2024-03-31T02:46:50
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3400",
"html_url": "https://github.com/ollama/ollama/pull/3400",
"diff_url": "https://github.com/ollama/ollama/pull/3400.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3400.patch",
"merged_at": "2024-03-31T02:46:50"
}
|
# Community Integration - ChatOllama
[ChatOllama](https://github.com/sugarforever/chat-ollama) is an open source chatbot based on LLMs. It supports a wide range of language models including:
- Ollama served models
- OpenAI
- Azure OpenAI
- Anthropic
ChatOllama supports multiple types of chat:
- Free chat with LLMs
- Chat with LLMs based on knowledge base
ChatOllama feature list:
- Ollama models management
- Knowledge bases management
- Chat
- Commercial LLMs API keys management
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3400/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3400/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5080
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5080/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5080/comments
|
https://api.github.com/repos/ollama/ollama/issues/5080/events
|
https://github.com/ollama/ollama/pull/5080
| 2,355,812,772
|
PR_kwDOJ0Z1Ps5ynM7-
| 5,080
|
Add some more debugging logs for intel discovery
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-16T14:43:54
| 2024-06-16T21:42:44
| 2024-06-16T21:42:42
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5080",
"html_url": "https://github.com/ollama/ollama/pull/5080",
"diff_url": "https://github.com/ollama/ollama/pull/5080.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5080.patch",
"merged_at": "2024-06-16T21:42:42"
}
|
Also removes an unused overall count variable
Until we can find a repro to fully root cause the crash, this may help narrow the search space.
Related to #5073
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5080/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5080/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2985
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2985/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2985/comments
|
https://api.github.com/repos/ollama/ollama/issues/2985/events
|
https://github.com/ollama/ollama/pull/2985
| 2,174,521,850
|
PR_kwDOJ0Z1Ps5pAGDj
| 2,985
|
remove empty examples
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-07T18:40:51
| 2024-03-07T18:49:41
| 2024-03-07T18:49:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2985",
"html_url": "https://github.com/ollama/ollama/pull/2985",
"diff_url": "https://github.com/ollama/ollama/pull/2985.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2985.patch",
"merged_at": "2024-03-07T18:49:40"
}
|
resolves #2984
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2985/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2985/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8409
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8409/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8409/comments
|
https://api.github.com/repos/ollama/ollama/issues/8409/events
|
https://github.com/ollama/ollama/issues/8409
| 2,785,953,412
|
I_kwDOJ0Z1Ps6mDkaE
| 8,409
|
Support model alias
|
{
"login": "1zilc",
"id": 44715458,
"node_id": "MDQ6VXNlcjQ0NzE1NDU4",
"avatar_url": "https://avatars.githubusercontent.com/u/44715458?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/1zilc",
"html_url": "https://github.com/1zilc",
"followers_url": "https://api.github.com/users/1zilc/followers",
"following_url": "https://api.github.com/users/1zilc/following{/other_user}",
"gists_url": "https://api.github.com/users/1zilc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/1zilc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/1zilc/subscriptions",
"organizations_url": "https://api.github.com/users/1zilc/orgs",
"repos_url": "https://api.github.com/users/1zilc/repos",
"events_url": "https://api.github.com/users/1zilc/events{/privacy}",
"received_events_url": "https://api.github.com/users/1zilc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 3
| 2025-01-14T01:45:38
| 2025-01-14T02:22:22
| 2025-01-14T02:21:50
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Thanks very much for Ollama's outstanding work, which allows us AI novices to quickly experience the most advanced AI.
Is it possible to provide the following directives to create aliases for models and manage them
```bash
# create
ollama alias create coder qwen2.5-coder:32b
# remove
ollama alias remove coder
# list
ollama alias [list]
qwen2.5-coder:32b coder
llama3.2-vision:latest vision
...
```
I think model aliases can seamlessly use the latest models in downstream tools, such as `Continue` and `Open WebUI`, without making any modifications, because they only need to configure the alias, and the specific association is controlled by ollama.
I'm sorry if my thoughts are inappropriate
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8409/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8409/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2994
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2994/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2994/comments
|
https://api.github.com/repos/ollama/ollama/issues/2994/events
|
https://github.com/ollama/ollama/pull/2994
| 2,175,004,503
|
PR_kwDOJ0Z1Ps5pBwEi
| 2,994
|
tune concurrency manager
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-07T23:27:51
| 2024-08-20T20:15:20
| 2024-08-20T20:15:20
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2994",
"html_url": "https://github.com/ollama/ollama/pull/2994",
"diff_url": "https://github.com/ollama/ollama/pull/2994.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2994.patch",
"merged_at": null
}
|
- higher initial concurrency
- lower cooldown after ramping up
- lower threshold for ramp up
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2994/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2994/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/971
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/971/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/971/comments
|
https://api.github.com/repos/ollama/ollama/issues/971/events
|
https://github.com/ollama/ollama/issues/971
| 1,973,901,785
|
I_kwDOJ0Z1Ps51p1nZ
| 971
|
docker build fails with `not a git repository`
|
{
"login": "j2l",
"id": 65325,
"node_id": "MDQ6VXNlcjY1MzI1",
"avatar_url": "https://avatars.githubusercontent.com/u/65325?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/j2l",
"html_url": "https://github.com/j2l",
"followers_url": "https://api.github.com/users/j2l/followers",
"following_url": "https://api.github.com/users/j2l/following{/other_user}",
"gists_url": "https://api.github.com/users/j2l/gists{/gist_id}",
"starred_url": "https://api.github.com/users/j2l/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/j2l/subscriptions",
"organizations_url": "https://api.github.com/users/j2l/orgs",
"repos_url": "https://api.github.com/users/j2l/repos",
"events_url": "https://api.github.com/users/j2l/events{/privacy}",
"received_events_url": "https://api.github.com/users/j2l/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-11-02T10:01:24
| 2023-11-02T16:58:20
| 2023-11-02T16:58:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Following the issue https://github.com/jmorganca/ollama/issues/797 I tried to build a local gpu version:
```
docker build -t ollama/ollama:gpu .
[+] Building 29.0s (17/18)
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 749B 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 115B 0.0s
=> [internal] load metadata for docker.io/library/ubuntu:22.04 1.6s
=> [internal] load metadata for docker.io/nvidia/cuda:11.8.0-devel-ubuntu 1.0s
=> [auth] library/ubuntu:pull token for registry-1.docker.io 0.0s
=> [auth] nvidia/cuda:pull token for registry-1.docker.io 0.0s
=> [stage-0 1/7] FROM docker.io/nvidia/cuda:11.8.0-devel-ubuntu22.04@sha2 0.1s
=> => resolve docker.io/nvidia/cuda:11.8.0-devel-ubuntu22.04@sha256:7f34d 0.0s
=> => sha256:9763e3487e02d9fbbd336c7159e713b231d4edb0c3fe 2.63kB / 2.63kB 0.0s
=> => sha256:e53c0e24403e6c7eb47f4e45831f3696ce8b0ce103 18.29kB / 18.29kB 0.0s
=> => sha256:7f34d0a2eeacd94238eaf3827d40636aa0c7c77f5bf6e232 743B / 743B 0.0s
=> https://dl.google.com/go/go1.21.3.linux-amd64.tar.gz 0.0s
=> [stage-1 1/3] FROM docker.io/library/ubuntu:22.04@sha256:2b7412e6465c 14.8s
=> => resolve docker.io/library/ubuntu:22.04@sha256:2b7412e6465c3c7fc5bb2 0.0s
=> => sha256:2b7412e6465c3c7fc5bb21d3e6f1917c167358449fec 1.13kB / 1.13kB 0.0s
=> => sha256:c9cf959fd83770dfdefd8fb42cfef0761432af36a764c077 424B / 424B 0.0s
=> => sha256:e4c58958181a5925816faa528ce959e487632f4cfd19 2.30kB / 2.30kB 0.0s
=> => sha256:aece8493d3972efa43bfd4ee3cdba659c0f787f8f 29.54MB / 29.54MB 13.9s
=> => extracting sha256:aece8493d3972efa43bfd4ee3cdba659c0f787f8f59c82fb3 0.7s
=> [internal] load build context 0.2s
=> => transferring context: 1.23MB 0.0s
=> [stage-0 2/7] WORKDIR /go/src/github.com/jmorganca/ollama 0.1s
=> [stage-0 3/7] RUN apt-get update && apt-get install -y git build-esse 17.6s
=> [stage-1 2/3] RUN apt-get update && apt-get install -y ca-certificates 7.8s
=> [stage-0 4/7] ADD https://dl.google.com/go/go1.21.3.linux-amd64.tar.gz 0.3s
=> [stage-0 5/7] RUN mkdir -p /usr/local && tar xz -C /usr/local </tmp/go 2.0s
=> [stage-0 6/7] COPY . . 0.1s
=> ERROR [stage-0 7/7] RUN /usr/local/go/bin/go generate ./... && /us 7.2s
------
> [stage-0 7/7] RUN /usr/local/go/bin/go generate ./... && /usr/local/go/bin/go build .:
#8 0.292 go: downloading github.com/pbnjay/memory v0.0.0-20210728143218-7b4eea64cf58
#8 0.295 go: downloading github.com/dustin/go-humanize v1.0.1
#8 0.295 go: downloading github.com/spf13/cobra v1.7.0
#8 0.295 go: downloading github.com/olekukonko/tablewriter v0.0.5
#8 0.295 go: downloading golang.org/x/crypto v0.14.0
#8 0.295 go: downloading golang.org/x/term v0.13.0
#8 0.298 go: downloading github.com/emirpasic/gods v1.18.1
#8 0.303 go: downloading github.com/gin-contrib/cors v1.4.0
#8 0.304 go: downloading github.com/gin-gonic/gin v1.9.1
#8 0.306 go: downloading golang.org/x/exp v0.0.0-20230817173708-d852ddb80c63
#8 0.312 go: downloading golang.org/x/sync v0.3.0
#8 0.327 go: downloading github.com/mattn/go-runewidth v0.0.14
#8 0.789 go: downloading github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db
#8 0.940 go: downloading github.com/rivo/uniseg v0.2.0
#8 0.940 go: downloading golang.org/x/sys v0.13.0
#8 0.940 go: downloading github.com/spf13/pflag v1.0.5
#8 1.227 go: downloading github.com/gin-contrib/sse v0.1.0
#8 1.227 go: downloading github.com/mattn/go-isatty v0.0.19
#8 1.227 go: downloading github.com/ugorji/go/codec v1.2.11
#8 1.227 go: downloading github.com/pelletier/go-toml/v2 v2.0.8
#8 1.227 go: downloading golang.org/x/net v0.17.0
#8 1.227 go: downloading google.golang.org/protobuf v1.30.0
#8 1.228 go: downloading gopkg.in/yaml.v3 v3.0.1
#8 1.228 go: downloading github.com/go-playground/validator/v10 v10.14.0
#8 1.605 go: downloading github.com/gabriel-vasile/mimetype v1.4.2
#8 1.605 go: downloading github.com/go-playground/universal-translator v0.18.1
#8 1.605 go: downloading github.com/leodido/go-urn v1.2.4
#8 1.605 go: downloading golang.org/x/text v0.13.0
#8 2.139 go: downloading github.com/go-playground/locales v0.14.1
#8 7.038 fatal: not a git repository (or any of the parent directories): .git
#8 7.038 llm/llama.cpp/generate_linux.go:3: running "git": exit status 128
------
Dockerfile:14
--------------------
13 | ENV GOFLAGS=$GOFLAGS
14 | >>> RUN /usr/local/go/bin/go generate ./... \
15 | >>> && /usr/local/go/bin/go build .
16 |
--------------------
error: failed to solve: process "/bin/sh -c /usr/local/go/bin/go generate ./... && /usr/local/go/bin/go build ." did not complete successfully: exit code: 1
```
|
{
"login": "j2l",
"id": 65325,
"node_id": "MDQ6VXNlcjY1MzI1",
"avatar_url": "https://avatars.githubusercontent.com/u/65325?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/j2l",
"html_url": "https://github.com/j2l",
"followers_url": "https://api.github.com/users/j2l/followers",
"following_url": "https://api.github.com/users/j2l/following{/other_user}",
"gists_url": "https://api.github.com/users/j2l/gists{/gist_id}",
"starred_url": "https://api.github.com/users/j2l/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/j2l/subscriptions",
"organizations_url": "https://api.github.com/users/j2l/orgs",
"repos_url": "https://api.github.com/users/j2l/repos",
"events_url": "https://api.github.com/users/j2l/events{/privacy}",
"received_events_url": "https://api.github.com/users/j2l/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/971/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/971/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1336
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1336/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1336/comments
|
https://api.github.com/repos/ollama/ollama/issues/1336/events
|
https://github.com/ollama/ollama/pull/1336
| 2,019,782,990
|
PR_kwDOJ0Z1Ps5g1zzD
| 1,336
|
docker: set PATH, LD_LIBRARY_PATH, and capabilities
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-01T00:32:32
| 2023-12-01T05:16:57
| 2023-12-01T05:16:56
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1336",
"html_url": "https://github.com/ollama/ollama/pull/1336",
"diff_url": "https://github.com/ollama/ollama/pull/1336.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1336.patch",
"merged_at": "2023-12-01T05:16:56"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1336/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7398
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7398/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7398/comments
|
https://api.github.com/repos/ollama/ollama/issues/7398/events
|
https://github.com/ollama/ollama/pull/7398
| 2,618,440,856
|
PR_kwDOJ0Z1Ps6AGiZY
| 7,398
|
Janpf version
|
{
"login": "to-sora",
"id": 60461394,
"node_id": "MDQ6VXNlcjYwNDYxMzk0",
"avatar_url": "https://avatars.githubusercontent.com/u/60461394?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/to-sora",
"html_url": "https://github.com/to-sora",
"followers_url": "https://api.github.com/users/to-sora/followers",
"following_url": "https://api.github.com/users/to-sora/following{/other_user}",
"gists_url": "https://api.github.com/users/to-sora/gists{/gist_id}",
"starred_url": "https://api.github.com/users/to-sora/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/to-sora/subscriptions",
"organizations_url": "https://api.github.com/users/to-sora/orgs",
"repos_url": "https://api.github.com/users/to-sora/repos",
"events_url": "https://api.github.com/users/to-sora/events{/privacy}",
"received_events_url": "https://api.github.com/users/to-sora/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-10-28T13:32:22
| 2024-10-28T13:34:22
| 2024-10-28T13:32:59
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7398",
"html_url": "https://github.com/ollama/ollama/pull/7398",
"diff_url": "https://github.com/ollama/ollama/pull/7398.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7398.patch",
"merged_at": null
}
| null |
{
"login": "to-sora",
"id": 60461394,
"node_id": "MDQ6VXNlcjYwNDYxMzk0",
"avatar_url": "https://avatars.githubusercontent.com/u/60461394?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/to-sora",
"html_url": "https://github.com/to-sora",
"followers_url": "https://api.github.com/users/to-sora/followers",
"following_url": "https://api.github.com/users/to-sora/following{/other_user}",
"gists_url": "https://api.github.com/users/to-sora/gists{/gist_id}",
"starred_url": "https://api.github.com/users/to-sora/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/to-sora/subscriptions",
"organizations_url": "https://api.github.com/users/to-sora/orgs",
"repos_url": "https://api.github.com/users/to-sora/repos",
"events_url": "https://api.github.com/users/to-sora/events{/privacy}",
"received_events_url": "https://api.github.com/users/to-sora/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7398/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7398/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1191
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1191/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1191/comments
|
https://api.github.com/repos/ollama/ollama/issues/1191/events
|
https://github.com/ollama/ollama/issues/1191
| 2,000,459,309
|
I_kwDOJ0Z1Ps53PJYt
| 1,191
|
JSON mode when used from LangChain RAG
|
{
"login": "abaranovskis-redsamurai",
"id": 19287736,
"node_id": "MDQ6VXNlcjE5Mjg3NzM2",
"avatar_url": "https://avatars.githubusercontent.com/u/19287736?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/abaranovskis-redsamurai",
"html_url": "https://github.com/abaranovskis-redsamurai",
"followers_url": "https://api.github.com/users/abaranovskis-redsamurai/followers",
"following_url": "https://api.github.com/users/abaranovskis-redsamurai/following{/other_user}",
"gists_url": "https://api.github.com/users/abaranovskis-redsamurai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/abaranovskis-redsamurai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/abaranovskis-redsamurai/subscriptions",
"organizations_url": "https://api.github.com/users/abaranovskis-redsamurai/orgs",
"repos_url": "https://api.github.com/users/abaranovskis-redsamurai/repos",
"events_url": "https://api.github.com/users/abaranovskis-redsamurai/events{/privacy}",
"received_events_url": "https://api.github.com/users/abaranovskis-redsamurai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-11-18T15:09:50
| 2023-11-20T18:32:34
| 2023-11-20T18:32:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
I would like to ask if there are any plans to support JSON mode response, when Ollama is called from LangChain RAG?
Thanks.
|
{
"login": "abaranovskis-redsamurai",
"id": 19287736,
"node_id": "MDQ6VXNlcjE5Mjg3NzM2",
"avatar_url": "https://avatars.githubusercontent.com/u/19287736?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/abaranovskis-redsamurai",
"html_url": "https://github.com/abaranovskis-redsamurai",
"followers_url": "https://api.github.com/users/abaranovskis-redsamurai/followers",
"following_url": "https://api.github.com/users/abaranovskis-redsamurai/following{/other_user}",
"gists_url": "https://api.github.com/users/abaranovskis-redsamurai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/abaranovskis-redsamurai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/abaranovskis-redsamurai/subscriptions",
"organizations_url": "https://api.github.com/users/abaranovskis-redsamurai/orgs",
"repos_url": "https://api.github.com/users/abaranovskis-redsamurai/repos",
"events_url": "https://api.github.com/users/abaranovskis-redsamurai/events{/privacy}",
"received_events_url": "https://api.github.com/users/abaranovskis-redsamurai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1191/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1191/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8196
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8196/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8196/comments
|
https://api.github.com/repos/ollama/ollama/issues/8196/events
|
https://github.com/ollama/ollama/pull/8196
| 2,753,835,404
|
PR_kwDOJ0Z1Ps6F-Opf
| 8,196
|
chore: upgrade to gods v2
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-12-21T08:06:14
| 2025-01-10T21:50:14
| 2025-01-10T21:50:11
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8196",
"html_url": "https://github.com/ollama/ollama/pull/8196",
"diff_url": "https://github.com/ollama/ollama/pull/8196.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8196.patch",
"merged_at": "2025-01-10T21:50:11"
}
|
gods v2 uses go generics rather than interfaces which simplifies the code considerably
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8196/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8196/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1809
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1809/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1809/comments
|
https://api.github.com/repos/ollama/ollama/issues/1809/events
|
https://github.com/ollama/ollama/issues/1809
| 2,067,540,347
|
I_kwDOJ0Z1Ps57PCl7
| 1,809
|
[ENHANCEMENT] Add more tests to avoid regressions
|
{
"login": "rgaidot",
"id": 5269,
"node_id": "MDQ6VXNlcjUyNjk=",
"avatar_url": "https://avatars.githubusercontent.com/u/5269?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rgaidot",
"html_url": "https://github.com/rgaidot",
"followers_url": "https://api.github.com/users/rgaidot/followers",
"following_url": "https://api.github.com/users/rgaidot/following{/other_user}",
"gists_url": "https://api.github.com/users/rgaidot/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rgaidot/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rgaidot/subscriptions",
"organizations_url": "https://api.github.com/users/rgaidot/orgs",
"repos_url": "https://api.github.com/users/rgaidot/repos",
"events_url": "https://api.github.com/users/rgaidot/events{/privacy}",
"received_events_url": "https://api.github.com/users/rgaidot/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-01-05T15:26:03
| 2024-01-06T11:49:22
| 2024-01-05T22:07:59
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
For example on this file https://github.com/jmorganca/ollama/blob/main/parser/parser.go
_Warning: I did not validate my code, I did it blind._
```go
package main
import (
"strings"
"testing"
)
func TestParser(t *testing.T) {
input :=
`
FROM model1
ADAPTER adapter1
LICENSE MIT
PARAMETER param1 value1
PARAMETER param2 value2
TEMPLATE template1
`
reader := strings.NewReader(input)
commands, err := Parse(reader)
if err != nil {
t.Errorf("Error parsing commands: %v", err)
}
expectedCommands := []Command{
{Name: "model", Args: "model1"},
{Name: "adapter", Args: "adapter1"},
{Name: "license", Args: "MIT"},
{Name: "parameter", Args: "param1 value1"},
{Name: "parameter", Args: "param2 value2"},
{Name: "template", Args: "template1"},
}
if !compareCommands(commands, expectedCommands) {
t.Errorf("Parsed commands do not match expected commands.")
}
}
```
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1809/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1809/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6479
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6479/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6479/comments
|
https://api.github.com/repos/ollama/ollama/issues/6479/events
|
https://github.com/ollama/ollama/issues/6479
| 2,483,490,821
|
I_kwDOJ0Z1Ps6UBxAF
| 6,479
|
v0.3.7-rc5 no longer uses multiple GPUs for a single model
|
{
"login": "Maltz42",
"id": 20978744,
"node_id": "MDQ6VXNlcjIwOTc4NzQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/20978744?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Maltz42",
"html_url": "https://github.com/Maltz42",
"followers_url": "https://api.github.com/users/Maltz42/followers",
"following_url": "https://api.github.com/users/Maltz42/following{/other_user}",
"gists_url": "https://api.github.com/users/Maltz42/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Maltz42/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Maltz42/subscriptions",
"organizations_url": "https://api.github.com/users/Maltz42/orgs",
"repos_url": "https://api.github.com/users/Maltz42/repos",
"events_url": "https://api.github.com/users/Maltz42/events{/privacy}",
"received_events_url": "https://api.github.com/users/Maltz42/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-08-23T16:36:25
| 2024-08-23T22:11:57
| 2024-08-23T22:11:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Moving from 0.3.6 to 0.3.7-rc5, Ollama no longer uses both GPUs for a single model when the model will not fit on one card. If I load two models, though, it will use the second card to load the second model. Output of "ollama ps" and "nvidia-smi" below.
```
ollama ps:
NAME ID SIZE PROCESSOR UNTIL
mistral-large:latest 0ca7dfa0bf06 71 GB 29%/71% CPU/GPU 4 minutes from now
nvidia-smi:
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.183.01 Driver Version: 535.183.01 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA RTX A6000 On | 00000000:01:00.0 Off | Off |
| 37% 67C P2 176W / 300W | 47681MiB / 49140MiB | 37% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
| 1 NVIDIA RTX A6000 On | 00000000:E1:00.0 Off | Off |
| 30% 49C P8 22W / 300W | 4MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| 0 N/A N/A 262888 C ...unners/cuda_v11/ollama_llama_server 47674MiB |
+---------------------------------------------------------------------------------------+
```
And here's with two models loaded (but inactive):
```
NAME ID SIZE PROCESSOR UNTIL
gemma2:27b 53261bc9c192 20 GB 100% GPU 4 minutes from now
mistral-large:latest 0ca7dfa0bf06 71 GB 29%/71% CPU/GPU 2 minutes from now
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.183.01 Driver Version: 535.183.01 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA RTX A6000 On | 00000000:01:00.0 Off | Off |
| 30% 51C P8 25W / 300W | 47681MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
| 1 NVIDIA RTX A6000 On | 00000000:E1:00.0 Off | Off |
| 37% 57C P8 25W / 300W | 19053MiB / 49140MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| 0 N/A N/A 262888 C ...unners/cuda_v11/ollama_llama_server 47674MiB |
| 1 N/A N/A 278516 C ...unners/cuda_v11/ollama_llama_server 19046MiB |
+---------------------------------------------------------------------------------------+
```
### OS
Linux
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.3.7-rc5
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6479/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6479/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8247
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8247/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8247/comments
|
https://api.github.com/repos/ollama/ollama/issues/8247/events
|
https://github.com/ollama/ollama/issues/8247
| 2,759,806,638
|
I_kwDOJ0Z1Ps6kf06u
| 8,247
|
Enhanced System Observability for Multi-Server Environments (Unified Endpoints?)
|
{
"login": "dezoito",
"id": 6494010,
"node_id": "MDQ6VXNlcjY0OTQwMTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/6494010?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dezoito",
"html_url": "https://github.com/dezoito",
"followers_url": "https://api.github.com/users/dezoito/followers",
"following_url": "https://api.github.com/users/dezoito/following{/other_user}",
"gists_url": "https://api.github.com/users/dezoito/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dezoito/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dezoito/subscriptions",
"organizations_url": "https://api.github.com/users/dezoito/orgs",
"repos_url": "https://api.github.com/users/dezoito/repos",
"events_url": "https://api.github.com/users/dezoito/events{/privacy}",
"received_events_url": "https://api.github.com/users/dezoito/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 5
| 2024-12-26T13:57:26
| 2025-01-13T01:47:21
| 2025-01-13T01:47:21
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
As Ollama adoption grows, the lack of comprehensive system metrics makes it challenging to meet standard operational requirements - monitoring, alerting, and planning across development, staging, and production environments.
This can also prevent a wider adoption in commercial and production applications.
While the current endpoints (`/api/version`, `/api/tags`, `/api/ps`) provide basic information, consolidating and expanding these into a single observability endpoint would significantly improve monitoring and management capabilities,
Proposed: New `/api/info` endpoint returning unified system metrics, like:
```json
{
"version": "0.1.16",
"system": {
"cpu": {
"cores": 16,
"threads": 32,
"usage_percent": 45.2,
...
},
"memory": {
"total_bytes": 34359738368,
"used_bytes": 28859738368,
...
},
"gpus": [
{
"name": "NVIDIA GeForce RTX 4090",
"memory": {
"total_bytes": 25769803776,
"used_bytes": 16106127360,
...
},
}
]
},
"models": {
"loaded": [...],
"available": [...],
},
}
```
This enhancement would:
- Enable proper monitoring and alerting in production environments
- Simplify capacity planning across deployments
- Allow integration with standard observability tools
To the community:
- What other metrics or stats would be useful to have?
- How do you currently monitor or observe your running Ollama instances?
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8247/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8247/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2023
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2023/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2023/comments
|
https://api.github.com/repos/ollama/ollama/issues/2023/events
|
https://github.com/ollama/ollama/issues/2023
| 2,084,931,185
|
I_kwDOJ0Z1Ps58RYZx
| 2,023
|
Enable Prompt Caching by Default
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 3
| 2024-01-16T21:06:43
| 2024-07-09T15:27:54
| 2024-05-06T23:48:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I had to disable prompt caching due to requests getting stuck: #1994
We should bring this back when we have a mitigation for the inference issue:
https://github.com/ggerganov/llama.cpp/issues/4989
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2023/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2023/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/697
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/697/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/697/comments
|
https://api.github.com/repos/ollama/ollama/issues/697/events
|
https://github.com/ollama/ollama/issues/697
| 1,926,053,967
|
I_kwDOJ0Z1Ps5yzUBP
| 697
|
Can not download the model of codellama:13b
|
{
"login": "danny-su",
"id": 12178855,
"node_id": "MDQ6VXNlcjEyMTc4ODU1",
"avatar_url": "https://avatars.githubusercontent.com/u/12178855?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/danny-su",
"html_url": "https://github.com/danny-su",
"followers_url": "https://api.github.com/users/danny-su/followers",
"following_url": "https://api.github.com/users/danny-su/following{/other_user}",
"gists_url": "https://api.github.com/users/danny-su/gists{/gist_id}",
"starred_url": "https://api.github.com/users/danny-su/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/danny-su/subscriptions",
"organizations_url": "https://api.github.com/users/danny-su/orgs",
"repos_url": "https://api.github.com/users/danny-su/repos",
"events_url": "https://api.github.com/users/danny-su/events{/privacy}",
"received_events_url": "https://api.github.com/users/danny-su/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 9
| 2023-10-04T11:58:47
| 2023-10-06T07:15:41
| 2023-10-06T07:15:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
<img width="1004" alt="image" src="https://github.com/jmorganca/ollama/assets/12178855/0de2b8b6-6b26-4f67-b70e-b73de8020852">
|
{
"login": "danny-su",
"id": 12178855,
"node_id": "MDQ6VXNlcjEyMTc4ODU1",
"avatar_url": "https://avatars.githubusercontent.com/u/12178855?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/danny-su",
"html_url": "https://github.com/danny-su",
"followers_url": "https://api.github.com/users/danny-su/followers",
"following_url": "https://api.github.com/users/danny-su/following{/other_user}",
"gists_url": "https://api.github.com/users/danny-su/gists{/gist_id}",
"starred_url": "https://api.github.com/users/danny-su/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/danny-su/subscriptions",
"organizations_url": "https://api.github.com/users/danny-su/orgs",
"repos_url": "https://api.github.com/users/danny-su/repos",
"events_url": "https://api.github.com/users/danny-su/events{/privacy}",
"received_events_url": "https://api.github.com/users/danny-su/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/697/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/697/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/574
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/574/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/574/comments
|
https://api.github.com/repos/ollama/ollama/issues/574/events
|
https://github.com/ollama/ollama/pull/574
| 1,909,265,837
|
PR_kwDOJ0Z1Ps5bAqKz
| 574
|
Added a new community project
|
{
"login": "TwanLuttik",
"id": 19343894,
"node_id": "MDQ6VXNlcjE5MzQzODk0",
"avatar_url": "https://avatars.githubusercontent.com/u/19343894?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TwanLuttik",
"html_url": "https://github.com/TwanLuttik",
"followers_url": "https://api.github.com/users/TwanLuttik/followers",
"following_url": "https://api.github.com/users/TwanLuttik/following{/other_user}",
"gists_url": "https://api.github.com/users/TwanLuttik/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TwanLuttik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TwanLuttik/subscriptions",
"organizations_url": "https://api.github.com/users/TwanLuttik/orgs",
"repos_url": "https://api.github.com/users/TwanLuttik/repos",
"events_url": "https://api.github.com/users/TwanLuttik/events{/privacy}",
"received_events_url": "https://api.github.com/users/TwanLuttik/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-09-22T17:17:40
| 2023-09-25T14:42:01
| 2023-09-25T14:40:59
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/574",
"html_url": "https://github.com/ollama/ollama/pull/574",
"diff_url": "https://github.com/ollama/ollama/pull/574.diff",
"patch_url": "https://github.com/ollama/ollama/pull/574.patch",
"merged_at": "2023-09-25T14:40:59"
}
| null |
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/574/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/574/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2696
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2696/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2696/comments
|
https://api.github.com/repos/ollama/ollama/issues/2696/events
|
https://github.com/ollama/ollama/issues/2696
| 2,150,014,256
|
I_kwDOJ0Z1Ps6AJp0w
| 2,696
|
`ollama` process on macOS using up a lot of RAM while being idle
|
{
"login": "siikdUde",
"id": 10148714,
"node_id": "MDQ6VXNlcjEwMTQ4NzE0",
"avatar_url": "https://avatars.githubusercontent.com/u/10148714?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/siikdUde",
"html_url": "https://github.com/siikdUde",
"followers_url": "https://api.github.com/users/siikdUde/followers",
"following_url": "https://api.github.com/users/siikdUde/following{/other_user}",
"gists_url": "https://api.github.com/users/siikdUde/gists{/gist_id}",
"starred_url": "https://api.github.com/users/siikdUde/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/siikdUde/subscriptions",
"organizations_url": "https://api.github.com/users/siikdUde/orgs",
"repos_url": "https://api.github.com/users/siikdUde/repos",
"events_url": "https://api.github.com/users/siikdUde/events{/privacy}",
"received_events_url": "https://api.github.com/users/siikdUde/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677279472,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjf8y8A",
"url": "https://api.github.com/repos/ollama/ollama/labels/macos",
"name": "macos",
"color": "E2DBC0",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 11
| 2024-02-22T22:01:32
| 2024-05-05T18:43:38
| 2024-05-05T18:43:38
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
<img width="1081" alt="SCR-20240222-ozbm" src="https://github.com/ollama/ollama/assets/10148714/575001a0-9b9a-4e08-ba8c-f0321ec3e6df">
As you can see, ollama is the second most resource intensive application. I am not actively running any models, just the app is open. Any idea why this is?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2696/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2696/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3800
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3800/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3800/comments
|
https://api.github.com/repos/ollama/ollama/issues/3800/events
|
https://github.com/ollama/ollama/issues/3800
| 2,255,172,519
|
I_kwDOJ0Z1Ps6GazOn
| 3,800
|
Auto-Save Functionality
|
{
"login": "M3cubo",
"id": 1382596,
"node_id": "MDQ6VXNlcjEzODI1OTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/1382596?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/M3cubo",
"html_url": "https://github.com/M3cubo",
"followers_url": "https://api.github.com/users/M3cubo/followers",
"following_url": "https://api.github.com/users/M3cubo/following{/other_user}",
"gists_url": "https://api.github.com/users/M3cubo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/M3cubo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/M3cubo/subscriptions",
"organizations_url": "https://api.github.com/users/M3cubo/orgs",
"repos_url": "https://api.github.com/users/M3cubo/repos",
"events_url": "https://api.github.com/users/M3cubo/events{/privacy}",
"received_events_url": "https://api.github.com/users/M3cubo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-04-21T16:56:34
| 2024-05-15T10:10:04
| 2024-05-14T22:50:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello, I am currently using Ollama for interactive terminal sessions and I find it to be an extremely useful tool. One of the features I am interested in is the ability to automatically save each addition to the conversation during an `ollama run <model>` session.
### Feature Request
I would like to inquire if there is an existing feature similar to `ollama run <model> --save`, where `--save` would automatically save each new addition to the conversation without requiring manual input each time. This would be particularly helpful for ensuring that all interactions are preserved without the need to manually input save commands after each response.
### Alternative
If implementing this feature is not feasible at the moment, could you provide guidance or recommend a best practice for automating this process using a shell script or any other method? Currently, I am considering using tools like `expect` to automate interactions, but any advice or recommendations would be greatly appreciated.
Thank you for considering this request. I believe this feature could greatly enhance the usability and efficiency of using Ollama for many users.
Best regards,
Martín
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3800/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3800/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2395
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2395/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2395/comments
|
https://api.github.com/repos/ollama/ollama/issues/2395/events
|
https://github.com/ollama/ollama/issues/2395
| 2,123,723,886
|
I_kwDOJ0Z1Ps5-lXRu
| 2,395
|
Multi-GPU setup of Tesla P100s is slow
|
{
"login": "PhilipAmadasun",
"id": 55031054,
"node_id": "MDQ6VXNlcjU1MDMxMDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/55031054?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PhilipAmadasun",
"html_url": "https://github.com/PhilipAmadasun",
"followers_url": "https://api.github.com/users/PhilipAmadasun/followers",
"following_url": "https://api.github.com/users/PhilipAmadasun/following{/other_user}",
"gists_url": "https://api.github.com/users/PhilipAmadasun/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PhilipAmadasun/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PhilipAmadasun/subscriptions",
"organizations_url": "https://api.github.com/users/PhilipAmadasun/orgs",
"repos_url": "https://api.github.com/users/PhilipAmadasun/repos",
"events_url": "https://api.github.com/users/PhilipAmadasun/events{/privacy}",
"received_events_url": "https://api.github.com/users/PhilipAmadasun/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-02-07T19:27:34
| 2024-03-21T13:58:19
| 2024-03-21T13:58:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
A multi-GPU setup of Tesla P100s is very slow compared to a single RTX 4090. I am using the 0.1.22 version of ollama. Is there something wrong with the Teslas? Are they just bad GPUs? I was told to try to run ollama on just one of them to see what happens, if that might indeed make ollama run faster I am not sure how to go about it. Is there some way to disable GPU so ollama only runs on a single Tesla GPU?
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2395/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2395/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8374
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8374/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8374/comments
|
https://api.github.com/repos/ollama/ollama/issues/8374/events
|
https://github.com/ollama/ollama/issues/8374
| 2,780,362,293
|
I_kwDOJ0Z1Ps6luPY1
| 8,374
|
different between Modelfile PARAMETER and API
|
{
"login": "SDAIer",
"id": 174102361,
"node_id": "U_kgDOCmCXWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/174102361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SDAIer",
"html_url": "https://github.com/SDAIer",
"followers_url": "https://api.github.com/users/SDAIer/followers",
"following_url": "https://api.github.com/users/SDAIer/following{/other_user}",
"gists_url": "https://api.github.com/users/SDAIer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SDAIer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SDAIer/subscriptions",
"organizations_url": "https://api.github.com/users/SDAIer/orgs",
"repos_url": "https://api.github.com/users/SDAIer/repos",
"events_url": "https://api.github.com/users/SDAIer/events{/privacy}",
"received_events_url": "https://api.github.com/users/SDAIer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2025-01-10T14:48:53
| 2025-01-10T15:19:18
| 2025-01-10T15:19:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I wanna know what's the difference between PARAMETER on Modelfile and on API
such as num_ctx 2048 .
If my use case is to call Ollama via API, and for the sake of convenient calling, can I use a Modelfile to create a new model with num_ctx defined that meets my requirements (provided that the model parameters and GPU capabilities allow it)?
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "SDAIer",
"id": 174102361,
"node_id": "U_kgDOCmCXWQ",
"avatar_url": "https://avatars.githubusercontent.com/u/174102361?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SDAIer",
"html_url": "https://github.com/SDAIer",
"followers_url": "https://api.github.com/users/SDAIer/followers",
"following_url": "https://api.github.com/users/SDAIer/following{/other_user}",
"gists_url": "https://api.github.com/users/SDAIer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SDAIer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SDAIer/subscriptions",
"organizations_url": "https://api.github.com/users/SDAIer/orgs",
"repos_url": "https://api.github.com/users/SDAIer/repos",
"events_url": "https://api.github.com/users/SDAIer/events{/privacy}",
"received_events_url": "https://api.github.com/users/SDAIer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8374/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8374/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/330
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/330/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/330/comments
|
https://api.github.com/repos/ollama/ollama/issues/330/events
|
https://github.com/ollama/ollama/issues/330
| 1,846,771,571
|
I_kwDOJ0Z1Ps5uE39z
| 330
|
ollama pull llama2:70b stuck
|
{
"login": "sarvagnan",
"id": 860916,
"node_id": "MDQ6VXNlcjg2MDkxNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/860916?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sarvagnan",
"html_url": "https://github.com/sarvagnan",
"followers_url": "https://api.github.com/users/sarvagnan/followers",
"following_url": "https://api.github.com/users/sarvagnan/following{/other_user}",
"gists_url": "https://api.github.com/users/sarvagnan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sarvagnan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sarvagnan/subscriptions",
"organizations_url": "https://api.github.com/users/sarvagnan/orgs",
"repos_url": "https://api.github.com/users/sarvagnan/repos",
"events_url": "https://api.github.com/users/sarvagnan/events{/privacy}",
"received_events_url": "https://api.github.com/users/sarvagnan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 13
| 2023-08-11T12:50:39
| 2024-08-28T22:04:58
| 2023-08-23T18:49:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I have tried to pull llama2:70b but ollama appears to be stuck in the "pulling manifest" stage. This repeats after cancelling as well. I tried pulling orca and that downloaded without any issues. I have appended the server log from the logs folder. These logs are repeated with almost identical times each run.
Thank you in advance for any help that you can provide.
```
[GIN] 2023/08/11 - 17:30:26 | 200 | 2.584µs | 127.0.0.1 | HEAD "/"
2023/08/11 17:30:28 images.go:1164: redirected to: https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/8c/8c17c2ebb0ea011be9981cc3922db8ca8fa61e828c5d3f44cb6ae342bf80460b/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20230811%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20230811T120028Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=a5aa71ee7e1eb700ed450dfb3a31a31a27c13d86617fd8a08b17860894055c13
2023/08/11 17:30:31 download.go:213: success getting sha256:8c17c2ebb0ea011be9981cc3922db8ca8fa61e828c5d3f44cb6ae342bf80460b
2023/08/11 17:30:32 images.go:1164: redirected to: https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/7c/7c23fb36d80141c4ab8cdbb61ee4790102ebd2bf7aeff414453177d4f2110e5d/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20230811%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20230811T120032Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=65d955ee08e83d4b875cce6c584ce45c08ebe74d102161ffa0c26c325b027795
2023/08/11 17:30:33 download.go:213: success getting sha256:7c23fb36d80141c4ab8cdbb61ee4790102ebd2bf7aeff414453177d4f2110e5d
2023/08/11 17:30:34 images.go:1164: redirected to: https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/57/578a2e81f7064c5118b93336dbe53dff6049bbeb4a8cee6c32a87579022e1aba/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20230811%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20230811T120034Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=bfa4befa8b20e0c3a6f68b7af4764ad9a1485735da82c5d1c54a9336b107a76d
2023/08/11 17:30:35 download.go:213: success getting sha256:578a2e81f7064c5118b93336dbe53dff6049bbeb4a8cee6c32a87579022e1aba
2023/08/11 17:30:36 images.go:1164: redirected to: https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/e3/e35ab70a78c78ebbbc4d2e2eaec8259938a6a60c34ebd9fd2e0c8b20f2cdcfc5/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20230811%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20230811T120036Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=87e970f689aabcb7f6e8473b80d7dd67509b177a91df1991e67ae71387fdbf4a
2023/08/11 17:30:36 download.go:213: success getting sha256:e35ab70a78c78ebbbc4d2e2eaec8259938a6a60c34ebd9fd2e0c8b20f2cdcfc5
2023/08/11 17:30:38 images.go:1164: redirected to: https://dd20bb891979d25aebc8bec07b2b3bbc.r2.cloudflarestorage.com/ollama/docker/registry/v2/blobs/sha256/96/96862bb35d7760e607f893b81ddef58a0288de62aaf66200b3a0e99c3e4956e5/data?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=66040c77ac1b787c3af820529859349a%2F20230811%2Fauto%2Fs3%2Faws4_request&X-Amz-Date=20230811T120037Z&X-Amz-Expires=1200&X-Amz-SignedHeaders=host&X-Amz-Signature=f2022dfb695b9c4c3273a119aea47def6ffaa2e4198de415c86765df8c53729d
2023/08/11 17:30:39 download.go:213: success getting sha256:96862bb35d7760e607f893b81ddef58a0288de62aaf66200b3a0e99c3e4956e5
[GIN] 2023/08/11 - 17:30:41 | 200 | 14.927754625s | 127.0.0.1 | POST "/api/pull"
[GIN] 2023/08/11 - 17:30:56 | 200 | 2.208µs | 127.0.0.1 | HEAD "/"
[GIN] 2023/08/11 - 17:34:50 | 200 | 2.709µs | 127.0.0.1 | HEAD "/"
[GIN] 2023/08/11 - 17:34:50 | 404 | 185.542µs | 127.0.0.1 | DELETE "/api/delete"
[GIN] 2023/08/11 - 17:35:03 | 200 | 3.083µs | 127.0.0.1 | HEAD "/"
[GIN] 2023/08/11 - 17:35:04 | 200 | 843.496584ms | 127.0.0.1 | POST "/api/pull"
[GIN] 2023/08/11 - 17:36:28 | 200 | 2.458µs | 127.0.0.1 | HEAD "/"
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/330/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/330/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/503
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/503/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/503/comments
|
https://api.github.com/repos/ollama/ollama/issues/503/events
|
https://github.com/ollama/ollama/issues/503
| 1,889,076,152
|
I_kwDOJ0Z1Ps5wmQO4
| 503
|
ollama pull llama2 error
|
{
"login": "EasonZhaoZ",
"id": 6023767,
"node_id": "MDQ6VXNlcjYwMjM3Njc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6023767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/EasonZhaoZ",
"html_url": "https://github.com/EasonZhaoZ",
"followers_url": "https://api.github.com/users/EasonZhaoZ/followers",
"following_url": "https://api.github.com/users/EasonZhaoZ/following{/other_user}",
"gists_url": "https://api.github.com/users/EasonZhaoZ/gists{/gist_id}",
"starred_url": "https://api.github.com/users/EasonZhaoZ/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/EasonZhaoZ/subscriptions",
"organizations_url": "https://api.github.com/users/EasonZhaoZ/orgs",
"repos_url": "https://api.github.com/users/EasonZhaoZ/repos",
"events_url": "https://api.github.com/users/EasonZhaoZ/events{/privacy}",
"received_events_url": "https://api.github.com/users/EasonZhaoZ/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-09-10T09:41:27
| 2023-09-11T03:20:18
| 2023-09-10T13:50:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
404 Client Error: Not Found for url: https://ollama.ai/api/models
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/503/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/503/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4475
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4475/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4475/comments
|
https://api.github.com/repos/ollama/ollama/issues/4475/events
|
https://github.com/ollama/ollama/issues/4475
| 2,300,311,089
|
I_kwDOJ0Z1Ps6JG_Yx
| 4,475
|
It is possible to enable OpenAI Api in Docker image
|
{
"login": "Tomichi",
"id": 2265229,
"node_id": "MDQ6VXNlcjIyNjUyMjk=",
"avatar_url": "https://avatars.githubusercontent.com/u/2265229?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Tomichi",
"html_url": "https://github.com/Tomichi",
"followers_url": "https://api.github.com/users/Tomichi/followers",
"following_url": "https://api.github.com/users/Tomichi/following{/other_user}",
"gists_url": "https://api.github.com/users/Tomichi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Tomichi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Tomichi/subscriptions",
"organizations_url": "https://api.github.com/users/Tomichi/orgs",
"repos_url": "https://api.github.com/users/Tomichi/repos",
"events_url": "https://api.github.com/users/Tomichi/events{/privacy}",
"received_events_url": "https://api.github.com/users/Tomichi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-05-16T12:45:10
| 2024-05-16T21:44:17
| 2024-05-16T18:56:30
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
I want ask you if it is possible to enable openAI api compatibility to official OIlama Docker image. I try that feature works in desktop app well and it's missing in docker image. https://ollama.com/blog/openai-compatibility Desktop app works well.
Thank you for anybody helps.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4475/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4475/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8617
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8617/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8617/comments
|
https://api.github.com/repos/ollama/ollama/issues/8617/events
|
https://github.com/ollama/ollama/issues/8617
| 2,814,009,755
|
I_kwDOJ0Z1Ps6numGb
| 8,617
|
Support Request for jonatasgrosman/wav2vec2-large-xlsr-53-italian
|
{
"login": "raphael10-collab",
"id": 70313067,
"node_id": "MDQ6VXNlcjcwMzEzMDY3",
"avatar_url": "https://avatars.githubusercontent.com/u/70313067?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/raphael10-collab",
"html_url": "https://github.com/raphael10-collab",
"followers_url": "https://api.github.com/users/raphael10-collab/followers",
"following_url": "https://api.github.com/users/raphael10-collab/following{/other_user}",
"gists_url": "https://api.github.com/users/raphael10-collab/gists{/gist_id}",
"starred_url": "https://api.github.com/users/raphael10-collab/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/raphael10-collab/subscriptions",
"organizations_url": "https://api.github.com/users/raphael10-collab/orgs",
"repos_url": "https://api.github.com/users/raphael10-collab/repos",
"events_url": "https://api.github.com/users/raphael10-collab/events{/privacy}",
"received_events_url": "https://api.github.com/users/raphael10-collab/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 3
| 2025-01-27T20:37:55
| 2025-01-27T20:44:21
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
(.venv) raphy@raohy:~/llama.cpp$ git clone https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-italian
Cloning into 'wav2vec2-large-xlsr-53-italian'...
remote: Enumerating objects: 99, done.
remote: Total 99 (delta 0), reused 0 (delta 0), pack-reused 99 (from 1)
Unpacking objects: 100% (99/99), 545.41 KiB | 1.55 MiB/s, done.
Filtering content: 100% (2/2), 2.35 GiB | 92.80 MiB/s, done.
(.venv) raphy@raohy:~/llama.cpp$ ollama create Modelfile
transferring model data
unpacking model metadata
Error: Models based on 'Wav2Vec2ForCTC' are not yet supported
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8617/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8617/timeline
| null | null | false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.