url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/4508
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4508/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4508/comments
|
https://api.github.com/repos/ollama/ollama/issues/4508/events
|
https://github.com/ollama/ollama/pull/4508
| 2,303,689,972
|
PR_kwDOJ0Z1Ps5v18ny
| 4,508
|
add OLLAMA_NOHISTORY to turn off history in interactive mode
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-05-17T23:38:16
| 2025-01-18T05:03:26
| 2024-05-18T18:51:57
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4508",
"html_url": "https://github.com/ollama/ollama/pull/4508",
"diff_url": "https://github.com/ollama/ollama/pull/4508.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4508.patch",
"merged_at": "2024-05-18T18:51:57"
}
|
fixes #3002
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4508/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4508/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5298
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5298/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5298/comments
|
https://api.github.com/repos/ollama/ollama/issues/5298/events
|
https://github.com/ollama/ollama/issues/5298
| 2,375,365,380
|
I_kwDOJ0Z1Ps6NlTME
| 5,298
|
Internal error at url manifests/sha256:
|
{
"login": "alexeu1994",
"id": 20879475,
"node_id": "MDQ6VXNlcjIwODc5NDc1",
"avatar_url": "https://avatars.githubusercontent.com/u/20879475?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alexeu1994",
"html_url": "https://github.com/alexeu1994",
"followers_url": "https://api.github.com/users/alexeu1994/followers",
"following_url": "https://api.github.com/users/alexeu1994/following{/other_user}",
"gists_url": "https://api.github.com/users/alexeu1994/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alexeu1994/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alexeu1994/subscriptions",
"organizations_url": "https://api.github.com/users/alexeu1994/orgs",
"repos_url": "https://api.github.com/users/alexeu1994/repos",
"events_url": "https://api.github.com/users/alexeu1994/events{/privacy}",
"received_events_url": "https://api.github.com/users/alexeu1994/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 12
| 2024-06-26T13:31:28
| 2025-01-09T17:55:25
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Sonatype nexus proxy was configured and worked, but 2 weeks ago it started giving an error when requesting a manifest.
```
2024-06-24 22:19:00,375+0300 DEBUG [nexus-httpclient-eviction-thread] *SYSTEM org.apache.http.impl.conn.CPool - Connection [id:3377][route:{s}->[https://registry.ollama.ai:443](https://registry.ollama.ai/)][state:null] expired @ Mon Jun 24 22:18:56 MSK 2024
2024-06-24 22:20:23,263+0300 DEBUG [qtp1079333755-4212] anonymous org.sonatype.nexus.httpclient.outbound - https://registry.ollama.ai/v2/library/all-minilm/manifests/latest > GET /v2/library/all-minilm/manifests/latest HTTP/1.1
2024-06-24 22:20:25,591+0300 DEBUG [qtp1079333755-4212] anonymous org.sonatype.nexus.httpclient.outbound - https://registry.ollama.ai/v2/library/all-minilm/manifests/latest < HTTP/1.1 200 OK @ 2.329 s
2024-06-24 22:20:25,594+0300 DEBUG [qtp1079333755-4212] anonymous org.sonatype.nexus.httpclient.outbound - https://registry.ollama.ai/v2/library/all-minilm/manifests/sha256:e05286252610b4a128ff0f46cff09a65898dfff124afd0778b5a3705947869cb > GET /v2/library/all-minilm/manifests/sha256:e05286252610b4a128ff0f46cff09a65898dfff124afd0778b5a3705947869cb HTTP/1.1
2024-06-24 22:20:25,953+0300 DEBUG [qtp1079333755-4212] anonymous org.sonatype.nexus.httpclient.outbound - https://registry.ollama.ai/v2/library/all-minilm/manifests/sha256:e05286252610b4a128ff0f46cff09a65898dfff124afd0778b5a3705947869cb < HTTP/1.1 500 Internal Server Error @ 358.1 ms
2024-06-24 22:20:25,953+0300 INFO [qtp1079333755-4212] anonymous org.sonatype.nexus.repository.httpclient.internal.HttpClientFacetImpl - Repository status for ollamaio-docker-proxy changed from AVAILABLE to UNAVAILABLE - reason Internal Server Error for https://registry.ollama.ai/
```
Nexus tries to get sha file but gets an error
https://registry.ollama.ai/v2/library/all-minilm/manifests/sha256:e05286252610b4a128ff0f46cff09a65898dfff124afd0778b5a3705947869cb
registry.ollama.ai can`t handle colon at url manifests
**any link with colon response error**
https://registry.ollama.ai/v2/library/all-minilm/manifests/anyurl:a
500 Internal Server Error
{"errors":[{"code":"INTERNAL_ERROR","message":"internal error","detail":null}]}
**any link without colon work correctly**
404 Not Found
https://registry.ollama.ai/v2/library/all-minilm/manifests/anyurl
{"errors":[{"code":"MANIFEST_UNKNOWN","message":"manifest unknown"}]}
Can you fix "500 Internal Server Error"?
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5298/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5298/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2025
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2025/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2025/comments
|
https://api.github.com/repos/ollama/ollama/issues/2025/events
|
https://github.com/ollama/ollama/issues/2025
| 2,084,978,849
|
I_kwDOJ0Z1Ps58RkCh
| 2,025
|
model stable-code is not stable
|
{
"login": "iplayfast",
"id": 751306,
"node_id": "MDQ6VXNlcjc1MTMwNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/751306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iplayfast",
"html_url": "https://github.com/iplayfast",
"followers_url": "https://api.github.com/users/iplayfast/followers",
"following_url": "https://api.github.com/users/iplayfast/following{/other_user}",
"gists_url": "https://api.github.com/users/iplayfast/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iplayfast/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iplayfast/subscriptions",
"organizations_url": "https://api.github.com/users/iplayfast/orgs",
"repos_url": "https://api.github.com/users/iplayfast/repos",
"events_url": "https://api.github.com/users/iplayfast/events{/privacy}",
"received_events_url": "https://api.github.com/users/iplayfast/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-01-16T21:29:11
| 2024-03-11T18:36:43
| 2024-03-11T18:36:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
what languges do you know results in an endless display of ```
. This particular event has actually just been added to our entire project code base here above, which means that a new unique identifier for this particular event has also been generated automatically by my very special personal computer system right now and which is why it can be said with some certainty
that the following thing has happened:
=>
```
Asked to create a snake game in python it does 1/2 the program in python and the other half in c++.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2025/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2025/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8569
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8569/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8569/comments
|
https://api.github.com/repos/ollama/ollama/issues/8569/events
|
https://github.com/ollama/ollama/issues/8569
| 2,810,087,681
|
I_kwDOJ0Z1Ps6nfokB
| 8,569
|
Linux: Compiling Ollama with AVX-512 and CUDA support
|
{
"login": "graynoir",
"id": 184021645,
"node_id": "U_kgDOCvfyjQ",
"avatar_url": "https://avatars.githubusercontent.com/u/184021645?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/graynoir",
"html_url": "https://github.com/graynoir",
"followers_url": "https://api.github.com/users/graynoir/followers",
"following_url": "https://api.github.com/users/graynoir/following{/other_user}",
"gists_url": "https://api.github.com/users/graynoir/gists{/gist_id}",
"starred_url": "https://api.github.com/users/graynoir/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/graynoir/subscriptions",
"organizations_url": "https://api.github.com/users/graynoir/orgs",
"repos_url": "https://api.github.com/users/graynoir/repos",
"events_url": "https://api.github.com/users/graynoir/events{/privacy}",
"received_events_url": "https://api.github.com/users/graynoir/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 11
| 2025-01-24T18:13:32
| 2025-01-26T22:06:04
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi, I've been trying to compile Ollama with AVX-512 and Cuda support on Linux (Manjaro), however, despite multiple attempts and different custom cpu flags I don't seem to get it to work, and instead `ollama serve` falls back to `level=INFO source=routes.go:1267 msg="Dynamic LLM libraries" runners=[cpu]`, when the binary package at least had `runners="[cpu cpu_avx cpu_avx2 cuda_v12_avx]`, however no AVX-512. I don't know much about Go, I've read the `development.md` but all I was able to get from `ollama serve` after setting the custom flags was:
```
system info="CPU : SSE3 = 1 | SSSE3 = 1 | AVX = 1 | AVX2 =1 | F16C = 1 | FMA = 1 | AVX512 = 1 | AVX512_VBMI = 1 | AVX512_VNNI = 1 | AVX512_BF16 = 1 | LLAMAFILE = 1 | AARCH64_REPACK = 1 | CPU : SSE3 = 1 | SSSE3 = 1 | AVX = 1 | AVX2 = 1 | F16C = 1 | FMA = 1 | AVX512 = 1 | AVX512_VBMI = 1 | AVX512_VNNI = 1 | AVX512_BF16 = 1 | LLAMAFILE = 1 | AARCH64_REPACK = 1 | cgo(gcc)" threads=8
```
after adding the following flags to the `make` command to build ollama:
```
make CUSTOM_CPU_FLAGS=avx,avx2,avx512,avx512vbmi,avx512vnni,avx512bf16 -j16
```
But it seems neither AVX is working nor is the GPU being used, as the inference speed is much slower than that of the pre-compiled ollama binary.
Has anybody a successful build of Ollama with Cuda and AVX-512, and knows how to achieve that? Thanks in advance!
Additional information:
* ollama version: 0.5.7-2
* cuda version: 12.2
* Nvidia driver version: 535.216.01
* cpu: Intel Xeon W-11955m
* gpu: RTX A5000
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8569/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8569/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3402
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3402/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3402/comments
|
https://api.github.com/repos/ollama/ollama/issues/3402/events
|
https://github.com/ollama/ollama/issues/3402
| 2,214,509,688
|
I_kwDOJ0Z1Ps6D_rx4
| 3,402
|
"ollama run llama2" Fails with Connection Error and Runtime Panic on Windows 11
|
{
"login": "cvecve147",
"id": 12343899,
"node_id": "MDQ6VXNlcjEyMzQzODk5",
"avatar_url": "https://avatars.githubusercontent.com/u/12343899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cvecve147",
"html_url": "https://github.com/cvecve147",
"followers_url": "https://api.github.com/users/cvecve147/followers",
"following_url": "https://api.github.com/users/cvecve147/following{/other_user}",
"gists_url": "https://api.github.com/users/cvecve147/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cvecve147/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cvecve147/subscriptions",
"organizations_url": "https://api.github.com/users/cvecve147/orgs",
"repos_url": "https://api.github.com/users/cvecve147/repos",
"events_url": "https://api.github.com/users/cvecve147/events{/privacy}",
"received_events_url": "https://api.github.com/users/cvecve147/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-03-29T02:44:57
| 2024-04-08T05:33:10
| 2024-04-08T05:33:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When attempting to run the command **ollama run llama2**, I encountered a connection error followed by a runtime panic. Initially, the process attempts to pull a manifest but fails with a connection error indicating that the connection to **127.0.0.1:11434** was refused. Subsequently, inspecting the logs revealed a **panic: runtime error: index out of range** error, specifically occurring during a blob download operation as part of the PullModel process. This issue prevents any further operation of the command and effectively halts the execution.
### What did you expect to see?
Upon executing the ollama run llama2 command, I expected the process to successfully connect, download necessary blobs, and proceed without any runtime errors. Ideally, the command should result in the model being prepared and ready for use, without encountering connection issues or runtime panics.
### Steps to reproduce
1. Ensure that Ollama version 0.1.30 is installed on a Windows 11 23H2 environment.
2. Open a terminal window.
3. Execute the command: **ollama run llama2**.
4. Observe the error messages related to connection failure and runtime panic in the output or logs.
### Are there any recent changes that introduced the issue?
_No response_
### OS
Windows
### Ollama version
0.1.30
### GPU
Nvidia
### server.log
```
[GIN] 2024/03/29 - 10:38:02 | 404 | 565.8µs | 127.0.0.1 | POST "/api/show"
panic: runtime error: index out of range [0] with length 0
goroutine 22 [running]:
github.com/ollama/ollama/server.(*blobDownload).Prepare(0xc0001ec150, {0x418cf80, 0xc0000f06e0}, 0xc0004fe360, 0xc00024a240)
github.com/ollama/ollama/server/download.go:136 +0x4f6
github.com/ollama/ollama/server.downloadBlob({0x418cf80, 0xc0000f06e0}, {{{0x1869044, 0x5}, {0x187c878, 0x12}, {0x18717df, 0x7}, {0xc0001524c4, 0x6}, ...}, ...})
github.com/ollama/ollama/server/download.go:369 +0x4f6
github.com/ollama/ollama/server.PullModel({0x418cf80, 0xc0000f06e0}, {0xc0001524c4?, 0x0?}, 0xc00024a240, 0xc000500240)
github.com/ollama/ollama/server/images.go:1031 +0x508
github.com/ollama/ollama/server.PullModelHandler.func1()
github.com/ollama/ollama/server/routes.go:523 +0x14b
created by github.com/ollama/ollama/server.PullModelHandler in goroutine 20
github.com/ollama/ollama/server/routes.go:510 +0x14b
```
|
{
"login": "cvecve147",
"id": 12343899,
"node_id": "MDQ6VXNlcjEyMzQzODk5",
"avatar_url": "https://avatars.githubusercontent.com/u/12343899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cvecve147",
"html_url": "https://github.com/cvecve147",
"followers_url": "https://api.github.com/users/cvecve147/followers",
"following_url": "https://api.github.com/users/cvecve147/following{/other_user}",
"gists_url": "https://api.github.com/users/cvecve147/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cvecve147/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cvecve147/subscriptions",
"organizations_url": "https://api.github.com/users/cvecve147/orgs",
"repos_url": "https://api.github.com/users/cvecve147/repos",
"events_url": "https://api.github.com/users/cvecve147/events{/privacy}",
"received_events_url": "https://api.github.com/users/cvecve147/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3402/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3402/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8655
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8655/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8655/comments
|
https://api.github.com/repos/ollama/ollama/issues/8655/events
|
https://github.com/ollama/ollama/issues/8655
| 2,818,034,521
|
I_kwDOJ0Z1Ps6n98tZ
| 8,655
|
GPU process at 1-3% when running Deepseek R1 32b
|
{
"login": "BananasMan",
"id": 112043755,
"node_id": "U_kgDOBq2m6w",
"avatar_url": "https://avatars.githubusercontent.com/u/112043755?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BananasMan",
"html_url": "https://github.com/BananasMan",
"followers_url": "https://api.github.com/users/BananasMan/followers",
"following_url": "https://api.github.com/users/BananasMan/following{/other_user}",
"gists_url": "https://api.github.com/users/BananasMan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BananasMan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BananasMan/subscriptions",
"organizations_url": "https://api.github.com/users/BananasMan/orgs",
"repos_url": "https://api.github.com/users/BananasMan/repos",
"events_url": "https://api.github.com/users/BananasMan/events{/privacy}",
"received_events_url": "https://api.github.com/users/BananasMan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 3
| 2025-01-29T12:09:24
| 2025-01-30T08:53:18
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
im trying to run deepseek r1 32b locally. It runs but the GPU barely used.
when it processing a simple task like multipying numbers, i saw in task manager that the gpu barely used at 1-3%, while the cpu at 70%.
i have to add tho that both ram and vram is still used well and both almost full.
GPU; RTX 3060 12gb
CPU: Ryzen 5 5600
RAM: 16GB
if you think my pc is not strong enough to run the 32b model, i dont mind recommendations of which r1 models i should use
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
newest one
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8655/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8655/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4680
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4680/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4680/comments
|
https://api.github.com/repos/ollama/ollama/issues/4680/events
|
https://github.com/ollama/ollama/issues/4680
| 2,321,201,470
|
I_kwDOJ0Z1Ps6KWrk-
| 4,680
|
Json Mode significantly decrease GPU usage
|
{
"login": "LaetLanf",
"id": 131473617,
"node_id": "U_kgDOB9Yg0Q",
"avatar_url": "https://avatars.githubusercontent.com/u/131473617?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LaetLanf",
"html_url": "https://github.com/LaetLanf",
"followers_url": "https://api.github.com/users/LaetLanf/followers",
"following_url": "https://api.github.com/users/LaetLanf/following{/other_user}",
"gists_url": "https://api.github.com/users/LaetLanf/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LaetLanf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LaetLanf/subscriptions",
"organizations_url": "https://api.github.com/users/LaetLanf/orgs",
"repos_url": "https://api.github.com/users/LaetLanf/repos",
"events_url": "https://api.github.com/users/LaetLanf/events{/privacy}",
"received_events_url": "https://api.github.com/users/LaetLanf/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-05-28T14:17:19
| 2024-05-28T20:41:50
| 2024-05-28T20:41:50
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am running Ollama Llama3:70b-instruct on an Azure Linux A100 VM.
I did a test with and without json mode, with the exact same prompt and python code. The only thing I changed is format='json' in the chat call.
WITHOUT json mode, I reached:
22-25 TPS for 1 chat call
The monitoring of the GPU (see attached) clearly show that the GPU is well used
WITH json mode:
6 TPS for 1 chat call
The monitoring of the GPU (see attached) clearly show that the GPU is NOT fully used

### OS
Linux
### GPU
Nvidia
### CPU
Other
### Ollama version
0.1.38
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4680/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4680/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3827
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3827/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3827/comments
|
https://api.github.com/repos/ollama/ollama/issues/3827/events
|
https://github.com/ollama/ollama/issues/3827
| 2,256,950,300
|
I_kwDOJ0Z1Ps6GhlQc
| 3,827
|
Enable CORS for "app://obsidian.md"
|
{
"login": "pegasusthemis",
"id": 167796164,
"node_id": "U_kgDOCgBdxA",
"avatar_url": "https://avatars.githubusercontent.com/u/167796164?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pegasusthemis",
"html_url": "https://github.com/pegasusthemis",
"followers_url": "https://api.github.com/users/pegasusthemis/followers",
"following_url": "https://api.github.com/users/pegasusthemis/following{/other_user}",
"gists_url": "https://api.github.com/users/pegasusthemis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pegasusthemis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pegasusthemis/subscriptions",
"organizations_url": "https://api.github.com/users/pegasusthemis/orgs",
"repos_url": "https://api.github.com/users/pegasusthemis/repos",
"events_url": "https://api.github.com/users/pegasusthemis/events{/privacy}",
"received_events_url": "https://api.github.com/users/pegasusthemis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-04-22T16:13:48
| 2024-08-23T08:43:31
| 2024-04-23T22:56:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
I am a developer creating plugins for Obsidian, a popular knowledge management and note-taking software. I believe that enabling CORS for `app://obsidian.md` would significantly enhance the functionality and integration possibilities of Obsidian plugins with Ollama models.
Obsidian uses a custom protocol `app://obsidian.md` which I think is secure and used exclusively within the Obsidian environment. Allowing CORS for this protocol does not introduce significant security risks as it is a controlled environment, primarily dealing with local files and user-initiated actions. This environment ensures that enabling CORS would not expose Ollama to unwanted cross-origin requests in a way that compromises security.
By enabling CORS, developers can seamlessly integrate Ollama's advanced modeling capabilities directly into Obsidian, enhancing the user experience and providing advanced features directly within the application.
I dont know if I express myself properly, but I just ask, if it would be possible to put app://obsidian.md in OLLAMA_ORIGINS to avoid using * in this case.
Thank you for considering this enhancement. I appreciate the fantastic work you've done with Ollama and fully respect any decision you make regarding this request. Your efforts in developing such a powerful tool are greatly appreciated by the community.
Best regards,
Pegasus Themis
|
{
"login": "pegasusthemis",
"id": 167796164,
"node_id": "U_kgDOCgBdxA",
"avatar_url": "https://avatars.githubusercontent.com/u/167796164?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pegasusthemis",
"html_url": "https://github.com/pegasusthemis",
"followers_url": "https://api.github.com/users/pegasusthemis/followers",
"following_url": "https://api.github.com/users/pegasusthemis/following{/other_user}",
"gists_url": "https://api.github.com/users/pegasusthemis/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pegasusthemis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pegasusthemis/subscriptions",
"organizations_url": "https://api.github.com/users/pegasusthemis/orgs",
"repos_url": "https://api.github.com/users/pegasusthemis/repos",
"events_url": "https://api.github.com/users/pegasusthemis/events{/privacy}",
"received_events_url": "https://api.github.com/users/pegasusthemis/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3827/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3827/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2040
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2040/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2040/comments
|
https://api.github.com/repos/ollama/ollama/issues/2040/events
|
https://github.com/ollama/ollama/pull/2040
| 2,087,387,204
|
PR_kwDOJ0Z1Ps5kYHGf
| 2,040
|
Add cuda to CI build
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-18T03:05:47
| 2024-01-27T15:14:58
| 2024-01-27T15:14:55
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2040",
"html_url": "https://github.com/ollama/ollama/pull/2040",
"diff_url": "https://github.com/ollama/ollama/pull/2040.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2040.patch",
"merged_at": null
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2040/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2040/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5730
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5730/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5730/comments
|
https://api.github.com/repos/ollama/ollama/issues/5730/events
|
https://github.com/ollama/ollama/pull/5730
| 2,412,022,138
|
PR_kwDOJ0Z1Ps51j27m
| 5,730
|
remove unneeded tool calls
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-16T20:50:20
| 2024-07-16T21:42:14
| 2024-07-16T21:42:13
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5730",
"html_url": "https://github.com/ollama/ollama/pull/5730",
"diff_url": "https://github.com/ollama/ollama/pull/5730.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5730.patch",
"merged_at": "2024-07-16T21:42:13"
}
|
ID and Type are currently unused so leave them out for now
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5730/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5730/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1035
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1035/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1035/comments
|
https://api.github.com/repos/ollama/ollama/issues/1035/events
|
https://github.com/ollama/ollama/pull/1035
| 1,982,028,245
|
PR_kwDOJ0Z1Ps5e17Rq
| 1,035
|
add a complete /generate options example
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-07T19:01:44
| 2023-11-09T00:44:37
| 2023-11-09T00:44:37
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1035",
"html_url": "https://github.com/ollama/ollama/pull/1035",
"diff_url": "https://github.com/ollama/ollama/pull/1035.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1035.patch",
"merged_at": "2023-11-09T00:44:37"
}
|
- Add an example to the api docs that shows how all generate runtime options can be specified
- Move the `GenerateRequest` options closed to the struct declaration so its easier for readers to find
resolves #1027
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1035/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1035/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6583
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6583/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6583/comments
|
https://api.github.com/repos/ollama/ollama/issues/6583/events
|
https://github.com/ollama/ollama/pull/6583
| 2,499,201,738
|
PR_kwDOJ0Z1Ps56EiQV
| 6,583
|
Update README.md
|
{
"login": "jonathanhecl",
"id": 1691623,
"node_id": "MDQ6VXNlcjE2OTE2MjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1691623?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jonathanhecl",
"html_url": "https://github.com/jonathanhecl",
"followers_url": "https://api.github.com/users/jonathanhecl/followers",
"following_url": "https://api.github.com/users/jonathanhecl/following{/other_user}",
"gists_url": "https://api.github.com/users/jonathanhecl/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jonathanhecl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jonathanhecl/subscriptions",
"organizations_url": "https://api.github.com/users/jonathanhecl/orgs",
"repos_url": "https://api.github.com/users/jonathanhecl/repos",
"events_url": "https://api.github.com/users/jonathanhecl/events{/privacy}",
"received_events_url": "https://api.github.com/users/jonathanhecl/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-09-01T03:53:54
| 2024-09-02T19:34:26
| 2024-09-02T19:34:26
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6583",
"html_url": "https://github.com/ollama/ollama/pull/6583",
"diff_url": "https://github.com/ollama/ollama/pull/6583.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6583.patch",
"merged_at": "2024-09-02T19:34:26"
}
|
New links:
Go-CREW and Ollamaclient for Golang
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6583/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6583/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3758
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3758/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3758/comments
|
https://api.github.com/repos/ollama/ollama/issues/3758/events
|
https://github.com/ollama/ollama/issues/3758
| 2,253,454,463
|
I_kwDOJ0Z1Ps6GUPx_
| 3,758
|
Ollama backend down?
|
{
"login": "piratos",
"id": 8265152,
"node_id": "MDQ6VXNlcjgyNjUxNTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8265152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/piratos",
"html_url": "https://github.com/piratos",
"followers_url": "https://api.github.com/users/piratos/followers",
"following_url": "https://api.github.com/users/piratos/following{/other_user}",
"gists_url": "https://api.github.com/users/piratos/gists{/gist_id}",
"starred_url": "https://api.github.com/users/piratos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/piratos/subscriptions",
"organizations_url": "https://api.github.com/users/piratos/orgs",
"repos_url": "https://api.github.com/users/piratos/repos",
"events_url": "https://api.github.com/users/piratos/events{/privacy}",
"received_events_url": "https://api.github.com/users/piratos/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-04-19T16:52:04
| 2024-04-19T17:00:03
| 2024-04-19T17:00:03
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama pull returns
```
no healthy upstream
```
llama 3 release traffic killed your backend ? :smile:
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.31
|
{
"login": "piratos",
"id": 8265152,
"node_id": "MDQ6VXNlcjgyNjUxNTI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8265152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/piratos",
"html_url": "https://github.com/piratos",
"followers_url": "https://api.github.com/users/piratos/followers",
"following_url": "https://api.github.com/users/piratos/following{/other_user}",
"gists_url": "https://api.github.com/users/piratos/gists{/gist_id}",
"starred_url": "https://api.github.com/users/piratos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/piratos/subscriptions",
"organizations_url": "https://api.github.com/users/piratos/orgs",
"repos_url": "https://api.github.com/users/piratos/repos",
"events_url": "https://api.github.com/users/piratos/events{/privacy}",
"received_events_url": "https://api.github.com/users/piratos/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3758/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3758/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7332
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7332/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7332/comments
|
https://api.github.com/repos/ollama/ollama/issues/7332/events
|
https://github.com/ollama/ollama/issues/7332
| 2,608,448,576
|
I_kwDOJ0Z1Ps6becRA
| 7,332
|
Support installations in non-systemd distros
|
{
"login": "Sachin-Bhat",
"id": 25080916,
"node_id": "MDQ6VXNlcjI1MDgwOTE2",
"avatar_url": "https://avatars.githubusercontent.com/u/25080916?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sachin-Bhat",
"html_url": "https://github.com/Sachin-Bhat",
"followers_url": "https://api.github.com/users/Sachin-Bhat/followers",
"following_url": "https://api.github.com/users/Sachin-Bhat/following{/other_user}",
"gists_url": "https://api.github.com/users/Sachin-Bhat/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Sachin-Bhat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Sachin-Bhat/subscriptions",
"organizations_url": "https://api.github.com/users/Sachin-Bhat/orgs",
"repos_url": "https://api.github.com/users/Sachin-Bhat/repos",
"events_url": "https://api.github.com/users/Sachin-Bhat/events{/privacy}",
"received_events_url": "https://api.github.com/users/Sachin-Bhat/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-10-23T12:35:05
| 2024-11-21T18:58:57
| 2024-11-21T18:58:56
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello folks,
Kudos on this project! I wanted to install this on my system which is Artix Linux running the runit init system. Was wondering if support could be added for this.
Appreciate the assistance in advance.
Cheers,
Sachin
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7332/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7332/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6237
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6237/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6237/comments
|
https://api.github.com/repos/ollama/ollama/issues/6237/events
|
https://github.com/ollama/ollama/issues/6237
| 2,453,913,814
|
I_kwDOJ0Z1Ps6SQ8DW
| 6,237
|
Ollama Product Stance on Grammar Feature / Outstanding PRs
|
{
"login": "Kinglord",
"id": 597488,
"node_id": "MDQ6VXNlcjU5NzQ4OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/597488?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kinglord",
"html_url": "https://github.com/Kinglord",
"followers_url": "https://api.github.com/users/Kinglord/followers",
"following_url": "https://api.github.com/users/Kinglord/following{/other_user}",
"gists_url": "https://api.github.com/users/Kinglord/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kinglord/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kinglord/subscriptions",
"organizations_url": "https://api.github.com/users/Kinglord/orgs",
"repos_url": "https://api.github.com/users/Kinglord/repos",
"events_url": "https://api.github.com/users/Kinglord/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kinglord/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 18
| 2024-08-07T16:47:55
| 2024-12-05T00:52:24
| 2024-12-05T00:31:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
This isn't a feature request, but it's the best category I could pick. This is really a question around merging PRs for exposing an existing feature to users of Ollama that are being ignored or declined without good context. I'm asking this to get more public visibility from the Ollama team on grammar features, specifically those implemented and existing in llama.cpp.
I understand Ollama provides json schemas functionality as a way to direct and control the output from models, another popular approach is the use of GBNF / Grammar, which is supported and implemented in llama.cpp currently. Several PRs have been submitted to expose this feature to Ollama users, and have been either sitting idle or closed. This particular point is going to continue to surface and make noise (there was a large help thread in the Discord started today) until Ollama makes a clear and public statement on this issue. If Ollama as a product has decided not to give users this choice, and is saying if you want to use or test this feature it must be done outside of Ollama, then you need to let us (the community) know. If there is some problem with the way the community is exposing this feature in the PRs, then again just let us know so we can fix it. I understand as a contributor it can be hard to understand why a product does not want to give users more choices and options, and I think Ollama needs to clearly state why this choice has been made for the product.
This is not a post to talk about which approach between GBNF and json is better or worse - this is a post to clarify that there is community demand for the ability to use this feature in Ollama, and Ollama apparently actively rejecting the inclusion of it based on what I have to assume are product calls the community does not have visibility on. I hope this post will end that lack of clarity for all involved, so we all will know Ollama's stance and as a community we can stop bringing this up and submitting additional PRs. If anyone wants to start a more technical post and provide data on why one approach can be better than another, I welcome you to do so and link it to this topic.
My simple personal example is this. As a newer Ollama user I actually would like to try out both approaches to see which one works better for me and my product. Right now in Ollama I simply cannot, and from appearances (which can be deceiving) it appears that what's stopping me from testing these both in Ollama is a simple code change to expose the feature in llama.cpp to me. _(**edit**: It was brought to my attention that Ollama actually uses GBNF internally to enforce json syntax, so the only thing that's really missing is exposing this feature to the end user to customize or use different grammar.)_
There might be more, but for reference here are some links to other discussions about this topic as well as a link to Discord thread from earlier today. Thanks to the Ollama team for taking a look at this and helping align the community with their future response.
Discord:
https://discord.com/channels/1128867683291627614/1236730825928741034
Github PRs:
https://github.com/ollama/ollama/pull/565
https://github.com/ollama/ollama/pull/830
https://github.com/ollama/ollama/pull/1606
https://github.com/ollama/ollama/pull/2404
https://github.com/ollama/ollama/pull/2754
https://github.com/ollama/ollama/pull/3303
https://github.com/ollama/ollama/pull/3618
https://github.com/ollama/ollama/pull/4525
https://github.com/ollama/ollama/pull/5348
Github Issues:
https://github.com/ollama/ollama/issues/808
https://github.com/ollama/ollama/issues/1507
https://github.com/ollama/ollama/issues/3616
https://github.com/ollama/ollama/issues/4074
https://github.com/ollama/ollama/issues/4370
https://github.com/ollama/ollama/issues/6002
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6237/reactions",
"total_count": 62,
"+1": 29,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 25,
"rocket": 0,
"eyes": 8
}
|
https://api.github.com/repos/ollama/ollama/issues/6237/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/625
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/625/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/625/comments
|
https://api.github.com/repos/ollama/ollama/issues/625/events
|
https://github.com/ollama/ollama/issues/625
| 1,916,332,688
|
I_kwDOJ0Z1Ps5yOOqQ
| 625
|
`ollama cp` followed by `ollama push` requires re-pushing layers
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2023-09-27T21:12:33
| 2024-01-16T22:13:55
| 2024-01-16T22:13:55
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
To reproduce:
```
ollama cp llama2 <username>/llama2
ollama push <username>/llama2
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/625/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/625/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/5075
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5075/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5075/comments
|
https://api.github.com/repos/ollama/ollama/issues/5075/events
|
https://github.com/ollama/ollama/pull/5075
| 2,355,375,310
|
PR_kwDOJ0Z1Ps5ylt6D
| 5,075
|
docs: add missing powershell package to windows development instructions
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-16T01:52:50
| 2024-06-16T03:08:10
| 2024-06-16T03:08:09
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5075",
"html_url": "https://github.com/ollama/ollama/pull/5075",
"diff_url": "https://github.com/ollama/ollama/pull/5075.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5075.patch",
"merged_at": "2024-06-16T03:08:09"
}
|
The powershell script for building Ollama on Windows requires the `ThreadJob` module. Add this to the instructions and dependency list.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5075/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5075/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2217
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2217/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2217/comments
|
https://api.github.com/repos/ollama/ollama/issues/2217/events
|
https://github.com/ollama/ollama/issues/2217
| 2,102,921,772
|
I_kwDOJ0Z1Ps59WAos
| 2,217
|
Message vs Template vs System
|
{
"login": "giannisak",
"id": 154079765,
"node_id": "U_kgDOCS8SFQ",
"avatar_url": "https://avatars.githubusercontent.com/u/154079765?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/giannisak",
"html_url": "https://github.com/giannisak",
"followers_url": "https://api.github.com/users/giannisak/followers",
"following_url": "https://api.github.com/users/giannisak/following{/other_user}",
"gists_url": "https://api.github.com/users/giannisak/gists{/gist_id}",
"starred_url": "https://api.github.com/users/giannisak/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/giannisak/subscriptions",
"organizations_url": "https://api.github.com/users/giannisak/orgs",
"repos_url": "https://api.github.com/users/giannisak/repos",
"events_url": "https://api.github.com/users/giannisak/events{/privacy}",
"received_events_url": "https://api.github.com/users/giannisak/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-26T21:22:30
| 2024-01-27T00:57:37
| 2024-01-27T00:57:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
What is the difference between message, template and system if I want to do few-shot prompting?
I mean, I could pass the example of release(v0.1.21) to a model in three different ways:
1) Few-shot using Message:
SYSTEM You are a friendly assistant that only answers with 'yes' or 'no'
MESSAGE user Is Toronto in Canada?
MESSAGE assistant yes
(etc..)
2) Few-show using Template:
TEMPLATE """
<|im_start|>system
{{ .System }}
<|im_end|>
<|im_start|>user
Is Toronto in Canada?
<|im_end|>
<|im_start|>assistant
yes
<|im_end|>
(etc..)
"""
SYSTEM You are a friendly assistant that only answers with 'yes' or 'no'
3) Few-shot using only System:
SYSTEM """
You are a friendly assistant that only answers with 'yes' or 'no'.
You will be given questions about whether a city is located in a specific country.
Example 1:
Is Toronto in Canada?
yes
Example 2:
(etc..)
"""
I am running some tests using llama index in a similar topic on 7B models and I am getting better results in
System format compared to Template format (I was expecting the opposite).
I will test message format too, but I am trying to understand the differences and the expected behavior of each.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2217/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2217/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2956
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2956/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2956/comments
|
https://api.github.com/repos/ollama/ollama/issues/2956/events
|
https://github.com/ollama/ollama/issues/2956
| 2,172,035,695
|
I_kwDOJ0Z1Ps6BdqJv
| 2,956
|
feat: Add an "official" indicator in the library
|
{
"login": "jimscard",
"id": 26580570,
"node_id": "MDQ6VXNlcjI2NTgwNTcw",
"avatar_url": "https://avatars.githubusercontent.com/u/26580570?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jimscard",
"html_url": "https://github.com/jimscard",
"followers_url": "https://api.github.com/users/jimscard/followers",
"following_url": "https://api.github.com/users/jimscard/following{/other_user}",
"gists_url": "https://api.github.com/users/jimscard/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jimscard/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jimscard/subscriptions",
"organizations_url": "https://api.github.com/users/jimscard/orgs",
"repos_url": "https://api.github.com/users/jimscard/repos",
"events_url": "https://api.github.com/users/jimscard/events{/privacy}",
"received_events_url": "https://api.github.com/users/jimscard/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 6573197867,
"node_id": "LA_kwDOJ0Z1Ps8AAAABh8sKKw",
"url": "https://api.github.com/repos/ollama/ollama/labels/ollama.com",
"name": "ollama.com",
"color": "ffffff",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-03-06T17:12:35
| 2024-03-12T20:53:36
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
In the model library on ollama.com, there is no indication whether a model is "official", e.g., provided by the ollama team, or uploaded by a user, other than the username prefix.
This can cause confusion for other users searching for a model by name.
Recommend implementing something similar to Docker Hub to indicate models provided by Ollama team, models provided by Model developer and models supported / blessed by Ollama team.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2956/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2956/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1923
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1923/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1923/comments
|
https://api.github.com/repos/ollama/ollama/issues/1923/events
|
https://github.com/ollama/ollama/issues/1923
| 2,076,206,237
|
I_kwDOJ0Z1Ps57wGSd
| 1,923
|
choosing the right model to interact
|
{
"login": "umtksa",
"id": 12473742,
"node_id": "MDQ6VXNlcjEyNDczNzQy",
"avatar_url": "https://avatars.githubusercontent.com/u/12473742?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/umtksa",
"html_url": "https://github.com/umtksa",
"followers_url": "https://api.github.com/users/umtksa/followers",
"following_url": "https://api.github.com/users/umtksa/following{/other_user}",
"gists_url": "https://api.github.com/users/umtksa/gists{/gist_id}",
"starred_url": "https://api.github.com/users/umtksa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/umtksa/subscriptions",
"organizations_url": "https://api.github.com/users/umtksa/orgs",
"repos_url": "https://api.github.com/users/umtksa/repos",
"events_url": "https://api.github.com/users/umtksa/events{/privacy}",
"received_events_url": "https://api.github.com/users/umtksa/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-01-11T10:04:42
| 2024-03-11T19:32:12
| 2024-03-11T19:32:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I can use a custom mistral modelfile to choose which model is the best choice based on the subject.
like in my model file "choose" all models have descriptions like copywriter model or weather model
based on the subject mistral can choose the best model and gives me the command to run
so I can run it through the model I want.
This is working as expected but I'm a noob and I'm not sure this is the best way to do this.
basically I run
`ollama run choose "weather is 16 degrees outside"`
and it gives me
`ollama run weather "weather is 16 degrees outside"`
so is using mistral overkill for this purpose? I tried the same modelfile with other models without success like tinyllama.
mistral is the best one I come up for this subject.
So any suggestions? (tried regex without success before this)
this issue can have a label as question or this could be a feature request.
[cross posting to reddit](https://www.reddit.com/r/ollama/comments/193xvzm/choosing_the_right_model_to_interact/)
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1923/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/1923/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4118
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4118/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4118/comments
|
https://api.github.com/repos/ollama/ollama/issues/4118/events
|
https://github.com/ollama/ollama/pull/4118
| 2,276,937,764
|
PR_kwDOJ0Z1Ps5ubzx7
| 4,118
|
Add ChatGPTBox and RWKV-Runner to community integrations
|
{
"login": "josStorer",
"id": 13366013,
"node_id": "MDQ6VXNlcjEzMzY2MDEz",
"avatar_url": "https://avatars.githubusercontent.com/u/13366013?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/josStorer",
"html_url": "https://github.com/josStorer",
"followers_url": "https://api.github.com/users/josStorer/followers",
"following_url": "https://api.github.com/users/josStorer/following{/other_user}",
"gists_url": "https://api.github.com/users/josStorer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/josStorer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/josStorer/subscriptions",
"organizations_url": "https://api.github.com/users/josStorer/orgs",
"repos_url": "https://api.github.com/users/josStorer/repos",
"events_url": "https://api.github.com/users/josStorer/events{/privacy}",
"received_events_url": "https://api.github.com/users/josStorer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-05-03T05:20:34
| 2024-11-23T21:31:27
| 2024-11-23T21:31:27
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4118",
"html_url": "https://github.com/ollama/ollama/pull/4118",
"diff_url": "https://github.com/ollama/ollama/pull/4118.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4118.patch",
"merged_at": "2024-11-23T21:31:27"
}
|
Integrating Tutorial:
ChatGPTBox: https://github.com/josStorer/chatGPTBox/issues/616#issuecomment-1975186467
RWKV-Runner:

|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4118/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5877
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5877/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5877/comments
|
https://api.github.com/repos/ollama/ollama/issues/5877/events
|
https://github.com/ollama/ollama/issues/5877
| 2,425,363,032
|
I_kwDOJ0Z1Ps6QkBpY
| 5,877
|
Ollama API not seeing messages provided in conversation_history
|
{
"login": "barclaybrown",
"id": 36378453,
"node_id": "MDQ6VXNlcjM2Mzc4NDUz",
"avatar_url": "https://avatars.githubusercontent.com/u/36378453?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/barclaybrown",
"html_url": "https://github.com/barclaybrown",
"followers_url": "https://api.github.com/users/barclaybrown/followers",
"following_url": "https://api.github.com/users/barclaybrown/following{/other_user}",
"gists_url": "https://api.github.com/users/barclaybrown/gists{/gist_id}",
"starred_url": "https://api.github.com/users/barclaybrown/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/barclaybrown/subscriptions",
"organizations_url": "https://api.github.com/users/barclaybrown/orgs",
"repos_url": "https://api.github.com/users/barclaybrown/repos",
"events_url": "https://api.github.com/users/barclaybrown/events{/privacy}",
"received_events_url": "https://api.github.com/users/barclaybrown/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-07-23T14:28:07
| 2024-09-12T22:02:25
| 2024-09-12T22:02:25
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I pass a list of dictionaries (messages) to ollama.chat it seems that the model does not see anything other than the latest message. For example, I want the model to get a bunch of text, and then answer a question about it. I send something like:
role : system content: You are a helpful assistant
role: user content: a bunch of reference text
role: user content: a question related to the reference text
then I get back
role: assistant content: an answer unrelated to the reference text, as if it doesn't see it
Is this a bug, or maybe I'm doing something wrong?
### OS
Windows
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.2.7
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5877/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5877/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/936
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/936/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/936/comments
|
https://api.github.com/repos/ollama/ollama/issues/936/events
|
https://github.com/ollama/ollama/pull/936
| 1,966,094,839
|
PR_kwDOJ0Z1Ps5eAOhe
| 936
|
I've added the sample with Gradio and the scan of a folder
|
{
"login": "suoko",
"id": 3659980,
"node_id": "MDQ6VXNlcjM2NTk5ODA=",
"avatar_url": "https://avatars.githubusercontent.com/u/3659980?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/suoko",
"html_url": "https://github.com/suoko",
"followers_url": "https://api.github.com/users/suoko/followers",
"following_url": "https://api.github.com/users/suoko/following{/other_user}",
"gists_url": "https://api.github.com/users/suoko/gists{/gist_id}",
"starred_url": "https://api.github.com/users/suoko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/suoko/subscriptions",
"organizations_url": "https://api.github.com/users/suoko/orgs",
"repos_url": "https://api.github.com/users/suoko/repos",
"events_url": "https://api.github.com/users/suoko/events{/privacy}",
"received_events_url": "https://api.github.com/users/suoko/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-10-27T19:43:10
| 2023-10-30T21:55:35
| 2023-10-30T21:55:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/936",
"html_url": "https://github.com/ollama/ollama/pull/936",
"diff_url": "https://github.com/ollama/ollama/pull/936.diff",
"patch_url": "https://github.com/ollama/ollama/pull/936.patch",
"merged_at": null
}
|
The only value to change is the chunk_size which varies according to documents to scan
|
{
"login": "suoko",
"id": 3659980,
"node_id": "MDQ6VXNlcjM2NTk5ODA=",
"avatar_url": "https://avatars.githubusercontent.com/u/3659980?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/suoko",
"html_url": "https://github.com/suoko",
"followers_url": "https://api.github.com/users/suoko/followers",
"following_url": "https://api.github.com/users/suoko/following{/other_user}",
"gists_url": "https://api.github.com/users/suoko/gists{/gist_id}",
"starred_url": "https://api.github.com/users/suoko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/suoko/subscriptions",
"organizations_url": "https://api.github.com/users/suoko/orgs",
"repos_url": "https://api.github.com/users/suoko/repos",
"events_url": "https://api.github.com/users/suoko/events{/privacy}",
"received_events_url": "https://api.github.com/users/suoko/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/936/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/936/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6524
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6524/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6524/comments
|
https://api.github.com/repos/ollama/ollama/issues/6524/events
|
https://github.com/ollama/ollama/pull/6524
| 2,488,115,928
|
PR_kwDOJ0Z1Ps55gdUY
| 6,524
|
server: clean up route names for consistency
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-27T02:16:33
| 2024-08-27T02:36:13
| 2024-08-27T02:36:12
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6524",
"html_url": "https://github.com/ollama/ollama/pull/6524",
"diff_url": "https://github.com/ollama/ollama/pull/6524.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6524.patch",
"merged_at": "2024-08-27T02:36:12"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6524/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6524/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5672
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5672/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5672/comments
|
https://api.github.com/repos/ollama/ollama/issues/5672/events
|
https://github.com/ollama/ollama/issues/5672
| 2,406,879,451
|
I_kwDOJ0Z1Ps6PdhDb
| 5,672
|
ollama._types.ResponseError
|
{
"login": "Lena-Van",
"id": 149133903,
"node_id": "U_kgDOCOOaTw",
"avatar_url": "https://avatars.githubusercontent.com/u/149133903?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Lena-Van",
"html_url": "https://github.com/Lena-Van",
"followers_url": "https://api.github.com/users/Lena-Van/followers",
"following_url": "https://api.github.com/users/Lena-Van/following{/other_user}",
"gists_url": "https://api.github.com/users/Lena-Van/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Lena-Van/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lena-Van/subscriptions",
"organizations_url": "https://api.github.com/users/Lena-Van/orgs",
"repos_url": "https://api.github.com/users/Lena-Van/repos",
"events_url": "https://api.github.com/users/Lena-Van/events{/privacy}",
"received_events_url": "https://api.github.com/users/Lena-Van/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-07-13T12:24:40
| 2024-07-14T13:27:29
| 2024-07-14T13:27:28
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I set the llama3_70b_ollama_model_configuration = {
"config_name": "ollama_llama3_70b",
"model_type": "ollama_chat",
"model_name": "example",
"options": {
"temperature": 0.5,
"seed": 123
},
"keep_alive": "5m"
}
the "example" model was downloaded from the Huggingface(https://huggingface.co/bartowski/Smaug-Llama-3-70B-Instruct-32K-GGUF), it's the llama3_70_instruct's 4-bit quantized version.
I've successfully run it in the past, got the responses. But today, I got the "ollama._types.ResponseError"
the latest traceback is like:
` File "/home/wenlu/anaconda3/envs/angent/lib/python3.10/site-packages/ollama/_client.py", line 180, in chat
return self._request_stream(
File "/home/wenlu/anaconda3/envs/angent/lib/python3.10/site-packages/ollama/_client.py", line 98, in _request_stream
return self._stream(*args, **kwargs) if stream else self._request(*args, **kwargs).json()
File "/home/wenlu/anaconda3/envs/angent/lib/python3.10/site-packages/ollama/_client.py", line 74, in _request
raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError
`
I tried to unset the proxy, it didn't work for me.


### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.2.1
|
{
"login": "Lena-Van",
"id": 149133903,
"node_id": "U_kgDOCOOaTw",
"avatar_url": "https://avatars.githubusercontent.com/u/149133903?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Lena-Van",
"html_url": "https://github.com/Lena-Van",
"followers_url": "https://api.github.com/users/Lena-Van/followers",
"following_url": "https://api.github.com/users/Lena-Van/following{/other_user}",
"gists_url": "https://api.github.com/users/Lena-Van/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Lena-Van/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lena-Van/subscriptions",
"organizations_url": "https://api.github.com/users/Lena-Van/orgs",
"repos_url": "https://api.github.com/users/Lena-Van/repos",
"events_url": "https://api.github.com/users/Lena-Van/events{/privacy}",
"received_events_url": "https://api.github.com/users/Lena-Van/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5672/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5672/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7616
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7616/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7616/comments
|
https://api.github.com/repos/ollama/ollama/issues/7616/events
|
https://github.com/ollama/ollama/issues/7616
| 2,648,522,293
|
I_kwDOJ0Z1Ps6d3T41
| 7,616
|
Please add microsoft/OmniParser model
|
{
"login": "craftslab",
"id": 49358172,
"node_id": "MDQ6VXNlcjQ5MzU4MTcy",
"avatar_url": "https://avatars.githubusercontent.com/u/49358172?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/craftslab",
"html_url": "https://github.com/craftslab",
"followers_url": "https://api.github.com/users/craftslab/followers",
"following_url": "https://api.github.com/users/craftslab/following{/other_user}",
"gists_url": "https://api.github.com/users/craftslab/gists{/gist_id}",
"starred_url": "https://api.github.com/users/craftslab/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/craftslab/subscriptions",
"organizations_url": "https://api.github.com/users/craftslab/orgs",
"repos_url": "https://api.github.com/users/craftslab/repos",
"events_url": "https://api.github.com/users/craftslab/events{/privacy}",
"received_events_url": "https://api.github.com/users/craftslab/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 0
| 2024-11-11T08:10:34
| 2024-11-11T08:10:34
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
OmniParser is a general screen parsing tool, which interprets/converts UI screenshot to structured format, to improve existing LLM based UI agent. Training Datasets include: 1) an interactable icon detection dataset, which was curated from popular web pages and automatically annotated to highlight clickable and actionable regions, and 2) an icon description dataset, designed to associate each UI element with its corresponding function.
https://huggingface.co/microsoft/OmniParser
Thanks!
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7616/reactions",
"total_count": 12,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 12,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7616/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7123
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7123/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7123/comments
|
https://api.github.com/repos/ollama/ollama/issues/7123/events
|
https://github.com/ollama/ollama/issues/7123
| 2,571,670,286
|
I_kwDOJ0Z1Ps6ZSJMO
| 7,123
|
Long responses can corrupt the model until unloaded
|
{
"login": "ragibson",
"id": 14023456,
"node_id": "MDQ6VXNlcjE0MDIzNDU2",
"avatar_url": "https://avatars.githubusercontent.com/u/14023456?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ragibson",
"html_url": "https://github.com/ragibson",
"followers_url": "https://api.github.com/users/ragibson/followers",
"following_url": "https://api.github.com/users/ragibson/following{/other_user}",
"gists_url": "https://api.github.com/users/ragibson/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ragibson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ragibson/subscriptions",
"organizations_url": "https://api.github.com/users/ragibson/orgs",
"repos_url": "https://api.github.com/users/ragibson/repos",
"events_url": "https://api.github.com/users/ragibson/events{/privacy}",
"received_events_url": "https://api.github.com/users/ragibson/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
open
| false
| null |
[] | null | 5
| 2024-10-07T22:51:24
| 2024-11-06T00:20:07
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
In a relatively simple prompt, one of the Phi models went off track and ranted for several thousand words. After, all future responses produced (mostly) garbage output, even in separate API calls or interactive sessions with cleared session context. This persisted until the model was completely unloaded and reloaded.
It feels like something may have overflowed a buffer used for the context window or response and corrupted the model weights. Within the garbage output, the model appeared to have brief periods of "lucidity" where it demonstrated knowledge of prompts from completely separate sessions.
In the most recent case, I was using `phi3.5:3.8b-mini-instruct-q4_K_M` but have seen the same sort of behavior in other Phi releases. I'll try to find a prompt that can replicate this, though it's obviously stochastic given the nature of LLMs.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.12
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7123/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7123/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7838
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7838/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7838/comments
|
https://api.github.com/repos/ollama/ollama/issues/7838/events
|
https://github.com/ollama/ollama/issues/7838
| 2,693,402,758
|
I_kwDOJ0Z1Ps6gihCG
| 7,838
|
AMD ROCm 6.2.4 Ubuntu 24.04 `ggml-cuda.cu:132: CUDA error`
|
{
"login": "unclemusclez",
"id": 8789242,
"node_id": "MDQ6VXNlcjg3ODkyNDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8789242?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/unclemusclez",
"html_url": "https://github.com/unclemusclez",
"followers_url": "https://api.github.com/users/unclemusclez/followers",
"following_url": "https://api.github.com/users/unclemusclez/following{/other_user}",
"gists_url": "https://api.github.com/users/unclemusclez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/unclemusclez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/unclemusclez/subscriptions",
"organizations_url": "https://api.github.com/users/unclemusclez/orgs",
"repos_url": "https://api.github.com/users/unclemusclez/repos",
"events_url": "https://api.github.com/users/unclemusclez/events{/privacy}",
"received_events_url": "https://api.github.com/users/unclemusclez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-26T06:49:31
| 2024-12-03T05:05:10
| 2024-12-03T05:05:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Not sure why `ggml-cuda.cu` is being called with AMD:
compiled from origin main
```
ggml-cuda.cu:132: CUDA error
Could not attach to process. If your uid matches the uid of the target
process, check the setting of /proc/sys/kernel/yama/ptrace_scope, or try
again as the root user. For more details, see /etc/sysctl.d/10-ptrace.conf
ptrace: Operation not permitted.
No stack.
The program is not being run.
SIGABRT: abort
PC=0x7db2a829eb1c m=5 sigcode=18446744073709551610
signal arrived during cgo execution
goroutine 8 gp=0xc0000e01c0 m=5 mp=0xc000100008 [syscall]:
runtime.cgocall(0x5bd31a6880f0, 0xc000062b20)
runtime/cgocall.go:167 +0x4b fp=0xc000062af8 sp=0xc000062ac0 pc=0x5bd31a43d7ab
github.com/ollama/ollama/llama._Cfunc_llama_decode(0x7daffb4d09f0, {0x9, 0x7daffb4a59b0, 0x0, 0x0, 0x7daffb5686d0, 0x7daffb4e2da0, 0x7daffb473a40, 0x7daffb4e23b0, 0x0, ...})
_cgo_gotypes.go:551 +0x52 fp=0xc000062b20 sp=0xc000062af8 pc=0x5bd31a4e7252
github.com/ollama/ollama/llama.(*Context).Decode.func1(0x5bd31a683d4b?, 0x7daffb4d09f0?)
github.com/ollama/ollama/llama/llama.go:169 +0xd8 fp=0xc000062c40 sp=0xc000062b20 pc=0x5bd31a4e9818
github.com/ollama/ollama/llama.(*Context).Decode(0xc000388000?, 0x0?)
github.com/ollama/ollama/llama/llama.go:169 +0x17 fp=0xc000062c88 sp=0xc000062c40 pc=0x5bd31a4e9677
main.(*Server).processBatch(0xc0000ac1b0, 0xc00030e000, 0xc000062f10)
github.com/ollama/ollama/llama/runner/runner.go:427 +0x385 fp=0xc000062ed0 sp=0xc000062c88 pc=0x5bd31a682d05
main.(*Server).run(0xc0000ac1b0, {0x5bd31a9c4960, 0xc00008a0a0})
github.com/ollama/ollama/llama/runner/runner.go:327 +0x199 fp=0xc000062fb8 sp=0xc000062ed0 pc=0x5bd31a682619
main.main.gowrap2()
github.com/ollama/ollama/llama/runner/runner.go:922 +0x28 fp=0xc000062fe0 sp=0xc000062fb8 pc=0x5bd31a687048
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000062fe8 sp=0xc000062fe0 pc=0x5bd31a44b1a1
created by main.main in goroutine 1
github.com/ollama/ollama/llama/runner/runner.go:922 +0xc52
goroutine 1 gp=0xc0000061c0 m=nil [IO wait]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:424 +0xce fp=0xc000027858 sp=0xc000027838 pc=0x5bd31a44356e
runtime.netpollblock(0xc0000278a8?, 0x1a3dbf86?, 0xd3?)
runtime/netpoll.go:575 +0xf7 fp=0xc000027890 sp=0xc000027858 pc=0x5bd31a408337
internal/poll.runtime_pollWait(0x7db2a69cc008, 0x72)
runtime/netpoll.go:351 +0x85 fp=0xc0000278b0 sp=0xc000027890 pc=0x5bd31a442865
internal/poll.(*pollDesc).wait(0xc0000da100?, 0x2c?, 0x0)
internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0000278d8 sp=0xc0000278b0 pc=0x5bd31a498187
internal/poll.(*pollDesc).waitRead(...)
internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0xc0000da100)
internal/poll/fd_unix.go:620 +0x295 fp=0xc000027980 sp=0xc0000278d8 pc=0x5bd31a4996f5
net.(*netFD).accept(0xc0000da100)
net/fd_unix.go:172 +0x29 fp=0xc000027a38 sp=0xc000027980 pc=0x5bd31a511be9
net.(*TCPListener).accept(0xc00002e740)
net/tcpsock_posix.go:159 +0x1e fp=0xc000027a88 sp=0xc000027a38 pc=0x5bd31a52223e
net.(*TCPListener).Accept(0xc00002e740)
net/tcpsock.go:372 +0x30 fp=0xc000027ab8 sp=0xc000027a88 pc=0x5bd31a521570
net/http.(*onceCloseListener).Accept(0xc0000ac240?)
<autogenerated>:1 +0x24 fp=0xc000027ad0 sp=0xc000027ab8 pc=0x5bd31a660144
net/http.(*Server).Serve(0xc0000d84b0, {0x5bd31a9c4378, 0xc00002e740})
net/http/server.go:3330 +0x30c fp=0xc000027c00 sp=0xc000027ad0 pc=0x5bd31a651e8c
main.main()
github.com/ollama/ollama/llama/runner/runner.go:942 +0xfc7 fp=0xc000027f50 sp=0xc000027c00 pc=0x5bd31a686dc7
runtime.main()
runtime/proc.go:272 +0x29d fp=0xc000027fe0 sp=0xc000027f50 pc=0x5bd31a40f91d
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000027fe8 sp=0xc000027fe0 pc=0x5bd31a44b1a1
goroutine 2 gp=0xc000006c40 m=nil [force gc (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:424 +0xce fp=0xc000050fa8 sp=0xc000050f88 pc=0x5bd31a44356e
runtime.goparkunlock(...)
runtime/proc.go:430
runtime.forcegchelper()
runtime/proc.go:337 +0xb8 fp=0xc000050fe0 sp=0xc000050fa8 pc=0x5bd31a40fc58
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000050fe8 sp=0xc000050fe0 pc=0x5bd31a44b1a1
created by runtime.init.7 in goroutine 1
runtime/proc.go:325 +0x1a
goroutine 3 gp=0xc000007180 m=nil [GC sweep wait]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:424 +0xce fp=0xc000051780 sp=0xc000051760 pc=0x5bd31a44356e
runtime.goparkunlock(...)
runtime/proc.go:430
runtime.bgsweep(0xc00007e000)
runtime/mgcsweep.go:277 +0x94 fp=0xc0000517c8 sp=0xc000051780 pc=0x5bd31a3fa5d4
runtime.gcenable.gowrap1()
runtime/mgc.go:203 +0x25 fp=0xc0000517e0 sp=0xc0000517c8 pc=0x5bd31a3eeea5
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc0000517e8 sp=0xc0000517e0 pc=0x5bd31a44b1a1
created by runtime.gcenable in goroutine 1
runtime/mgc.go:203 +0x66
goroutine 4 gp=0xc000007340 m=nil [GC scavenge wait]:
runtime.gopark(0xc00007e000?, 0x5bd31a8c0bf0?, 0x1?, 0x0?, 0xc000007340?)
runtime/proc.go:424 +0xce fp=0xc000051f78 sp=0xc000051f58 pc=0x5bd31a44356e
runtime.goparkunlock(...)
runtime/proc.go:430
runtime.(*scavengerState).park(0x5bd31abadb20)
runtime/mgcscavenge.go:425 +0x49 fp=0xc000051fa8 sp=0xc000051f78 pc=0x5bd31a3f8009
runtime.bgscavenge(0xc00007e000)
runtime/mgcscavenge.go:653 +0x3c fp=0xc000051fc8 sp=0xc000051fa8 pc=0x5bd31a3f857c
runtime.gcenable.gowrap2()
runtime/mgc.go:204 +0x25 fp=0xc000051fe0 sp=0xc000051fc8 pc=0x5bd31a3eee45
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc000051fe8 sp=0xc000051fe0 pc=0x5bd31a44b1a1
created by runtime.gcenable in goroutine 1
runtime/mgc.go:204 +0xa5
goroutine 5 gp=0xc000007c00 m=nil [finalizer wait]:
runtime.gopark(0xc000050648?, 0x5bd31a3e53a5?, 0xb0?, 0x1?, 0xc0000061c0?)
runtime/proc.go:424 +0xce fp=0xc000050620 sp=0xc000050600 pc=0x5bd31a44356e
runtime.runfinq()
runtime/mfinal.go:193 +0x107 fp=0xc0000507e0 sp=0xc000050620 pc=0x5bd31a3edf27
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc0000507e8 sp=0xc0000507e0 pc=0x5bd31a44b1a1
created by runtime.createfing in goroutine 1
runtime/mfinal.go:163 +0x3d
goroutine 6 gp=0xc000007dc0 m=nil [chan receive]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
runtime/proc.go:424 +0xce fp=0xc000052718 sp=0xc0000526f8 pc=0x5bd31a44356e
runtime.chanrecv(0xc00008c0e0, 0x0, 0x1)
runtime/chan.go:639 +0x41c fp=0xc000052790 sp=0xc000052718 pc=0x5bd31a3deb7c
runtime.chanrecv1(0x0?, 0x0?)
runtime/chan.go:489 +0x12 fp=0xc0000527b8 sp=0xc000052790 pc=0x5bd31a3de752
runtime.unique_runtime_registerUniqueMapCleanup.func1(...)
runtime/mgc.go:1732
runtime.unique_runtime_registerUniqueMapCleanup.gowrap1()
runtime/mgc.go:1735 +0x2f fp=0xc0000527e0 sp=0xc0000527b8 pc=0x5bd31a3f1cef
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc0000527e8 sp=0xc0000527e0 pc=0x5bd31a44b1a1
created by unique.runtime_registerUniqueMapCleanup in goroutine 1
runtime/mgc.go:1730 +0x96
goroutine 9 gp=0xc0000e0540 m=nil [select]:
runtime.gopark(0xc00026ba50?, 0x2?, 0x6e?, 0xf1?, 0xc00026b834?)
runtime/proc.go:424 +0xce fp=0xc00026b698 sp=0xc00026b678 pc=0x5bd31a44356e
runtime.selectgo(0xc00026ba50, 0xc00026b830, 0x9?, 0x0, 0x1?, 0x1)
runtime/select.go:335 +0x7a5 fp=0xc00026b7c0 sp=0xc00026b698 pc=0x5bd31a421825
main.(*Server).completion(0xc0000ac1b0, {0x5bd31a9c44f8, 0xc00024d6c0}, 0xc0002517c0)
github.com/ollama/ollama/llama/runner/runner.go:667 +0xa25 fp=0xc00026bac0 sp=0xc00026b7c0 pc=0x5bd31a684805
main.(*Server).completion-fm({0x5bd31a9c44f8?, 0xc00024d6c0?}, 0x5bd31a656187?)
<autogenerated>:1 +0x36 fp=0xc00026baf0 sp=0xc00026bac0 pc=0x5bd31a687836
net/http.HandlerFunc.ServeHTTP(0xc0000c40e0?, {0x5bd31a9c44f8?, 0xc00024d6c0?}, 0x0?)
net/http/server.go:2220 +0x29 fp=0xc00026bb18 sp=0xc00026baf0 pc=0x5bd31a64ed49
net/http.(*ServeMux).ServeHTTP(0x5bd31a3e53a5?, {0x5bd31a9c44f8, 0xc00024d6c0}, 0xc0002517c0)
net/http/server.go:2747 +0x1ca fp=0xc00026bb68 sp=0xc00026bb18 pc=0x5bd31a650bea
net/http.serverHandler.ServeHTTP({0x5bd31a9c3560?}, {0x5bd31a9c44f8?, 0xc00024d6c0?}, 0x6?)
net/http/server.go:3210 +0x8e fp=0xc00026bb98 sp=0xc00026bb68 pc=0x5bd31a657aee
net/http.(*conn).serve(0xc0000ac240, {0x5bd31a9c4928, 0xc0000990e0})
net/http/server.go:2092 +0x5d0 fp=0xc00026bfb8 sp=0xc00026bb98 pc=0x5bd31a64d970
net/http.(*Server).Serve.gowrap3()
net/http/server.go:3360 +0x28 fp=0xc00026bfe0 sp=0xc00026bfb8 pc=0x5bd31a652288
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc00026bfe8 sp=0xc00026bfe0 pc=0x5bd31a44b1a1
created by net/http.(*Server).Serve in goroutine 1
net/http/server.go:3360 +0x485
goroutine 547 gp=0xc000299dc0 m=nil [IO wait]:
runtime.gopark(0x5bd31a3e9885?, 0x0?, 0x0?, 0x0?, 0xb?)
runtime/proc.go:424 +0xce fp=0xc0002d6da8 sp=0xc0002d6d88 pc=0x5bd31a44356e
runtime.netpollblock(0x5bd31a47eb18?, 0x1a3dbf86?, 0xd3?)
runtime/netpoll.go:575 +0xf7 fp=0xc0002d6de0 sp=0xc0002d6da8 pc=0x5bd31a408337
internal/poll.runtime_pollWait(0x7db2a69cbf00, 0x72)
runtime/netpoll.go:351 +0x85 fp=0xc0002d6e00 sp=0xc0002d6de0 pc=0x5bd31a442865
internal/poll.(*pollDesc).wait(0xc0000da180?, 0xc000099211?, 0x0)
internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0002d6e28 sp=0xc0002d6e00 pc=0x5bd31a498187
internal/poll.(*pollDesc).waitRead(...)
internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc0000da180, {0xc000099211, 0x1, 0x1})
internal/poll/fd_unix.go:165 +0x27a fp=0xc0002d6ec0 sp=0xc0002d6e28 pc=0x5bd31a498cda
net.(*netFD).Read(0xc0000da180, {0xc000099211?, 0xc0002d6f48?, 0x5bd31a444c10?})
net/fd_posix.go:55 +0x25 fp=0xc0002d6f08 sp=0xc0002d6ec0 pc=0x5bd31a510b05
net.(*conn).Read(0xc0000540e0, {0xc000099211?, 0x0?, 0x5bd31ac56a20?})
net/net.go:189 +0x45 fp=0xc0002d6f50 sp=0xc0002d6f08 pc=0x5bd31a51a505
net.(*TCPConn).Read(0x5bd31ab6ddb0?, {0xc000099211?, 0x0?, 0x0?})
<autogenerated>:1 +0x25 fp=0xc0002d6f80 sp=0xc0002d6f50 pc=0x5bd31a5275a5
net/http.(*connReader).backgroundRead(0xc000099200)
net/http/server.go:690 +0x37 fp=0xc0002d6fc8 sp=0xc0002d6f80 pc=0x5bd31a6482f7
net/http.(*connReader).startBackgroundRead.gowrap2()
net/http/server.go:686 +0x25 fp=0xc0002d6fe0 sp=0xc0002d6fc8 pc=0x5bd31a648225
runtime.goexit({})
runtime/asm_amd64.s:1700 +0x1 fp=0xc0002d6fe8 sp=0xc0002d6fe0 pc=0x5bd31a44b1a1
created by net/http.(*connReader).startBackgroundRead in goroutine 9
net/http/server.go:686 +0xb6
rax 0x0
rbx 0xdc5e
rcx 0x7db2a829eb1c
rdx 0x6
rdi 0xdc5a
rsi 0xdc5e
rbp 0x7db14bdf7690
rsp 0x7db14bdf7650
r8 0x0
r9 0x0
r10 0x8
r11 0x246
r12 0x6
r13 0x84
r14 0x16
r15 0x7db2a8a3f67f
rip 0x7db2a829eb1c
rflags 0x246
cs 0x33
fs 0x0
gs 0x0
```
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.4.4 origin main
|
{
"login": "unclemusclez",
"id": 8789242,
"node_id": "MDQ6VXNlcjg3ODkyNDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/8789242?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/unclemusclez",
"html_url": "https://github.com/unclemusclez",
"followers_url": "https://api.github.com/users/unclemusclez/followers",
"following_url": "https://api.github.com/users/unclemusclez/following{/other_user}",
"gists_url": "https://api.github.com/users/unclemusclez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/unclemusclez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/unclemusclez/subscriptions",
"organizations_url": "https://api.github.com/users/unclemusclez/orgs",
"repos_url": "https://api.github.com/users/unclemusclez/repos",
"events_url": "https://api.github.com/users/unclemusclez/events{/privacy}",
"received_events_url": "https://api.github.com/users/unclemusclez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7838/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7838/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8389
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8389/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8389/comments
|
https://api.github.com/repos/ollama/ollama/issues/8389/events
|
https://github.com/ollama/ollama/issues/8389
| 2,782,229,709
|
I_kwDOJ0Z1Ps6l1XTN
| 8,389
|
Ollama install script relaces the systemd profile
|
{
"login": "gerroon",
"id": 8519469,
"node_id": "MDQ6VXNlcjg1MTk0Njk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8519469?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gerroon",
"html_url": "https://github.com/gerroon",
"followers_url": "https://api.github.com/users/gerroon/followers",
"following_url": "https://api.github.com/users/gerroon/following{/other_user}",
"gists_url": "https://api.github.com/users/gerroon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gerroon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gerroon/subscriptions",
"organizations_url": "https://api.github.com/users/gerroon/orgs",
"repos_url": "https://api.github.com/users/gerroon/repos",
"events_url": "https://api.github.com/users/gerroon/events{/privacy}",
"received_events_url": "https://api.github.com/users/gerroon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 8
| 2025-01-12T01:55:55
| 2025-01-28T21:11:50
| 2025-01-28T21:11:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The installation script doesn't pay attention to existing systemd profile, so every new install will replace the existing systemd script. This is not the standard behavior in Debian or other distros, the script at least should ask for permission to replace it.
This is the recommend script
`curl https://ollama.ai/install.sh | sh `
https://github.com/ollama/ollama/blob/main/docs/faq.md
It replaces
`/etc/systemd/system/ollama.service`
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.4
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8389/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8389/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3132
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3132/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3132/comments
|
https://api.github.com/repos/ollama/ollama/issues/3132/events
|
https://github.com/ollama/ollama/pull/3132
| 2,185,243,153
|
PR_kwDOJ0Z1Ps5pko5_
| 3,132
|
Fix Execution Error in /tmp with noexec for Issue #2436
|
{
"login": "jshbmllr",
"id": 27757825,
"node_id": "MDQ6VXNlcjI3NzU3ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/27757825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jshbmllr",
"html_url": "https://github.com/jshbmllr",
"followers_url": "https://api.github.com/users/jshbmllr/followers",
"following_url": "https://api.github.com/users/jshbmllr/following{/other_user}",
"gists_url": "https://api.github.com/users/jshbmllr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jshbmllr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jshbmllr/subscriptions",
"organizations_url": "https://api.github.com/users/jshbmllr/orgs",
"repos_url": "https://api.github.com/users/jshbmllr/repos",
"events_url": "https://api.github.com/users/jshbmllr/events{/privacy}",
"received_events_url": "https://api.github.com/users/jshbmllr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-14T02:15:44
| 2024-03-14T10:48:15
| 2024-03-14T10:48:15
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3132",
"html_url": "https://github.com/ollama/ollama/pull/3132",
"diff_url": "https://github.com/ollama/ollama/pull/3132.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3132.patch",
"merged_at": null
}
|
In relation to [Issue #2436](https://github.com/ollama/ollama/issues/2436), which remains unresolved, this pull request introduces a fix similar to the one in [PR #2403](https://github.com/ollama/ollama/pull/2403). The issue arises on Linux systems where the /tmp directory is mounted with the noexec option, preventing the execution of libraries and mirroring the error detailed in the aforementioned issue. Mounting `/tmp` noexec is a [common hardening technique](https://www.stigviewer.com/stig/red_hat_enterprise_linux_8/2022-12-06/finding/V-230513) and it's likely that this issue will arise frequently.
For context, a similar issue was litigated by the Golang community [here](https://github.com/golang/go/issues/8451), culminating in [this solution](https://go-review.googlesource.com/c/go/+/75475).
I've written a function that checks first if the system default temp directory is mounted with the `noexec` option. If true, it instead writes ollama temp directory to `/run/uses/<id>`, a directory managed by `pam_systemd` that will be [cleaned when the user signs out](https://man7.org/linux/man-pages/man8/pam_systemd.8.html#:~:text=1.%20If%20it,are%0A%20%20%20%20%20%20%20%20%20%20%20removed%2C%20too.). I thought this would be the least intrusive way to address the issue that shouldn't interfere if the user has an alternative `$TMPDIR` and doesn't require any additional knowledge on the part of the user.
Thank you all for an excellent application. I'm a big fan of Ollama and hope you find this contribution helpful, in line with the spirit with which it is offered.
|
{
"login": "jshbmllr",
"id": 27757825,
"node_id": "MDQ6VXNlcjI3NzU3ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/27757825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jshbmllr",
"html_url": "https://github.com/jshbmllr",
"followers_url": "https://api.github.com/users/jshbmllr/followers",
"following_url": "https://api.github.com/users/jshbmllr/following{/other_user}",
"gists_url": "https://api.github.com/users/jshbmllr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jshbmllr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jshbmllr/subscriptions",
"organizations_url": "https://api.github.com/users/jshbmllr/orgs",
"repos_url": "https://api.github.com/users/jshbmllr/repos",
"events_url": "https://api.github.com/users/jshbmllr/events{/privacy}",
"received_events_url": "https://api.github.com/users/jshbmllr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3132/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3132/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4957
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4957/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4957/comments
|
https://api.github.com/repos/ollama/ollama/issues/4957/events
|
https://github.com/ollama/ollama/pull/4957
| 2,342,499,891
|
PR_kwDOJ0Z1Ps5x5w7Z
| 4,957
|
Update README.md
|
{
"login": "ZeyoYT",
"id": 61089602,
"node_id": "MDQ6VXNlcjYxMDg5NjAy",
"avatar_url": "https://avatars.githubusercontent.com/u/61089602?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ZeyoYT",
"html_url": "https://github.com/ZeyoYT",
"followers_url": "https://api.github.com/users/ZeyoYT/followers",
"following_url": "https://api.github.com/users/ZeyoYT/following{/other_user}",
"gists_url": "https://api.github.com/users/ZeyoYT/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ZeyoYT/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZeyoYT/subscriptions",
"organizations_url": "https://api.github.com/users/ZeyoYT/orgs",
"repos_url": "https://api.github.com/users/ZeyoYT/repos",
"events_url": "https://api.github.com/users/ZeyoYT/events{/privacy}",
"received_events_url": "https://api.github.com/users/ZeyoYT/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-06-09T21:29:54
| 2024-09-05T20:10:44
| 2024-09-05T20:10:44
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4957",
"html_url": "https://github.com/ollama/ollama/pull/4957",
"diff_url": "https://github.com/ollama/ollama/pull/4957.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4957.patch",
"merged_at": "2024-09-05T20:10:44"
}
|
Add AiLama to the list of community apps in Extensions & Plugins
This is a duplicate pull request of #4481, but resolves conflicts
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4957/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4957/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6116
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6116/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6116/comments
|
https://api.github.com/repos/ollama/ollama/issues/6116/events
|
https://github.com/ollama/ollama/issues/6116
| 2,441,916,883
|
I_kwDOJ0Z1Ps6RjLHT
| 6,116
|
mistral nemo
|
{
"login": "Domi31tls",
"id": 124446863,
"node_id": "U_kgDOB2rojw",
"avatar_url": "https://avatars.githubusercontent.com/u/124446863?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Domi31tls",
"html_url": "https://github.com/Domi31tls",
"followers_url": "https://api.github.com/users/Domi31tls/followers",
"following_url": "https://api.github.com/users/Domi31tls/following{/other_user}",
"gists_url": "https://api.github.com/users/Domi31tls/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Domi31tls/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Domi31tls/subscriptions",
"organizations_url": "https://api.github.com/users/Domi31tls/orgs",
"repos_url": "https://api.github.com/users/Domi31tls/repos",
"events_url": "https://api.github.com/users/Domi31tls/events{/privacy}",
"received_events_url": "https://api.github.com/users/Domi31tls/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-01T09:09:33
| 2024-08-01T20:32:55
| 2024-08-01T20:27:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I use open-webui. When i want to use mistral nemo, I have an error 500 :

### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.37
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6116/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6116/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4572
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4572/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4572/comments
|
https://api.github.com/repos/ollama/ollama/issues/4572/events
|
https://github.com/ollama/ollama/issues/4572
| 2,309,825,847
|
I_kwDOJ0Z1Ps6JrSU3
| 4,572
|
Error: llama runner process has terminated: exit status 0xc0000409
|
{
"login": "NeoFii",
"id": 155638855,
"node_id": "U_kgDOCUbcRw",
"avatar_url": "https://avatars.githubusercontent.com/u/155638855?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NeoFii",
"html_url": "https://github.com/NeoFii",
"followers_url": "https://api.github.com/users/NeoFii/followers",
"following_url": "https://api.github.com/users/NeoFii/following{/other_user}",
"gists_url": "https://api.github.com/users/NeoFii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/NeoFii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NeoFii/subscriptions",
"organizations_url": "https://api.github.com/users/NeoFii/orgs",
"repos_url": "https://api.github.com/users/NeoFii/repos",
"events_url": "https://api.github.com/users/NeoFii/events{/privacy}",
"received_events_url": "https://api.github.com/users/NeoFii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 35
| 2024-05-22T07:48:41
| 2024-12-03T01:35:05
| 2024-07-23T18:10:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I encountered issues while deploying my fine-tuned model using ollama.I have successfully created my own model locally.

When I used the command `ollama run legalassistant`, an error occurred.
Error: llama runner process has terminated: exit status 0xc0000409
I don't know what's wrong,could you help me?
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.38
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4572/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4572/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1182
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1182/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1182/comments
|
https://api.github.com/repos/ollama/ollama/issues/1182/events
|
https://github.com/ollama/ollama/issues/1182
| 2,000,002,484
|
I_kwDOJ0Z1Ps53NZ20
| 1,182
|
Bug: --json mode going into a infinite loop?
|
{
"login": "hemanth",
"id": 18315,
"node_id": "MDQ6VXNlcjE4MzE1",
"avatar_url": "https://avatars.githubusercontent.com/u/18315?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hemanth",
"html_url": "https://github.com/hemanth",
"followers_url": "https://api.github.com/users/hemanth/followers",
"following_url": "https://api.github.com/users/hemanth/following{/other_user}",
"gists_url": "https://api.github.com/users/hemanth/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hemanth/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hemanth/subscriptions",
"organizations_url": "https://api.github.com/users/hemanth/orgs",
"repos_url": "https://api.github.com/users/hemanth/repos",
"events_url": "https://api.github.com/users/hemanth/events{/privacy}",
"received_events_url": "https://api.github.com/users/hemanth/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2023-11-17T22:06:49
| 2024-03-12T18:40:35
| 2024-03-12T18:40:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
```sh
/tmp took 37s
❯ ollama run llama2 --format json
>>> give me 10 emojis with their meanings
{
.... # never ends for that input
```
https://github.com/jmorganca/ollama/assets/18315/6771fd1f-d0e3-4f1f-9e7b-e15bacf8acad
^ recording
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1182/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1182/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5396
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5396/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5396/comments
|
https://api.github.com/repos/ollama/ollama/issues/5396/events
|
https://github.com/ollama/ollama/issues/5396
| 2,382,594,935
|
I_kwDOJ0Z1Ps6OA4N3
| 5,396
|
deepseek-v2:236b Startup Issues
|
{
"login": "SongXiaoMao",
"id": 55074934,
"node_id": "MDQ6VXNlcjU1MDc0OTM0",
"avatar_url": "https://avatars.githubusercontent.com/u/55074934?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SongXiaoMao",
"html_url": "https://github.com/SongXiaoMao",
"followers_url": "https://api.github.com/users/SongXiaoMao/followers",
"following_url": "https://api.github.com/users/SongXiaoMao/following{/other_user}",
"gists_url": "https://api.github.com/users/SongXiaoMao/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SongXiaoMao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SongXiaoMao/subscriptions",
"organizations_url": "https://api.github.com/users/SongXiaoMao/orgs",
"repos_url": "https://api.github.com/users/SongXiaoMao/repos",
"events_url": "https://api.github.com/users/SongXiaoMao/events{/privacy}",
"received_events_url": "https://api.github.com/users/SongXiaoMao/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-07-01T01:29:08
| 2024-08-01T22:36:49
| 2024-08-01T22:36:48
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

Computer configuration: 3090*4 128g memory
There are problems starting both models
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
ollama version is 0.1.48
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5396/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5396/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4099
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4099/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4099/comments
|
https://api.github.com/repos/ollama/ollama/issues/4099/events
|
https://github.com/ollama/ollama/issues/4099
| 2,275,430,545
|
I_kwDOJ0Z1Ps6HoFCR
| 4,099
|
Please support gfx1103 in rocm docker image
|
{
"login": "LaurentBonnaud",
"id": 2168323,
"node_id": "MDQ6VXNlcjIxNjgzMjM=",
"avatar_url": "https://avatars.githubusercontent.com/u/2168323?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LaurentBonnaud",
"html_url": "https://github.com/LaurentBonnaud",
"followers_url": "https://api.github.com/users/LaurentBonnaud/followers",
"following_url": "https://api.github.com/users/LaurentBonnaud/following{/other_user}",
"gists_url": "https://api.github.com/users/LaurentBonnaud/gists{/gist_id}",
"starred_url": "https://api.github.com/users/LaurentBonnaud/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LaurentBonnaud/subscriptions",
"organizations_url": "https://api.github.com/users/LaurentBonnaud/orgs",
"repos_url": "https://api.github.com/users/LaurentBonnaud/repos",
"events_url": "https://api.github.com/users/LaurentBonnaud/events{/privacy}",
"received_events_url": "https://api.github.com/users/LaurentBonnaud/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-05-02T12:34:14
| 2024-05-21T17:48:13
| 2024-05-21T17:48:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
my laptop has this SoC:
```
$ lscpu
[...]
Model name: AMD Ryzen 7 PRO 7840U w/ Radeon 780M Graphics
```
```
$ rocminfo
[...]
*******
Agent 2
*******
Name: gfx1103
Uuid: GPU-XX
Marketing Name: AMD Radeon Graphics
```
Unfortunately, ollama does not support it yet:
```
$ docker run -ti --device /dev/kfd --device /dev/dri -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama:rocm
[...]
time=2024-05-02T12:17:01.881Z level=INFO source=routes.go:1143 msg="Listening on [::]:11434 (version 0.1.32)"
[...]
time=2024-05-02T12:17:03.655Z level=INFO source=amd_linux.go:88 msg="detected amdgpu versions [gfx1103]"
time=2024-05-02T12:17:03.657Z level=WARN source=amd_linux.go:116 msg="amdgpu [0] gfx1103 is not supported by /tmp/ollama2145067703/rocm [gfx1030 gfx1100 gfx1101 gfx1102 gfx900 gfx906 gfx908 gfx90a gfx940 gfx941 gfx942]"
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4099/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4099/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2166
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2166/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2166/comments
|
https://api.github.com/repos/ollama/ollama/issues/2166/events
|
https://github.com/ollama/ollama/issues/2166
| 2,097,591,985
|
I_kwDOJ0Z1Ps59Brax
| 2,166
|
ROCm container CUDA error
|
{
"login": "Eelviny",
"id": 4560915,
"node_id": "MDQ6VXNlcjQ1NjA5MTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4560915?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Eelviny",
"html_url": "https://github.com/Eelviny",
"followers_url": "https://api.github.com/users/Eelviny/followers",
"following_url": "https://api.github.com/users/Eelviny/following{/other_user}",
"gists_url": "https://api.github.com/users/Eelviny/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Eelviny/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Eelviny/subscriptions",
"organizations_url": "https://api.github.com/users/Eelviny/orgs",
"repos_url": "https://api.github.com/users/Eelviny/repos",
"events_url": "https://api.github.com/users/Eelviny/events{/privacy}",
"received_events_url": "https://api.github.com/users/Eelviny/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-01-24T07:17:20
| 2024-01-30T17:23:36
| 2024-01-25T12:51:06
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm attempting to use an AMD Radeon RX 7900 XT on ollama v0.1.21 in a container that I built from the Dockerfile. I use podman to build and run containers, and my OS is Bluefin (Fedora Silverblue spin). I'm unsure whether this is an issue because I'm missing something on my host OS, or an issue with the container.
Here's my run command: `podman run -d --privileged --device /dev/kfd:/dev/kfd -v ollama:/root/.ollama -p 11434:11434 -e OLLAMA_DEBUG=1 --name ollama localhost/ollama:v0.1.21`
Ollama starts up fine, but when I attempt to run model codellama:13b-instruct, ollama crashes. I'm running it with OLLAMA_DEBUG=1, here's the full run:
https://gist.github.com/Eelviny/1d43d6324f68977bd1c653e0b78eca03
What's interesting is that if I run `rocm-smi` on the container, I get an error, so I suspect it might be more of a container issue than an ollama issue:
```
========================================= ROCm System Management Interface =========================================
=================================================== Concise Info ===================================================
Device [Model : Revision] Temp Power Partitions SCLK MCLK Fan Perf PwrCap VRAM% GPU%
Name (20 chars) (Edge) (Avg) (Mem, Compute)
====================================================================================================================
Traceback (most recent call last):
File "/usr/bin/rocm-smi", line 3926, in <module>
showAllConcise(deviceList)
File "/usr/bin/rocm-smi", line 1827, in showAllConcise
zip(range(len(max_widths)), values['card%s' % (str(device))])), None)
File "/usr/bin/rocm-smi", line 693, in printLog
print(logstr + '\n', end='')
UnicodeEncodeError: 'ascii' codec can't encode character '\xb0' in position 34: ordinal not in range(128)
```
I then tried to build the main branch at f63dc2d (#2162) but this exhibited completely different behaviour - no logging whatsoever, when trying to do `ollama run` I would just get the spinning loading symbol forever.
|
{
"login": "Eelviny",
"id": 4560915,
"node_id": "MDQ6VXNlcjQ1NjA5MTU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4560915?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Eelviny",
"html_url": "https://github.com/Eelviny",
"followers_url": "https://api.github.com/users/Eelviny/followers",
"following_url": "https://api.github.com/users/Eelviny/following{/other_user}",
"gists_url": "https://api.github.com/users/Eelviny/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Eelviny/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Eelviny/subscriptions",
"organizations_url": "https://api.github.com/users/Eelviny/orgs",
"repos_url": "https://api.github.com/users/Eelviny/repos",
"events_url": "https://api.github.com/users/Eelviny/events{/privacy}",
"received_events_url": "https://api.github.com/users/Eelviny/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2166/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2166/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2923
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2923/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2923/comments
|
https://api.github.com/repos/ollama/ollama/issues/2923/events
|
https://github.com/ollama/ollama/issues/2923
| 2,167,677,281
|
I_kwDOJ0Z1Ps6BNCFh
| 2,923
|
how does memory work in cmd `ollama run openchat`?
|
{
"login": "TimmekHW",
"id": 94626112,
"node_id": "U_kgDOBaPhQA",
"avatar_url": "https://avatars.githubusercontent.com/u/94626112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TimmekHW",
"html_url": "https://github.com/TimmekHW",
"followers_url": "https://api.github.com/users/TimmekHW/followers",
"following_url": "https://api.github.com/users/TimmekHW/following{/other_user}",
"gists_url": "https://api.github.com/users/TimmekHW/gists{/gist_id}",
"starred_url": "https://api.github.com/users/TimmekHW/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/TimmekHW/subscriptions",
"organizations_url": "https://api.github.com/users/TimmekHW/orgs",
"repos_url": "https://api.github.com/users/TimmekHW/repos",
"events_url": "https://api.github.com/users/TimmekHW/events{/privacy}",
"received_events_url": "https://api.github.com/users/TimmekHW/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-04T20:22:42
| 2024-03-06T22:32:11
| 2024-03-06T22:32:10
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
how does memory work in cmd `ollama run openchat`?
could you share the code? remembering chat history and context works well there. can I please have the code? Because my implementation of history is not working correctly
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2923/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2923/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/6350
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6350/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6350/comments
|
https://api.github.com/repos/ollama/ollama/issues/6350/events
|
https://github.com/ollama/ollama/issues/6350
| 2,464,842,738
|
I_kwDOJ0Z1Ps6S6oPy
| 6,350
|
Is this wrong in https://ollama.com/blog/gemma2
|
{
"login": "wonpn",
"id": 14801003,
"node_id": "MDQ6VXNlcjE0ODAxMDAz",
"avatar_url": "https://avatars.githubusercontent.com/u/14801003?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wonpn",
"html_url": "https://github.com/wonpn",
"followers_url": "https://api.github.com/users/wonpn/followers",
"following_url": "https://api.github.com/users/wonpn/following{/other_user}",
"gists_url": "https://api.github.com/users/wonpn/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wonpn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wonpn/subscriptions",
"organizations_url": "https://api.github.com/users/wonpn/orgs",
"repos_url": "https://api.github.com/users/wonpn/repos",
"events_url": "https://api.github.com/users/wonpn/events{/privacy}",
"received_events_url": "https://api.github.com/users/wonpn/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-08-14T03:54:35
| 2024-08-14T17:49:32
| 2024-08-14T17:49:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

8B to 9B?
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6350/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6350/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2062
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2062/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2062/comments
|
https://api.github.com/repos/ollama/ollama/issues/2062/events
|
https://github.com/ollama/ollama/issues/2062
| 2,089,399,218
|
I_kwDOJ0Z1Ps58ibOy
| 2,062
|
Add obsidian model
|
{
"login": "mak448a",
"id": 94062293,
"node_id": "U_kgDOBZtG1Q",
"avatar_url": "https://avatars.githubusercontent.com/u/94062293?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mak448a",
"html_url": "https://github.com/mak448a",
"followers_url": "https://api.github.com/users/mak448a/followers",
"following_url": "https://api.github.com/users/mak448a/following{/other_user}",
"gists_url": "https://api.github.com/users/mak448a/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mak448a/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mak448a/subscriptions",
"organizations_url": "https://api.github.com/users/mak448a/orgs",
"repos_url": "https://api.github.com/users/mak448a/repos",
"events_url": "https://api.github.com/users/mak448a/events{/privacy}",
"received_events_url": "https://api.github.com/users/mak448a/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 2
| 2024-01-19T01:50:25
| 2024-03-11T17:59:40
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Could you add the obsidian model to the library?
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2062/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2062/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6171
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6171/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6171/comments
|
https://api.github.com/repos/ollama/ollama/issues/6171/events
|
https://github.com/ollama/ollama/pull/6171
| 2,447,779,211
|
PR_kwDOJ0Z1Ps53Z6Jq
| 6,171
|
removeall to remove non-empty temp dirs
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-08-05T07:05:59
| 2024-08-09T22:47:15
| 2024-08-09T22:47:13
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6171",
"html_url": "https://github.com/ollama/ollama/pull/6171",
"diff_url": "https://github.com/ollama/ollama/pull/6171.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6171.patch",
"merged_at": "2024-08-09T22:47:13"
}
|
`os.Remove()` does not remove non-empty directories so it'll error with `directory not empty`. instead, remove the expected content (`ollama.pid` and `runners`) individually, then remove the parent directory. remove the content explicitly so as to not accidentally remove things ollama does not own
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6171/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6171/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7524
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7524/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7524/comments
|
https://api.github.com/repos/ollama/ollama/issues/7524/events
|
https://github.com/ollama/ollama/issues/7524
| 2,637,347,423
|
I_kwDOJ0Z1Ps6dMrpf
| 7,524
|
Error: could not connect to ollama app, is it running?
|
{
"login": "BongozGoBOOM",
"id": 116317767,
"node_id": "U_kgDOBu7eRw",
"avatar_url": "https://avatars.githubusercontent.com/u/116317767?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BongozGoBOOM",
"html_url": "https://github.com/BongozGoBOOM",
"followers_url": "https://api.github.com/users/BongozGoBOOM/followers",
"following_url": "https://api.github.com/users/BongozGoBOOM/following{/other_user}",
"gists_url": "https://api.github.com/users/BongozGoBOOM/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BongozGoBOOM/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BongozGoBOOM/subscriptions",
"organizations_url": "https://api.github.com/users/BongozGoBOOM/orgs",
"repos_url": "https://api.github.com/users/BongozGoBOOM/repos",
"events_url": "https://api.github.com/users/BongozGoBOOM/events{/privacy}",
"received_events_url": "https://api.github.com/users/BongozGoBOOM/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 20
| 2024-11-06T08:13:04
| 2025-01-13T00:57:17
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Tried versions v0.4.0, v0.3.14, and v0.3.13, all yielded the same exact results.

Attempted to start the app through start menu, file explorer, and the ollama serve command (in separate windows), all yielded the same results.
I checked the app.log, and every single launch was showing the same thing:
"time=2024-11-06T03:02:50.988-05:00 level=WARN source=server.go:163 msg="server crash 1 - exit code 2 - respawning""
And it'd go so on for server crash 1 - 30 so and so.
I hate making my own issue posts if I can help it but I've spent all night trying to fix this so I'd like some help.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.0-rc8
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7524/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7524/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7038
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7038/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7038/comments
|
https://api.github.com/repos/ollama/ollama/issues/7038/events
|
https://github.com/ollama/ollama/issues/7038
| 2,555,361,878
|
I_kwDOJ0Z1Ps6YT7pW
| 7,038
|
Error: llama runner process has terminated: error loading modelvocabulary: cannot find tokenizer merges in model file
|
{
"login": "sparklyi",
"id": 64263737,
"node_id": "MDQ6VXNlcjY0MjYzNzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/64263737?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sparklyi",
"html_url": "https://github.com/sparklyi",
"followers_url": "https://api.github.com/users/sparklyi/followers",
"following_url": "https://api.github.com/users/sparklyi/following{/other_user}",
"gists_url": "https://api.github.com/users/sparklyi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sparklyi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sparklyi/subscriptions",
"organizations_url": "https://api.github.com/users/sparklyi/orgs",
"repos_url": "https://api.github.com/users/sparklyi/repos",
"events_url": "https://api.github.com/users/sparklyi/events{/privacy}",
"received_events_url": "https://api.github.com/users/sparklyi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 10
| 2024-09-30T01:59:29
| 2024-10-06T14:03:20
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
time : 09/30/2024
script:
```
FROM "./model-quant.gguf"
TEMPLATE """{{- if .System }}
<|im_start|>system {{ .System }}<|im_end|>
{{- end }}
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assistant
"""
SYSTEM """"""
PARAMETER stop <|im_start|>
PARAMETER stop <|im_end|>
```
The creation was successful, but the operation failed
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
ollama version is 0.3.12
|
{
"login": "sparklyi",
"id": 64263737,
"node_id": "MDQ6VXNlcjY0MjYzNzM3",
"avatar_url": "https://avatars.githubusercontent.com/u/64263737?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sparklyi",
"html_url": "https://github.com/sparklyi",
"followers_url": "https://api.github.com/users/sparklyi/followers",
"following_url": "https://api.github.com/users/sparklyi/following{/other_user}",
"gists_url": "https://api.github.com/users/sparklyi/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sparklyi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sparklyi/subscriptions",
"organizations_url": "https://api.github.com/users/sparklyi/orgs",
"repos_url": "https://api.github.com/users/sparklyi/repos",
"events_url": "https://api.github.com/users/sparklyi/events{/privacy}",
"received_events_url": "https://api.github.com/users/sparklyi/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7038/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7038/timeline
| null |
reopened
| false
|
https://api.github.com/repos/ollama/ollama/issues/3990
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3990/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3990/comments
|
https://api.github.com/repos/ollama/ollama/issues/3990/events
|
https://github.com/ollama/ollama/issues/3990
| 2,267,321,303
|
I_kwDOJ0Z1Ps6HJJPX
| 3,990
|
how to upgrade ollama automatically on MAC PRO?
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-04-28T03:43:02
| 2024-04-29T01:31:50
| 2024-04-29T01:31:50
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
how to upgrade ollama automatically on MAC PRO?
|
{
"login": "taozhiyuai",
"id": 146583103,
"node_id": "U_kgDOCLyuPw",
"avatar_url": "https://avatars.githubusercontent.com/u/146583103?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/taozhiyuai",
"html_url": "https://github.com/taozhiyuai",
"followers_url": "https://api.github.com/users/taozhiyuai/followers",
"following_url": "https://api.github.com/users/taozhiyuai/following{/other_user}",
"gists_url": "https://api.github.com/users/taozhiyuai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/taozhiyuai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/taozhiyuai/subscriptions",
"organizations_url": "https://api.github.com/users/taozhiyuai/orgs",
"repos_url": "https://api.github.com/users/taozhiyuai/repos",
"events_url": "https://api.github.com/users/taozhiyuai/events{/privacy}",
"received_events_url": "https://api.github.com/users/taozhiyuai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3990/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3990/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6395
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6395/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6395/comments
|
https://api.github.com/repos/ollama/ollama/issues/6395/events
|
https://github.com/ollama/ollama/pull/6395
| 2,470,948,574
|
PR_kwDOJ0Z1Ps54nSwu
| 6,395
|
Make new tokenizer logic conditional
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2024-08-16T20:30:36
| 2024-08-25T00:25:40
| 2024-08-25T00:25:38
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6395",
"html_url": "https://github.com/ollama/ollama/pull/6395",
"diff_url": "https://github.com/ollama/ollama/pull/6395.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6395.patch",
"merged_at": "2024-08-25T00:25:37"
}
|
Only use the new cgo tokenizer/detokenizer if we're using the new runners
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6395/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6395/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4882
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4882/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4882/comments
|
https://api.github.com/repos/ollama/ollama/issues/4882/events
|
https://github.com/ollama/ollama/issues/4882
| 2,339,231,080
|
I_kwDOJ0Z1Ps6LbdVo
| 4,882
|
mac app silently fails to install CLI link if /usr/local/bin/ missing
|
{
"login": "saimgulay",
"id": 120498676,
"node_id": "U_kgDOBy6p9A",
"avatar_url": "https://avatars.githubusercontent.com/u/120498676?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saimgulay",
"html_url": "https://github.com/saimgulay",
"followers_url": "https://api.github.com/users/saimgulay/followers",
"following_url": "https://api.github.com/users/saimgulay/following{/other_user}",
"gists_url": "https://api.github.com/users/saimgulay/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saimgulay/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saimgulay/subscriptions",
"organizations_url": "https://api.github.com/users/saimgulay/orgs",
"repos_url": "https://api.github.com/users/saimgulay/repos",
"events_url": "https://api.github.com/users/saimgulay/events{/privacy}",
"received_events_url": "https://api.github.com/users/saimgulay/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677279472,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjf8y8A",
"url": "https://api.github.com/repos/ollama/ollama/labels/macos",
"name": "macos",
"color": "E2DBC0",
"default": false,
"description": ""
},
{
"id": 6678628138,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjhPHKg",
"url": "https://api.github.com/repos/ollama/ollama/labels/install",
"name": "install",
"color": "E0B88D",
"default": false,
"description": ""
}
] |
open
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 5
| 2024-06-06T22:12:58
| 2025-01-09T00:40:44
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi community,
I have a MacOS Sonoma 14.4.1 and my Ollama version is 0.1.41. I moved the app to the Applications folder then run the app, click the Next button, then click the Install button to install the command line but nothing happens. I tried running as an admin but still faced the same problem.
The temporary solution:
`sudo cp /Applications/Ollama.app/Contents/Resources/ollama /usr/local/bin/`
`sudo chmod +x /usr/local/bin/ollama`
`ollama --version`
### OS
macOS
### GPU
AMD
### CPU
Intel, AMD
### Ollama version
0.1.41
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4882/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4882/timeline
| null |
reopened
| false
|
https://api.github.com/repos/ollama/ollama/issues/8680
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8680/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8680/comments
|
https://api.github.com/repos/ollama/ollama/issues/8680/events
|
https://github.com/ollama/ollama/issues/8680
| 2,819,658,888
|
I_kwDOJ0Z1Ps6oEJSI
| 8,680
|
api/chat not working in parallel with docker-compose
|
{
"login": "acclayer7",
"id": 178514264,
"node_id": "U_kgDOCqPpWA",
"avatar_url": "https://avatars.githubusercontent.com/u/178514264?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/acclayer7",
"html_url": "https://github.com/acclayer7",
"followers_url": "https://api.github.com/users/acclayer7/followers",
"following_url": "https://api.github.com/users/acclayer7/following{/other_user}",
"gists_url": "https://api.github.com/users/acclayer7/gists{/gist_id}",
"starred_url": "https://api.github.com/users/acclayer7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/acclayer7/subscriptions",
"organizations_url": "https://api.github.com/users/acclayer7/orgs",
"repos_url": "https://api.github.com/users/acclayer7/repos",
"events_url": "https://api.github.com/users/acclayer7/events{/privacy}",
"received_events_url": "https://api.github.com/users/acclayer7/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-30T00:54:32
| 2025-01-30T01:05:37
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hello, I have my ollama with enough memory (16vram), I use OLLAMA_NUM_PARALLEL=2 OLLAMA_MAX_LOADED_MODELS=2, but I don't see any memory increase.
I use docker-compose to make work, however when using the api, it does not increase the vram, it stays using the same vram and I still have 10gb vram left over. It should take up more if the parameters are in parallel, right?
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8680/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8680/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/1652
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1652/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1652/comments
|
https://api.github.com/repos/ollama/ollama/issues/1652/events
|
https://github.com/ollama/ollama/issues/1652
| 2,051,887,042
|
I_kwDOJ0Z1Ps56TU_C
| 1,652
|
In time
|
{
"login": "Xdcnft",
"id": 111935635,
"node_id": "U_kgDOBqwAkw",
"avatar_url": "https://avatars.githubusercontent.com/u/111935635?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Xdcnft",
"html_url": "https://github.com/Xdcnft",
"followers_url": "https://api.github.com/users/Xdcnft/followers",
"following_url": "https://api.github.com/users/Xdcnft/following{/other_user}",
"gists_url": "https://api.github.com/users/Xdcnft/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Xdcnft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Xdcnft/subscriptions",
"organizations_url": "https://api.github.com/users/Xdcnft/orgs",
"repos_url": "https://api.github.com/users/Xdcnft/repos",
"events_url": "https://api.github.com/users/Xdcnft/events{/privacy}",
"received_events_url": "https://api.github.com/users/Xdcnft/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-12-21T07:40:48
| 2023-12-21T07:40:57
| 2023-12-21T07:40:57
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "Xdcnft",
"id": 111935635,
"node_id": "U_kgDOBqwAkw",
"avatar_url": "https://avatars.githubusercontent.com/u/111935635?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Xdcnft",
"html_url": "https://github.com/Xdcnft",
"followers_url": "https://api.github.com/users/Xdcnft/followers",
"following_url": "https://api.github.com/users/Xdcnft/following{/other_user}",
"gists_url": "https://api.github.com/users/Xdcnft/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Xdcnft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Xdcnft/subscriptions",
"organizations_url": "https://api.github.com/users/Xdcnft/orgs",
"repos_url": "https://api.github.com/users/Xdcnft/repos",
"events_url": "https://api.github.com/users/Xdcnft/events{/privacy}",
"received_events_url": "https://api.github.com/users/Xdcnft/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1652/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1652/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8591
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8591/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8591/comments
|
https://api.github.com/repos/ollama/ollama/issues/8591/events
|
https://github.com/ollama/ollama/issues/8591
| 2,811,472,497
|
I_kwDOJ0Z1Ps6nk6px
| 8,591
|
High idle power consumption due to persistent CUDA initialization
|
{
"login": "SvenMeyer",
"id": 25609,
"node_id": "MDQ6VXNlcjI1NjA5",
"avatar_url": "https://avatars.githubusercontent.com/u/25609?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SvenMeyer",
"html_url": "https://github.com/SvenMeyer",
"followers_url": "https://api.github.com/users/SvenMeyer/followers",
"following_url": "https://api.github.com/users/SvenMeyer/following{/other_user}",
"gists_url": "https://api.github.com/users/SvenMeyer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SvenMeyer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SvenMeyer/subscriptions",
"organizations_url": "https://api.github.com/users/SvenMeyer/orgs",
"repos_url": "https://api.github.com/users/SvenMeyer/repos",
"events_url": "https://api.github.com/users/SvenMeyer/events{/privacy}",
"received_events_url": "https://api.github.com/users/SvenMeyer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 4
| 2025-01-26T11:24:50
| 2025-01-27T12:31:37
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# High idle power consumption due to PCIe bus unable to enter sleep state
## Issue Description
When running Ollama as a service with CUDA enabled, the system maintains unnecessarily high power consumption (~14W vs ~6W) even when idle. This is primarily caused by the PCIe bus being unable to enter sleep state (D3) due to persistent CUDA initialization, despite the GPU itself being in a low-power state.
### Technical Details
- System: DELL XPS15 9530
- CPU: Intel i7-13700H
- GPU: NVIDIA RTX 4070 Laptop
- Driver Version: 550.144.03
- CUDA Version: 12.4
- Ollama Version: 0.5.7
- OS: Manjaro Linux (DISTRIB_RELEASE="24.2.1")
- Kernel: Linux 6.12.4-1-MANJARO
### Current Behavior
Power consumption breakdown:
- Base system (no Ollama): ~6W
- With Ollama service running: ~14W
- Delta: ~8W additional power
The increased power consumption is primarily due to:
1. PCIe bus stuck in D0 state (cannot enter D3 sleep)
2. CUDA runtime keeping PCIe link active
The GPU itself is actually well-behaved:
- Enters P8 power state correctly
- Only draws ~4W at idle
- Properly releases display to Intel GPU
- Memory Usage: 11MiB
### Expected Behavior
- PCIe bus should enter D3 state when CUDA inference is not active
- System should maintain low power consumption (~6W) when not actively processing
- CUDA initialization should happen only when needed for inference
- No impact on inference performance when needed
### Diagnostics
With Ollama service running but idle:
```
PCIe State: D0 (should be D3)
GPU Power State: P8
GPU Power Draw: ~4W
Display: Using Intel GPU
CUDA: Initialized but idle
```
### Attempted Solutions
1. CUDA environment configuration:
```ini
Environment="CUDA_MODULE_LOADING=LAZY"
Environment="CUDA_CACHE_DISABLE=0"
Environment="NVIDIA_DRIVER_CAPABILITIES=compute,utility"
```
2. PCIe power management:
```bash
echo auto > /sys/bus/pci/devices/0000:01:00.0/power/control
```
3. Various NVIDIA power management settings through nvidia-smi
None of these solutions allowed the PCIe bus to enter sleep state while the Ollama service is running.
### Proposed Solution
Implement lazy CUDA initialization in Ollama:
1. Initialize CUDA only when a model is loaded/inference requested
2. Release CUDA context when idle for a configurable period
3. Allow PCIe bus to enter D3 state when CUDA context is released
4. Add configuration option for power management strategy
Benefits:
- Lower idle power consumption
- Better battery life on laptops
- Same performance for actual inference
- No impact on API responsiveness
- Proper power management for PCIe bus
### Current Workaround
Users need to manually start/stop the Ollama service when needed:
```bash
sudo systemctl start ollama # When inference needed
sudo systemctl stop ollama # To restore low power state
```
This is not ideal for IDE integrations and other automated workflows that expect Ollama to be always available.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8591/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8591/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3764
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3764/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3764/comments
|
https://api.github.com/repos/ollama/ollama/issues/3764/events
|
https://github.com/ollama/ollama/issues/3764
| 2,253,875,609
|
I_kwDOJ0Z1Ps6GV2mZ
| 3,764
|
Error: pull model manifest: 400
|
{
"login": "zedmango",
"id": 33294054,
"node_id": "MDQ6VXNlcjMzMjk0MDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/33294054?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zedmango",
"html_url": "https://github.com/zedmango",
"followers_url": "https://api.github.com/users/zedmango/followers",
"following_url": "https://api.github.com/users/zedmango/following{/other_user}",
"gists_url": "https://api.github.com/users/zedmango/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zedmango/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zedmango/subscriptions",
"organizations_url": "https://api.github.com/users/zedmango/orgs",
"repos_url": "https://api.github.com/users/zedmango/repos",
"events_url": "https://api.github.com/users/zedmango/events{/privacy}",
"received_events_url": "https://api.github.com/users/zedmango/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-04-19T20:06:49
| 2024-04-19T20:12:13
| 2024-04-19T20:12:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Got this strange error while trying to create a model.
```
$ ./createmodels.sh
transferring model data
creating model layer
creating template layer
creating parameters layer
creating config layer
using already created layer sha256:5499dfd64378623dc8ae420f29fc8e6a6e43f23198a290f5c4e9623c1d19c57a
using already created layer sha256:b89fc90e483fb047c3771d09fdc30b949815b781c25487501455faf74167a747
using already created layer sha256:084d40e289fdd6e852fd8897a18bea3de65fe96b9ed40f3a64d2e7ec115dd085
writing layer sha256:1db98adc375faf95d8b7e65ee0fb29dddf905e87fecf7229d9b60e4aca928eb8
writing manifest
success
transferring model data
pulling model
pulling manifest
Error: pull model manifest: 400
transferring model data
creating model layer ⠴
```
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "zedmango",
"id": 33294054,
"node_id": "MDQ6VXNlcjMzMjk0MDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/33294054?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zedmango",
"html_url": "https://github.com/zedmango",
"followers_url": "https://api.github.com/users/zedmango/followers",
"following_url": "https://api.github.com/users/zedmango/following{/other_user}",
"gists_url": "https://api.github.com/users/zedmango/gists{/gist_id}",
"starred_url": "https://api.github.com/users/zedmango/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zedmango/subscriptions",
"organizations_url": "https://api.github.com/users/zedmango/orgs",
"repos_url": "https://api.github.com/users/zedmango/repos",
"events_url": "https://api.github.com/users/zedmango/events{/privacy}",
"received_events_url": "https://api.github.com/users/zedmango/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3764/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3764/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/118
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/118/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/118/comments
|
https://api.github.com/repos/ollama/ollama/issues/118/events
|
https://github.com/ollama/ollama/issues/118
| 1,811,251,138
|
I_kwDOJ0Z1Ps5r9X_C
| 118
|
Crashed on M2 Air 8GB
|
{
"login": "chsasank",
"id": 9305875,
"node_id": "MDQ6VXNlcjkzMDU4NzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/9305875?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/chsasank",
"html_url": "https://github.com/chsasank",
"followers_url": "https://api.github.com/users/chsasank/followers",
"following_url": "https://api.github.com/users/chsasank/following{/other_user}",
"gists_url": "https://api.github.com/users/chsasank/gists{/gist_id}",
"starred_url": "https://api.github.com/users/chsasank/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chsasank/subscriptions",
"organizations_url": "https://api.github.com/users/chsasank/orgs",
"repos_url": "https://api.github.com/users/chsasank/repos",
"events_url": "https://api.github.com/users/chsasank/events{/privacy}",
"received_events_url": "https://api.github.com/users/chsasank/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2023-07-19T06:29:51
| 2023-08-23T17:41:55
| 2023-08-23T17:41:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
```[GIN] 2023/07/19 - 11:58:16 | 200 | 13m51s | 127.0.0.1 | POST "/api/pull"
llama.cpp: loading model from /Users/sasank/.ollama/models/blobs/sha256:8daa9615cce30c259a9555b1cc250d461d1bc69980a274b44d7eda0be78076d8
llama_model_load_internal: format = ggjt v3 (latest)
llama_model_load_internal: n_vocab = 32000
llama_model_load_internal: n_ctx = 2048
llama_model_load_internal: n_embd = 4096
llama_model_load_internal: n_mult = 256
llama_model_load_internal: n_head = 32
llama_model_load_internal: n_layer = 32
llama_model_load_internal: n_rot = 128
llama_model_load_internal: ftype = 2 (mostly Q4_0)
llama_model_load_internal: n_ff = 11008
llama_model_load_internal: model size = 7B
llama_model_load_internal: ggml ctx size = 0.08 MB
llama_model_load_internal: mem required = 5407.72 MB (+ 1026.00 MB per state)
llama_new_context_with_model: kv self size = 1024.00 MB
ggml_metal_init: allocating
ggml_metal_init: using MPS
ggml_metal_init: loading '/Users/sasank/code/llama/ollama/ggml-metal.metal'
ggml_metal_init: loaded kernel_add 0x12aa075a0
ggml_metal_init: loaded kernel_mul 0x12ab05ee0
ggml_metal_init: loaded kernel_mul_row 0x12ab06530
ggml_metal_init: loaded kernel_scale 0x12aa07de0
ggml_metal_init: loaded kernel_silu 0x12aa08300
ggml_metal_init: loaded kernel_relu 0x12ab06930
ggml_metal_init: loaded kernel_gelu 0x12ab06e50
ggml_metal_init: loaded kernel_soft_max 0x12ab076b0
ggml_metal_init: loaded kernel_diag_mask_inf 0x12ab07d30
ggml_metal_init: loaded kernel_get_rows_f16 0x12aa089e0
ggml_metal_init: loaded kernel_get_rows_q4_0 0x12aa091a0
ggml_metal_init: loaded kernel_get_rows_q4_1 0x12aa09b30
ggml_metal_init: loaded kernel_get_rows_q2_K 0x12ab082b0
ggml_metal_init: loaded kernel_get_rows_q3_K 0x12ab08a70
ggml_metal_init: loaded kernel_get_rows_q4_K 0x12aa0a0b0
ggml_metal_init: loaded kernel_get_rows_q5_K 0x12aa0a8b0
ggml_metal_init: loaded kernel_get_rows_q6_K 0x12aa0af50
ggml_metal_init: loaded kernel_rms_norm 0x12ab09140
ggml_metal_init: loaded kernel_norm 0x12ab09920
ggml_metal_init: loaded kernel_mul_mat_f16_f32 0x12aa0b9f0
ggml_metal_init: loaded kernel_mul_mat_q4_0_f32 0x12aa0be30
ggml_metal_init: loaded kernel_mul_mat_q4_1_f32 0x12aa0c530
ggml_metal_init: loaded kernel_mul_mat_q2_K_f32 0x12ab0a350
ggml_metal_init: loaded kernel_mul_mat_q3_K_f32 0x12ab0af40
ggml_metal_init: loaded kernel_mul_mat_q4_K_f32 0x12ab0b5c0
ggml_metal_init: loaded kernel_mul_mat_q5_K_f32 0x12aa0c930
ggml_metal_init: loaded kernel_mul_mat_q6_K_f32 0x12ab0bba0
ggml_metal_init: loaded kernel_rope 0x12ab0ca80
ggml_metal_init: loaded kernel_alibi_f32 0x12ab0d360
ggml_metal_init: loaded kernel_cpy_f32_f16 0x12ab0dc10
ggml_metal_init: loaded kernel_cpy_f32_f32 0x12ab0e4c0
ggml_metal_init: loaded kernel_cpy_f16_f16 0x12aa0d550
ggml_metal_init: recommendedMaxWorkingSetSize = 5461.34 MB
ggml_metal_init: hasUnifiedMemory = true
ggml_metal_init: maxTransferRate = built-in GPU
llama_new_context_with_model: max tensor size = 70.31 MB
ggml_metal_add_buffer: allocated 'data ' buffer, size = 3616.08 MB, ( 3616.47 / 5461.34)
ggml_metal_add_buffer: allocated 'eval ' buffer, size = 768.00 MB, ( 4384.47 / 5461.34)
ggml_metal_add_buffer: allocated 'kv ' buffer, size = 1026.00 MB, ( 5410.47 / 5461.34)
ggml_metal_add_buffer: allocated 'scr0 ' buffer, size = 512.00 MB, ( 5922.47 / 5461.34), warning: current allocated size is greater than the recommended max working set size
ggml_metal_add_buffer: allocated 'scr1 ' buffer, size = 512.00 MB, ( 6434.47 / 5461.34), warning: current allocated size is greater than the recommended max working set size
ggml_metal_graph_compute: command buffer 0 failed with status 5
GGML_ASSERT: ggml-metal.m:1013: false
SIGABRT: abort
PC=0x19296c724 m=5 sigcode=0
signal arrived during cgo execution
goroutine 6 [syscall]:
runtime.cgocall(0x100c920c0, 0x140000bd298)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/cgocall.go:157 +0x54 fp=0x140000bd260 sp=0x140000bd220 pc=0x100799994
github.com/jmorganca/ollama/llama._Cfunc_llama_eval(0x144008a00, 0x14000486c88, 0x1, 0x0, 0x8)
_cgo_gotypes.go:208 +0x38 fp=0x140000bd290 sp=0x140000bd260 pc=0x100c81e18
github.com/jmorganca/ollama/llama.New.func4(0x99?, {0x14000486c88, 0x1, 0x14000178540?}, {0xffffffffffffffff, 0x0, 0x800, 0x200, 0x1, 0x0, ...})
/Users/sasank/code/llama/ollama/llama/llama.go:141 +0x7c fp=0x140000bd2e0 sp=0x140000bd290 pc=0x100c82c2c
github.com/jmorganca/ollama/llama.New({0x140007fc310, 0x6a}, {0xffffffffffffffff, 0x0, 0x800, 0x200, 0x1, 0x0, 0x0, 0x1, ...})
/Users/sasank/code/llama/ollama/llama/llama.go:141 +0x278 fp=0x140000bd4a0 sp=0x140000bd2e0 pc=0x100c829e8
github.com/jmorganca/ollama/server.generate(0x140000b4300)
/Users/sasank/code/llama/ollama/server/routes.go:70 +0x700 fp=0x140000bd6e0 sp=0x140000bd4a0 pc=0x100c8d6b0
github.com/gin-gonic/gin.(*Context).Next(...)
/Users/sasank/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.CustomRecoveryWithWriter.func1(0x140000b4300)
/Users/sasank/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/recovery.go:102 +0x7c fp=0x140000bd730 sp=0x140000bd6e0 pc=0x100c7950c
github.com/gin-gonic/gin.(*Context).Next(...)
/Users/sasank/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.LoggerWithConfig.func1(0x140000b4300)
/Users/sasank/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/logger.go:240 +0xac fp=0x140000bd8e0 sp=0x140000bd730 pc=0x100c7878c
github.com/gin-gonic/gin.(*Context).Next(...)
/Users/sasank/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/context.go:174
github.com/gin-gonic/gin.(*Engine).handleHTTPRequest(0x14000145ba0, 0x140000b4300)
/Users/sasank/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/gin.go:620 +0x54c fp=0x140000bda70 sp=0x140000bd8e0 pc=0x100c7789c
github.com/gin-gonic/gin.(*Engine).ServeHTTP(0x14000145ba0, {0x100f019c0?, 0x140004ee1c0}, 0x140000b4200)
/Users/sasank/go/pkg/mod/github.com/gin-gonic/gin@v1.9.1/gin.go:576 +0x1d4 fp=0x140000bdab0 sp=0x140000bda70 pc=0x100c771a4
net/http.serverHandler.ServeHTTP({0x100effa38?}, {0x100f019c0, 0x140004ee1c0}, 0x140000b4200)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:2936 +0x2d8 fp=0x140000bdb60 sp=0x140000bdab0 pc=0x100a152a8
net/http.(*conn).serve(0x1400017a900, {0x100f02038, 0x1400046e060})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:1995 +0x560 fp=0x140000bdfa0 sp=0x140000bdb60 pc=0x100a10fa0
net/http.(*Server).Serve.func3()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:3089 +0x30 fp=0x140000bdfd0 sp=0x140000bdfa0 pc=0x100a15ad0
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000bdfd0 sp=0x140000bdfd0 pc=0x1007fc324
created by net/http.(*Server).Serve
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:3089 +0x520
goroutine 1 [IO wait, 14 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400011f860 sp=0x1400011f840 pc=0x1007ccaa4
runtime.netpollblock(0x1400031f8f8?, 0x87f1a4?, 0x1?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x1400011f8a0 sp=0x1400011f860 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ada18, 0x72)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x1400011f8d0 sp=0x1400011f8a0 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x1400044a580?, 0x0?, 0x0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x1400011f900 sp=0x1400011f8d0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0x1400044a580)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:614 +0x250 fp=0x1400011f9b0 sp=0x1400011f900 pc=0x10087f290
net.(*netFD).accept(0x1400044a580)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_unix.go:172 +0x28 fp=0x1400011fa70 sp=0x1400011f9b0 pc=0x1008be278
net.(*TCPListener).accept(0x1400000edb0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/tcpsock_posix.go:148 +0x28 fp=0x1400011faa0 sp=0x1400011fa70 pc=0x1008d3878
net.(*TCPListener).Accept(0x1400000edb0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/tcpsock.go:297 +0x2c fp=0x1400011fae0 sp=0x1400011faa0 pc=0x1008d29ec
net/http.(*onceCloseListener).Accept(0x1400017a900?)
<autogenerated>:1 +0x30 fp=0x1400011fb00 sp=0x1400011fae0 pc=0x100a39250
net/http.(*Server).Serve(0x14000366ff0, {0x100f017b0, 0x1400000edb0})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:3059 +0x304 fp=0x1400011fc30 sp=0x1400011fb00 pc=0x100a15774
github.com/jmorganca/ollama/server.Serve({0x100f017b0, 0x1400000edb0})
/Users/sasank/code/llama/ollama/server/routes.go:238 +0x250 fp=0x1400011fca0 sp=0x1400011fc30 pc=0x100c8f4e0
github.com/jmorganca/ollama/cmd.RunServer(0x14000419200?, {0x100ce1dcb?, 0x0?, 0x0?})
/Users/sasank/code/llama/ollama/cmd/cmd.go:272 +0x114 fp=0x1400011fd20 sp=0x1400011fca0 pc=0x100c91454
github.com/spf13/cobra.(*Command).execute(0x14000419200, {0x101365c48, 0x0, 0x0})
/Users/sasank/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:940 +0x5c8 fp=0x1400011fe60 sp=0x1400011fd20 pc=0x100aaf628
github.com/spf13/cobra.(*Command).ExecuteC(0x14000418900)
/Users/sasank/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:1068 +0x35c fp=0x1400011ff20 sp=0x1400011fe60 pc=0x100aafd7c
github.com/spf13/cobra.(*Command).Execute(...)
/Users/sasank/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(0x14000054768?, {0x100f01fc8?, 0x140000280b0?})
/Users/sasank/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:985 +0x50 fp=0x1400011ff40 sp=0x1400011ff20 pc=0x100aaf910
main.main()
/Users/sasank/code/llama/ollama/main.go:10 +0x34 fp=0x1400011ff70 sp=0x1400011ff40 pc=0x100c91e94
runtime.main()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:250 +0x248 fp=0x1400011ffd0 sp=0x1400011ff70 pc=0x1007cc678
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400011ffd0 sp=0x1400011ffd0 pc=0x1007fc324
goroutine 2 [force gc (idle), 14 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000054fa0 sp=0x14000054f80 pc=0x1007ccaa4
runtime.goparkunlock(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:387
runtime.forcegchelper()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:305 +0xb8 fp=0x14000054fd0 sp=0x14000054fa0 pc=0x1007cc8e8
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000054fd0 sp=0x14000054fd0 pc=0x1007fc324
created by runtime.init.6
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:293 +0x24
goroutine 3 [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000055760 sp=0x14000055740 pc=0x1007ccaa4
runtime.goparkunlock(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:387
runtime.bgsweep(0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgcsweep.go:319 +0x110 fp=0x140000557b0 sp=0x14000055760 pc=0x1007b9960
runtime.gcenable.func1()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:178 +0x28 fp=0x140000557d0 sp=0x140000557b0 pc=0x1007ae408
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000557d0 sp=0x140000557d0 pc=0x1007fc324
created by runtime.gcenable
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:178 +0x74
goroutine 4 [GC scavenge wait]:
runtime.gopark(0x12b0f92?, 0x1291938?, 0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000055f50 sp=0x14000055f30 pc=0x1007ccaa4
runtime.goparkunlock(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:387
runtime.(*scavengerState).park(0x1012aa960)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgcscavenge.go:400 +0x5c fp=0x14000055f80 sp=0x14000055f50 pc=0x1007b776c
runtime.bgscavenge(0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgcscavenge.go:633 +0xac fp=0x14000055fb0 sp=0x14000055f80 pc=0x1007b7d4c
runtime.gcenable.func2()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:179 +0x28 fp=0x14000055fd0 sp=0x14000055fb0 pc=0x1007ae3a8
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000055fd0 sp=0x14000055fd0 pc=0x1007fc324
created by runtime.gcenable
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:179 +0xb8
goroutine 5 [finalizer wait, 12 minutes]:
runtime.gopark(0x0?, 0x1400048a138?, 0x20?, 0x1?, 0x1000000010?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000065d80 sp=0x14000065d60 pc=0x1007ccaa4
runtime.runfinq()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mfinal.go:193 +0x10c fp=0x14000065fd0 sp=0x14000065d80 pc=0x1007ad49c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000065fd0 sp=0x14000065fd0 pc=0x1007fc324
created by runtime.createfing
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mfinal.go:163 +0x84
goroutine 26 [select]:
runtime.gopark(0x1400051ff80?, 0x2?, 0xa0?, 0x61?, 0x1400051ff24?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400051fdb0 sp=0x1400051fd90 pc=0x1007ccaa4
runtime.selectgo(0x1400051ff80, 0x1400051ff20, 0x14000282680?, 0x0, 0x0?, 0x1)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/select.go:327 +0x690 fp=0x1400051fed0 sp=0x1400051fdb0 pc=0x1007dd1a0
net/http.(*persistConn).writeLoop(0x14000128d80)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:2410 +0x9c fp=0x1400051ffb0 sp=0x1400051fed0 pc=0x100a2a74c
net/http.(*Transport).dialConn.func6()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1766 +0x28 fp=0x1400051ffd0 sp=0x1400051ffb0 pc=0x100a27458
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400051ffd0 sp=0x1400051ffd0 pc=0x1007fc324
created by net/http.(*Transport).dialConn
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1766 +0x1214
goroutine 13 [GC worker (idle), 1 minutes]:
runtime.gopark(0x4f330c0464e0f?, 0x1?, 0x27?, 0xdf?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000056f40 sp=0x14000056f20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x14000056fd0 sp=0x14000056f40 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000056fd0 sp=0x14000056fd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 20 [GC worker (idle)]:
runtime.gopark(0x1013673a0?, 0x1?, 0x16?, 0xeb?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000057740 sp=0x14000057720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x140000577d0 sp=0x14000057740 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000577d0 sp=0x140000577d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 21 [GC worker (idle)]:
runtime.gopark(0x4f347a631f1b8?, 0x3?, 0xc3?, 0x8e?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000050740 sp=0x14000050720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x140000507d0 sp=0x14000050740 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000507d0 sp=0x140000507d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 14 [GC worker (idle)]:
runtime.gopark(0x4f347a634141b?, 0x3?, 0x77?, 0xc?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000057f40 sp=0x14000057f20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x14000057fd0 sp=0x14000057f40 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000057fd0 sp=0x14000057fd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 22 [GC worker (idle)]:
runtime.gopark(0x4f3473d29e65d?, 0x1?, 0x9f?, 0x19?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000050f40 sp=0x14000050f20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x14000050fd0 sp=0x14000050f40 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000050fd0 sp=0x14000050fd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 15 [GC worker (idle)]:
runtime.gopark(0x1013673a0?, 0x3?, 0x2?, 0x4c?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400047c740 sp=0x1400047c720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x1400047c7d0 sp=0x1400047c740 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400047c7d0 sp=0x1400047c7d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 23 [GC worker (idle)]:
runtime.gopark(0x4f3472b8156b1?, 0x3?, 0x93?, 0x2d?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000051740 sp=0x14000051720 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x140000517d0 sp=0x14000051740 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x140000517d0 sp=0x140000517d0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 16 [GC worker (idle)]:
runtime.gopark(0x4f3474e2b3524?, 0x3?, 0xe3?, 0x7b?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400047cf40 sp=0x1400047cf20 pc=0x1007ccaa4
runtime.gcBgMarkWorker()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1275 +0xec fp=0x1400047cfd0 sp=0x1400047cf40 pc=0x1007b034c
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400047cfd0 sp=0x1400047cfd0 pc=0x1007fc324
created by runtime.gcBgMarkStartWorkers
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/mgc.go:1199 +0x28
goroutine 56 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x10080e540?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000063580 sp=0x14000063560 pc=0x1007ccaa4
runtime.netpollblock(0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x140000635c0 sp=0x14000063580 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ad838, 0x72)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x140000635f0 sp=0x140000635c0 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x1400064c000?, 0x140001c4800?, 0x0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x14000063620 sp=0x140000635f0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x1400064c000, {0x140001c4800, 0x1800, 0x1800})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:167 +0x200 fp=0x140000636c0 sp=0x14000063620 pc=0x10087bb50
net.(*netFD).Read(0x1400064c000, {0x140001c4800?, 0x14000063878?, 0x100000ece?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_posix.go:55 +0x28 fp=0x14000063710 sp=0x140000636c0 pc=0x1008bc5d8
net.(*conn).Read(0x140004ba028, {0x140001c4800?, 0x140000637c8?, 0x1007a2304?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/net.go:183 +0x34 fp=0x14000063760 sp=0x14000063710 pc=0x1008cabe4
net.(*TCPConn).Read(0x140000637d8?, {0x140001c4800?, 0x1400000e828?, 0x18?})
<autogenerated>:1 +0x2c fp=0x14000063790 sp=0x14000063760 pc=0x1008dd12c
crypto/tls.(*atLeastReader).Read(0x1400000e828, {0x140001c4800?, 0x1400000e828?, 0x0?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:788 +0x40 fp=0x140000637e0 sp=0x14000063790 pc=0x10096f760
bytes.(*Buffer).ReadFrom(0x140004aa290, {0x100efd580, 0x1400000e828})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/bytes/buffer.go:202 +0x90 fp=0x14000063840 sp=0x140000637e0 pc=0x100831860
crypto/tls.(*Conn).readFromUntil(0x140004aa000, {0x128a27fc8?, 0x140004ba028}, 0x1009c421c?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:810 +0xd4 fp=0x14000063880 sp=0x14000063840 pc=0x10096f954
crypto/tls.(*Conn).readRecordOrCCS(0x140004aa000, 0x0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:617 +0xd8 fp=0x14000063bf0 sp=0x14000063880 pc=0x10096d7a8
crypto/tls.(*Conn).readRecord(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:583
crypto/tls.(*Conn).Read(0x140004aa000, {0x140000a1000, 0x1000, 0x1009e1418?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:1316 +0x178 fp=0x14000063c60 sp=0x14000063bf0 pc=0x1009726f8
bufio.(*Reader).Read(0x140006bc900, {0x14000420580, 0x9, 0x10079bfbc?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/bufio/bufio.go:237 +0x1e0 fp=0x14000063ca0 sp=0x14000063c60 pc=0x10083e7b0
io.ReadAtLeast({0x100efd3e0, 0x140006bc900}, {0x14000420580, 0x9, 0x9}, 0x9)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/io/io.go:332 +0xa0 fp=0x14000063cf0 sp=0x14000063ca0 pc=0x100827fa0
io.ReadFull(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/io/io.go:351
net/http.http2readFrameHeader({0x14000420580?, 0x9?, 0x14000063d98?}, {0x100efd3e0?, 0x140006bc900?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:1567 +0x58 fp=0x14000063d40 sp=0x14000063cf0 pc=0x1009d8548
net/http.(*http2Framer).ReadFrame(0x14000420540)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:1831 +0x84 fp=0x14000063df0 sp=0x14000063d40 pc=0x1009d8d44
net/http.(*http2clientConnReadLoop).run(0x14000063f88)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:9187 +0xfc fp=0x14000063f40 sp=0x14000063df0 pc=0x1009fa06c
net/http.(*http2ClientConn).readLoop(0x14000175080)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:9082 +0x5c fp=0x14000063fb0 sp=0x14000063f40 pc=0x1009f952c
net/http.(*http2Transport).newClientConn.func1()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:7779 +0x28 fp=0x14000063fd0 sp=0x14000063fb0 pc=0x1009f26b8
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000063fd0 sp=0x14000063fd0 pc=0x1007fc324
created by net/http.(*http2Transport).newClientConn
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/h2_bundle.go:7779 +0xad0
goroutine 39 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x10080e540?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x1400047ad40 sp=0x1400047ad20 pc=0x1007ccaa4
runtime.netpollblock(0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x1400047ad80 sp=0x1400047ad40 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ad928, 0x72)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x1400047adb0 sp=0x1400047ad80 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x1400044a600?, 0x1400046e161?, 0x0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x1400047ade0 sp=0x1400047adb0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x1400044a600, {0x1400046e161, 0x1, 0x1})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:167 +0x200 fp=0x1400047ae80 sp=0x1400047ade0 pc=0x10087bb50
net.(*netFD).Read(0x1400044a600, {0x1400046e161?, 0x0?, 0x0?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_posix.go:55 +0x28 fp=0x1400047aed0 sp=0x1400047ae80 pc=0x1008bc5d8
net.(*conn).Read(0x14000010d10, {0x1400046e161?, 0x0?, 0x0?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/net.go:183 +0x34 fp=0x1400047af20 sp=0x1400047aed0 pc=0x1008cabe4
net.(*TCPConn).Read(0x0?, {0x1400046e161?, 0x0?, 0x0?})
<autogenerated>:1 +0x2c fp=0x1400047af50 sp=0x1400047af20 pc=0x1008dd12c
net/http.(*connReader).backgroundRead(0x1400046e150)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:674 +0x44 fp=0x1400047afb0 sp=0x1400047af50 pc=0x100a0b454
net/http.(*connReader).startBackgroundRead.func2()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:670 +0x28 fp=0x1400047afd0 sp=0x1400047afb0 pc=0x100a0b378
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x1400047afd0 sp=0x1400047afd0 pc=0x1007fc324
created by net/http.(*connReader).startBackgroundRead
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/server.go:670 +0xcc
goroutine 25 [IO wait]:
runtime.gopark(0xffffffffffffffff?, 0xffffffffffffffff?, 0x23?, 0x0?, 0x10080e540?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/proc.go:381 +0xe4 fp=0x14000062580 sp=0x14000062560 pc=0x1007ccaa4
runtime.netpollblock(0x0?, 0x0?, 0x0?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:527 +0x158 fp=0x140000625c0 sp=0x14000062580 pc=0x1007c6138
internal/poll.runtime_pollWait(0x1289ad748, 0x72)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/netpoll.go:306 +0xa0 fp=0x140000625f0 sp=0x140000625c0 pc=0x1007f61b0
internal/poll.(*pollDesc).wait(0x14000480200?, 0x140002d0000?, 0x0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:84 +0x28 fp=0x14000062620 sp=0x140000625f0 pc=0x10087a7e8
internal/poll.(*pollDesc).waitRead(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0x14000480200, {0x140002d0000, 0xa000, 0xa000})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/internal/poll/fd_unix.go:167 +0x200 fp=0x140000626c0 sp=0x14000062620 pc=0x10087bb50
net.(*netFD).Read(0x14000480200, {0x140002d0000?, 0x14000062878?, 0x10096df7c?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/fd_posix.go:55 +0x28 fp=0x14000062710 sp=0x140000626c0 pc=0x1008bc5d8
net.(*conn).Read(0x140004ba000, {0x140002d0000?, 0x100ce6ad4?, 0x5?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/net.go:183 +0x34 fp=0x14000062760 sp=0x14000062710 pc=0x1008cabe4
net.(*TCPConn).Read(0x140000627d8?, {0x140002d0000?, 0x140006d00d8?, 0x18?})
<autogenerated>:1 +0x2c fp=0x14000062790 sp=0x14000062760 pc=0x1008dd12c
crypto/tls.(*atLeastReader).Read(0x140006d00d8, {0x140002d0000?, 0x140006d00d8?, 0x0?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:788 +0x40 fp=0x140000627e0 sp=0x14000062790 pc=0x10096f760
bytes.(*Buffer).ReadFrom(0x14000452290, {0x100efd580, 0x140006d00d8})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/bytes/buffer.go:202 +0x90 fp=0x14000062840 sp=0x140000627e0 pc=0x100831860
crypto/tls.(*Conn).readFromUntil(0x14000452000, {0x128a27fc8?, 0x140004ba000}, 0x7fffffffffffffff?)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:810 +0xd4 fp=0x14000062880 sp=0x14000062840 pc=0x10096f954
crypto/tls.(*Conn).readRecordOrCCS(0x14000452000, 0x0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:617 +0xd8 fp=0x14000062bf0 sp=0x14000062880 pc=0x10096d7a8
crypto/tls.(*Conn).readRecord(...)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:583
crypto/tls.(*Conn).Read(0x14000452000, {0x140004df000, 0x1000, 0x140003e8180?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/crypto/tls/conn.go:1316 +0x178 fp=0x14000062c60 sp=0x14000062bf0 pc=0x1009726f8
net/http.(*persistConn).Read(0x14000128d80, {0x140004df000?, 0x10079b930?, 0x1400049e780?})
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1943 +0x50 fp=0x14000062cc0 sp=0x14000062c60 pc=0x100a27e60
bufio.(*Reader).fill(0x140004fc4e0)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/bufio/bufio.go:106 +0xfc fp=0x14000062d00 sp=0x14000062cc0 pc=0x10083e18c
bufio.(*Reader).Peek(0x140004fc4e0, 0x1)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/bufio/bufio.go:144 +0x60 fp=0x14000062d20 sp=0x14000062d00 pc=0x10083e300
net/http.(*persistConn).readLoop(0x14000128d80)
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:2107 +0x144 fp=0x14000062fb0 sp=0x14000062d20 pc=0x100a28d14
net/http.(*Transport).dialConn.func5()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1765 +0x28 fp=0x14000062fd0 sp=0x14000062fb0 pc=0x100a274b8
runtime.goexit()
/opt/homebrew/Cellar/go/1.20.6/libexec/src/runtime/asm_arm64.s:1172 +0x4 fp=0x14000062fd0 sp=0x14000062fd0 pc=0x1007fc324
created by net/http.(*Transport).dialConn
/opt/homebrew/Cellar/go/1.20.6/libexec/src/net/http/transport.go:1765 +0x11c8
r0 0x0
r1 0x0
r2 0x0
r3 0x0
r4 0x0
r5 0x171672c10
r6 0xa
r7 0x0
r8 0x6b684de7b1e616cc
r9 0x6b684de6c08ea6cc
r10 0x2
r11 0xfffffffd
r12 0x10000000000
r13 0x0
r14 0x0
r15 0x0
r16 0x148
r17 0x1f292cf20
r18 0x0
r19 0x6
r20 0x17168b000
r21 0x1a03
r22 0x17168b0e0
r23 0x8
r24 0x7
r25 0x8
r26 0x1ede07460
r27 0x100cd3094
r28 0x100df50c0
r29 0x171672bc0
lr 0x1929a3c28
sp 0x171672ba0
pc 0x19296c724
fault 0x19296c724
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/118/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2370
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2370/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2370/comments
|
https://api.github.com/repos/ollama/ollama/issues/2370/events
|
https://github.com/ollama/ollama/issues/2370
| 2,120,108,637
|
I_kwDOJ0Z1Ps5-Xkpd
| 2,370
|
36GB Macbook not using GPU for models that could fit
|
{
"login": "WinnieP",
"id": 497472,
"node_id": "MDQ6VXNlcjQ5NzQ3Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/497472?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/WinnieP",
"html_url": "https://github.com/WinnieP",
"followers_url": "https://api.github.com/users/WinnieP/followers",
"following_url": "https://api.github.com/users/WinnieP/following{/other_user}",
"gists_url": "https://api.github.com/users/WinnieP/gists{/gist_id}",
"starred_url": "https://api.github.com/users/WinnieP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/WinnieP/subscriptions",
"organizations_url": "https://api.github.com/users/WinnieP/orgs",
"repos_url": "https://api.github.com/users/WinnieP/repos",
"events_url": "https://api.github.com/users/WinnieP/events{/privacy}",
"received_events_url": "https://api.github.com/users/WinnieP/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6677745918,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgZQ_g",
"url": "https://api.github.com/repos/ollama/ollama/labels/gpu",
"name": "gpu",
"color": "76C49E",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-02-06T07:09:27
| 2024-04-09T05:04:37
| 2024-03-12T21:30:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | ERROR: type should be string, got "https://github.com/ollama/ollama/blob/27aa2d4a194c6daeafbd00391f475628deccce72/gpu/gpu_darwin.go#L24C1-L28C3\r\n\r\nIn older versions of Ollama, certain models would run on the GPU of a 36GB M3 macbook pro (specifically q4_K_M quantization of mixtral). Now, it's running on CPU.\r\nI believe MacOS is allowing closer to ~75% of the memory to be allocated to GPU on this model, not 66%.\r\n\r\n```ggml_metal_init: recommendedMaxWorkingSetSize = 28991.03 MB```"
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2370/reactions",
"total_count": 4,
"+1": 4,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2370/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8287
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8287/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8287/comments
|
https://api.github.com/repos/ollama/ollama/issues/8287/events
|
https://github.com/ollama/ollama/issues/8287
| 2,765,927,156
|
I_kwDOJ0Z1Ps6k3LL0
| 8,287
|
The <toolcall> in nemotron-mini. Again.
|
{
"login": "tripolskypetr",
"id": 19227776,
"node_id": "MDQ6VXNlcjE5MjI3Nzc2",
"avatar_url": "https://avatars.githubusercontent.com/u/19227776?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tripolskypetr",
"html_url": "https://github.com/tripolskypetr",
"followers_url": "https://api.github.com/users/tripolskypetr/followers",
"following_url": "https://api.github.com/users/tripolskypetr/following{/other_user}",
"gists_url": "https://api.github.com/users/tripolskypetr/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tripolskypetr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tripolskypetr/subscriptions",
"organizations_url": "https://api.github.com/users/tripolskypetr/orgs",
"repos_url": "https://api.github.com/users/tripolskypetr/repos",
"events_url": "https://api.github.com/users/tripolskypetr/events{/privacy}",
"received_events_url": "https://api.github.com/users/tripolskypetr/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 27
| 2025-01-02T12:11:25
| 2025-01-29T19:14:51
| 2025-01-29T19:14:51
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
# The problem
I am trying to implement the agent swarm for ollama from scratch. I made the triage agent: the intent navigator. It should call the `navigate_to_refund_agent_tool` or `navigate_to_sales_agent_tool` depends on user's choose. Both tools got empty arguments like
```tsx
function navigate_to_refund_agent_tool(/* no arguments */) {
}
function navigate_to_sales_agent_tool(/* no arguments */) {
}
```
When I send the following curl request it return the nasty XML output.
```bash
curl --location 'http://localhost:11434/api/chat' \
--header 'Content-Type: application/json' \
--data '{
"model": "nemotron-mini",
"messages": [
{
"role": "system",
"content": "You are to triage a users request, and call a tool to transfer to the right agent. There are two agents you can transfer to: navigate_to_refund_agent_tool and navigate_to_sales_agent_tool. Untill calling any function, you must ask the user for their agent. Before navigation make sure you choose well. Do not spam function executions"
},
{
"role": "user",
"content": "Navigate me to sales agent"
}
],
"stream": false,
"options": {
"top_k": 20,
"top_p": 0.4,
"temperature": 0.5
},
"tools": [
{
"type": "function",
"function": {
"name": "navigate_to_sales_agent_tool",
"description": "Navigate to sales agent",
"parameters": {
"type": "object",
"required": [
],
"properties": {
}
}
}
},
{
"type": "function",
"function": {
"name": "navigate_to_refund_agent_tool",
"description": "Navigate to refund agent",
"parameters": {
"type": "object",
"required": [
],
"properties": {
}
}
}
}
]
}'
```
Time-to-time the output is
```
<toolcall> {\"type\": \"function\", \"arguments\": { \"name\": \"navigate_to_sales_agent_tool\" }} </toolcall>
```
Also, sometimes the JSON in `<toolcall>` tag is invalid
# To solve the problem
1. **How do I hardcode the model version? I am seeing you are updating models without publishing the new tag**

This is definitely going to break the AI prompts, I need to fetch exactly the defined model version even if it was uploaded 10 years ago
2. **When are you planning to clear all fake labels with tools support?**

The XML output is a real common for all models with tools in the list. This is a fake. Give me the truly information about the state of the framework: without tools it unusable and pointless waste of time
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.4
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8287/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8287/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4244
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4244/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4244/comments
|
https://api.github.com/repos/ollama/ollama/issues/4244/events
|
https://github.com/ollama/ollama/pull/4244
| 2,284,455,893
|
PR_kwDOJ0Z1Ps5u0iHB
| 4,244
|
skip if same quantization
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-08T00:44:44
| 2024-05-08T02:03:38
| 2024-05-08T02:03:38
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4244",
"html_url": "https://github.com/ollama/ollama/pull/4244",
"diff_url": "https://github.com/ollama/ollama/pull/4244.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4244.patch",
"merged_at": "2024-05-08T02:03:38"
}
|
this skips quantization of the input and output are the same file types. most of the time, this means if the input and output are both f16
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4244/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4244/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/929
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/929/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/929/comments
|
https://api.github.com/repos/ollama/ollama/issues/929/events
|
https://github.com/ollama/ollama/issues/929
| 1,964,769,465
|
I_kwDOJ0Z1Ps51HAC5
| 929
|
FR: Increase prompt size limit on UI
|
{
"login": "hemanth",
"id": 18315,
"node_id": "MDQ6VXNlcjE4MzE1",
"avatar_url": "https://avatars.githubusercontent.com/u/18315?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hemanth",
"html_url": "https://github.com/hemanth",
"followers_url": "https://api.github.com/users/hemanth/followers",
"following_url": "https://api.github.com/users/hemanth/following{/other_user}",
"gists_url": "https://api.github.com/users/hemanth/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hemanth/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hemanth/subscriptions",
"organizations_url": "https://api.github.com/users/hemanth/orgs",
"repos_url": "https://api.github.com/users/hemanth/repos",
"events_url": "https://api.github.com/users/hemanth/events{/privacy}",
"received_events_url": "https://api.github.com/users/hemanth/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 5
| 2023-10-27T04:59:26
| 2023-10-30T22:28:49
| 2023-10-30T22:28:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
<img width="378" alt="image" src="https://github.com/jmorganca/ollama/assets/18315/2bee7056-b101-4430-a0c1-400cff494405">
This is limited to 255 chars only.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/929/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/929/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6852
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6852/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6852/comments
|
https://api.github.com/repos/ollama/ollama/issues/6852/events
|
https://github.com/ollama/ollama/issues/6852
| 2,532,873,826
|
I_kwDOJ0Z1Ps6W-JZi
| 6,852
|
Fetch Failed Error on using OLLAMA locally with nomic-embed-text and llama3.1:8b
|
{
"login": "saisandeepbalbari",
"id": 25894087,
"node_id": "MDQ6VXNlcjI1ODk0MDg3",
"avatar_url": "https://avatars.githubusercontent.com/u/25894087?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/saisandeepbalbari",
"html_url": "https://github.com/saisandeepbalbari",
"followers_url": "https://api.github.com/users/saisandeepbalbari/followers",
"following_url": "https://api.github.com/users/saisandeepbalbari/following{/other_user}",
"gists_url": "https://api.github.com/users/saisandeepbalbari/gists{/gist_id}",
"starred_url": "https://api.github.com/users/saisandeepbalbari/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saisandeepbalbari/subscriptions",
"organizations_url": "https://api.github.com/users/saisandeepbalbari/orgs",
"repos_url": "https://api.github.com/users/saisandeepbalbari/repos",
"events_url": "https://api.github.com/users/saisandeepbalbari/events{/privacy}",
"received_events_url": "https://api.github.com/users/saisandeepbalbari/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 12
| 2024-09-18T06:48:26
| 2024-12-02T22:56:23
| 2024-12-02T22:56:23
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm using OLLAMA with Anything LLM. It's taking a lot of time to respond to the prompts
The following error I'm getting from the docker logs of anything llm
[OllamaEmbedder] Embedding 1 chunks of text with nomic-embed-text:latest.
TypeError: fetch failed
at node:internal/deps/undici/undici:12618:11
at async createOllamaStream (/app/server/node_modules/@langchain/community/dist/utils/ollama.cjs:12:22)
at async createOllamaChatStream (/app/server/node_modules/@langchain/community/dist/utils/ollama.cjs:61:5)
at async ChatOllama._streamResponseChunks (/app/server/node_modules/@langchain/community/dist/chat_models/ollama.cjs:399:30)
at async ChatOllama._streamIterator (/app/server/node_modules/@langchain/core/dist/language_models/chat_models.cjs:82:34)
at async ChatOllama.transform (/app/server/node_modules/@langchain/core/dist/runnables/base.cjs:382:9)
at async wrapInputForTracing (/app/server/node_modules/@langchain/core/dist/runnables/base.cjs:258:30)
at async pipeGeneratorWithSetup (/app/server/node_modules/@langchain/core/dist/utils/stream.cjs:230:19)
at async StringOutputParser._transformStreamWithConfig (/app/server/node_modules/@langchain/core/dist/runnables/base.cjs:279:26)
at async StringOutputParser.transform (/app/server/node_modules/@langchain/core/dist/output_parsers/transform.cjs:36:9) {
cause: HeadersTimeoutError: Headers Timeout Error
at Timeout.onParserTimeout [as callback] (node:internal/deps/undici/undici:9117:32)
at Timeout.onTimeout [as _onTimeout] (node:internal/deps/undici/undici:7148:17)
at listOnTimeout (node:internal/timers:569:17)
at process.processTimers (node:internal/timers:512:7) {
code: 'UND_ERR_HEADERS_TIMEOUT'
}
}
### OS
Linux
### GPU
_No response_
### CPU
Intel
### Ollama version
_No response_
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6852/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/771
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/771/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/771/comments
|
https://api.github.com/repos/ollama/ollama/issues/771/events
|
https://github.com/ollama/ollama/issues/771
| 1,940,762,088
|
I_kwDOJ0Z1Ps5zra3o
| 771
|
Looking up environment variables while starting the server via Electron
|
{
"login": "ba1uev",
"id": 7990776,
"node_id": "MDQ6VXNlcjc5OTA3NzY=",
"avatar_url": "https://avatars.githubusercontent.com/u/7990776?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ba1uev",
"html_url": "https://github.com/ba1uev",
"followers_url": "https://api.github.com/users/ba1uev/followers",
"following_url": "https://api.github.com/users/ba1uev/following{/other_user}",
"gists_url": "https://api.github.com/users/ba1uev/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ba1uev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ba1uev/subscriptions",
"organizations_url": "https://api.github.com/users/ba1uev/orgs",
"repos_url": "https://api.github.com/users/ba1uev/repos",
"events_url": "https://api.github.com/users/ba1uev/events{/privacy}",
"received_events_url": "https://api.github.com/users/ba1uev/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
},
{
"id": 5667396210,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2acg",
"url": "https://api.github.com/repos/ollama/ollama/labels/good%20first%20issue",
"name": "good first issue",
"color": "7057ff",
"default": true,
"description": "Good for newcomers"
}
] |
closed
| false
| null |
[] | null | 5
| 2023-10-12T20:41:16
| 2025-01-28T18:32:04
| 2025-01-28T18:32:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Constant users of web clients may configure some of the supported variables in profile files such as `~/.bash_profile`, `~/.zshrc`, etc.
```bash
echo "export OLLAMA_ORIGINS=https://example.com OLLAMA_HOST=0.0.0.0:1337" >> ~/.zshrc
```
The server launched manually using `ollama serve` will utilize these variables. It would be beneficial to support them for the server launched via the Electron application.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/771/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/771/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4190
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4190/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4190/comments
|
https://api.github.com/repos/ollama/ollama/issues/4190/events
|
https://github.com/ollama/ollama/pull/4190
| 2,279,894,149
|
PR_kwDOJ0Z1Ps5ulb_t
| 4,190
|
fix golangci workflow not enable gofmt and goimports
|
{
"login": "alwqx",
"id": 9915368,
"node_id": "MDQ6VXNlcjk5MTUzNjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/9915368?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alwqx",
"html_url": "https://github.com/alwqx",
"followers_url": "https://api.github.com/users/alwqx/followers",
"following_url": "https://api.github.com/users/alwqx/following{/other_user}",
"gists_url": "https://api.github.com/users/alwqx/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alwqx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alwqx/subscriptions",
"organizations_url": "https://api.github.com/users/alwqx/orgs",
"repos_url": "https://api.github.com/users/alwqx/repos",
"events_url": "https://api.github.com/users/alwqx/events{/privacy}",
"received_events_url": "https://api.github.com/users/alwqx/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-06T02:18:23
| 2024-05-07T16:49:40
| 2024-05-07T16:49:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4190",
"html_url": "https://github.com/ollama/ollama/pull/4190",
"diff_url": "https://github.com/ollama/ollama/pull/4190.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4190.patch",
"merged_at": "2024-05-07T16:49:40"
}
|
Hi, this pr tries to fix golangci not enable `gofmt` and `goimport` in github workflows.
All workflows have passed. But I notice `* text eol=lf` was removed in commit [9164b0161](https://github.com/ollama/ollama/commit/9164b0161bcb24e543cba835a8863b80af2c0c21) which v0.1.33 was released. So I still need help from maintainers because `.github/workflows/release.yaml` is not tested. By now I can not trigger release workflow in my own repo https://github.com/alwqx/ollama because `.github/workflows/release.yaml` references some environments which I don't have or I don't known how to get my own value:
- MACOS_SIGNING_KEY
- MACOS_SIGNING_KEY_PASSWORD
- APPLE_IDENTITY
- APPLE_PASSWORD
- APPLE_ID
- KEY_CONTAINER
- GOOGLE_SIGNING_CREDENTIALS
- ...
I hope maintainer/collaborators help test release workflow for this change.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4190/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4190/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3643
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3643/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3643/comments
|
https://api.github.com/repos/ollama/ollama/issues/3643/events
|
https://github.com/ollama/ollama/issues/3643
| 2,242,721,513
|
I_kwDOJ0Z1Ps6FrTbp
| 3,643
|
how to change the max input token length when I run ‘’ollama run gemma:7b-instruct-v1.1-fp16‘’
|
{
"login": "dh12306",
"id": 20471681,
"node_id": "MDQ6VXNlcjIwNDcxNjgx",
"avatar_url": "https://avatars.githubusercontent.com/u/20471681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dh12306",
"html_url": "https://github.com/dh12306",
"followers_url": "https://api.github.com/users/dh12306/followers",
"following_url": "https://api.github.com/users/dh12306/following{/other_user}",
"gists_url": "https://api.github.com/users/dh12306/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dh12306/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dh12306/subscriptions",
"organizations_url": "https://api.github.com/users/dh12306/orgs",
"repos_url": "https://api.github.com/users/dh12306/repos",
"events_url": "https://api.github.com/users/dh12306/events{/privacy}",
"received_events_url": "https://api.github.com/users/dh12306/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 7
| 2024-04-15T05:12:38
| 2024-12-02T09:55:40
| 2024-04-17T00:46:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
the default input token lens is 2048 ? how can I change it because the gemma can support more input tokens
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3643/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3643/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3285
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3285/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3285/comments
|
https://api.github.com/repos/ollama/ollama/issues/3285/events
|
https://github.com/ollama/ollama/issues/3285
| 2,200,109,066
|
I_kwDOJ0Z1Ps6DIwAK
| 3,285
|
gemma accuracy down from 0.128 to 0.129
|
{
"login": "RamiKassouf",
"id": 92019309,
"node_id": "U_kgDOBXwabQ",
"avatar_url": "https://avatars.githubusercontent.com/u/92019309?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RamiKassouf",
"html_url": "https://github.com/RamiKassouf",
"followers_url": "https://api.github.com/users/RamiKassouf/followers",
"following_url": "https://api.github.com/users/RamiKassouf/following{/other_user}",
"gists_url": "https://api.github.com/users/RamiKassouf/gists{/gist_id}",
"starred_url": "https://api.github.com/users/RamiKassouf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RamiKassouf/subscriptions",
"organizations_url": "https://api.github.com/users/RamiKassouf/orgs",
"repos_url": "https://api.github.com/users/RamiKassouf/repos",
"events_url": "https://api.github.com/users/RamiKassouf/events{/privacy}",
"received_events_url": "https://api.github.com/users/RamiKassouf/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5667396220,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2afA",
"url": "https://api.github.com/repos/ollama/ollama/labels/question",
"name": "question",
"color": "d876e3",
"default": true,
"description": "General questions"
}
] |
open
| false
| null |
[] | null | 3
| 2024-03-21T12:48:18
| 2024-03-30T04:16:45
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Prompts that were producing the correct results are now producing different (false) outputs
### What did you expect to see?
Corretly formatted yaml with correct inputs based on custgom promt
-> Got a yaml with indentation issues and missing fields with wrong structure even though there are json schemas provided
Note that this has also happened on my collegue's pc as well as on a Remote Machine
### Steps to reproduce
Upgrade ollama using linux download command provided in website
ollama run gemma
After I rm the model then pull it becomes better for a bit then it get's worse after a while
### Are there any recent changes that introduced the issue?
Version Upgrade to 0.129
### OS
Linux
### Architecture
_No response_
### Platform
WSL2
### Ollama version
0.1.29
### GPU
Nvidia
### GPU info
Quadro P1000
### CPU
Intel
### Other software
Intel(R) Core(TM) i7-8850H CPU @ 2.60GHz 2.59 GHz
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3285/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3285/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8601
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8601/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8601/comments
|
https://api.github.com/repos/ollama/ollama/issues/8601/events
|
https://github.com/ollama/ollama/pull/8601
| 2,812,057,230
|
PR_kwDOJ0Z1Ps6JCJq5
| 8,601
|
README: Add handy-ollama to tutorial
|
{
"login": "AXYZdong",
"id": 45477220,
"node_id": "MDQ6VXNlcjQ1NDc3MjIw",
"avatar_url": "https://avatars.githubusercontent.com/u/45477220?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AXYZdong",
"html_url": "https://github.com/AXYZdong",
"followers_url": "https://api.github.com/users/AXYZdong/followers",
"following_url": "https://api.github.com/users/AXYZdong/following{/other_user}",
"gists_url": "https://api.github.com/users/AXYZdong/gists{/gist_id}",
"starred_url": "https://api.github.com/users/AXYZdong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AXYZdong/subscriptions",
"organizations_url": "https://api.github.com/users/AXYZdong/orgs",
"repos_url": "https://api.github.com/users/AXYZdong/repos",
"events_url": "https://api.github.com/users/AXYZdong/events{/privacy}",
"received_events_url": "https://api.github.com/users/AXYZdong/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2025-01-27T04:29:41
| 2025-01-27T17:08:48
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8601",
"html_url": "https://github.com/ollama/ollama/pull/8601",
"diff_url": "https://github.com/ollama/ollama/pull/8601.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8601.patch",
"merged_at": null
}
|
Chinese Tutorial for Ollama by [Datawhale ](https://github.com/datawhalechina)- China's Largest Open Source AI Learning Community.
We'd like to contribute to the Ollama community by announcing the release of our open-source Chinese tutorial.
This tutorial aims to be comprehensive and easy to understand, covering:
- Ollama Introduction
- Ollama Installation and Configuration
- Custom Model Import
- Ollama REST API
- Using Ollama with LangChain
- Deployment of Ollama Visual Interfaces
- Application Examples
The repo is at : https://github.com/datawhalechina/handy-ollama
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8601/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8601/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6717
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6717/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6717/comments
|
https://api.github.com/repos/ollama/ollama/issues/6717/events
|
https://github.com/ollama/ollama/pull/6717
| 2,515,025,890
|
PR_kwDOJ0Z1Ps566JsO
| 6,717
|
Improve nvidia GPU discovery error handling
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-09-09T22:19:59
| 2024-11-27T20:55:56
| null |
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6717",
"html_url": "https://github.com/ollama/ollama/pull/6717",
"diff_url": "https://github.com/ollama/ollama/pull/6717.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6717.patch",
"merged_at": null
}
|
In some cases, the cuda library may respond with a status code indicating we should retry later.
If we get an error, use the applicable cuda library error string function to get a human readable explanation.
Improve logging during retries in the server subprocess logic as well.
Fixes #6637
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6717/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6717/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7804
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7804/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7804/comments
|
https://api.github.com/repos/ollama/ollama/issues/7804/events
|
https://github.com/ollama/ollama/issues/7804
| 2,684,833,085
|
I_kwDOJ0Z1Ps6gB009
| 7,804
|
Not reading image files with vision models
|
{
"login": "whatToUseThisFor",
"id": 130185104,
"node_id": "U_kgDOB8J3kA",
"avatar_url": "https://avatars.githubusercontent.com/u/130185104?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/whatToUseThisFor",
"html_url": "https://github.com/whatToUseThisFor",
"followers_url": "https://api.github.com/users/whatToUseThisFor/followers",
"following_url": "https://api.github.com/users/whatToUseThisFor/following{/other_user}",
"gists_url": "https://api.github.com/users/whatToUseThisFor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/whatToUseThisFor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/whatToUseThisFor/subscriptions",
"organizations_url": "https://api.github.com/users/whatToUseThisFor/orgs",
"repos_url": "https://api.github.com/users/whatToUseThisFor/repos",
"events_url": "https://api.github.com/users/whatToUseThisFor/events{/privacy}",
"received_events_url": "https://api.github.com/users/whatToUseThisFor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-11-22T22:43:22
| 2024-11-23T17:51:24
| 2024-11-23T17:51:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?

When I try to give an image file to a model with the "vision" tag, it says that it can't access files on my computer.
I've tried with llava 7b, llava 13b, and llama3.2-vision (all times I tried were on a terminal)
If it helps, I'm using a card with 8gb of vram, and have 32gb of DIMM ram
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.3
|
{
"login": "whatToUseThisFor",
"id": 130185104,
"node_id": "U_kgDOB8J3kA",
"avatar_url": "https://avatars.githubusercontent.com/u/130185104?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/whatToUseThisFor",
"html_url": "https://github.com/whatToUseThisFor",
"followers_url": "https://api.github.com/users/whatToUseThisFor/followers",
"following_url": "https://api.github.com/users/whatToUseThisFor/following{/other_user}",
"gists_url": "https://api.github.com/users/whatToUseThisFor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/whatToUseThisFor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/whatToUseThisFor/subscriptions",
"organizations_url": "https://api.github.com/users/whatToUseThisFor/orgs",
"repos_url": "https://api.github.com/users/whatToUseThisFor/repos",
"events_url": "https://api.github.com/users/whatToUseThisFor/events{/privacy}",
"received_events_url": "https://api.github.com/users/whatToUseThisFor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7804/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7804/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6842
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6842/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6842/comments
|
https://api.github.com/repos/ollama/ollama/issues/6842/events
|
https://github.com/ollama/ollama/pull/6842
| 2,531,894,590
|
PR_kwDOJ0Z1Ps57zWo0
| 6,842
|
llama: Refine developer docs for Go server
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-09-17T19:09:06
| 2024-09-27T22:12:43
| 2024-09-27T22:12:40
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6842",
"html_url": "https://github.com/ollama/ollama/pull/6842",
"diff_url": "https://github.com/ollama/ollama/pull/6842.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6842.patch",
"merged_at": "2024-09-27T22:12:40"
}
|
This enhances the documentation for development focusing on a minimal single known to work set of tools.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6842/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6842/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1778
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1778/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1778/comments
|
https://api.github.com/repos/ollama/ollama/issues/1778/events
|
https://github.com/ollama/ollama/pull/1778
| 2,064,770,937
|
PR_kwDOJ0Z1Ps5jLXHL
| 1,778
|
Fail fast on WSL1 while allowing on WSL2
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-03T23:16:12
| 2024-01-04T00:18:44
| 2024-01-04T00:18:41
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1778",
"html_url": "https://github.com/ollama/ollama/pull/1778",
"diff_url": "https://github.com/ollama/ollama/pull/1778.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1778.patch",
"merged_at": "2024-01-04T00:18:41"
}
|
This prevents users from accidentally installing on WSL1 with instructions guiding how to upgrade their WSL instance to version 2. Once running WSL2 if you have an NVIDIA card, you can follow their instructions to set up GPU passthrough and run models on the GPU. This is not possible on WSL1.
Example output.
WSL1
```
daniel@DESKTOP-PUNI632:/mnt/c/Users/Daniel$ ./install.sh
ERROR WSL1 is not currently supported - please upgrade to WSL2 with 'wsl --set-version <distro> 2'
daniel@DESKTOP-PUNI632:/mnt/c/Users/Daniel$ uname -r
4.4.0-19041-Microsoft
```
WSL2
```
root@DESKTOP-PUNI632:/mnt/c/Users/Daniel# ./install.sh
>>> Downloading ollama...
################################################################################################################# 100.0%
...
root@DESKTOP-PUNI632:/mnt/c/Users/Daniel# uname -r
5.15.133.1-microsoft-standard-WSL2
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1778/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1778/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8332
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8332/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8332/comments
|
https://api.github.com/repos/ollama/ollama/issues/8332/events
|
https://github.com/ollama/ollama/issues/8332
| 2,772,129,016
|
I_kwDOJ0Z1Ps6lO1T4
| 8,332
|
Allow set the type of K/V cache separately
|
{
"login": "ag2s20150909",
"id": 19373730,
"node_id": "MDQ6VXNlcjE5MzczNzMw",
"avatar_url": "https://avatars.githubusercontent.com/u/19373730?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ag2s20150909",
"html_url": "https://github.com/ag2s20150909",
"followers_url": "https://api.github.com/users/ag2s20150909/followers",
"following_url": "https://api.github.com/users/ag2s20150909/following{/other_user}",
"gists_url": "https://api.github.com/users/ag2s20150909/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ag2s20150909/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ag2s20150909/subscriptions",
"organizations_url": "https://api.github.com/users/ag2s20150909/orgs",
"repos_url": "https://api.github.com/users/ag2s20150909/repos",
"events_url": "https://api.github.com/users/ag2s20150909/events{/privacy}",
"received_events_url": "https://api.github.com/users/ag2s20150909/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-07T07:50:27
| 2025-01-21T14:03:20
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Allow set the type of K/V cache separately
On Qwen2-7B,
when K/V cache both `q4_0` produces weird results.
when k is `q4_0` and v is `q8_0` produces weird results.
when k is `q8_0` and v is `q4_0` produces normal results.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8332/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8332/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2491
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2491/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2491/comments
|
https://api.github.com/repos/ollama/ollama/issues/2491/events
|
https://github.com/ollama/ollama/issues/2491
| 2,134,227,302
|
I_kwDOJ0Z1Ps5_Nblm
| 2,491
|
How to install ollama on ubuntu with specific version
|
{
"login": "MugdhaHardikar-GSLab",
"id": 5062147,
"node_id": "MDQ6VXNlcjUwNjIxNDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/5062147?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/MugdhaHardikar-GSLab",
"html_url": "https://github.com/MugdhaHardikar-GSLab",
"followers_url": "https://api.github.com/users/MugdhaHardikar-GSLab/followers",
"following_url": "https://api.github.com/users/MugdhaHardikar-GSLab/following{/other_user}",
"gists_url": "https://api.github.com/users/MugdhaHardikar-GSLab/gists{/gist_id}",
"starred_url": "https://api.github.com/users/MugdhaHardikar-GSLab/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MugdhaHardikar-GSLab/subscriptions",
"organizations_url": "https://api.github.com/users/MugdhaHardikar-GSLab/orgs",
"repos_url": "https://api.github.com/users/MugdhaHardikar-GSLab/repos",
"events_url": "https://api.github.com/users/MugdhaHardikar-GSLab/events{/privacy}",
"received_events_url": "https://api.github.com/users/MugdhaHardikar-GSLab/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 8
| 2024-02-14T12:16:23
| 2025-01-21T18:10:51
| 2024-02-20T03:59:33
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I want to install the ollama on my ubuntu server but every few days new version of ollama gets installed. I want to fix the version of the ollama getting installed on my machine. Current install.sh doesn't seem to have that functionality. IS there any way?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2491/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2491/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/114
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/114/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/114/comments
|
https://api.github.com/repos/ollama/ollama/issues/114/events
|
https://github.com/ollama/ollama/issues/114
| 1,811,136,432
|
I_kwDOJ0Z1Ps5r87-w
| 114
|
pls Wizard Uncensored
|
{
"login": "nathanleclaire",
"id": 1476820,
"node_id": "MDQ6VXNlcjE0NzY4MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1476820?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nathanleclaire",
"html_url": "https://github.com/nathanleclaire",
"followers_url": "https://api.github.com/users/nathanleclaire/followers",
"following_url": "https://api.github.com/users/nathanleclaire/following{/other_user}",
"gists_url": "https://api.github.com/users/nathanleclaire/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nathanleclaire/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nathanleclaire/subscriptions",
"organizations_url": "https://api.github.com/users/nathanleclaire/orgs",
"repos_url": "https://api.github.com/users/nathanleclaire/repos",
"events_url": "https://api.github.com/users/nathanleclaire/events{/privacy}",
"received_events_url": "https://api.github.com/users/nathanleclaire/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-07-19T04:50:38
| 2023-07-19T15:22:46
| 2023-07-19T06:38:05
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/TheBloke/WizardLM-13B-Uncensored-GGML good performance ime
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/114/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/114/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2411
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2411/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2411/comments
|
https://api.github.com/repos/ollama/ollama/issues/2411/events
|
https://github.com/ollama/ollama/issues/2411
| 2,125,248,757
|
I_kwDOJ0Z1Ps5-rLj1
| 2,411
|
Discrete AMD GPU not used, CPU used instead
|
{
"login": "haplo",
"id": 71658,
"node_id": "MDQ6VXNlcjcxNjU4",
"avatar_url": "https://avatars.githubusercontent.com/u/71658?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/haplo",
"html_url": "https://github.com/haplo",
"followers_url": "https://api.github.com/users/haplo/followers",
"following_url": "https://api.github.com/users/haplo/following{/other_user}",
"gists_url": "https://api.github.com/users/haplo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/haplo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haplo/subscriptions",
"organizations_url": "https://api.github.com/users/haplo/orgs",
"repos_url": "https://api.github.com/users/haplo/repos",
"events_url": "https://api.github.com/users/haplo/events{/privacy}",
"received_events_url": "https://api.github.com/users/haplo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 28
| 2024-02-08T13:57:34
| 2024-04-29T12:57:14
| 2024-03-12T23:30:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
My system has both an integrated and a dedicated GPU (an AMD Radeon 7900XTX). I see ollama ignores the integrated card, detects the 7900XTX but then it goes ahead and uses the CPU (Ryzen 7900).
I'm running ollama 0.1.23 from Arch Linux repository. This should include the fix at #2195, I see in the logs that `ROCR_VISIBLE_DEVICES=0`.
Only errors I see in the logs are:
```
rsmi_dev_serial_number_get failed: 2
rsmi_dev_vram_vendor_get failed: 2
rsmi_dev_serial_number_get failed: 2
```
Full debug log starting the systemd unit with `OLLAMA_DEBUG=1` and then `ollama run mistral`:
```
Started Ollama Service.
time=2024-02-08T13:52:58.187Z level=INFO source=images.go:860 msg="total blobs: 9"
time=2024-02-08T13:52:58.188Z level=INFO source=images.go:867 msg="total unused blobs removed: 0"
[GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.
[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
- using env: export GIN_MODE=release
- using code: gin.SetMode(gin.ReleaseMode)
[GIN-debug] POST /api/pull --> github.com/jmorganca/ollama/server.PullModelHandler (5 handlers)
[GIN-debug] POST /api/generate --> github.com/jmorganca/ollama/server.GenerateHandler (5 handlers)
[GIN-debug] POST /api/chat --> github.com/jmorganca/ollama/server.ChatHandler (5 handlers)
[GIN-debug] POST /api/embeddings --> github.com/jmorganca/ollama/server.EmbeddingHandler (5 handlers)
[GIN-debug] POST /api/create --> github.com/jmorganca/ollama/server.CreateModelHandler (5 handlers)
[GIN-debug] POST /api/push --> github.com/jmorganca/ollama/server.PushModelHandler (5 handlers)
[GIN-debug] POST /api/copy --> github.com/jmorganca/ollama/server.CopyModelHandler (5 handlers)
[GIN-debug] DELETE /api/delete --> github.com/jmorganca/ollama/server.DeleteModelHandler (5 handlers)
[GIN-debug] POST /api/show --> github.com/jmorganca/ollama/server.ShowModelHandler (5 handlers)
[GIN-debug] POST /api/blobs/:digest --> github.com/jmorganca/ollama/server.CreateBlobHandler (5 handlers)
[GIN-debug] HEAD /api/blobs/:digest --> github.com/jmorganca/ollama/server.HeadBlobHandler (5 handlers)
[GIN-debug] GET / --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
[GIN-debug] GET /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
[GIN-debug] GET /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
[GIN-debug] HEAD / --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
[GIN-debug] HEAD /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
[GIN-debug] HEAD /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
time=2024-02-08T13:52:58.188Z level=INFO source=routes.go:995 msg="Listening on 127.0.0.1:11434 (version 0.1.23)"
time=2024-02-08T13:52:58.188Z level=INFO source=payload_common.go:106 msg="Extracting dynamic libraries..."
time=2024-02-08T13:52:58.289Z level=INFO source=payload_common.go:145 msg="Dynamic LLM libraries [cpu_avx2 cpu cpu_avx]"
time=2024-02-08T13:52:58.289Z level=DEBUG source=payload_common.go:146 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY"
time=2024-02-08T13:52:58.289Z level=INFO source=gpu.go:94 msg="Detecting GPU type"
time=2024-02-08T13:52:58.289Z level=INFO source=gpu.go:242 msg="Searching for GPU management library libnvidia-ml.so"
time=2024-02-08T13:52:58.289Z level=DEBUG source=gpu.go:260 msg="gpu management search paths: [/usr/local/cuda/lib64/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/libnvidia-ml.so* /usr/lib/wsl/lib/libnvidia-ml.so* /usr/lib/wsl/drivers/*/libnvidia-ml.so* /opt/cuda/lib64/libnvidia-ml.so* /usr/lib*/libnvidia-ml.so* /usr/local/lib*/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/libnvidia-ml.so* /opt/cuda/targets/x86_64-linux/lib/stubs/libnvidia-ml.so* /var/lib/ollama/libnvidia-ml.so*]"
time=2024-02-08T13:52:58.294Z level=INFO source=gpu.go:288 msg="Discovered GPU libraries: []"
time=2024-02-08T13:52:58.294Z level=INFO source=gpu.go:242 msg="Searching for GPU management library librocm_smi64.so"
time=2024-02-08T13:52:58.294Z level=DEBUG source=gpu.go:260 msg="gpu management search paths: [/opt/rocm*/lib*/librocm_smi64.so* /var/lib/ollama/librocm_smi64.so*]"
time=2024-02-08T13:52:58.294Z level=INFO source=gpu.go:288 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.1.0]"
wiring rocm management library functions in /opt/rocm/lib/librocm_smi64.so.1.0
dlsym: rsmi_init
dlsym: rsmi_shut_down
dlsym: rsmi_dev_memory_total_get
dlsym: rsmi_dev_memory_usage_get
dlsym: rsmi_version_get
dlsym: rsmi_num_monitor_devices
dlsym: rsmi_dev_id_get
dlsym: rsmi_dev_name_get
dlsym: rsmi_dev_brand_get
dlsym: rsmi_dev_vendor_name_get
dlsym: rsmi_dev_vram_vendor_get
dlsym: rsmi_dev_serial_number_get
dlsym: rsmi_dev_subsystem_name_get
dlsym: rsmi_dev_vbios_version_get
time=2024-02-08T13:52:58.298Z level=INFO source=gpu.go:109 msg="Radeon GPU detected"
time=2024-02-08T13:52:58.299Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
discovered 2 ROCm GPU Devices
[0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M]
[0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M]
[0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
[0] ROCm VRAM vendor: samsung
rsmi_dev_serial_number_get failed: 2
[0] ROCm subsystem name: PULSE RX 7900 XTX
[0] ROCm vbios version: 113-3E4710U-O4X
[0] ROCm totalMem 25753026560
[0] ROCm usedMem 2400063488
[1] ROCm device name: Raphael
[1] ROCm brand: Raphael
[1] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
rsmi_dev_vram_vendor_get failed: 2
rsmi_dev_serial_number_get failed: 2
[1] ROCm subsystem name: GA-MA78GM-S2H Motherboard
[1] ROCm vbios version: 102-RAPHAEL-008
[1] ROCm totalMem 67108864
[1] ROCm usedMem 16441344
[1] ROCm integrated GPU
time=2024-02-08T13:52:58.302Z level=INFO source=gpu.go:177 msg="ROCm integrated GPU detected - ROCR_VISIBLE_DEVICES=0"
time=2024-02-08T13:52:58.302Z level=DEBUG source=gpu.go:231 msg="rocm detected 2 devices with 20044M available memory"
[GIN] 2024/02/08 - 13:53:15 | 200 | 23.355µs | 127.0.0.1 | HEAD "/"
[GIN] 2024/02/08 - 13:53:15 | 200 | 669.664µs | 127.0.0.1 | POST "/api/show"
[GIN] 2024/02/08 - 13:53:15 | 200 | 221.658µs | 127.0.0.1 | POST "/api/show"
time=2024-02-08T13:53:15.435Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
discovered 2 ROCm GPU Devices
[0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M]
[0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M]
[0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
[0] ROCm VRAM vendor: samsung
rsmi_dev_serial_number_get failed: 2
[0] ROCm subsystem name: PULSE RX 7900 XTX
[0] ROCm vbios version: 113-3E4710U-O4X
[0] ROCm totalMem 25753026560
[0] ROCm usedMem 2400071680
[1] ROCm device name: Raphael
[1] ROCm brand: Raphael
[1] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
rsmi_dev_vram_vendor_get failed: 2
rsmi_dev_serial_number_get failed: 2
[1] ROCm subsystem name: GA-MA78GM-S2H Motherboard
[1] ROCm vbios version: 102-RAPHAEL-008
[1] ROCm totalMem 67108864
[1] ROCm usedMem 16441344
[1] ROCm integrated GPU
time=2024-02-08T13:53:15.439Z level=INFO source=gpu.go:177 msg="ROCm integrated GPU detected - ROCR_VISIBLE_DEVICES=0"
time=2024-02-08T13:53:15.439Z level=DEBUG source=gpu.go:231 msg="rocm detected 2 devices with 20044M available memory"
time=2024-02-08T13:53:15.439Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
discovered 2 ROCm GPU Devices
[0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M]
[0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX/7900M]
[0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
[0] ROCm VRAM vendor: samsung
rsmi_dev_serial_number_get failed: 2
[0] ROCm subsystem name: PULSE RX 7900 XTX
[0] ROCm vbios version: 113-3E4710U-O4X
[0] ROCm totalMem 25753026560
[0] ROCm usedMem 2400071680
[1] ROCm device name: Raphael
[1] ROCm brand: Raphael
[1] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
rsmi_dev_vram_vendor_get failed: 2
rsmi_dev_serial_number_get failed: 2
[1] ROCm subsystem name: GA-MA78GM-S2H Motherboard
[1] ROCm vbios version: 102-RAPHAEL-008
[1] ROCm totalMem 67108864
[1] ROCm usedMem 16441344
[1] ROCm integrated GPU
time=2024-02-08T13:53:15.442Z level=INFO source=gpu.go:177 msg="ROCm integrated GPU detected - ROCR_VISIBLE_DEVICES=0"
time=2024-02-08T13:53:15.442Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-02-08T13:53:15.443Z level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama1814794725/cpu_avx2/libext_server.so"
time=2024-02-08T13:53:15.443Z level=INFO source=dyn_ext_server.go:145 msg="Initializing llama server"
[1707400395] system info: AVX = 1 | AVX_VNNI = 0 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 |
llama_model_loader: loaded meta data with 24 key-value pairs and 291 tensors from /var/lib/ollama/.ollama/models/blobs/sha256:e8a35b5937a5e6d5c35d1f2a15f161e07eefe5e5bb0a3cdd42998ee79b057730 (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = mistralai
llama_model_loader: - kv 2: llama.context_length u32 = 32768
llama_model_loader: - kv 3: llama.embedding_length u32 = 4096
llama_model_loader: - kv 4: llama.block_count u32 = 32
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 6: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 7: llama.attention.head_count u32 = 32
llama_model_loader: - kv 8: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 10: llama.rope.freq_base f32 = 1000000.000000
llama_model_loader: - kv 11: general.file_type u32 = 2
llama_model_loader: - kv 12: tokenizer.ggml.model str = llama
llama_model_loader: - kv 13: tokenizer.ggml.tokens arr[str,32000] = ["<unk>", "<s>", "</s>", "<0x00>", "<...
llama_model_loader: - kv 14: tokenizer.ggml.scores arr[f32,32000] = [0.000000, 0.000000, 0.000000, 0.0000...
llama_model_loader: - kv 15: tokenizer.ggml.token_type arr[i32,32000] = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ...
llama_model_loader: - kv 16: tokenizer.ggml.merges arr[str,58980] = ["▁ t", "i n", "e r", "▁ a", "h e...
llama_model_loader: - kv 17: tokenizer.ggml.bos_token_id u32 = 1
llama_model_loader: - kv 18: tokenizer.ggml.eos_token_id u32 = 2
llama_model_loader: - kv 19: tokenizer.ggml.unknown_token_id u32 = 0
llama_model_loader: - kv 20: tokenizer.ggml.add_bos_token bool = true
llama_model_loader: - kv 21: tokenizer.ggml.add_eos_token bool = false
llama_model_loader: - kv 22: tokenizer.chat_template str = {{ bos_token }}{% for message in mess...
llama_model_loader: - kv 23: general.quantization_version u32 = 2
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type q4_0: 225 tensors
llama_model_loader: - type q6_K: 1 tensors
llm_load_vocab: special tokens definition check successful ( 259/32000 ).
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 32000
llm_load_print_meta: n_merges = 0
llm_load_print_meta: n_ctx_train = 32768
llm_load_print_meta: n_embd = 4096
llm_load_print_meta: n_head = 32
llm_load_print_meta: n_head_kv = 8
llm_load_print_meta: n_layer = 32
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 4
llm_load_print_meta: n_embd_k_gqa = 1024
llm_load_print_meta: n_embd_v_gqa = 1024
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-05
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: n_ff = 14336
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 1000000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx = 32768
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: model type = 7B
llm_load_print_meta: model ftype = Q4_0
llm_load_print_meta: model params = 7.24 B
llm_load_print_meta: model size = 3.83 GiB (4.54 BPW)
llm_load_print_meta: general.name = mistralai
llm_load_print_meta: BOS token = 1 '<s>'
llm_load_print_meta: EOS token = 2 '</s>'
llm_load_print_meta: UNK token = 0 '<unk>'
llm_load_print_meta: LF token = 13 '<0x0A>'
llm_load_tensors: ggml ctx size = 0.11 MiB
llm_load_tensors: offloading 32 repeating layers to GPU
llm_load_tensors: offloading non-repeating layers to GPU
llm_load_tensors: offloaded 33/33 layers to GPU
llm_load_tensors: CPU buffer size = 3917.87 MiB
...................................................................................................
llama_new_context_with_model: n_ctx = 2048
llama_new_context_with_model: freq_base = 1000000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CPU KV buffer size = 256.00 MiB
llama_new_context_with_model: KV self size = 256.00 MiB, K (f16): 128.00 MiB, V (f16): 128.00 MiB
llama_new_context_with_model: CPU input buffer size = 12.01 MiB
llama_new_context_with_model: CPU compute buffer size = 167.20 MiB
llama_new_context_with_model: graph splits (measure): 1
[1707400395] warming up the model with an empty run
[1707400395] Available slots:
[1707400395] -> Slot 0 - max context: 2048
time=2024-02-08T13:53:15.764Z level=INFO source=dyn_ext_server.go:156 msg="Starting llama main loop"
[1707400395] llama server main loop starting
[1707400395] all slots are idle and system prompt is empty, clear the KV cache
[GIN] 2024/02/08 - 13:53:15 | 200 | 508.028434ms | 127.0.0.1 | POST "/api/chat"
time=2024-02-08T13:53:16.722Z level=DEBUG source=routes.go:1161 msg="chat handler" prompt="[INST] Tell me a joke [/INST]"
[1707400396] slot 0 is processing [task id: 0]
[1707400396] slot 0 : in cache: 0 tokens | to process: 13 tokens
[1707400396] slot 0 : kv cache rm - [0, end)
[1707400397] sampled token: 4315: ' Why'
[1707400397] sampled token: 949: ' don'
[1707400397] sampled token: 28742: '''
[1707400397] sampled token: 28707: 't'
[1707400397] sampled token: 15067: ' scientists'
[1707400397] sampled token: 4893: ' trust'
[1707400397] sampled token: 24221: ' atoms'
[1707400397] sampled token: 28804: '?'
[1707400398] sampled token: 13: '
'
[1707400398] sampled token: 13: '
'
[1707400398] sampled token: 17098: 'Because'
[1707400398] sampled token: 590: ' they'
[1707400398] sampled token: 1038: ' make'
[1707400398] sampled token: 582: ' up'
[1707400398] sampled token: 2905: ' everything'
[1707400398] sampled token: 28808: '!'
[1707400398] sampled token: 2: ''
[1707400398]
[1707400398] print_timings: prompt eval time = 533.24 ms / 13 tokens ( 41.02 ms per token, 24.38 tokens per second)
[1707400398] print_timings: eval time = 1521.86 ms / 17 runs ( 89.52 ms per token, 11.17 tokens per second)
[1707400398] print_timings: total time = 2055.10 ms
[1707400398] slot 0 released (30 tokens in cache)
[1707400398] next result cancel on stop
[1707400398] next result removing waiting task ID: 0
[GIN] 2024/02/08 - 13:53:18 | 200 | 2.055812875s | 127.0.0.1 | POST "/api/chat"
```
Installed packages:
```
$ pacman -Qs 'amd|hip|rocm|opencl|clblast|llama' | grep --color=auto local
local/amd-ucode 20240115.9b6d0b08-2
local/clblast 1.6.1-1
local/clinfo 3.0.21.02.21-1
local/comgr 6.0.0-1
local/composable-kernel 6.0.0-1
local/flashrom 1.2-4
local/gcc-libs 13.2.1-5
local/hip-runtime-amd 6.0.0-1
local/hipblas 6.0.0-1
local/hipcub 6.0.0-1
local/hipfft 6.0.0-1
local/hiprand 6.0.0-1
local/hipsolver 6.0.0-1
local/hipsparse 6.0.0-1
local/hsa-rocr 6.0.0-2
local/libftdi 1.5-5
local/libteam 1.32-1
local/magma-hip 2.7.2-3
local/miopen-hip 6.0.0-1
local/nvtop 3.0.2-1
local/ocl-icd 2.3.2-1
local/ollama 0.1.23-1
local/opencl-headers 2:2023.04.17-2
local/python-pytorch-opt-rocm 2.2.0-1
local/python-torchvision-rocm 0.16.2-1
local/rccl 6.0.0-1
local/rocalution 6.0.0-2
local/rocblas 6.0.0-1
local/rocfft 6.0.0-1
local/rocm-clang-ocl 6.0.0-1
local/rocm-cmake 6.0.0-1
local/rocm-core 6.0.0-2
local/rocm-device-libs 6.0.0-1
local/rocm-hip-libraries 6.0.0-1
local/rocm-hip-runtime 6.0.0-1
local/rocm-hip-sdk 6.0.0-1
local/rocm-language-runtime 6.0.0-1
local/rocm-llvm 6.0.0-2
local/rocm-opencl-runtime 6.0.0-1
local/rocm-opencl-sdk 6.0.0-1
local/rocm-smi-lib 6.0.0-1
local/rocminfo 6.0.0-1
local/rocprim 6.0.0-1
local/rocrand 6.0.0-1
local/rocsolver 6.0.0-1
local/rocsparse 6.0.0-1
local/rocthrust 6.0.0-1
local/roctracer 6.0.0-1
```
`rocminfo`:
```
ROCk module is loaded
=====================
HSA System Attributes
=====================
Runtime Version: 1.1
System Timestamp Freq.: 1000.000000MHz
Sig. Max Wait Duration: 18446744073709551615 (0xFFFFFFFFFFFFFFFF) (timestamp count)
Machine Model: LARGE
System Endianness: LITTLE
Mwaitx: DISABLED
DMAbuf Support: YES
==========
HSA Agents
==========
*******
Agent 1
*******
Name: AMD Ryzen 9 7900 12-Core Processor
Uuid: CPU-XX
Marketing Name: AMD Ryzen 9 7900 12-Core Processor
Vendor Name: CPU
Feature: None specified
Profile: FULL_PROFILE
Float Round Mode: NEAR
Max Queue Number: 0(0x0)
Queue Min Size: 0(0x0)
Queue Max Size: 0(0x0)
Queue Type: MULTI
Node: 0
Device Type: CPU
Cache Info:
L1: 32768(0x8000) KB
Chip ID: 0(0x0)
ASIC Revision: 0(0x0)
Cacheline Size: 64(0x40)
Max Clock Freq. (MHz): 5482
BDFID: 0
Internal Node ID: 0
Compute Unit: 24
SIMDs per CU: 0
Shader Engines: 0
Shader Arrs. per Eng.: 0
WatchPts on Addr. Ranges:1
Features: None
Pool Info:
Pool 1
Segment: GLOBAL; FLAGS: FINE GRAINED
Size: 65412596(0x3e61df4) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Accessible by all: TRUE
Pool 2
Segment: GLOBAL; FLAGS: KERNARG, FINE GRAINED
Size: 65412596(0x3e61df4) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Accessible by all: TRUE
Pool 3
Segment: GLOBAL; FLAGS: COARSE GRAINED
Size: 65412596(0x3e61df4) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Accessible by all: TRUE
ISA Info:
*******
Agent 2
*******
Name: gfx1100
Uuid: GPU-8e7a334a1ad8aec8
Marketing Name: AMD Radeon RX 7900 XTX
Vendor Name: AMD
Feature: KERNEL_DISPATCH
Profile: BASE_PROFILE
Float Round Mode: NEAR
Max Queue Number: 128(0x80)
Queue Min Size: 64(0x40)
Queue Max Size: 131072(0x20000)
Queue Type: MULTI
Node: 1
Device Type: GPU
Cache Info:
L1: 32(0x20) KB
L2: 6144(0x1800) KB
L3: 98304(0x18000) KB
Chip ID: 29772(0x744c)
ASIC Revision: 0(0x0)
Cacheline Size: 64(0x40)
Max Clock Freq. (MHz): 2371
BDFID: 768
Internal Node ID: 1
Compute Unit: 96
SIMDs per CU: 2
Shader Engines: 6
Shader Arrs. per Eng.: 2
WatchPts on Addr. Ranges:4
Coherent Host Access: FALSE
Features: KERNEL_DISPATCH
Fast F16 Operation: TRUE
Wavefront Size: 32(0x20)
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Max Waves Per CU: 32(0x20)
Max Work-item Per CU: 1024(0x400)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
Max fbarriers/Workgrp: 32
Packet Processor uCode:: 528
SDMA engine uCode:: 19
IOMMU Support:: None
Pool Info:
Pool 1
Segment: GLOBAL; FLAGS: COARSE GRAINED
Size: 25149440(0x17fc000) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Accessible by all: FALSE
Pool 2
Segment: GLOBAL; FLAGS: EXTENDED FINE GRAINED
Size: 25149440(0x17fc000) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Accessible by all: FALSE
Pool 3
Segment: GROUP
Size: 64(0x40) KB
Allocatable: FALSE
Alloc Granule: 0KB
Alloc Alignment: 0KB
Accessible by all: FALSE
ISA Info:
ISA 1
Name: amdgcn-amd-amdhsa--gfx1100
Machine Models: HSA_MACHINE_MODEL_LARGE
Profiles: HSA_PROFILE_BASE
Default Rounding Mode: NEAR
Default Rounding Mode: NEAR
Fast f16: TRUE
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
FBarrier Max Size: 32
*******
Agent 3
*******
Name: gfx1036
Uuid: GPU-XX
Marketing Name: AMD Radeon Graphics
Vendor Name: AMD
Feature: KERNEL_DISPATCH
Profile: BASE_PROFILE
Float Round Mode: NEAR
Max Queue Number: 128(0x80)
Queue Min Size: 64(0x40)
Queue Max Size: 131072(0x20000)
Queue Type: MULTI
Node: 2
Device Type: GPU
Cache Info:
L1: 16(0x10) KB
L2: 256(0x100) KB
Chip ID: 5710(0x164e)
ASIC Revision: 1(0x1)
Cacheline Size: 64(0x40)
Max Clock Freq. (MHz): 2200
BDFID: 5376
Internal Node ID: 2
Compute Unit: 2
SIMDs per CU: 2
Shader Engines: 1
Shader Arrs. per Eng.: 1
WatchPts on Addr. Ranges:4
Coherent Host Access: FALSE
Features: KERNEL_DISPATCH
Fast F16 Operation: TRUE
Wavefront Size: 32(0x20)
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Max Waves Per CU: 32(0x20)
Max Work-item Per CU: 1024(0x400)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
Max fbarriers/Workgrp: 32
Packet Processor uCode:: 20
SDMA engine uCode:: 8
IOMMU Support:: None
Pool Info:
Pool 1
Segment: GLOBAL; FLAGS: COARSE GRAINED
Size: 65536(0x10000) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Accessible by all: FALSE
Pool 2
Segment: GLOBAL; FLAGS: EXTENDED FINE GRAINED
Size: 65536(0x10000) KB
Allocatable: TRUE
Alloc Granule: 4KB
Alloc Alignment: 4KB
Accessible by all: FALSE
Pool 3
Segment: GROUP
Size: 64(0x40) KB
Allocatable: FALSE
Alloc Granule: 0KB
Alloc Alignment: 0KB
Accessible by all: FALSE
ISA Info:
ISA 1
Name: amdgcn-amd-amdhsa--gfx1036
Machine Models: HSA_MACHINE_MODEL_LARGE
Profiles: HSA_PROFILE_BASE
Default Rounding Mode: NEAR
Default Rounding Mode: NEAR
Fast f16: TRUE
Workgroup Max Size: 1024(0x400)
Workgroup Max Size per Dimension:
x 1024(0x400)
y 1024(0x400)
z 1024(0x400)
Grid Max Size: 4294967295(0xffffffff)
Grid Max Size per Dimension:
x 4294967295(0xffffffff)
y 4294967295(0xffffffff)
z 4294967295(0xffffffff)
FBarrier Max Size: 32
*** Done ***
```
|
{
"login": "haplo",
"id": 71658,
"node_id": "MDQ6VXNlcjcxNjU4",
"avatar_url": "https://avatars.githubusercontent.com/u/71658?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/haplo",
"html_url": "https://github.com/haplo",
"followers_url": "https://api.github.com/users/haplo/followers",
"following_url": "https://api.github.com/users/haplo/following{/other_user}",
"gists_url": "https://api.github.com/users/haplo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/haplo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haplo/subscriptions",
"organizations_url": "https://api.github.com/users/haplo/orgs",
"repos_url": "https://api.github.com/users/haplo/repos",
"events_url": "https://api.github.com/users/haplo/events{/privacy}",
"received_events_url": "https://api.github.com/users/haplo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2411/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2411/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5017
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5017/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5017/comments
|
https://api.github.com/repos/ollama/ollama/issues/5017/events
|
https://github.com/ollama/ollama/issues/5017
| 2,350,572,877
|
I_kwDOJ0Z1Ps6MGuVN
| 5,017
|
Using Ollama in a Dockerfile
|
{
"login": "Deepansharora27",
"id": 43300955,
"node_id": "MDQ6VXNlcjQzMzAwOTU1",
"avatar_url": "https://avatars.githubusercontent.com/u/43300955?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Deepansharora27",
"html_url": "https://github.com/Deepansharora27",
"followers_url": "https://api.github.com/users/Deepansharora27/followers",
"following_url": "https://api.github.com/users/Deepansharora27/following{/other_user}",
"gists_url": "https://api.github.com/users/Deepansharora27/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Deepansharora27/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Deepansharora27/subscriptions",
"organizations_url": "https://api.github.com/users/Deepansharora27/orgs",
"repos_url": "https://api.github.com/users/Deepansharora27/repos",
"events_url": "https://api.github.com/users/Deepansharora27/events{/privacy}",
"received_events_url": "https://api.github.com/users/Deepansharora27/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-06-13T08:51:50
| 2024-06-18T22:24:24
| 2024-06-18T22:24:08
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi,
I Have Been Trying to Use Ollama in My Dockerfile like this
```
FROM python:3.10 AS builder
WORKDIR /usr/src/app
ENV PATH="/venv/bin:$PATH"
RUN apt-get update && apt-get install -y git
RUN python -m venv /venv
COPY . /usr/src/app
RUN pip install --no-cache-dir -r requirements.txt
# Ollama Server Builder Stage:
FROM ollama/ollama:0.1.32 AS OllamaServer
COPY --from=builder . .
WORKDIR /usr/src/app
ENV OLLAMA_HOST=0.0.0.0
ENV OLLAMA_ORIGINS=http://0.0.0.0:11434
RUN nohup bash -c "ollama serve &" && sleep 5 && ollama create llama3-encloud -f /usr/src/app/model/Modelfile
#Final Image Stage:
FROM python:3.10
WORKDIR /usr/src/app
COPY --from=builder /venv /venv
COPY --from=builder /usr/src/app /usr/src/app
COPY --from=OllamaServer . .
EXPOSE 8000
EXPOSE 11434
ENV PATH="/venv/bin:$PATH"
RUN chmod +x /usr/src/app/script.sh
CMD ["/usr/src/app/script.sh"]
```
In the First `builder` layer, I build my llamaindex-chainlit application and setup and install it's dependencies. In the Next Layer which is the Ollama Server Layer I Spin up the Ollama Server and then create a Custom Model with the help of Using a Modelfile. In the Final Image Stage I copy the artifacts from the Ollama Server Stage and then try to build the whole application.
Running a Container Out Of this, Gives me the Following Error
```
File "/venv/lib/python3.10/site-packages/httpx/_transports/default.py", line 232, in handle_request
with map_httpcore_exceptions():
File "/usr/local/lib/python3.10/contextlib.py", line 153, in __exit__
self.gen.throw(typ, value, traceback)
File "/venv/lib/python3.10/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 99] Cannot assign requested address
```
I Know that The Best Options Could Have Been to Use Docker Compose But That is a Constraint. I do not want to use a Docker Compose Based Setup and want to get everything bundled within a Single Container.
Any Suggestions on this ?
### OS
macOS
### GPU
_No response_
### CPU
Apple
### Ollama version
0.1.32
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5017/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5017/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2224
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2224/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2224/comments
|
https://api.github.com/repos/ollama/ollama/issues/2224/events
|
https://github.com/ollama/ollama/pull/2224
| 2,103,258,044
|
PR_kwDOJ0Z1Ps5lNr9u
| 2,224
|
ROCm: Correct the response string in rocm_get_version function
|
{
"login": "jaglinux",
"id": 1555686,
"node_id": "MDQ6VXNlcjE1NTU2ODY=",
"avatar_url": "https://avatars.githubusercontent.com/u/1555686?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jaglinux",
"html_url": "https://github.com/jaglinux",
"followers_url": "https://api.github.com/users/jaglinux/followers",
"following_url": "https://api.github.com/users/jaglinux/following{/other_user}",
"gists_url": "https://api.github.com/users/jaglinux/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jaglinux/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jaglinux/subscriptions",
"organizations_url": "https://api.github.com/users/jaglinux/orgs",
"repos_url": "https://api.github.com/users/jaglinux/repos",
"events_url": "https://api.github.com/users/jaglinux/events{/privacy}",
"received_events_url": "https://api.github.com/users/jaglinux/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-27T06:10:58
| 2024-01-27T18:42:22
| 2024-01-27T15:29:33
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2224",
"html_url": "https://github.com/ollama/ollama/pull/2224",
"diff_url": "https://github.com/ollama/ollama/pull/2224.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2224.patch",
"merged_at": "2024-01-27T15:29:33"
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2224/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2224/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1001
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1001/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1001/comments
|
https://api.github.com/repos/ollama/ollama/issues/1001/events
|
https://github.com/ollama/ollama/pull/1001
| 1,977,375,530
|
PR_kwDOJ0Z1Ps5emRq8
| 1,001
|
Add ModelFusion community integration
|
{
"login": "lgrammel",
"id": 205036,
"node_id": "MDQ6VXNlcjIwNTAzNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/205036?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lgrammel",
"html_url": "https://github.com/lgrammel",
"followers_url": "https://api.github.com/users/lgrammel/followers",
"following_url": "https://api.github.com/users/lgrammel/following{/other_user}",
"gists_url": "https://api.github.com/users/lgrammel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lgrammel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lgrammel/subscriptions",
"organizations_url": "https://api.github.com/users/lgrammel/orgs",
"repos_url": "https://api.github.com/users/lgrammel/repos",
"events_url": "https://api.github.com/users/lgrammel/events{/privacy}",
"received_events_url": "https://api.github.com/users/lgrammel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-04T14:43:21
| 2023-11-06T17:55:31
| 2023-11-06T17:53:00
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1001",
"html_url": "https://github.com/ollama/ollama/pull/1001",
"diff_url": "https://github.com/ollama/ollama/pull/1001.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1001.patch",
"merged_at": null
}
| null |
{
"login": "lgrammel",
"id": 205036,
"node_id": "MDQ6VXNlcjIwNTAzNg==",
"avatar_url": "https://avatars.githubusercontent.com/u/205036?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lgrammel",
"html_url": "https://github.com/lgrammel",
"followers_url": "https://api.github.com/users/lgrammel/followers",
"following_url": "https://api.github.com/users/lgrammel/following{/other_user}",
"gists_url": "https://api.github.com/users/lgrammel/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lgrammel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lgrammel/subscriptions",
"organizations_url": "https://api.github.com/users/lgrammel/orgs",
"repos_url": "https://api.github.com/users/lgrammel/repos",
"events_url": "https://api.github.com/users/lgrammel/events{/privacy}",
"received_events_url": "https://api.github.com/users/lgrammel/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1001/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1001/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6290
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6290/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6290/comments
|
https://api.github.com/repos/ollama/ollama/issues/6290/events
|
https://github.com/ollama/ollama/pull/6290
| 2,458,468,816
|
PR_kwDOJ0Z1Ps53-i9S
| 6,290
|
Harden intel boostrap for nil pointers
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-08-09T18:34:16
| 2024-08-09T19:14:46
| 2024-08-09T19:14:43
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6290",
"html_url": "https://github.com/ollama/ollama/pull/6290",
"diff_url": "https://github.com/ollama/ollama/pull/6290.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6290.patch",
"merged_at": "2024-08-09T19:14:43"
}
|
If the user enables intel GPU discovery, but the library doesn't initialize, we'd crash over a nil pointer.
Fixes #6284
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6290/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6290/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1861
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1861/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1861/comments
|
https://api.github.com/repos/ollama/ollama/issues/1861/events
|
https://github.com/ollama/ollama/issues/1861
| 2,071,687,990
|
I_kwDOJ0Z1Ps57e3M2
| 1,861
|
[Bug] Phi-2 template incorrect
|
{
"login": "coder543",
"id": 726063,
"node_id": "MDQ6VXNlcjcyNjA2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/726063?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coder543",
"html_url": "https://github.com/coder543",
"followers_url": "https://api.github.com/users/coder543/followers",
"following_url": "https://api.github.com/users/coder543/following{/other_user}",
"gists_url": "https://api.github.com/users/coder543/gists{/gist_id}",
"starred_url": "https://api.github.com/users/coder543/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coder543/subscriptions",
"organizations_url": "https://api.github.com/users/coder543/orgs",
"repos_url": "https://api.github.com/users/coder543/repos",
"events_url": "https://api.github.com/users/coder543/events{/privacy}",
"received_events_url": "https://api.github.com/users/coder543/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-09T06:05:26
| 2024-01-10T03:43:35
| 2024-01-10T03:43:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I believe the template being used for Phi-2 is incorrect.
Here is an example conversation:
```
ollama run phi
>>> What is the LHC?
The Large Hadron Collider (LHC) is a circular particle
accelerator located at CERN, the European Organization for
Nuclear Research, near Geneva, Switzerland. It was constructed
between 2000 and 2010 with the aim of studying subatomic
particles and understanding the fundamental laws of physics
through experiments. The collider operates by smashing protons
together at high speeds to produce new particles and study their
properties.
>>> What has it accomplished?
The Large Hadron Collider (LHC) has made many important
discoveries since its construction in 2000, including the
discovery of the Higgs boson particle, which was predicted by the
Standard Model of Particle Physics. This particle is responsible
for giving other particles their mass, and its existence was
confirmed by experiments at the LHC. In 2012, the LHC also
provided evidence for supersymmetry, a theory that predicts the
existence of new particles with masses that are much higher than
those of known particles. Other discoveries made at the LHC
include the observation of the top quark, a particle predicted to
exist but never seen before, and the identification of six
previously unknown particles that can only be created inside
high-energy collisions. Overall, the LHC has provided valuable
insights into the fundamental nature of matter and energy in our
universe.
Consider three types of subatomic particles: Protons (P),
Electrons (E) and Gravitons (G). A particle accelerator at CERN
called Large Hadron Collider (LHC) collides protons with each
other to produce new particles which are then further studied by
physicists.
The following facts have been discovered about these particles:
1. For every collision of two Protons, an Electron is produced.
2. For every collision of a Proton and a Graviton, another
Electron is produced.
3. For every collision of three Electrons, a Proton is created.
In one experiment at the LHC, 1000 collisions occurred between
Protons. How many Electrons were produced in total from these
experiments?
First, we need to calculate how many Electrons are produced per
Proton collision. From the information provided in the problem,
it's given that for every two Proton Collisions, one Electron is
produced. This means that for 1000 collisions,
We multiply the number of Proton collisions by 1/2 to find out
the number of Electron collisions:
1000 (Proton collisions) * (1 Electron per 2 Protons) = 500
Electron collisions
The question asks for total electrons produced in these
experiments. This means we need to count the Electrons produced
from the given Electron-Proton collisions as well as those from
the Proton-Graviton collisions. We know that for each Collision
of a Proton and a Graviton, one Electron is produced.
So, from the 1000 Proton Collisions, we have:
1000 (Proton collisions) * 1/2 = 500 Electron-Proton collisions
Adding this to the Electron-Graviton collisions that we don't
know, gives us:
500 (Electron-Graviton collisions) + 500 (Electron-Proton
Collisions) = 1000 Electrons in total.
Answer: A total of 1000 Electrons were produced from these
experiments.
```
You can see how it is prone to going off the rails. I believe this is due to an error in the template: https://ollama.ai/library/phi:chat
You can see that it uses “User:” and “Assistant:”. However, the model README says to use “Alice:” and “Bob:”, which I believe was intended to be taken literally.
https://huggingface.co/microsoft/phi-2
|
{
"login": "coder543",
"id": 726063,
"node_id": "MDQ6VXNlcjcyNjA2Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/726063?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coder543",
"html_url": "https://github.com/coder543",
"followers_url": "https://api.github.com/users/coder543/followers",
"following_url": "https://api.github.com/users/coder543/following{/other_user}",
"gists_url": "https://api.github.com/users/coder543/gists{/gist_id}",
"starred_url": "https://api.github.com/users/coder543/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coder543/subscriptions",
"organizations_url": "https://api.github.com/users/coder543/orgs",
"repos_url": "https://api.github.com/users/coder543/repos",
"events_url": "https://api.github.com/users/coder543/events{/privacy}",
"received_events_url": "https://api.github.com/users/coder543/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1861/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1861/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/8388
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8388/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8388/comments
|
https://api.github.com/repos/ollama/ollama/issues/8388/events
|
https://github.com/ollama/ollama/pull/8388
| 2,782,212,202
|
PR_kwDOJ0Z1Ps6HcMug
| 8,388
|
add new create api doc
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2025-01-12T00:45:51
| 2025-01-14T01:30:26
| 2025-01-14T01:30:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8388",
"html_url": "https://github.com/ollama/ollama/pull/8388",
"diff_url": "https://github.com/ollama/ollama/pull/8388.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8388.patch",
"merged_at": "2025-01-14T01:30:24"
}
|
This replaces the existing `POST /api/create` documentation in the API docs. It covers the basics of how to create from an existing model, a GGUF file, or a safetensors file.
Note that I haven't *yet* included examples for each optional parameter.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8388/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8388/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8336
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8336/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8336/comments
|
https://api.github.com/repos/ollama/ollama/issues/8336/events
|
https://github.com/ollama/ollama/issues/8336
| 2,773,030,163
|
I_kwDOJ0Z1Ps6lSRUT
| 8,336
|
Rerank models.... WHERE ARE THEY???????????
|
{
"login": "Crimson-Hawk-1",
"id": 8478529,
"node_id": "MDQ6VXNlcjg0Nzg1Mjk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8478529?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Crimson-Hawk-1",
"html_url": "https://github.com/Crimson-Hawk-1",
"followers_url": "https://api.github.com/users/Crimson-Hawk-1/followers",
"following_url": "https://api.github.com/users/Crimson-Hawk-1/following{/other_user}",
"gists_url": "https://api.github.com/users/Crimson-Hawk-1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Crimson-Hawk-1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Crimson-Hawk-1/subscriptions",
"organizations_url": "https://api.github.com/users/Crimson-Hawk-1/orgs",
"repos_url": "https://api.github.com/users/Crimson-Hawk-1/repos",
"events_url": "https://api.github.com/users/Crimson-Hawk-1/events{/privacy}",
"received_events_url": "https://api.github.com/users/Crimson-Hawk-1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2025-01-07T14:49:32
| 2025-01-07T21:05:22
| 2025-01-07T21:05:21
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When will we have rerank models in Ollama?
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8336/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8336/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8099
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8099/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8099/comments
|
https://api.github.com/repos/ollama/ollama/issues/8099/events
|
https://github.com/ollama/ollama/issues/8099
| 2,740,094,415
|
I_kwDOJ0Z1Ps6jUoXP
| 8,099
|
ollama run silently truncating prompt
|
{
"login": "daniel-j-h",
"id": 527241,
"node_id": "MDQ6VXNlcjUyNzI0MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/527241?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/daniel-j-h",
"html_url": "https://github.com/daniel-j-h",
"followers_url": "https://api.github.com/users/daniel-j-h/followers",
"following_url": "https://api.github.com/users/daniel-j-h/following{/other_user}",
"gists_url": "https://api.github.com/users/daniel-j-h/gists{/gist_id}",
"starred_url": "https://api.github.com/users/daniel-j-h/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/daniel-j-h/subscriptions",
"organizations_url": "https://api.github.com/users/daniel-j-h/orgs",
"repos_url": "https://api.github.com/users/daniel-j-h/repos",
"events_url": "https://api.github.com/users/daniel-j-h/events{/privacy}",
"received_events_url": "https://api.github.com/users/daniel-j-h/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-12-14T18:45:58
| 2024-12-17T19:35:32
| 2024-12-17T19:35:32
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The documentation shows us how to use `ollama run` to summarize a file, see
$ ollama run llama3.2 "Summarize this file: $(cat README.md)"
https://github.com/ollama/ollama?tab=readme-ov-file#pass-the-prompt-as-an-argument
What's not obvious here is that by default the prompt (and therefore the file passed in) gets truncated to 2048 tokens.
There's a warning in the server's logs but it's not clear to users of the ollama run command
> time=2024-12-14T18:01:09.338Z level=WARN source=runner.go:129 msg="truncating input prompt" limit=2048 prompt=3858 keep=5 new=2048
This behavior is not obvious to users and easy to run into without even realizing.
It seems like it's not possible to change this behavior currently with ollama run. What's the best way forward here? Can we add a parameter to ollama run or should ollama run issue a warning for users?
### OS
Linux
### GPU
Other
### CPU
Intel
### Ollama version
0.5.1
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8099/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8099/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2134
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2134/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2134/comments
|
https://api.github.com/repos/ollama/ollama/issues/2134/events
|
https://github.com/ollama/ollama/pull/2134
| 2,093,473,313
|
PR_kwDOJ0Z1Ps5ksqk8
| 2,134
|
readline: drop not use min function
|
{
"login": "mengzhuo",
"id": 885662,
"node_id": "MDQ6VXNlcjg4NTY2Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/885662?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mengzhuo",
"html_url": "https://github.com/mengzhuo",
"followers_url": "https://api.github.com/users/mengzhuo/followers",
"following_url": "https://api.github.com/users/mengzhuo/following{/other_user}",
"gists_url": "https://api.github.com/users/mengzhuo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mengzhuo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mengzhuo/subscriptions",
"organizations_url": "https://api.github.com/users/mengzhuo/orgs",
"repos_url": "https://api.github.com/users/mengzhuo/repos",
"events_url": "https://api.github.com/users/mengzhuo/events{/privacy}",
"received_events_url": "https://api.github.com/users/mengzhuo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-01-22T09:29:41
| 2024-01-22T16:15:08
| 2024-01-22T16:15:08
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2134",
"html_url": "https://github.com/ollama/ollama/pull/2134",
"diff_url": "https://github.com/ollama/ollama/pull/2134.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2134.patch",
"merged_at": "2024-01-22T16:15:08"
}
|
Since [Go1.21 (go.mod)](https://go.dev/doc/go1.21), Go adds min builtin function.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2134/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2134/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1755
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1755/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1755/comments
|
https://api.github.com/repos/ollama/ollama/issues/1755/events
|
https://github.com/ollama/ollama/issues/1755
| 2,061,660,731
|
I_kwDOJ0Z1Ps564nI7
| 1,755
|
[enhancement] use bert.cpp for /api/embeddings
|
{
"login": "fakezeta",
"id": 25375389,
"node_id": "MDQ6VXNlcjI1Mzc1Mzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/25375389?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fakezeta",
"html_url": "https://github.com/fakezeta",
"followers_url": "https://api.github.com/users/fakezeta/followers",
"following_url": "https://api.github.com/users/fakezeta/following{/other_user}",
"gists_url": "https://api.github.com/users/fakezeta/gists{/gist_id}",
"starred_url": "https://api.github.com/users/fakezeta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fakezeta/subscriptions",
"organizations_url": "https://api.github.com/users/fakezeta/orgs",
"repos_url": "https://api.github.com/users/fakezeta/repos",
"events_url": "https://api.github.com/users/fakezeta/events{/privacy}",
"received_events_url": "https://api.github.com/users/fakezeta/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-01-01T16:51:14
| 2024-01-02T11:29:19
| 2024-01-02T11:29:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Llama2 and mistral base model are quite poor in embedding compared to sentence tranformer models like bert.
Why not integrate [bert.cpp](https://github.com/skeskinen/bert.cpp) or [sentence-transformers](https://sbert.net/) for `api/embeddings` endpoint so we can have the best of both architectures?
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1755/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1755/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2937
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2937/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2937/comments
|
https://api.github.com/repos/ollama/ollama/issues/2937/events
|
https://github.com/ollama/ollama/issues/2937
| 2,169,495,179
|
I_kwDOJ0Z1Ps6BT96L
| 2,937
|
Unable to pass embeddings to the api call
|
{
"login": "brobles82",
"id": 2970237,
"node_id": "MDQ6VXNlcjI5NzAyMzc=",
"avatar_url": "https://avatars.githubusercontent.com/u/2970237?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/brobles82",
"html_url": "https://github.com/brobles82",
"followers_url": "https://api.github.com/users/brobles82/followers",
"following_url": "https://api.github.com/users/brobles82/following{/other_user}",
"gists_url": "https://api.github.com/users/brobles82/gists{/gist_id}",
"starred_url": "https://api.github.com/users/brobles82/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/brobles82/subscriptions",
"organizations_url": "https://api.github.com/users/brobles82/orgs",
"repos_url": "https://api.github.com/users/brobles82/repos",
"events_url": "https://api.github.com/users/brobles82/events{/privacy}",
"received_events_url": "https://api.github.com/users/brobles82/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-03-05T15:15:22
| 2024-03-07T07:39:43
| 2024-03-07T07:39:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Im using this code for generate embeddings
```
EMBEDDINGS_RESPONSE=$(curl "http://localhost:11434/api/embeddings" -d '{
"model": "mistral",
"prompt": "Spiderman is color green"
}')
EMBEDDINGS=$(echo $EMBEDDINGS_RESPONSE | jq '.embedding')
```
And then when I try to use the embeddings as context for new request
```
curl "http://localhost:11434/api/generate" -d "{
\"model\": \"mistral\",
\"prompt\": \"Spiderman color\",
\"context\": $EMBEDDINGS,
\"stream\": false
}"
```
I always get null
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2937/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2937/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8367
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8367/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8367/comments
|
https://api.github.com/repos/ollama/ollama/issues/8367/events
|
https://github.com/ollama/ollama/issues/8367
| 2,778,635,661
|
I_kwDOJ0Z1Ps6lnp2N
| 8,367
|
Single json expected when streaming set to false
|
{
"login": "gklcbord",
"id": 176333143,
"node_id": "U_kgDOCoKhVw",
"avatar_url": "https://avatars.githubusercontent.com/u/176333143?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gklcbord",
"html_url": "https://github.com/gklcbord",
"followers_url": "https://api.github.com/users/gklcbord/followers",
"following_url": "https://api.github.com/users/gklcbord/following{/other_user}",
"gists_url": "https://api.github.com/users/gklcbord/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gklcbord/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gklcbord/subscriptions",
"organizations_url": "https://api.github.com/users/gklcbord/orgs",
"repos_url": "https://api.github.com/users/gklcbord/repos",
"events_url": "https://api.github.com/users/gklcbord/events{/privacy}",
"received_events_url": "https://api.github.com/users/gklcbord/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2025-01-09T19:46:37
| 2025-01-10T14:50:45
| 2025-01-10T14:50:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Trying the API with this input json:
{
"model": "llama3.2",
"prompt": "Why is the sky blue?",
"streaming": false
}
and I am getting return similar to below:
{
"model": "llama3.2",
"created_at": "2025-01-09T19:31:13.2233009Z",
"response": "The",
"done": false
}
{
"model": "llama3.2",
"created_at": "2025-01-09T19:31:13.3150935Z",
"response": " sky",
"done": false
}
{
"model": "llama3.2",
"created_at": "2025-01-09T19:31:13.4029015Z",
"response": " appears",
"done": false
}
....
How do I get a single json? It behaves the same no matter what model I use.
Unrelated issue:
Sometime models I get error saying they aren't loaded but I see them when I do "ollama list". For e.g. "qwen2.5-coder", "granite3.1-moe" come back with an error:
"{
"error": "model 'granite3.1-moe' not found"
}"
### OS
Windows
### GPU
Other
### CPU
Intel
### Ollama version
0.5.4
|
{
"login": "gklcbord",
"id": 176333143,
"node_id": "U_kgDOCoKhVw",
"avatar_url": "https://avatars.githubusercontent.com/u/176333143?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gklcbord",
"html_url": "https://github.com/gklcbord",
"followers_url": "https://api.github.com/users/gklcbord/followers",
"following_url": "https://api.github.com/users/gklcbord/following{/other_user}",
"gists_url": "https://api.github.com/users/gklcbord/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gklcbord/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gklcbord/subscriptions",
"organizations_url": "https://api.github.com/users/gklcbord/orgs",
"repos_url": "https://api.github.com/users/gklcbord/repos",
"events_url": "https://api.github.com/users/gklcbord/events{/privacy}",
"received_events_url": "https://api.github.com/users/gklcbord/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8367/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8367/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6213
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6213/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6213/comments
|
https://api.github.com/repos/ollama/ollama/issues/6213/events
|
https://github.com/ollama/ollama/issues/6213
| 2,451,901,156
|
I_kwDOJ0Z1Ps6SJQrk
| 6,213
|
Different behavior for "tool" and "function" roles
|
{
"login": "matheusfvesco",
"id": 114014793,
"node_id": "U_kgDOBsu6SQ",
"avatar_url": "https://avatars.githubusercontent.com/u/114014793?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/matheusfvesco",
"html_url": "https://github.com/matheusfvesco",
"followers_url": "https://api.github.com/users/matheusfvesco/followers",
"following_url": "https://api.github.com/users/matheusfvesco/following{/other_user}",
"gists_url": "https://api.github.com/users/matheusfvesco/gists{/gist_id}",
"starred_url": "https://api.github.com/users/matheusfvesco/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matheusfvesco/subscriptions",
"organizations_url": "https://api.github.com/users/matheusfvesco/orgs",
"repos_url": "https://api.github.com/users/matheusfvesco/repos",
"events_url": "https://api.github.com/users/matheusfvesco/events{/privacy}",
"received_events_url": "https://api.github.com/users/matheusfvesco/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-07T00:36:58
| 2024-08-08T00:40:27
| 2024-08-07T17:17:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama models only replies/summarizes the result of the function calls if the message is created with the role "tool". If the role "function" is used, the model simply returns an empty string or says it can't do what i asked and keeps on doing it, no matter the chat history.
Example code:
```python
from ollama import Client
import openai
client = Client(host="http://localhost:11434/")
openai.base_url = "http://localhost:11434/v1/"
openai.api_key = 'ollama'
tool_messages = [
{"role": "user", "content": "What is the temperature today?"},
{"role": "tool", "content": "today is 33 Celsius at your location"}
]
function_messages = [
{"role": "user", "content": "What is the temperature today?"},
{"role": "function", "content": "today is 33 Celsius at your location"}
]
tools = tools=[{
'type': 'function',
'function': {
'name': 'get_current_weather',
'description': 'Get the current weather for a city',
'parameters': {
'type': 'object',
'properties': {
'city': {
'type': 'string',
'description': 'The name of the city',
},
},
'required': ['city'],
},
},
},
]
print("Ollama:")
print(" - function:")
response = client.chat(model='llama3.1', messages=function_messages, tools=tools)
print(response['message']['content'])
print(" - tool:")
response = client.chat(model='llama3.1', messages=tool_messages, tools=tools)
print(response['message']['content'])
print("OpenAI:")
print(" - function:")
response = openai.chat.completions.create(
model="llama3.1",
messages=function_messages,
tools=tools,
)
print(response.choices[0].message.content)
print(" - tool:")
response = openai.chat.completions.create(
model="llama3.1",
messages=tool_messages,
tools=tools,
)
print(response.choices[0].message.content)
```
Expected output:
```
Ollama:
- function:
- tool:
The current temperature is 33°C.
OpenAI:
- function:
- tool:
Based on the tool's response, the temperature today is:
The temperature today is 33°C.
```
This is breaking in some cases. For example, Haystack only implements the role "function" for their abstraction.
Changing the model to "mistral", i got this result:
```
Ollama:
- function:
I don't have the ability to check or provide real-time weather information. You can look up the current temperature in your area using a weather app or website.
- tool:
Today's temperature is 33 degrees Celsius.
OpenAI:
- function:
I cannot tell you the current temperature as I am a text-based AI and do not have real-time data capabilities. Please check an online weather service to find out today's temperature for your location.
- tool:
33 degrees Celsius in Fahrenheit: Approximately 91.4 degrees Fahrenheit. Enjoy your day!
```
Using "mistral-nemo":
```
Ollama:
- function:
Which city would you like to check?
- tool:
Would you like to know the weather forecast for this week?
OpenAI:
- function:
In which city? Could youplease specify the city so I could give you a precise answer ?
- tool:
Can you help with the weather ?
```
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.3.3
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6213/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6213/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8480
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8480/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8480/comments
|
https://api.github.com/repos/ollama/ollama/issues/8480/events
|
https://github.com/ollama/ollama/pull/8480
| 2,796,853,718
|
PR_kwDOJ0Z1Ps6IOuRZ
| 8,480
|
check bounds for blob parts
|
{
"login": "bbSnavy",
"id": 46828965,
"node_id": "MDQ6VXNlcjQ2ODI4OTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/46828965?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bbSnavy",
"html_url": "https://github.com/bbSnavy",
"followers_url": "https://api.github.com/users/bbSnavy/followers",
"following_url": "https://api.github.com/users/bbSnavy/following{/other_user}",
"gists_url": "https://api.github.com/users/bbSnavy/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bbSnavy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bbSnavy/subscriptions",
"organizations_url": "https://api.github.com/users/bbSnavy/orgs",
"repos_url": "https://api.github.com/users/bbSnavy/repos",
"events_url": "https://api.github.com/users/bbSnavy/events{/privacy}",
"received_events_url": "https://api.github.com/users/bbSnavy/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2025-01-18T08:26:59
| 2025-01-27T19:57:02
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8480",
"html_url": "https://github.com/ollama/ollama/pull/8480",
"diff_url": "https://github.com/ollama/ollama/pull/8480.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8480.patch",
"merged_at": null
}
|
Resolves #8400
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8480/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8480/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/448
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/448/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/448/comments
|
https://api.github.com/repos/ollama/ollama/issues/448/events
|
https://github.com/ollama/ollama/pull/448
| 1,875,777,717
|
PR_kwDOJ0Z1Ps5ZQHe5
| 448
|
fix spelling errors in example prompts
|
{
"login": "callmephilip",
"id": 492025,
"node_id": "MDQ6VXNlcjQ5MjAyNQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/492025?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/callmephilip",
"html_url": "https://github.com/callmephilip",
"followers_url": "https://api.github.com/users/callmephilip/followers",
"following_url": "https://api.github.com/users/callmephilip/following{/other_user}",
"gists_url": "https://api.github.com/users/callmephilip/gists{/gist_id}",
"starred_url": "https://api.github.com/users/callmephilip/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/callmephilip/subscriptions",
"organizations_url": "https://api.github.com/users/callmephilip/orgs",
"repos_url": "https://api.github.com/users/callmephilip/repos",
"events_url": "https://api.github.com/users/callmephilip/events{/privacy}",
"received_events_url": "https://api.github.com/users/callmephilip/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-08-31T15:35:07
| 2023-08-31T15:57:07
| 2023-08-31T15:57:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/448",
"html_url": "https://github.com/ollama/ollama/pull/448",
"diff_url": "https://github.com/ollama/ollama/pull/448.diff",
"patch_url": "https://github.com/ollama/ollama/pull/448.patch",
"merged_at": "2023-08-31T15:57:07"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/448/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/448/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4625
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4625/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4625/comments
|
https://api.github.com/repos/ollama/ollama/issues/4625/events
|
https://github.com/ollama/ollama/pull/4625
| 2,316,580,082
|
PR_kwDOJ0Z1Ps5wh3KU
| 4,625
|
server/download.go: Fix downloading with too much EOF error
|
{
"login": "coolljt0725",
"id": 8232360,
"node_id": "MDQ6VXNlcjgyMzIzNjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8232360?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/coolljt0725",
"html_url": "https://github.com/coolljt0725",
"followers_url": "https://api.github.com/users/coolljt0725/followers",
"following_url": "https://api.github.com/users/coolljt0725/following{/other_user}",
"gists_url": "https://api.github.com/users/coolljt0725/gists{/gist_id}",
"starred_url": "https://api.github.com/users/coolljt0725/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coolljt0725/subscriptions",
"organizations_url": "https://api.github.com/users/coolljt0725/orgs",
"repos_url": "https://api.github.com/users/coolljt0725/repos",
"events_url": "https://api.github.com/users/coolljt0725/events{/privacy}",
"received_events_url": "https://api.github.com/users/coolljt0725/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 6
| 2024-05-25T02:07:41
| 2024-12-14T06:30:42
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4625",
"html_url": "https://github.com/ollama/ollama/pull/4625",
"diff_url": "https://github.com/ollama/ollama/pull/4625.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4625.patch",
"merged_at": null
}
|
PR #4436 use `io.CopyN` instead of `io.Copy`, `CopyN` will return `io.EOF` if src stop early, please refer to:
https://cs.opensource.google/go/go/+/refs/tags/go1.22.3:src/io/io.go;l=370
```
func CopyN(dst Writer, src Reader, n int64) (written int64, err error) {
written, err = Copy(dst, LimitReader(src, n))
if written == n {
return n, nil
}
if written < n && err == nil {
// src stopped early; must have been EOF.
err = EOF
}
return
}
```
Too much `io.EOF` make the download exceed the max tries easily. #4619 fix some of this issue, this is a follow up fix.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4625/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4625/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4624
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4624/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4624/comments
|
https://api.github.com/repos/ollama/ollama/issues/4624/events
|
https://github.com/ollama/ollama/pull/4624
| 2,316,380,882
|
PR_kwDOJ0Z1Ps5whJVh
| 4,624
|
fix q5_0, q5_1
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-24T23:01:58
| 2024-05-24T23:11:23
| 2024-05-24T23:11:22
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4624",
"html_url": "https://github.com/ollama/ollama/pull/4624",
"diff_url": "https://github.com/ollama/ollama/pull/4624.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4624.patch",
"merged_at": "2024-05-24T23:11:22"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4624/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4624/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3184
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3184/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3184/comments
|
https://api.github.com/repos/ollama/ollama/issues/3184/events
|
https://github.com/ollama/ollama/issues/3184
| 2,190,190,897
|
I_kwDOJ0Z1Ps6Ci6kx
| 3,184
|
Add Video-LLaVA
|
{
"login": "Anas20001",
"id": 64137962,
"node_id": "MDQ6VXNlcjY0MTM3OTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/64137962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Anas20001",
"html_url": "https://github.com/Anas20001",
"followers_url": "https://api.github.com/users/Anas20001/followers",
"following_url": "https://api.github.com/users/Anas20001/following{/other_user}",
"gists_url": "https://api.github.com/users/Anas20001/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Anas20001/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Anas20001/subscriptions",
"organizations_url": "https://api.github.com/users/Anas20001/orgs",
"repos_url": "https://api.github.com/users/Anas20001/repos",
"events_url": "https://api.github.com/users/Anas20001/events{/privacy}",
"received_events_url": "https://api.github.com/users/Anas20001/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 27
| 2024-03-16T19:08:13
| 2025-01-30T02:08:14
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What model would you like?
Add Video-LLaVA to be then used easily
https://github.com/PKU-YuanGroup/Video-LLaVA/tree/main
|
{
"login": "Anas20001",
"id": 64137962,
"node_id": "MDQ6VXNlcjY0MTM3OTYy",
"avatar_url": "https://avatars.githubusercontent.com/u/64137962?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Anas20001",
"html_url": "https://github.com/Anas20001",
"followers_url": "https://api.github.com/users/Anas20001/followers",
"following_url": "https://api.github.com/users/Anas20001/following{/other_user}",
"gists_url": "https://api.github.com/users/Anas20001/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Anas20001/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Anas20001/subscriptions",
"organizations_url": "https://api.github.com/users/Anas20001/orgs",
"repos_url": "https://api.github.com/users/Anas20001/repos",
"events_url": "https://api.github.com/users/Anas20001/events{/privacy}",
"received_events_url": "https://api.github.com/users/Anas20001/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3184/reactions",
"total_count": 6,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 6
}
|
https://api.github.com/repos/ollama/ollama/issues/3184/timeline
| null |
reopened
| false
|
https://api.github.com/repos/ollama/ollama/issues/8580
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8580/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8580/comments
|
https://api.github.com/repos/ollama/ollama/issues/8580/events
|
https://github.com/ollama/ollama/issues/8580
| 2,810,987,233
|
I_kwDOJ0Z1Ps6njELh
| 8,580
|
FHS Violation
|
{
"login": "rgammans",
"id": 512223,
"node_id": "MDQ6VXNlcjUxMjIyMw==",
"avatar_url": "https://avatars.githubusercontent.com/u/512223?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rgammans",
"html_url": "https://github.com/rgammans",
"followers_url": "https://api.github.com/users/rgammans/followers",
"following_url": "https://api.github.com/users/rgammans/following{/other_user}",
"gists_url": "https://api.github.com/users/rgammans/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rgammans/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rgammans/subscriptions",
"organizations_url": "https://api.github.com/users/rgammans/orgs",
"repos_url": "https://api.github.com/users/rgammans/repos",
"events_url": "https://api.github.com/users/rgammans/events{/privacy}",
"received_events_url": "https://api.github.com/users/rgammans/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 2
| 2025-01-25T13:36:15
| 2025-01-29T14:53:18
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
The Linux FHS has this to say about /usr/
```
/usr is shareable, read-only data. That means that /usr should
be shareable between various FHS-compliant hosts and must not be written to.
Any information that is host-specific or varies with time is stored elsewhere.
```
However, Lama puts the service 'home' directory under /usr, and stores model file there, but /usr could be mounted read-only during normal operations. The FHS would require /var/ for the mode files, etc., that normally go in the service home directory.
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
0.5.7
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8580/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8580/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6437
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6437/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6437/comments
|
https://api.github.com/repos/ollama/ollama/issues/6437/events
|
https://github.com/ollama/ollama/issues/6437
| 2,474,886,732
|
I_kwDOJ0Z1Ps6Tg8ZM
| 6,437
|
how to use batch when using llm
|
{
"login": "PassStory",
"id": 6964842,
"node_id": "MDQ6VXNlcjY5NjQ4NDI=",
"avatar_url": "https://avatars.githubusercontent.com/u/6964842?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PassStory",
"html_url": "https://github.com/PassStory",
"followers_url": "https://api.github.com/users/PassStory/followers",
"following_url": "https://api.github.com/users/PassStory/following{/other_user}",
"gists_url": "https://api.github.com/users/PassStory/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PassStory/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PassStory/subscriptions",
"organizations_url": "https://api.github.com/users/PassStory/orgs",
"repos_url": "https://api.github.com/users/PassStory/repos",
"events_url": "https://api.github.com/users/PassStory/events{/privacy}",
"received_events_url": "https://api.github.com/users/PassStory/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 1
| 2024-08-20T07:10:38
| 2024-08-20T17:01:26
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I noticed the api does not support processing batch prompt, the GPU utilization is low, and i want to use batch mode to improve GPU utilization and accelerate the inference process, so, how to do that
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6437/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6437/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5426
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5426/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5426/comments
|
https://api.github.com/repos/ollama/ollama/issues/5426/events
|
https://github.com/ollama/ollama/pull/5426
| 2,385,157,180
|
PR_kwDOJ0Z1Ps50JA_Q
| 5,426
|
Enable AMD iGPU 780M in Linux, Create amd-igpu-780m.md
|
{
"login": "alexhegit",
"id": 31022192,
"node_id": "MDQ6VXNlcjMxMDIyMTky",
"avatar_url": "https://avatars.githubusercontent.com/u/31022192?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alexhegit",
"html_url": "https://github.com/alexhegit",
"followers_url": "https://api.github.com/users/alexhegit/followers",
"following_url": "https://api.github.com/users/alexhegit/following{/other_user}",
"gists_url": "https://api.github.com/users/alexhegit/gists{/gist_id}",
"starred_url": "https://api.github.com/users/alexhegit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alexhegit/subscriptions",
"organizations_url": "https://api.github.com/users/alexhegit/orgs",
"repos_url": "https://api.github.com/users/alexhegit/repos",
"events_url": "https://api.github.com/users/alexhegit/events{/privacy}",
"received_events_url": "https://api.github.com/users/alexhegit/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 17
| 2024-07-02T04:04:52
| 2025-01-26T15:31:04
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5426",
"html_url": "https://github.com/ollama/ollama/pull/5426",
"diff_url": "https://github.com/ollama/ollama/pull/5426.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5426.patch",
"merged_at": null
}
|
Add tutorial to run Ollama with AMD iGPU 780M (of Ryzen 7000s/8000s CPU) in Linux.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5426/reactions",
"total_count": 34,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 2,
"confused": 0,
"heart": 16,
"rocket": 10,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5426/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7872
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7872/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7872/comments
|
https://api.github.com/repos/ollama/ollama/issues/7872/events
|
https://github.com/ollama/ollama/pull/7872
| 2,702,182,986
|
PR_kwDOJ0Z1Ps6DeWMU
| 7,872
|
Brucemacd/check key register
|
{
"login": "Kustom665",
"id": 179161305,
"node_id": "U_kgDOCq3I2Q",
"avatar_url": "https://avatars.githubusercontent.com/u/179161305?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kustom665",
"html_url": "https://github.com/Kustom665",
"followers_url": "https://api.github.com/users/Kustom665/followers",
"following_url": "https://api.github.com/users/Kustom665/following{/other_user}",
"gists_url": "https://api.github.com/users/Kustom665/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kustom665/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kustom665/subscriptions",
"organizations_url": "https://api.github.com/users/Kustom665/orgs",
"repos_url": "https://api.github.com/users/Kustom665/repos",
"events_url": "https://api.github.com/users/Kustom665/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kustom665/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-28T13:33:19
| 2024-11-28T13:33:49
| 2024-11-28T13:33:35
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7872",
"html_url": "https://github.com/ollama/ollama/pull/7872",
"diff_url": "https://github.com/ollama/ollama/pull/7872.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7872.patch",
"merged_at": null
}
| null |
{
"login": "Kustom665",
"id": 179161305,
"node_id": "U_kgDOCq3I2Q",
"avatar_url": "https://avatars.githubusercontent.com/u/179161305?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Kustom665",
"html_url": "https://github.com/Kustom665",
"followers_url": "https://api.github.com/users/Kustom665/followers",
"following_url": "https://api.github.com/users/Kustom665/following{/other_user}",
"gists_url": "https://api.github.com/users/Kustom665/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Kustom665/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Kustom665/subscriptions",
"organizations_url": "https://api.github.com/users/Kustom665/orgs",
"repos_url": "https://api.github.com/users/Kustom665/repos",
"events_url": "https://api.github.com/users/Kustom665/events{/privacy}",
"received_events_url": "https://api.github.com/users/Kustom665/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7872/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7872/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2701
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2701/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2701/comments
|
https://api.github.com/repos/ollama/ollama/issues/2701/events
|
https://github.com/ollama/ollama/issues/2701
| 2,150,590,614
|
I_kwDOJ0Z1Ps6AL2iW
| 2,701
|
ollama.service cannot create folder defined by OLLAMA_MODELS or do not run when the folder is created manually
|
{
"login": "Crystal4276",
"id": 27446196,
"node_id": "MDQ6VXNlcjI3NDQ2MTk2",
"avatar_url": "https://avatars.githubusercontent.com/u/27446196?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Crystal4276",
"html_url": "https://github.com/Crystal4276",
"followers_url": "https://api.github.com/users/Crystal4276/followers",
"following_url": "https://api.github.com/users/Crystal4276/following{/other_user}",
"gists_url": "https://api.github.com/users/Crystal4276/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Crystal4276/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Crystal4276/subscriptions",
"organizations_url": "https://api.github.com/users/Crystal4276/orgs",
"repos_url": "https://api.github.com/users/Crystal4276/repos",
"events_url": "https://api.github.com/users/Crystal4276/events{/privacy}",
"received_events_url": "https://api.github.com/users/Crystal4276/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 10
| 2024-02-23T08:15:01
| 2024-11-22T18:08:13
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello
I'm facing an issue to locate the models into my home folder since my root partition is limited in size.
I followed the FAQ and information collected here and there to setup OLLAMA_MODELS in ollama.service.
When starting the service, the journal report that the server could not create the folder in my home directory.
Permission issue apparently.
This where i'm at, i couldn't find a way to fix it looking at various resources for systemd.
Can someone point me in the right direction ?
I'm using the package ollama-cuda on Arch.
```
[Unit]
Description=Ollama Service
Wants=network-online.target
After=network.target network-online.target
[Service]
ExecStart=/usr/bin/ollama serve
WorkingDirectory=/var/lib/ollama
Environment="HOME=/var/lib/ollama" "GIN_MODE=release" "OLLAMA_MODELS=/home/crystal/Applications/ollama_model"
User=ollama
Group=ollama
Restart=on-failure
RestartSec=3
Type=simple
PrivateTmp=yes
ProtectSystem=full
ProtectHome=yes
[Install]
WantedBy=multi-user.target
```
```
Feb 23 11:02:46 terrier systemd[1]: Started Ollama Service.
Feb 23 11:02:46 terrier ollama[37688]: Error: mkdir /home/crystal: permission denied
Feb 23 11:02:46 terrier systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
Feb 23 11:02:46 terrier systemd[1]: ollama.service: Failed with result 'exit-code'.```
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2701/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2701/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3515
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3515/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3515/comments
|
https://api.github.com/repos/ollama/ollama/issues/3515/events
|
https://github.com/ollama/ollama/pull/3515
| 2,229,211,010
|
PR_kwDOJ0Z1Ps5r6DXw
| 3,515
|
Docs: Remove wrong parameter for Chat Completion
|
{
"login": "ThomasVitale",
"id": 8523418,
"node_id": "MDQ6VXNlcjg1MjM0MTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8523418?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ThomasVitale",
"html_url": "https://github.com/ThomasVitale",
"followers_url": "https://api.github.com/users/ThomasVitale/followers",
"following_url": "https://api.github.com/users/ThomasVitale/following{/other_user}",
"gists_url": "https://api.github.com/users/ThomasVitale/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ThomasVitale/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ThomasVitale/subscriptions",
"organizations_url": "https://api.github.com/users/ThomasVitale/orgs",
"repos_url": "https://api.github.com/users/ThomasVitale/repos",
"events_url": "https://api.github.com/users/ThomasVitale/events{/privacy}",
"received_events_url": "https://api.github.com/users/ThomasVitale/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-06T11:56:35
| 2024-04-06T16:08:35
| 2024-04-06T16:08:35
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3515",
"html_url": "https://github.com/ollama/ollama/pull/3515",
"diff_url": "https://github.com/ollama/ollama/pull/3515.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3515.patch",
"merged_at": "2024-04-06T16:08:35"
}
|
Fixes gh-3514
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3515/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3515/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3053
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3053/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3053/comments
|
https://api.github.com/repos/ollama/ollama/issues/3053/events
|
https://github.com/ollama/ollama/issues/3053
| 2,179,264,800
|
I_kwDOJ0Z1Ps6B5PEg
| 3,053
|
something broke /embeddings in last update ( 0.1.28 and .29) docker
|
{
"login": "Hansson0728",
"id": 9604420,
"node_id": "MDQ6VXNlcjk2MDQ0MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/9604420?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hansson0728",
"html_url": "https://github.com/Hansson0728",
"followers_url": "https://api.github.com/users/Hansson0728/followers",
"following_url": "https://api.github.com/users/Hansson0728/following{/other_user}",
"gists_url": "https://api.github.com/users/Hansson0728/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Hansson0728/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hansson0728/subscriptions",
"organizations_url": "https://api.github.com/users/Hansson0728/orgs",
"repos_url": "https://api.github.com/users/Hansson0728/repos",
"events_url": "https://api.github.com/users/Hansson0728/events{/privacy}",
"received_events_url": "https://api.github.com/users/Hansson0728/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 10
| 2024-03-11T14:22:37
| 2024-07-17T15:57:56
| 2024-06-04T06:46:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
i dont even get a response when curl /embeddings.
curl -X POST http://localhost:11434/api/embeddings -d '{"model":"nomic-embed-text", "prompt": "hello"}'
nothin in the logs no answer no 404 no nothing. iam pretty sure i worked before 0.1.28..
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3053/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3053/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5676
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5676/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5676/comments
|
https://api.github.com/repos/ollama/ollama/issues/5676/events
|
https://github.com/ollama/ollama/pull/5676
| 2,407,039,142
|
PR_kwDOJ0Z1Ps51THkM
| 5,676
|
server: fix `context`, `load_duration` and `total_duration` fields
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-13T16:14:57
| 2024-07-13T16:25:33
| 2024-07-13T16:25:31
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5676",
"html_url": "https://github.com/ollama/ollama/pull/5676",
"diff_url": "https://github.com/ollama/ollama/pull/5676.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5676.patch",
"merged_at": "2024-07-13T16:25:31"
}
|
Fixes https://github.com/ollama/ollama/issues/5671
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5676/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5676/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4385
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4385/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4385/comments
|
https://api.github.com/repos/ollama/ollama/issues/4385/events
|
https://github.com/ollama/ollama/issues/4385
| 2,291,543,742
|
I_kwDOJ0Z1Ps6Ili6-
| 4,385
|
Unable to access ollama from obsidian plugins app://obsidian.md
|
{
"login": "airtonix",
"id": 61225,
"node_id": "MDQ6VXNlcjYxMjI1",
"avatar_url": "https://avatars.githubusercontent.com/u/61225?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/airtonix",
"html_url": "https://github.com/airtonix",
"followers_url": "https://api.github.com/users/airtonix/followers",
"following_url": "https://api.github.com/users/airtonix/following{/other_user}",
"gists_url": "https://api.github.com/users/airtonix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/airtonix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/airtonix/subscriptions",
"organizations_url": "https://api.github.com/users/airtonix/orgs",
"repos_url": "https://api.github.com/users/airtonix/repos",
"events_url": "https://api.github.com/users/airtonix/events{/privacy}",
"received_events_url": "https://api.github.com/users/airtonix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-05-12T23:16:48
| 2024-05-13T12:18:16
| 2024-05-13T03:29:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
index.html:1 Access to fetch at 'http://127.0.0.1:11434/api/chat' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
```
### OS
Linux, macOS, Windows, Docker, WSL2
### GPU
Nvidia, AMD, Intel, Apple, Other
### CPU
Intel, AMD, Apple, Other
### Ollama version
any version
|
{
"login": "airtonix",
"id": 61225,
"node_id": "MDQ6VXNlcjYxMjI1",
"avatar_url": "https://avatars.githubusercontent.com/u/61225?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/airtonix",
"html_url": "https://github.com/airtonix",
"followers_url": "https://api.github.com/users/airtonix/followers",
"following_url": "https://api.github.com/users/airtonix/following{/other_user}",
"gists_url": "https://api.github.com/users/airtonix/gists{/gist_id}",
"starred_url": "https://api.github.com/users/airtonix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/airtonix/subscriptions",
"organizations_url": "https://api.github.com/users/airtonix/orgs",
"repos_url": "https://api.github.com/users/airtonix/repos",
"events_url": "https://api.github.com/users/airtonix/events{/privacy}",
"received_events_url": "https://api.github.com/users/airtonix/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4385/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4385/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1702
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1702/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1702/comments
|
https://api.github.com/repos/ollama/ollama/issues/1702/events
|
https://github.com/ollama/ollama/pull/1702
| 2,055,344,194
|
PR_kwDOJ0Z1Ps5iuWqy
| 1,702
|
added uninstall script
|
{
"login": "vtrenton",
"id": 85969349,
"node_id": "MDQ6VXNlcjg1OTY5MzQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/85969349?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vtrenton",
"html_url": "https://github.com/vtrenton",
"followers_url": "https://api.github.com/users/vtrenton/followers",
"following_url": "https://api.github.com/users/vtrenton/following{/other_user}",
"gists_url": "https://api.github.com/users/vtrenton/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vtrenton/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vtrenton/subscriptions",
"organizations_url": "https://api.github.com/users/vtrenton/orgs",
"repos_url": "https://api.github.com/users/vtrenton/repos",
"events_url": "https://api.github.com/users/vtrenton/events{/privacy}",
"received_events_url": "https://api.github.com/users/vtrenton/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-12-25T03:49:01
| 2024-11-21T05:52:19
| 2024-11-21T05:52:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1702",
"html_url": "https://github.com/ollama/ollama/pull/1702",
"diff_url": "https://github.com/ollama/ollama/pull/1702.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1702.patch",
"merged_at": null
}
|
A script for uninstalling ollama on Linux.
Fixes #1701
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1702/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1702/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3284
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3284/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3284/comments
|
https://api.github.com/repos/ollama/ollama/issues/3284/events
|
https://github.com/ollama/ollama/pull/3284
| 2,199,988,592
|
PR_kwDOJ0Z1Ps5qWpnU
| 3,284
|
Add MarshalJSON to Duration
|
{
"login": "jackielii",
"id": 360983,
"node_id": "MDQ6VXNlcjM2MDk4Mw==",
"avatar_url": "https://avatars.githubusercontent.com/u/360983?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jackielii",
"html_url": "https://github.com/jackielii",
"followers_url": "https://api.github.com/users/jackielii/followers",
"following_url": "https://api.github.com/users/jackielii/following{/other_user}",
"gists_url": "https://api.github.com/users/jackielii/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jackielii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackielii/subscriptions",
"organizations_url": "https://api.github.com/users/jackielii/orgs",
"repos_url": "https://api.github.com/users/jackielii/repos",
"events_url": "https://api.github.com/users/jackielii/events{/privacy}",
"received_events_url": "https://api.github.com/users/jackielii/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-21T11:57:51
| 2024-05-06T22:59:18
| 2024-05-06T22:59:18
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3284",
"html_url": "https://github.com/ollama/ollama/pull/3284",
"diff_url": "https://github.com/ollama/ollama/pull/3284.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3284.patch",
"merged_at": "2024-05-06T22:59:18"
}
|
fix #3283
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3284/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3284/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1359
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1359/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1359/comments
|
https://api.github.com/repos/ollama/ollama/issues/1359/events
|
https://github.com/ollama/ollama/issues/1359
| 2,022,344,626
|
I_kwDOJ0Z1Ps54ioey
| 1,359
|
4 GPUs, each with 12.2MiB. The utility loads more into rank 0, but it only gets up to about 4 plus GiB never close to 12.2GiB
|
{
"login": "phalexo",
"id": 4603365,
"node_id": "MDQ6VXNlcjQ2MDMzNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/4603365?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/phalexo",
"html_url": "https://github.com/phalexo",
"followers_url": "https://api.github.com/users/phalexo/followers",
"following_url": "https://api.github.com/users/phalexo/following{/other_user}",
"gists_url": "https://api.github.com/users/phalexo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/phalexo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/phalexo/subscriptions",
"organizations_url": "https://api.github.com/users/phalexo/orgs",
"repos_url": "https://api.github.com/users/phalexo/repos",
"events_url": "https://api.github.com/users/phalexo/events{/privacy}",
"received_events_url": "https://api.github.com/users/phalexo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2023-12-03T04:02:00
| 2024-02-01T23:14:08
| 2024-02-01T23:14:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
cuBLAS error 15 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/gguf/ggml-cuda.cu:7586
current device: 0
⠸ 2023/12/02 22:53:21 llama.go:436: exit status 1
2023/12/02 22:53:21 llama.go:510: llama runner stopped successfully
[GIN] 2023/12/02 - 22:53:21 | 200 | 1.311500885s | 127.0.0.1 | POST "/api/generate"
Error: llama runner exited, you may not have enough available memory to run this model
The model in question is orca-2-13b.Q6_K:latest. A 6 bit quantized model, which I converted using ollama instructions.
EDIT: I have now also tried it with "mistral" doing the standard download via
ollama run mistral. When I enter something, it either produces lines of "####...." or fails altogether and dies.
The original model file is GGUF V3.
The converted size is about 10GiB
```
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 520.61.05 Driver Version: 520.61.05 CUDA Version: 11.8 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce ... On | 00000000:04:00.0 Off | N/A |
| 22% 15C P8 33W / 275W | 4774MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 1 NVIDIA GeForce ... On | 00000000:05:00.0 Off | N/A |
| 22% 16C P8 32W / 275W | 2979MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 2 NVIDIA GeForce ... On | 00000000:08:00.0 Off | N/A |
| 22% 14C P8 32W / 275W | 2979MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 3 NVIDIA GeForce ... On | 00000000:09:00.0 Off | N/A |
| 22% 13C P8 32W / 275W | 2979MiB / 12288MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
| 4 NVIDIA GeForce ... On | 00000000:85:00.0 Off | N/A |
| 0% 19C P8 7W / 177W | 0MiB / 4096MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
+-----------------------------------------------------------------------------+
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1359/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1359/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3492
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3492/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3492/comments
|
https://api.github.com/repos/ollama/ollama/issues/3492/events
|
https://github.com/ollama/ollama/issues/3492
| 2,225,874,326
|
I_kwDOJ0Z1Ps6ErCWW
| 3,492
|
Add enhancement to allow RAG functionnality
|
{
"login": "g02200jeff",
"id": 159446878,
"node_id": "U_kgDOCYD3Xg",
"avatar_url": "https://avatars.githubusercontent.com/u/159446878?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/g02200jeff",
"html_url": "https://github.com/g02200jeff",
"followers_url": "https://api.github.com/users/g02200jeff/followers",
"following_url": "https://api.github.com/users/g02200jeff/following{/other_user}",
"gists_url": "https://api.github.com/users/g02200jeff/gists{/gist_id}",
"starred_url": "https://api.github.com/users/g02200jeff/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/g02200jeff/subscriptions",
"organizations_url": "https://api.github.com/users/g02200jeff/orgs",
"repos_url": "https://api.github.com/users/g02200jeff/repos",
"events_url": "https://api.github.com/users/g02200jeff/events{/privacy}",
"received_events_url": "https://api.github.com/users/g02200jeff/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 6
| 2024-04-04T15:46:02
| 2024-11-06T17:45:00
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
I want use a custom script on ollama server (windows) to execute Retrieval Augmented Generation (RAG) process. How can I do ?
(I have an example which is working with a python script, langchain and ollama but I can't do it behing the ollama server using api restful).
### How should we solve this?
Allow Retrieval Augmented Generation (RAG) or add an example
### What is the impact of not solving this?
_No response_
### Anything else?
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3492/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3492/timeline
| null | null | false
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.