url
stringlengths 51
54
| repository_url
stringclasses 1
value | labels_url
stringlengths 65
68
| comments_url
stringlengths 60
63
| events_url
stringlengths 58
61
| html_url
stringlengths 39
44
| id
int64 1.78B
2.82B
| node_id
stringlengths 18
19
| number
int64 1
8.69k
| title
stringlengths 1
382
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 1
class | assignee
dict | assignees
listlengths 0
2
| milestone
null | comments
int64 0
323
| created_at
timestamp[s] | updated_at
timestamp[s] | closed_at
timestamp[s] | author_association
stringclasses 4
values | sub_issues_summary
dict | active_lock_reason
null | draft
bool 2
classes | pull_request
dict | body
stringlengths 2
118k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 60
63
| performed_via_github_app
null | state_reason
stringclasses 4
values | is_pull_request
bool 2
classes |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/ollama/ollama/issues/8238
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8238/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8238/comments
|
https://api.github.com/repos/ollama/ollama/issues/8238/events
|
https://github.com/ollama/ollama/issues/8238
| 2,758,668,018
|
I_kwDOJ0Z1Ps6kbe7y
| 8,238
|
vocabulary is larger than expected
|
{
"login": "lx687",
"id": 192780267,
"node_id": "U_kgDOC32X6w",
"avatar_url": "https://avatars.githubusercontent.com/u/192780267?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lx687",
"html_url": "https://github.com/lx687",
"followers_url": "https://api.github.com/users/lx687/followers",
"following_url": "https://api.github.com/users/lx687/following{/other_user}",
"gists_url": "https://api.github.com/users/lx687/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lx687/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lx687/subscriptions",
"organizations_url": "https://api.github.com/users/lx687/orgs",
"repos_url": "https://api.github.com/users/lx687/repos",
"events_url": "https://api.github.com/users/lx687/events{/privacy}",
"received_events_url": "https://api.github.com/users/lx687/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 8
| 2024-12-25T09:26:30
| 2024-12-27T02:05:34
| 2024-12-27T01:45:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
ollama create model -f ./Modelfile report errors: Error: vocabulary is larger than expected '128257'instead of '128256'
### OS
Linux
### GPU
Nvidia
### CPU
_No response_
### Ollama version
0.5.4
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8238/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8238/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2729
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2729/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2729/comments
|
https://api.github.com/repos/ollama/ollama/issues/2729/events
|
https://github.com/ollama/ollama/issues/2729
| 2,152,283,323
|
I_kwDOJ0Z1Ps6ASTy7
| 2,729
|
性能不佳:在本地笔记本电脑上通过Ollama运行大型模型
|
{
"login": "GeYingzhen01",
"id": 155865563,
"node_id": "U_kgDOCUpR2w",
"avatar_url": "https://avatars.githubusercontent.com/u/155865563?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/GeYingzhen01",
"html_url": "https://github.com/GeYingzhen01",
"followers_url": "https://api.github.com/users/GeYingzhen01/followers",
"following_url": "https://api.github.com/users/GeYingzhen01/following{/other_user}",
"gists_url": "https://api.github.com/users/GeYingzhen01/gists{/gist_id}",
"starred_url": "https://api.github.com/users/GeYingzhen01/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GeYingzhen01/subscriptions",
"organizations_url": "https://api.github.com/users/GeYingzhen01/orgs",
"repos_url": "https://api.github.com/users/GeYingzhen01/repos",
"events_url": "https://api.github.com/users/GeYingzhen01/events{/privacy}",
"received_events_url": "https://api.github.com/users/GeYingzhen01/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-24T12:14:58
| 2024-03-05T09:00:59
| 2024-02-25T06:06:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |

Running large models through Ollama on a local laptop results in significant lag, and the computer's performance is not fully utilized.

|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2729/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2729/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1634
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1634/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1634/comments
|
https://api.github.com/repos/ollama/ollama/issues/1634/events
|
https://github.com/ollama/ollama/pull/1634
| 2,050,843,436
|
PR_kwDOJ0Z1Ps5ifLBp
| 1,634
|
Typo error fixed.
|
{
"login": "nepalivai",
"id": 108126089,
"node_id": "U_kgDOBnHfiQ",
"avatar_url": "https://avatars.githubusercontent.com/u/108126089?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nepalivai",
"html_url": "https://github.com/nepalivai",
"followers_url": "https://api.github.com/users/nepalivai/followers",
"following_url": "https://api.github.com/users/nepalivai/following{/other_user}",
"gists_url": "https://api.github.com/users/nepalivai/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nepalivai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nepalivai/subscriptions",
"organizations_url": "https://api.github.com/users/nepalivai/orgs",
"repos_url": "https://api.github.com/users/nepalivai/repos",
"events_url": "https://api.github.com/users/nepalivai/events{/privacy}",
"received_events_url": "https://api.github.com/users/nepalivai/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-12-20T15:52:52
| 2023-12-21T00:56:46
| 2023-12-21T00:56:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1634",
"html_url": "https://github.com/ollama/ollama/pull/1634",
"diff_url": "https://github.com/ollama/ollama/pull/1634.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1634.patch",
"merged_at": null
}
|
A few typo errors were fixed in the README.md file includes the period error after the end of some sentences and small grammatical errors.
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1634/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1634/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/288
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/288/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/288/comments
|
https://api.github.com/repos/ollama/ollama/issues/288/events
|
https://github.com/ollama/ollama/pull/288
| 1,837,425,825
|
PR_kwDOJ0Z1Ps5XO9EG
| 288
|
embed text document in modelfile
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-08-04T23:01:19
| 2023-08-09T14:26:22
| 2023-08-09T14:26:20
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/288",
"html_url": "https://github.com/ollama/ollama/pull/288",
"diff_url": "https://github.com/ollama/ollama/pull/288.diff",
"patch_url": "https://github.com/ollama/ollama/pull/288.patch",
"merged_at": "2023-08-09T14:26:20"
}
|
Allow embedding information into Modelfiles. This is an initial version that only supports embedding text files, other file types to follow.
```
FROM llama2
EMBED /path/to/doc.txt
TEMPLATE """
Context:
{{ .Embed }}
User:
{{ .User }}
"""
```
TODO before merge:
- [x] Test library `FROM` image (local and pull)
- [x] Test `FROM` local bin file
- [x] Update docs
Resolves #237
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/288/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/288/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6655
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6655/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6655/comments
|
https://api.github.com/repos/ollama/ollama/issues/6655/events
|
https://github.com/ollama/ollama/issues/6655
| 2,507,523,727
|
I_kwDOJ0Z1Ps6VdcaP
| 6,655
|
Windows binaries are built without GPU support and ignore available SIMD support
|
{
"login": "mlgitter",
"id": 81476825,
"node_id": "MDQ6VXNlcjgxNDc2ODI1",
"avatar_url": "https://avatars.githubusercontent.com/u/81476825?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mlgitter",
"html_url": "https://github.com/mlgitter",
"followers_url": "https://api.github.com/users/mlgitter/followers",
"following_url": "https://api.github.com/users/mlgitter/following{/other_user}",
"gists_url": "https://api.github.com/users/mlgitter/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mlgitter/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mlgitter/subscriptions",
"organizations_url": "https://api.github.com/users/mlgitter/orgs",
"repos_url": "https://api.github.com/users/mlgitter/repos",
"events_url": "https://api.github.com/users/mlgitter/events{/privacy}",
"received_events_url": "https://api.github.com/users/mlgitter/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-05T11:30:23
| 2024-09-05T16:11:45
| 2024-09-05T16:11:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I've set up ollama from windows installer, and it says in logs
that it NOT build with GPU support
**_WARN [server_params_parse] Not compiled with GPU offload support, --n-gpu-layers option will be ignored._**
and ignores SSE3 and SSSE3 of my cpus 5645x2 capabilities
**_INFO [wmain] system info | n_threads=12 n_threads_batch=-1 system_info="AVX = 0 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 0 | SSSE3 = 0 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="16916" timestamp=1725530008 total_threads=24_**
So my questions are:
**_1. Is there a fork that actually builds with NVidia GPU support with 6+ computing capability as it's been said in releases and wiki?_**
**_2. Why SIMD features, like SSE3|4 which on par with AVX|2 are being ignored, yet it mostly a matter of compiling flags?_**
**_3. Where is information on enabling GPU BLAS support which was said to reside in main README.md but absent here as of latest release, 0.3.9 and Google can't find it out anywhere?_**
My CUDA is in place,
> $ nvcc --version
> nvcc: NVIDIA (R) Cuda compiler driver
> Copyright (c) 2005-2013 NVIDIA Corporation
> Built on Fri_Mar_14_19:30:01_PDT_2014
> Cuda compilation tools, release 6.0, V6.0.1
>
The server log is as follows:
> 2024/09/05 12:40:01 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: _HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\xxxxxxx\\.ollama\\models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR:C:\\Users\\xxxxxxx\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\runners OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-09-05T12:40:01.258+03:00 level=INFO source=images.go:753 msg="total blobs: 5"
time=2024-09-05T12:40:01.259+03:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0"
time=2024-09-05T12:40:01.264+03:00 level=INFO source=routes.go:1172 msg="Listening on 127.0.0.1:11434 (version 0.3.9)"
time=2024-09-05T12:40:01.267+03:00 level=INFO source=payload.go:44 **msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11 cuda_v12 rocm_v6.1]"**
time=2024-09-05T12:40:01.268+03:00 level=INFO source=gpu.go:200 msg="looking for compatible GPUs"
time=2024-09-05T12:40:01.268+03:00 level=WARN source=gpu.go:222 **msg="CPU does not have minimum vector extensions, GPU inference disabled" required=avx detected="no vector extensions"**
time=2024-09-05T12:40:01.270+03:00 level=INFO source=types.go:107 msg="inference compute" id=0 library=cpu variant="no vector extensions" compute="" driver=0.0 name="" total="128.0 GiB" available="36.9 GiB"
[GIN] 2024/09/05 - 12:53:23 | 200 | 4.3759ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/09/05 - 12:53:23 | 200 | 1.6545ms | 127.0.0.1 | GET "/api/tags"
[GIN] 2024/09/05 - 12:53:23 | 200 | 509.1µs | 127.0.0.1 | GET "/"
time=2024-09-05T12:53:28.881+03:00 level=INFO source=memory.go:309 msg="offload to cpu" layers.requested=50 layers.model=19 layers.offload=0 layers.split="" memory.available="[37.5 GiB]" memory.required.full="2.3 GiB" memory.required.partial="0 B" memory.required.kv="144.0 MiB" memory.required.allocations="[2.3 GiB]" memory.weights.total="1.2 GiB" memory.weights.repeating="675.9 MiB" memory.weights.nonrepeating="531.5 MiB" memory.graph.full="504.2 MiB" memory.graph.partial="914.6 MiB"
time=2024-09-05T12:53:28.898+03:00 level=INFO source=server.go:391 msg="starting llama server" cmd="C:\\Users\\xxxxxxx\\AppData\\Local\\Programs\\Ollama\\lib\\ollama\\runners\\cpu\\ollama_llama_server.exe --model C:\\Users\\xxxxxxx\\.ollama\\models\\blobs\\sha256-c1864a5eb19305c40519da12cc543519e48a0697ecd30e15d5ac228644957d12 --ctx-size 8192 --batch-size 512 --embedding --log-disable --n-gpu-layers 50 --no-mmap --parallel 4 --port 50237"
time=2024-09-05T12:53:28.939+03:00 level=INFO source=sched.go:450 msg="loaded runners" count=1
time=2024-09-05T12:53:28.940+03:00 level=INFO source=server.go:591 msg="waiting for llama runner to start responding"
time=2024-09-05T12:53:28.948+03:00 level=INFO source=server.go:625 msg="waiting for server to become available" status="llm server error"
WARN [server_params_parse] Not compiled with GPU offload support, --n-gpu-layers option will be ignored. See main README.md for information on enabling GPU BLAS support | n_gpu_layers=-1 tid="16916" timestamp=1725530008
INFO [wmain] build info | build=3535 commit="1e6f6554" tid="16916" timestamp=1725530008
**INFO [wmain] system info | n_threads=12 n_threads_batch=-1 system_info="AVX = 0 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 0 | SSSE3 = 0 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | "** tid="16916" timestamp=1725530008 total_threads=24
INFO [wmain] HTTP server listening | hostname="127.0.0.1" n_threads_http="23" port="50237" tid="16916" timestamp=1725530008
llama_model_loader: loaded meta data with 21 key-value pairs and 164 tensors from C:\Users\xxxxxxx\.ollama\models\blobs\sha256-c1864a5eb19305c40519da12cc543519e48a0697ecd30e15d5ac228644957d12 (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = gemma
llama_model_loader: - kv 1: general.name str = gemma-2b-it
llama_model_loader: - kv 2: gemma.context_length u32 = 8192
llama_model_loader: - kv 3: gemma.block_count u32 = 18
llama_model_loader: - kv 4: gemma.embedding_length u32 = 2048
llama_model_loader: - kv 5: gemma.feed_forward_length u32 = 16384
llama_model_loader: - kv 6: gemma.attention.head_count u32 = 8
llama_model_loader: - kv 7: gemma.attention.head_count_kv u32 = 1
llama_model_loader: - kv 8: gemma.attention.key_length u32 = 256
llama_model_loader: - kv 9: gemma.attention.value_length u32 = 256
llama_model_loader: - kv 10: gemma.attention.layer_norm_rms_epsilon f32 = 0.000001
llama_model_loader: - kv 11: tokenizer.ggml.model str = llama
llama_model_loader: - kv 12: tokenizer.ggml.bos_token_id u32 = 2
llama_model_loader: - kv 13: tokenizer.ggml.eos_token_id u32 = 1
llama_model_loader: - kv 14: tokenizer.ggml.padding_token_id u32 = 0
llama_model_loader: - kv 15: tokenizer.ggml.unknown_token_id u32 = 3
llama_model_loader: - kv 16: tokenizer.ggml.tokens arr[str,256128] = ["<pad>", "<eos>", "<bos>", "<unk>", ...
time=2024-09-05T12:53:29.207+03:00 level=INFO source=server.go:625 msg="waiting for server to become available" status="llm server loading model"
llama_model_loader: - kv 17: tokenizer.ggml.scores arr[f32,256128] = [0.000000, 0.000000, 0.000000, 0.0000...
llama_model_loader: - kv 18: tokenizer.ggml.token_type arr[i32,256128] = [3, 3, 3, 2, 1, 1, 1, 1, 1, 1, 1, 1, ...
llama_model_loader: - kv 19: general.quantization_version u32 = 2
llama_model_loader: - kv 20: general.file_type u32 = 2
llama_model_loader: - type f32: 37 tensors
llama_model_loader: - type q4_0: 126 tensors
llama_model_loader: - type q8_0: 1 tensors
llm_load_vocab: special tokens cache size = 4
llm_load_vocab: token to piece cache size = 1.6014 MB
llm_load_print_meta: format = GGUF V3 (latest)
llm_load_print_meta: arch = gemma
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 256128
llm_load_print_meta: n_merges = 0
llm_load_print_meta: vocab_only = 0
llm_load_print_meta: n_ctx_train = 8192
llm_load_print_meta: n_embd = 2048
llm_load_print_meta: n_layer = 18
llm_load_print_meta: n_head = 8
llm_load_print_meta: n_head_kv = 1
llm_load_print_meta: n_rot = 256
llm_load_print_meta: n_swa = 0
llm_load_print_meta: n_embd_head_k = 256
llm_load_print_meta: n_embd_head_v = 256
llm_load_print_meta: n_gqa = 8
llm_load_print_meta: n_embd_k_gqa = 256
llm_load_print_meta: n_embd_v_gqa = 256
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-06
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: f_logit_scale = 0.0e+00
llm_load_print_meta: n_ff = 16384
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: causal attn = 1
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 2
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 10000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_ctx_orig_yarn = 8192
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: ssm_d_conv = 0
llm_load_print_meta: ssm_d_inner = 0
llm_load_print_meta: ssm_d_state = 0
llm_load_print_meta: ssm_dt_rank = 0
llm_load_print_meta: model type = 2B
llm_load_print_meta: model ftype = Q4_0
llm_load_print_meta: model params = 2.51 B
llm_load_print_meta: model size = 1.56 GiB (5.34 BPW)
llm_load_print_meta: general.name = gemma-2b-it
llm_load_print_meta: BOS token = 2 '<bos>'
llm_load_print_meta: EOS token = 1 '<eos>'
llm_load_print_meta: UNK token = 3 '<unk>'
llm_load_print_meta: PAD token = 0 '<pad>'
llm_load_print_meta: LF token = 227 '<0x0A>'
llm_load_print_meta: EOT token = 107 '<end_of_turn>'
llm_load_print_meta: max token length = 93
llm_load_tensors: ggml ctx size = 0.08 MiB
llm_load_tensors: CPU buffer size = 2126.45 MiB
[GIN] 2024/09/05 - 12:53:39 | 200 | 1.6532ms | 127.0.0.1 | GET "/api/tags"
llama_new_context_with_model: n_ctx = 8192
llama_new_context_with_model: n_batch = 512
llama_new_context_with_model: n_ubatch = 512
llama_new_context_with_model: flash_attn = 0
llama_new_context_with_model: freq_base = 10000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CPU KV buffer size = 144.00 MiB
llama_new_context_with_model: KV self size = 144.00 MiB, K (f16): 72.00 MiB, V (f16): 72.00 MiB
llama_new_context_with_model: CPU output buffer size = 3.94 MiB
llama_new_context_with_model: CPU compute buffer size = 508.25 MiB
llama_new_context_with_model: graph nodes = 601
llama_new_context_with_model: graph splits = 1
INFO [wmain] model loaded | tid="16916" timestamp=1725530022
time=2024-09-05T12:53:42.108+03:00 level=INFO source=server.go:630 msg="llama runner started in 13.17 seconds"
[GIN] 2024/09/05 - 12:53:54 | 200 | 2.2921ms | 127.0.0.1 | GET "/api/tags"_
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
version 0.3.9
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6655/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6655/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2048
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2048/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2048/comments
|
https://api.github.com/repos/ollama/ollama/issues/2048/events
|
https://github.com/ollama/ollama/issues/2048
| 2,088,215,428
|
I_kwDOJ0Z1Ps58d6OE
| 2,048
|
unexpected error in llama server update_slots - exiting main loop
|
{
"login": "mofanke",
"id": 54242816,
"node_id": "MDQ6VXNlcjU0MjQyODE2",
"avatar_url": "https://avatars.githubusercontent.com/u/54242816?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mofanke",
"html_url": "https://github.com/mofanke",
"followers_url": "https://api.github.com/users/mofanke/followers",
"following_url": "https://api.github.com/users/mofanke/following{/other_user}",
"gists_url": "https://api.github.com/users/mofanke/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mofanke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mofanke/subscriptions",
"organizations_url": "https://api.github.com/users/mofanke/orgs",
"repos_url": "https://api.github.com/users/mofanke/repos",
"events_url": "https://api.github.com/users/mofanke/events{/privacy}",
"received_events_url": "https://api.github.com/users/mofanke/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 6
| 2024-01-18T12:45:24
| 2024-04-02T01:57:27
| 2024-03-13T17:41:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
[1704891429] sampled token: 29896: '1'
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 256
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 128
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 64
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 32
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 16
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 8
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 4
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 2
[1704891429] update_slots : failed to find free space in the KV cache, retrying with smaller n_batch = 1
[1704891429] update_slots : failed to decode the batch, n_batch = 1, ret = 1
[1704891429] unexpected error in llama server update_slots - exiting main loop
[1704891429]
llama server shutting down
ollama is still running , and not respond for chat api
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2048/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/2048/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3426
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3426/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3426/comments
|
https://api.github.com/repos/ollama/ollama/issues/3426/events
|
https://github.com/ollama/ollama/issues/3426
| 2,217,029,149
|
I_kwDOJ0Z1Ps6EJS4d
| 3,426
|
Ollama serve API response returns nonsense only on subsequent calls or times out
|
{
"login": "YanWittmann",
"id": 37689635,
"node_id": "MDQ6VXNlcjM3Njg5NjM1",
"avatar_url": "https://avatars.githubusercontent.com/u/37689635?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YanWittmann",
"html_url": "https://github.com/YanWittmann",
"followers_url": "https://api.github.com/users/YanWittmann/followers",
"following_url": "https://api.github.com/users/YanWittmann/following{/other_user}",
"gists_url": "https://api.github.com/users/YanWittmann/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YanWittmann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YanWittmann/subscriptions",
"organizations_url": "https://api.github.com/users/YanWittmann/orgs",
"repos_url": "https://api.github.com/users/YanWittmann/repos",
"events_url": "https://api.github.com/users/YanWittmann/events{/privacy}",
"received_events_url": "https://api.github.com/users/YanWittmann/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-03-31T14:50:21
| 2024-03-31T15:37:42
| 2024-03-31T15:37:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Upon making more than one request to the `ollama serve` API server, the server will respond with seemingly garbage text:
```batch
curl -X POST http://localhost:11434/api/generate -d "{\"model\":\"mixtral:8x7b-instruct-v0.1-q3_K_S\",\"prompt\":\"Why is the sky blue? Just a short one-sentance description will be enough.\",\"stream\":false}" -H "Content-Type: application/json"
{"model":"mixtral:8x7b-instruct-v0.1-q3_K_S","created_at":"2024-03-31T14:43:56.3540523Z","response":"\n Question: Let q = -26137985 + 45303185. What is q rounded to the nearest 1000000?\nAnswer: 19000000","done":true,"context":[28705,...],"total_duration":12775145300,"load_duration":528600,"prompt_eval_duration":248002000,"eval_count":56,"eval_duration":12522777000}
```
The first time the request is made or upon switching models, everything works fine the first time, but the times after that, only nonsense is returned. Turning on or off streaming does not matter.
Sometimes, it is even worse and the API does not respond at all for subsequent calls.
The Ollama WebUI works fine without problems.
<details>
<summary>Here's the server log</summary>
```
time=2024-03-31T16:47:31.235+02:00 level=INFO source=routes.go:79 msg="changing loaded model"
time=2024-03-31T16:47:32.501+02:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-03-31T16:47:32.501+02:00 level=INFO source=gpu.go:119 msg="CUDA Compute Capability detected: 8.6"
time=2024-03-31T16:47:32.502+02:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-03-31T16:47:32.502+02:00 level=INFO source=gpu.go:119 msg="CUDA Compute Capability detected: 8.6"
time=2024-03-31T16:47:32.502+02:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
loading library C:\Users\yan20\AppData\Local\Temp\ollama113621716\runners\cpu_avx2\ext_server.dll
time=2024-03-31T16:47:32.507+02:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: C:\\Users\\yan20\\AppData\\Local\\Temp\\ollama113621716\\runners\\cpu_avx2\\ext_server.dll"time=2024-03-31T16:47:32.508+02:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"
llama_model_loader: loaded meta data with 19 key-value pairs and 543 tensors from C:\Users\yan20\.ollama\models\blobs\sha256-83b45bda27326a4e5402e61f7ceb67f735729332ae2714f5fe857f117fb63445 (version GGUF V2)
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = lmsys
llama_model_loader: - kv 2: llama.context_length u32 = 2048
llama_model_loader: - kv 3: llama.embedding_length u32 = 6656
llama_model_loader: - kv 4: llama.block_count u32 = 60
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 17920
llama_model_loader: - kv 6: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 7: llama.attention.head_count u32 = 52
llama_model_loader: - kv 8: llama.attention.head_count_kv u32 = 52
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000001
llama_model_loader: - kv 10: general.file_type u32 = 2
llama_model_loader: - kv 11: tokenizer.ggml.model str = llama
llama_model_loader: - kv 12: tokenizer.ggml.tokens arr[str,32000] = ["<unk>", "<s>", "</s>", "<0x00>", "<...
llama_model_loader: - kv 13: tokenizer.ggml.scores arr[f32,32000] = [0.000000, 0.000000, 0.000000, 0.0000...
llama_model_loader: - kv 14: tokenizer.ggml.token_type arr[i32,32000] = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ...
llama_model_loader: - kv 15: tokenizer.ggml.bos_token_id u32 = 1
llama_model_loader: - kv 16: tokenizer.ggml.eos_token_id u32 = 2
llama_model_loader: - kv 17: tokenizer.ggml.padding_token_id u32 = 0
llama_model_loader: - kv 18: general.quantization_version u32 = 2
llama_model_loader: - type f32: 121 tensors
llama_model_loader: - type q4_0: 421 tensors
llama_model_loader: - type q6_K: 1 tensors
llm_load_vocab: special tokens definition check successful ( 259/32000 ).
llm_load_print_meta: format = GGUF V2
llm_load_print_meta: arch = llama
llm_load_print_meta: vocab type = SPM
llm_load_print_meta: n_vocab = 32000
llm_load_print_meta: n_merges = 0
llm_load_print_meta: n_ctx_train = 2048
llm_load_print_meta: n_embd = 6656
llm_load_print_meta: n_head = 52
llm_load_print_meta: n_head_kv = 52
llm_load_print_meta: n_layer = 60
llm_load_print_meta: n_rot = 128
llm_load_print_meta: n_embd_head_k = 128
llm_load_print_meta: n_embd_head_v = 128
llm_load_print_meta: n_gqa = 1
llm_load_print_meta: n_embd_k_gqa = 6656
llm_load_print_meta: n_embd_v_gqa = 6656
llm_load_print_meta: f_norm_eps = 0.0e+00
llm_load_print_meta: f_norm_rms_eps = 1.0e-06
llm_load_print_meta: f_clamp_kqv = 0.0e+00
llm_load_print_meta: f_max_alibi_bias = 0.0e+00
llm_load_print_meta: n_ff = 17920
llm_load_print_meta: n_expert = 0
llm_load_print_meta: n_expert_used = 0
llm_load_print_meta: pooling type = 0
llm_load_print_meta: rope type = 0
llm_load_print_meta: rope scaling = linear
llm_load_print_meta: freq_base_train = 10000.0
llm_load_print_meta: freq_scale_train = 1
llm_load_print_meta: n_yarn_orig_ctx = 2048
llm_load_print_meta: rope_finetuned = unknown
llm_load_print_meta: model type = 30B
llm_load_print_meta: model ftype = Q4_0
llm_load_print_meta: model params = 32.53 B
llm_load_print_meta: model size = 17.09 GiB (4.51 BPW)
llm_load_print_meta: general.name = lmsys
llm_load_print_meta: BOS token = 1 '<s>'
llm_load_print_meta: EOS token = 2 '</s>'
llm_load_print_meta: UNK token = 0 '<unk>'
llm_load_print_meta: PAD token = 0 '<unk>'
llm_load_print_meta: LF token = 13 '<0x0A>'
llm_load_tensors: ggml ctx size = 0.21 MiB
llm_load_tensors: CPU buffer size = 17504.89 MiB
....................................................................................................
llama_new_context_with_model: n_ctx = 2048
llama_new_context_with_model: freq_base = 10000.0
llama_new_context_with_model: freq_scale = 1
llama_kv_cache_init: CPU KV buffer size = 3120.00 MiB
llama_new_context_with_model: KV self size = 3120.00 MiB, K (f16): 1560.00 MiB, V (f16): 1560.00 MiB
llama_new_context_with_model: CPU input buffer size = 18.02 MiB
llama_new_context_with_model: CPU compute buffer size = 260.00 MiB
llama_new_context_with_model: graph splits (measure): 1
{"function":"initialize","level":"INFO","line":440,"msg":"initializing slots","n_slots":1,"tid":"19792","timestamp":1711896456}
{"function":"initialize","level":"INFO","line":452,"msg":"new slot","n_ctx_slot":2048,"slot_id":0,"tid":"19792","timestamp":1711896456}
time=2024-03-31T16:47:36.418+02:00 level=INFO source=dyn_ext_server.go:162 msg="Starting llama main loop"
{"function":"launch_slot_with_data","level":"INFO","line":833,"msg":"slot is processing task","slot_id":0,"task_id":0,"tid":"1856","timestamp":1711896456}
{"function":"update_slots","ga_i":0,"level":"INFO","line":1828,"msg":"slot progression","n_past":0,"n_past_se":0,"n_prompt_tokens_processed":58,"slot_id":0,"task_id":0,"tid":"1856","timestamp":1711896456}
{"function":"update_slots","level":"INFO","line":1852,"msg":"kv cache rm [p0, end)","p0":0,"slot_id":0,"task_id":0,"tid":"1856","timestamp":1711896456}
{"function":"print_timings","level":"INFO","line":264,"msg":"prompt eval time = 16803.92 ms / 58 tokens ( 289.72 ms per token, 3.45 tokens per second)","n_prompt_tokens_processed":58,"n_tokens_second":3.4515749685356214,"slot_id":0,"t_prompt_processing":16803.923,"t_token":289.72281034482756,"task_id":0,"tid":"1856","timestamp":1711896498}
{"function":"print_timings","level":"INFO","line":278,"msg":"generation eval time = 25254.01 ms / 41 runs ( 615.95 ms per token, 1.62 tokens per second)","n_decoded":41,"n_tokens_second":1.623504222991869,"slot_id":0,"t_token":615.9515853658536,"t_token_generation":25254.015,"task_id":0,"tid":"1856","timestamp":1711896498}
{"function":"print_timings","level":"INFO","line":287,"msg":" total time = 42057.94 ms","slot_id":0,"t_prompt_processing":16803.923,"t_token_generation":25254.015,"t_total":42057.937999999995,"task_id":0,"tid":"1856","timestamp":1711896498}
{"function":"update_slots","level":"INFO","line":1660,"msg":"slot released","n_cache_tokens":99,"n_ctx":2048,"n_past":98,"n_system_tokens":0,"slot_id":0,"task_id":0,"tid":"1856","timestamp":1711896498,"truncated":false}
{"function":"update_slots","level":"INFO","line":1590,"msg":"all slots are idle and system prompt is empty, clear the KV cache","tid":"1856","timestamp":1711896498}
[GIN] 2024/03/31 - 16:48:18 | 200 | 47.2460699s | 127.0.0.1 | POST "/api/generate"
{"function":"launch_slot_with_data","level":"INFO","line":833,"msg":"slot is processing task","slot_id":0,"task_id":44,"tid":"1856","timestamp":1711896509}
{"function":"update_slots","ga_i":0,"level":"INFO","line":1828,"msg":"slot progression","n_past":58,"n_past_se":0,"n_prompt_tokens_processed":0,"slot_id":0,"task_id":44,"tid":"1856","timestamp":1711896509}
{"function":"update_slots","level":"INFO","line":1839,"msg":"we have to evaluate at least 1 token to generate logits","slot_id":0,"task_id":44,"tid":"1856","timestamp":1711896509}
{"function":"update_slots","level":"INFO","line":1852,"msg":"kv cache rm [p0, end)","p0":57,"slot_id":0,"task_id":44,"tid":"1856","timestamp":1711896509}
[GIN] 2024/03/31 - 16:49:05 | 200 | 36.1949878s | 127.0.0.1 | POST "/api/generate"
{"function":"update_slots","level":"INFO","line":1660,"msg":"slot released","n_cache_tokens":117,"n_ctx":2048,"n_past":116,"n_system_tokens":0,"slot_id":0,"task_id":44,"tid":"1856","timestamp":1711896545,"truncated":false}
{"function":"launch_slot_with_data","level":"INFO","line":833,"msg":"slot is processing task","slot_id":0,"task_id":106,"tid":"1856","timestamp":1711896546}
{"function":"update_slots","ga_i":0,"level":"INFO","line":1828,"msg":"slot progression","n_past":58,"n_past_se":0,"n_prompt_tokens_processed":0,"slot_id":0,"task_id":106,"tid":"1856","timestamp":1711896546}
{"function":"update_slots","level":"INFO","line":1839,"msg":"we have to evaluate at least 1 token to generate logits","slot_id":0,"task_id":106,"tid":"1856","timestamp":1711896546}
{"function":"update_slots","level":"INFO","line":1852,"msg":"kv cache rm [p0, end)","p0":57,"slot_id":0,"task_id":106,"tid":"1856","timestamp":1711896546}
{"function":"print_timings","level":"INFO","line":264,"msg":"prompt eval time = 610.37 ms / 0 tokens ( inf ms per token, 0.00 tokens per second)","n_prompt_tokens_processed":0,"n_tokens_second":0.0,"slot_id":0,"t_prompt_processing":610.374,"t_token":null,"task_id":106,"tid":"1856","timestamp":1711896547}
{"function":"print_timings","level":"INFO","line":278,"msg":"generation eval time = 602.92 ms / 2 runs ( 301.46 ms per token, 3.32 tokens per second)","n_decoded":2,"n_tokens_second":3.3171676695570254,"slot_id":0,"t_token":301.462,"t_token_generation":602.924,"task_id":106,"tid":"1856","timestamp":1711896547}
{"function":"print_timings","level":"INFO","line":287,"msg":" total time = 1213.30 ms","slot_id":0,"t_prompt_processing":610.374,"t_token_generation":602.924,"t_total":1213.298,"task_id":106,"tid":"1856","timestamp":1711896547}
{"function":"update_slots","level":"INFO","line":1660,"msg":"slot released","n_cache_tokens":60,"n_ctx":2048,"n_past":59,"n_system_tokens":0,"slot_id":0,"task_id":106,"tid":"1856","timestamp":1711896547,"truncated":false}
[GIN] 2024/03/31 - 16:49:07 | 200 | 1.2161599s | 127.0.0.1 | POST "/api/generate"
```
</details>
### What did you expect to see?
A response that uses the context provided in the prompt.
```batch
C:\Users\user>curl -X POST http://localhost:11434/api/generate -d "{\"model\":\"mixtral:8x7b-instruct-v0.1-q3_K_S\",\"prompt\":\"Why is the sky blue? Just a short one-sentance description will be enough.\",\"stream\":false}" -H "Content-Type: application/json"
{"model":"mixtral:8x7b-instruct-v0.1-q3_K_S","created_at":"2024-03-31T14:39:56.8355783Z","response":" The sky appears blue because molecules in the Earth's atmosphere scatter sunlight in all directions and blue light is scattered more than other colors due to its shorter wavelength.","done":true,"context":[28705,...],"total_duration":15124835300,"load_duration":3531055100,"prompt_eval_count":28,"prompt_eval_duration":4215357000,"eval_count":35,"eval_duration":7373269000}
```
### Steps to reproduce
- Startup the server using `ollama serve`
- Make a first request to any model, you will get a meaningful response here.
```bash
C:\Users\user>curl -X POST http://localhost:11434/api/generate -d "{\"model\":\"mixtral:8x7b-instruct-v0.1-q3_K_S\",\"prompt\":\"Why is the sky blue? Just a short one-sentance description will be enough.\",\"stream\":false}" -H "Content-Type: application/json"
{"model":"mixtral:8x7b-instruct-v0.1-q3_K_S","created_at":"2024-03-31T14:39:56.8355783Z","response":" The sky appears blue because molecules in the Earth's atmosphere scatter sunlight in all directions and blue light is scattered more than other colors due to its shorter wavelength.","done":true,"context":[28705,...],"total_duration":15124835300,"load_duration":3531055100,"prompt_eval_count":28,"prompt_eval_duration":4215357000,"eval_count":35,"eval_duration":7373269000}
```
- Make a second request to the same model, this time it will be nonsense. It does not matter what model, as long as you make the second request to the same model as the first request.
```bash
C:\Users\user>curl -X POST http://localhost:11434/api/generate -d "{\"model\":\"mixtral:8x7b-instruct-v0.1-q3_K_S\",\"prompt\":\"Why is the sky blue? Just a short one-sentance description will be enough.\",\"stream\":false}" -H "Content-Type: application/json"
{"model":"mixtral:8x7b-instruct-v0.1-q3_K_S","created_at":"2024-03-31T14:37:20.5472248Z","response":"\n Username: Administrator\n Password: {873f6431-a015-4e98-b86d-93aed073cfc6}\n```\n\nAfter I entered the above information, it works. But the error still appears if I use a new computer without these settings.\n\nAny ideas?\n\nComment: Have you tried installing it from an elevated command prompt?\n\nComment: Yes, I have installed it by using admin user account.\n\n## Answer (0)\n\nThe \"Access is denied\" error is thrown when there is insufficient permissions to access the registry key `HKEY_CURRENT_USER\\Control Panel\\International`. This can happen because of two reasons:\n\n1. The current user doesn't have read/write access to this registry key\n2....","done":true,"context":[28705,...],"total_duration":28009589800,"load_duration":523300,"prompt_eval_duration":79120000,"eval_count":820,"eval_duration":27926267000}
```
or for another model, the same happens:
```bash
C:\Users\user>curl -X POST http://localhost:11434/api/generate -d "{\"model\":\"vicuna:33b\",\"prompt\":\"Why is the sky blue? Just a short one-sentance description will be enough.\",\"stream\":false}" -H "Content-Type: application/json"
{"model":"vicuna:33b","created_at":"2024-03-31T14:32:43.4994468Z","response":"\n1. Name: \"Battle Chef Brigade\"\n2. Genre: Cooking Competition / Action Adventure\n3. Platform: PC, Nintendo Switch, Xbox One, and PlayStation 4\n4. Release Date: May 2018 (Nintendo","done":true,"context":[319,...],"total_duration":5752960100,"load_duration":1083700,"prompt_eval_count":16,"prompt_eval_duration":867577000,"eval_count":62,"eval_duration":4880727000}
```
### Are there any recent changes that introduced the issue?
_No response_
### OS
Windows
### Architecture
amd64
### Platform
_No response_
### Ollama version
0.1.29
### GPU
Nvidia
### GPU info
NVIDIA GeForce RTX 3090
### CPU
AMD
### Other software
AMD Ryzen 7 3800XT 8-Core Processor
|
{
"login": "YanWittmann",
"id": 37689635,
"node_id": "MDQ6VXNlcjM3Njg5NjM1",
"avatar_url": "https://avatars.githubusercontent.com/u/37689635?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YanWittmann",
"html_url": "https://github.com/YanWittmann",
"followers_url": "https://api.github.com/users/YanWittmann/followers",
"following_url": "https://api.github.com/users/YanWittmann/following{/other_user}",
"gists_url": "https://api.github.com/users/YanWittmann/gists{/gist_id}",
"starred_url": "https://api.github.com/users/YanWittmann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/YanWittmann/subscriptions",
"organizations_url": "https://api.github.com/users/YanWittmann/orgs",
"repos_url": "https://api.github.com/users/YanWittmann/repos",
"events_url": "https://api.github.com/users/YanWittmann/events{/privacy}",
"received_events_url": "https://api.github.com/users/YanWittmann/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3426/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3426/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5197
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5197/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5197/comments
|
https://api.github.com/repos/ollama/ollama/issues/5197/events
|
https://github.com/ollama/ollama/issues/5197
| 2,365,620,827
|
I_kwDOJ0Z1Ps6NAIJb
| 5,197
|
run quwen2-instruct-70b error
|
{
"login": "leoHostProject",
"id": 87935281,
"node_id": "MDQ6VXNlcjg3OTM1Mjgx",
"avatar_url": "https://avatars.githubusercontent.com/u/87935281?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/leoHostProject",
"html_url": "https://github.com/leoHostProject",
"followers_url": "https://api.github.com/users/leoHostProject/followers",
"following_url": "https://api.github.com/users/leoHostProject/following{/other_user}",
"gists_url": "https://api.github.com/users/leoHostProject/gists{/gist_id}",
"starred_url": "https://api.github.com/users/leoHostProject/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/leoHostProject/subscriptions",
"organizations_url": "https://api.github.com/users/leoHostProject/orgs",
"repos_url": "https://api.github.com/users/leoHostProject/repos",
"events_url": "https://api.github.com/users/leoHostProject/events{/privacy}",
"received_events_url": "https://api.github.com/users/leoHostProject/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-06-21T03:12:16
| 2024-06-21T03:12:16
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
modelUrl:
https://hf-mirror.com/Qwen/Qwen2-72B-Instruct-GGUF/blob/main/
convert qwen2-70b-instruct-q8_0.gguf model success
but run model error
errorMsg
llama runner process has terminated:signal:aborted error::failed to create context with model '/usr/share/ollama/.ollama/models/blobs/sha256-431f167fa6cafb36626a7a37f869833
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.44
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5197/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5197/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3584
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3584/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3584/comments
|
https://api.github.com/repos/ollama/ollama/issues/3584/events
|
https://github.com/ollama/ollama/pull/3584
| 2,236,374,171
|
PR_kwDOJ0Z1Ps5sScDI
| 3,584
|
server: provide helpful workaround hint when stalling on pull
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-10T20:28:04
| 2024-04-10T23:24:37
| 2024-04-10T23:24:37
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3584",
"html_url": "https://github.com/ollama/ollama/pull/3584",
"diff_url": "https://github.com/ollama/ollama/pull/3584.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3584.patch",
"merged_at": "2024-04-10T23:24:37"
}
|
This is a quick fix to help users who are stuck on the "pull" step at 99%.
In the near future we're introducing a new registry client that should/will hopefully be smarter. In the meantime, this should unblock the users hitting issue #1736.
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3584/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3584/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2181
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2181/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2181/comments
|
https://api.github.com/repos/ollama/ollama/issues/2181/events
|
https://github.com/ollama/ollama/pull/2181
| 2,099,399,832
|
PR_kwDOJ0Z1Ps5lA1ix
| 2,181
|
stub generate outputs for lint
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-25T01:30:09
| 2024-01-25T19:55:17
| 2024-01-25T19:55:16
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2181",
"html_url": "https://github.com/ollama/ollama/pull/2181",
"diff_url": "https://github.com/ollama/ollama/pull/2181.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2181.patch",
"merged_at": "2024-01-25T19:55:16"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2181/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2181/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8632
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8632/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8632/comments
|
https://api.github.com/repos/ollama/ollama/issues/8632/events
|
https://github.com/ollama/ollama/issues/8632
| 2,815,629,446
|
I_kwDOJ0Z1Ps6n0xiG
| 8,632
|
Ollama unable to download/run deepseek-r1:7b, other models work
|
{
"login": "arjunivor",
"id": 123751821,
"node_id": "U_kgDOB2BNjQ",
"avatar_url": "https://avatars.githubusercontent.com/u/123751821?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arjunivor",
"html_url": "https://github.com/arjunivor",
"followers_url": "https://api.github.com/users/arjunivor/followers",
"following_url": "https://api.github.com/users/arjunivor/following{/other_user}",
"gists_url": "https://api.github.com/users/arjunivor/gists{/gist_id}",
"starred_url": "https://api.github.com/users/arjunivor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/arjunivor/subscriptions",
"organizations_url": "https://api.github.com/users/arjunivor/orgs",
"repos_url": "https://api.github.com/users/arjunivor/repos",
"events_url": "https://api.github.com/users/arjunivor/events{/privacy}",
"received_events_url": "https://api.github.com/users/arjunivor/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677370291,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCVsw",
"url": "https://api.github.com/repos/ollama/ollama/labels/networking",
"name": "networking",
"color": "0B5368",
"default": false,
"description": "Issues relating to ollama pull and push"
}
] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 40
| 2025-01-28T13:09:40
| 2025-01-30T02:15:17
| 2025-01-30T00:07:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
while trying to run `ollama run deepseek-r1:7b` it repeatedly fails at 6%. I tried to run llama 3.2 and it downloaded that flawlessly, but everytime i try to run deepseek i get an error saying `error max retries exceeded: EOF`
### OS
WSL2
### GPU
Nvidia
### CPU
AMD
### Ollama version
latest
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8632/reactions",
"total_count": 6,
"+1": 6,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8632/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7251
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7251/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7251/comments
|
https://api.github.com/repos/ollama/ollama/issues/7251/events
|
https://github.com/ollama/ollama/issues/7251
| 2,596,677,464
|
I_kwDOJ0Z1Ps6axidY
| 7,251
|
debug modelfile on create
|
{
"login": "belfie13",
"id": 39270867,
"node_id": "MDQ6VXNlcjM5MjcwODY3",
"avatar_url": "https://avatars.githubusercontent.com/u/39270867?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/belfie13",
"html_url": "https://github.com/belfie13",
"followers_url": "https://api.github.com/users/belfie13/followers",
"following_url": "https://api.github.com/users/belfie13/following{/other_user}",
"gists_url": "https://api.github.com/users/belfie13/gists{/gist_id}",
"starred_url": "https://api.github.com/users/belfie13/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/belfie13/subscriptions",
"organizations_url": "https://api.github.com/users/belfie13/orgs",
"repos_url": "https://api.github.com/users/belfie13/repos",
"events_url": "https://api.github.com/users/belfie13/events{/privacy}",
"received_events_url": "https://api.github.com/users/belfie13/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 2
| 2024-10-18T07:28:14
| 2024-11-07T10:02:07
| 2024-11-07T10:02:06
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
i get
Error: command must be one of "from", "license...
when creating via model file i generated from some text eg:
ollama create test -f test.modelfile
i have to cut out the code and paste it line by line back into the modelfile to find where the issue is.
in the error message, could you add a line number where the issue was found when attempting to create a model from file
|
{
"login": "belfie13",
"id": 39270867,
"node_id": "MDQ6VXNlcjM5MjcwODY3",
"avatar_url": "https://avatars.githubusercontent.com/u/39270867?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/belfie13",
"html_url": "https://github.com/belfie13",
"followers_url": "https://api.github.com/users/belfie13/followers",
"following_url": "https://api.github.com/users/belfie13/following{/other_user}",
"gists_url": "https://api.github.com/users/belfie13/gists{/gist_id}",
"starred_url": "https://api.github.com/users/belfie13/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/belfie13/subscriptions",
"organizations_url": "https://api.github.com/users/belfie13/orgs",
"repos_url": "https://api.github.com/users/belfie13/repos",
"events_url": "https://api.github.com/users/belfie13/events{/privacy}",
"received_events_url": "https://api.github.com/users/belfie13/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7251/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7251/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4257
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4257/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4257/comments
|
https://api.github.com/repos/ollama/ollama/issues/4257/events
|
https://github.com/ollama/ollama/issues/4257
| 2,285,471,713
|
I_kwDOJ0Z1Ps6IOYfh
| 4,257
|
Support for InternVL-Chat-V1.5
|
{
"login": "wwjCMP",
"id": 32979859,
"node_id": "MDQ6VXNlcjMyOTc5ODU5",
"avatar_url": "https://avatars.githubusercontent.com/u/32979859?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wwjCMP",
"html_url": "https://github.com/wwjCMP",
"followers_url": "https://api.github.com/users/wwjCMP/followers",
"following_url": "https://api.github.com/users/wwjCMP/following{/other_user}",
"gists_url": "https://api.github.com/users/wwjCMP/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wwjCMP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wwjCMP/subscriptions",
"organizations_url": "https://api.github.com/users/wwjCMP/orgs",
"repos_url": "https://api.github.com/users/wwjCMP/repos",
"events_url": "https://api.github.com/users/wwjCMP/events{/privacy}",
"received_events_url": "https://api.github.com/users/wwjCMP/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 8
| 2024-05-08T12:24:34
| 2025-01-28T13:32:27
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://huggingface.co/OpenGVLab/InternVL-Chat-V1-5
We introduce InternVL 1.5, an open-source multimodal large language model (MLLM) to bridge the capability gap between open-source and proprietary commercial models in multimodal understanding. We introduce three simple designs:
Strong Vision Encoder: we explored a continuous learning strategy for the large-scale vision foundation model---InternViT-6B, boosting its visual understanding capabilities, and making it can be transferred and reused in different LLMs.
Dynamic High-Resolution: we divide images into tiles ranging from 1 to 40 of 448 × 448 pixels according to the aspect ratio and resolution of the input images, which supports up to 4K resolution input.
High-Quality Bilingual Dataset: we carefully collected a high-quality bilingual dataset that covers common scenes, document images, and annotated them with English and Chinese question-answer pairs, significantly enhancing performance in OCR- and Chinese-related tasks.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4257/reactions",
"total_count": 26,
"+1": 26,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4257/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6469
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6469/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6469/comments
|
https://api.github.com/repos/ollama/ollama/issues/6469/events
|
https://github.com/ollama/ollama/pull/6469
| 2,482,118,192
|
PR_kwDOJ0Z1Ps55Ma7I
| 6,469
|
Link Time Optimization - cabelo@opensuse.org
|
{
"login": "cabelo",
"id": 675645,
"node_id": "MDQ6VXNlcjY3NTY0NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/675645?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cabelo",
"html_url": "https://github.com/cabelo",
"followers_url": "https://api.github.com/users/cabelo/followers",
"following_url": "https://api.github.com/users/cabelo/following{/other_user}",
"gists_url": "https://api.github.com/users/cabelo/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cabelo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cabelo/subscriptions",
"organizations_url": "https://api.github.com/users/cabelo/orgs",
"repos_url": "https://api.github.com/users/cabelo/repos",
"events_url": "https://api.github.com/users/cabelo/events{/privacy}",
"received_events_url": "https://api.github.com/users/cabelo/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-08-23T02:30:02
| 2024-09-02T19:39:16
| 2024-09-02T19:39:16
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6469",
"html_url": "https://github.com/ollama/ollama/pull/6469",
"diff_url": "https://github.com/ollama/ollama/pull/6469.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6469.patch",
"merged_at": null
}
|
Tested in Debian, Ubuntu, Fedora, Redhad, openSUSE and SUSE the Link Time Optimization for interprocedural optimization in the performat the time of linking application code.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6469/reactions",
"total_count": 2,
"+1": 2,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6469/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5035
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5035/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5035/comments
|
https://api.github.com/repos/ollama/ollama/issues/5035/events
|
https://github.com/ollama/ollama/issues/5035
| 2,352,132,207
|
I_kwDOJ0Z1Ps6MMrBv
| 5,035
|
Ollama not use GPU
|
{
"login": "Mina4ever",
"id": 92083902,
"node_id": "U_kgDOBX0Wvg",
"avatar_url": "https://avatars.githubusercontent.com/u/92083902?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Mina4ever",
"html_url": "https://github.com/Mina4ever",
"followers_url": "https://api.github.com/users/Mina4ever/followers",
"following_url": "https://api.github.com/users/Mina4ever/following{/other_user}",
"gists_url": "https://api.github.com/users/Mina4ever/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Mina4ever/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Mina4ever/subscriptions",
"organizations_url": "https://api.github.com/users/Mina4ever/orgs",
"repos_url": "https://api.github.com/users/Mina4ever/repos",
"events_url": "https://api.github.com/users/Mina4ever/events{/privacy}",
"received_events_url": "https://api.github.com/users/Mina4ever/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 10
| 2024-06-13T21:58:12
| 2024-07-09T15:19:45
| 2024-06-14T17:00:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I am using Ollama , it use CPU only and not use GPU, although I installed cuda v 12.5 and cudnn v 9.2.0 and I can check that python using gpu in liabrary like pytourch (result of command (>>> print(torch.backends.cudnn.is_available())
**True**, ), I have Nvidia 1050 ti and I am trying to runn llama3 8B model, i found this warning in ollamam server log "level=WARN source=gpu.go:177 msg="CPU does not have AVX or AVX2, disabling GPU support."
### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.43
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5035/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5035/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3708
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3708/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3708/comments
|
https://api.github.com/repos/ollama/ollama/issues/3708/events
|
https://github.com/ollama/ollama/pull/3708
| 2,248,998,182
|
PR_kwDOJ0Z1Ps5s9gKv
| 3,708
|
move Ollama static build to its own flag
|
{
"login": "remy415",
"id": 105550370,
"node_id": "U_kgDOBkqSIg",
"avatar_url": "https://avatars.githubusercontent.com/u/105550370?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/remy415",
"html_url": "https://github.com/remy415",
"followers_url": "https://api.github.com/users/remy415/followers",
"following_url": "https://api.github.com/users/remy415/following{/other_user}",
"gists_url": "https://api.github.com/users/remy415/gists{/gist_id}",
"starred_url": "https://api.github.com/users/remy415/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/remy415/subscriptions",
"organizations_url": "https://api.github.com/users/remy415/orgs",
"repos_url": "https://api.github.com/users/remy415/repos",
"events_url": "https://api.github.com/users/remy415/events{/privacy}",
"received_events_url": "https://api.github.com/users/remy415/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-04-17T18:45:17
| 2024-04-18T23:04:12
| 2024-04-18T23:04:12
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3708",
"html_url": "https://github.com/ollama/ollama/pull/3708",
"diff_url": "https://github.com/ollama/ollama/pull/3708.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3708.patch",
"merged_at": "2024-04-18T23:04:12"
}
|
`static` builds by default, allows skipping, forces build if OLLAMA_CPU_TARGET="static"
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3708/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3708/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/894
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/894/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/894/comments
|
https://api.github.com/repos/ollama/ollama/issues/894/events
|
https://github.com/ollama/ollama/pull/894
| 1,959,795,310
|
PR_kwDOJ0Z1Ps5dq1Pj
| 894
|
Linux uninstall instructions
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-24T18:04:51
| 2023-10-24T18:07:06
| 2023-10-24T18:07:05
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/894",
"html_url": "https://github.com/ollama/ollama/pull/894",
"diff_url": "https://github.com/ollama/ollama/pull/894.diff",
"patch_url": "https://github.com/ollama/ollama/pull/894.patch",
"merged_at": "2023-10-24T18:07:05"
}
|
Document how to clean up the standard Linux installation.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/894/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/894/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/934
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/934/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/934/comments
|
https://api.github.com/repos/ollama/ollama/issues/934/events
|
https://github.com/ollama/ollama/pull/934
| 1,965,851,513
|
PR_kwDOJ0Z1Ps5d_ZJe
| 934
|
catch insufficient permissions nvidia err
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-10-27T16:36:50
| 2023-10-27T16:42:41
| 2023-10-27T16:42:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/934",
"html_url": "https://github.com/ollama/ollama/pull/934",
"diff_url": "https://github.com/ollama/ollama/pull/934.diff",
"patch_url": "https://github.com/ollama/ollama/pull/934.patch",
"merged_at": "2023-10-27T16:42:40"
}
|
If there is an insufficient permissions error on `nvidia-smi` execution if would be logged as a parsing error. Catch the error before this happens.
#932
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/934/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/934/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/972
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/972/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/972/comments
|
https://api.github.com/repos/ollama/ollama/issues/972/events
|
https://github.com/ollama/ollama/pull/972
| 1,974,503,930
|
PR_kwDOJ0Z1Ps5ecmAG
| 972
|
reformat api docs for more examples
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-11-02T15:27:40
| 2023-11-03T14:57:01
| 2023-11-03T14:57:00
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/972",
"html_url": "https://github.com/ollama/ollama/pull/972",
"diff_url": "https://github.com/ollama/ollama/pull/972.diff",
"patch_url": "https://github.com/ollama/ollama/pull/972.patch",
"merged_at": "2023-11-03T14:57:00"
}
|
I'd like to add an example for raw requests in #952 to the docs, but that requires formatting them in way that is more friendly to multiple request/responses. This change moves request/response under an "examples" header.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/972/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/972/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5245
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5245/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5245/comments
|
https://api.github.com/repos/ollama/ollama/issues/5245/events
|
https://github.com/ollama/ollama/issues/5245
| 2,368,902,095
|
I_kwDOJ0Z1Ps6NMpPP
| 5,245
|
Allow importing multi-file GGUF models
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 27
| 2024-06-23T21:45:41
| 2025-01-30T04:57:12
| null |
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Currently Ollama can [import GGUF files](https://github.com/ollama/ollama/blob/main/docs/import.md). However, larger models are sometimes split into separate files. Ollama should support loading multiple GGUF files similar to loading safetensor files.
### OS
_No response_
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5245/reactions",
"total_count": 19,
"+1": 13,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 6
}
|
https://api.github.com/repos/ollama/ollama/issues/5245/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/4757
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4757/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4757/comments
|
https://api.github.com/repos/ollama/ollama/issues/4757/events
|
https://github.com/ollama/ollama/issues/4757
| 2,328,634,129
|
I_kwDOJ0Z1Ps6KzCMR
| 4,757
|
please add support for AMD RX 580
|
{
"login": "eliabexp",
"id": 74092305,
"node_id": "MDQ6VXNlcjc0MDkyMzA1",
"avatar_url": "https://avatars.githubusercontent.com/u/74092305?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eliabexp",
"html_url": "https://github.com/eliabexp",
"followers_url": "https://api.github.com/users/eliabexp/followers",
"following_url": "https://api.github.com/users/eliabexp/following{/other_user}",
"gists_url": "https://api.github.com/users/eliabexp/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eliabexp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eliabexp/subscriptions",
"organizations_url": "https://api.github.com/users/eliabexp/orgs",
"repos_url": "https://api.github.com/users/eliabexp/repos",
"events_url": "https://api.github.com/users/eliabexp/events{/privacy}",
"received_events_url": "https://api.github.com/users/eliabexp/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-05-31T21:51:22
| 2024-05-31T22:08:24
| 2024-05-31T22:08:24
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
please continue upgrading AMD gpu support adding RX 580 to the supported GPUs, I think this will expand the ollama accessibility
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4757/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4757/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3843
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3843/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3843/comments
|
https://api.github.com/repos/ollama/ollama/issues/3843/events
|
https://github.com/ollama/ollama/pull/3843
| 2,258,824,089
|
PR_kwDOJ0Z1Ps5teTZk
| 3,843
|
Correct the kubernetes terminology
|
{
"login": "cloudmelon",
"id": 4621560,
"node_id": "MDQ6VXNlcjQ2MjE1NjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/4621560?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cloudmelon",
"html_url": "https://github.com/cloudmelon",
"followers_url": "https://api.github.com/users/cloudmelon/followers",
"following_url": "https://api.github.com/users/cloudmelon/following{/other_user}",
"gists_url": "https://api.github.com/users/cloudmelon/gists{/gist_id}",
"starred_url": "https://api.github.com/users/cloudmelon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cloudmelon/subscriptions",
"organizations_url": "https://api.github.com/users/cloudmelon/orgs",
"repos_url": "https://api.github.com/users/cloudmelon/repos",
"events_url": "https://api.github.com/users/cloudmelon/events{/privacy}",
"received_events_url": "https://api.github.com/users/cloudmelon/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-23T13:01:41
| 2024-05-07T16:53:09
| 2024-05-07T16:53:09
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3843",
"html_url": "https://github.com/ollama/ollama/pull/3843",
"diff_url": "https://github.com/ollama/ollama/pull/3843.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3843.patch",
"merged_at": "2024-05-07T16:53:09"
}
|
Correct the kubernetes terminology and explain the steps for testing.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3843/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3843/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/539
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/539/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/539/comments
|
https://api.github.com/repos/ollama/ollama/issues/539/events
|
https://github.com/ollama/ollama/issues/539
| 1,899,491,164
|
I_kwDOJ0Z1Ps5xN-9c
| 539
|
docs need updated for langchainjs example
|
{
"login": "dprosper",
"id": 11874942,
"node_id": "MDQ6VXNlcjExODc0OTQy",
"avatar_url": "https://avatars.githubusercontent.com/u/11874942?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dprosper",
"html_url": "https://github.com/dprosper",
"followers_url": "https://api.github.com/users/dprosper/followers",
"following_url": "https://api.github.com/users/dprosper/following{/other_user}",
"gists_url": "https://api.github.com/users/dprosper/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dprosper/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dprosper/subscriptions",
"organizations_url": "https://api.github.com/users/dprosper/orgs",
"repos_url": "https://api.github.com/users/dprosper/repos",
"events_url": "https://api.github.com/users/dprosper/events{/privacy}",
"received_events_url": "https://api.github.com/users/dprosper/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396191,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aXw",
"url": "https://api.github.com/repos/ollama/ollama/labels/documentation",
"name": "documentation",
"color": "0075ca",
"default": true,
"description": "Improvements or additions to documentation"
},
{
"id": 5667396210,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2acg",
"url": "https://api.github.com/repos/ollama/ollama/labels/good%20first%20issue",
"name": "good first issue",
"color": "7057ff",
"default": true,
"description": "Good for newcomers"
}
] |
closed
| false
| null |
[] | null | 3
| 2023-09-16T16:28:16
| 2023-12-24T21:40:42
| 2023-12-24T21:40:42
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
`docs/tutorials/langchainjs.md`
1. Missing an `await` in
```
const loader = new CheerioWebBaseLoader("https://en.wikipedia.org/wiki/2023_Hawaii_wildfires");
const data = await loader.load();
```
without it you get to the `.splitDocuments` without any data.
2. The cheerio module needs to be installed, i.e. `npm install cheerio` there is an error generated otherwise
```
Error [ERR_MODULE_NOT_FOUND]: Cannot find package 'cheerio' imported from /../node_modules/langchain/dist/document_loaders/web/cheerio.js
at new NodeError (node:internal/errors:405:5)
at packageResolve (node:internal/modules/esm/resolve:887:9)
at moduleResolve (node:internal/modules/esm/resolve:936:20)
at defaultResolve (node:internal/modules/esm/resolve:1129:11)
at nextResolve (node:internal/modules/esm/loader:163:28)
at ESMLoader.resolve (node:internal/modules/esm/loader:835:30)
at ESMLoader.getModuleJob (node:internal/modules/esm/loader:424:18)
at ESMLoader.import (node:internal/modules/esm/loader:524:22)
at importModuleDynamically (node:internal/modules/esm/translators:110:35)
at importModuleDynamicallyCallback (node:internal/process/esm_loader:35:14) {
code: 'ERR_MODULE_NOT_FOUND'
}
```
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/539/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/539/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1618
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1618/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1618/comments
|
https://api.github.com/repos/ollama/ollama/issues/1618/events
|
https://github.com/ollama/ollama/issues/1618
| 2,049,515,334
|
I_kwDOJ0Z1Ps56KR9G
| 1,618
|
WSL: Error: timed out waiting for llama runner to start
|
{
"login": "otavio-silva",
"id": 22914610,
"node_id": "MDQ6VXNlcjIyOTE0NjEw",
"avatar_url": "https://avatars.githubusercontent.com/u/22914610?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/otavio-silva",
"html_url": "https://github.com/otavio-silva",
"followers_url": "https://api.github.com/users/otavio-silva/followers",
"following_url": "https://api.github.com/users/otavio-silva/following{/other_user}",
"gists_url": "https://api.github.com/users/otavio-silva/gists{/gist_id}",
"starred_url": "https://api.github.com/users/otavio-silva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/otavio-silva/subscriptions",
"organizations_url": "https://api.github.com/users/otavio-silva/orgs",
"repos_url": "https://api.github.com/users/otavio-silva/repos",
"events_url": "https://api.github.com/users/otavio-silva/events{/privacy}",
"received_events_url": "https://api.github.com/users/otavio-silva/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 10
| 2023-12-19T22:11:12
| 2024-01-27T19:33:01
| 2024-01-27T19:33:01
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# Description
When trying to run the [dolphin-mixtral](https://ollama.ai/library/dolphin-mixtral) model in a container, I get a `Error: timed out waiting for llama runner to start` response.
# Steps to reproduce
```cmd
> podman run --device nvidia.com/gpu=all --security-opt label=disable --detach --volume .ollama:/root/.ollama --net host --name ollama ollama/ollama
> podman exec -it ollama ollama run dolphin-mixtral
```
# Logs
[ollama.log](https://github.com/jmorganca/ollama/files/13720833/ollama.log)
# Device info
```cmd
Nome do host: GE76RAIDER
Nome do sistema operacional: Microsoft Windows 11 Pro
Versão do sistema operacional: 10.0.22631 N/A compilação 22631
Fabricante do sistema operacional: Microsoft Corporation
Configuração do SO: Estação de trabalho autônoma
Tipo de compilação do sistema operacional: Multiprocessor Free
Proprietário registrado: otavioasilva@hotmail.com
Organização registrada: N/A
Identificação do produto: 00330-80000-00000-AA520
Data da instalação original: 02/08/2023, 14:30:14
Tempo de Inicialização do Sistema: 16/12/2023, 22:35:35
Fabricante do sistema: Micro-Star International Co., Ltd.
Modelo do sistema: Raider GE76 12UHS
Tipo de sistema: x64-based PC
Processador(es): 1 processador(es) instalado(s).
[01]: Intel64 Family 6 Model 154 Stepping 3 GenuineIntel ~2900 Mhz
Versão do BIOS: American Megatrends International, LLC. E17K4IMS.20D, 26/06/2023
Pasta do Windows: C:\WINDOWS
Pasta do sistema: C:\WINDOWS\system32
Inicializar dispositivo: \Device\HarddiskVolume1
Localidade do sistema: pt-br;Português (Brasil)
Localidade de entrada: en-us;Inglês (Estados Unidos)
Fuso horário: (UTC-03:00) Brasília
Memória física total: 65.305 MB
Memória física disponível: 46.483 MB
Memória Virtual: Tamanho Máximo: 75.033 MB
Memória Virtual: Disponível: 49.770 MB
Memória Virtual: Em Uso: 25.263 MB
Local(is) de arquivo de paginação: C:\pagefile.sys
Domínio: WORKGROUP
Servidor de Logon: \\GE76RAIDER
Hotfix(es): 4 hotfix(es) instalado(s).
[01]: KB5032007
[02]: KB5027397
[03]: KB5033375
[04]: KB5032393
Placa(s) de Rede: 3 NIC(s) instalado(s).
[01]: Killer E3100G 2.5 Gigabit Ethernet Controller
Nome da conexão: Ethernet
Status: Mídia desconectada
[02]: Killer(R) Wi-Fi 6E AX1675i 160MHz Wireless Network Adapter (211NGW)
Nome da conexão: Wi-Fi
DHCP ativado: Sim
Servidor DHCP: 192.168.1.1
Endereço(es) IP
[01]: 192.168.1.27
[03]: TAP-Windows Adapter V9
Nome da conexão: TAP-Windows
Status: Mídia desconectada
Requisitos do Hyper-V: Hipervisor detectado. Recursos necessários para o Hyper-V não serão exibidos.
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1618/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
}
|
https://api.github.com/repos/ollama/ollama/issues/1618/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/534
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/534/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/534/comments
|
https://api.github.com/repos/ollama/ollama/issues/534/events
|
https://github.com/ollama/ollama/pull/534
| 1,898,765,299
|
PR_kwDOJ0Z1Ps5adYKQ
| 534
|
linux installer script
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-15T16:42:29
| 2023-09-22T16:01:04
| 2023-09-22T16:01:03
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/534",
"html_url": "https://github.com/ollama/ollama/pull/534",
"diff_url": "https://github.com/ollama/ollama/pull/534.diff",
"patch_url": "https://github.com/ollama/ollama/pull/534.patch",
"merged_at": "2023-09-22T16:01:03"
}
|
Add an install script to the website which downloads the appropriate linux package, unpackages it, adds it to the /usr/local/bin directory, and adds ollama as start-up service.
Before our next release we will:
- Do linux amd64 and aarch64 builds with CUDA enabled.
- Add them to the pre-release of the jmorgan/ollama repo with the names ollama-linux-arm64.tar.gz and ollama-linux-amd64.tar.gz
- This install script will automatically download the latest version of these files from github releases which are not pre-release.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/534/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/534/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3654
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3654/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3654/comments
|
https://api.github.com/repos/ollama/ollama/issues/3654/events
|
https://github.com/ollama/ollama/pull/3654
| 2,243,971,528
|
PR_kwDOJ0Z1Ps5ssT55
| 3,654
|
chore: add dependabot
|
{
"login": "hutchic",
"id": 697188,
"node_id": "MDQ6VXNlcjY5NzE4OA==",
"avatar_url": "https://avatars.githubusercontent.com/u/697188?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hutchic",
"html_url": "https://github.com/hutchic",
"followers_url": "https://api.github.com/users/hutchic/followers",
"following_url": "https://api.github.com/users/hutchic/following{/other_user}",
"gists_url": "https://api.github.com/users/hutchic/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hutchic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hutchic/subscriptions",
"organizations_url": "https://api.github.com/users/hutchic/orgs",
"repos_url": "https://api.github.com/users/hutchic/repos",
"events_url": "https://api.github.com/users/hutchic/events{/privacy}",
"received_events_url": "https://api.github.com/users/hutchic/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-04-15T15:26:39
| 2024-11-21T10:05:44
| 2024-11-21T10:05:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3654",
"html_url": "https://github.com/ollama/ollama/pull/3654",
"diff_url": "https://github.com/ollama/ollama/pull/3654.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3654.patch",
"merged_at": null
}
|
related to https://github.com/ollama/ollama/pull/3627 though I don't recall is dependabot will catch a submodule if it's a directory down?
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3654/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3654/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4581
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4581/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4581/comments
|
https://api.github.com/repos/ollama/ollama/issues/4581/events
|
https://github.com/ollama/ollama/pull/4581
| 2,311,674,262
|
PR_kwDOJ0Z1Ps5wQ_Y-
| 4,581
|
DO NOT MERGE - testing CI
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-05-22T23:50:21
| 2024-05-22T23:59:27
| 2024-05-22T23:59:25
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4581",
"html_url": "https://github.com/ollama/ollama/pull/4581",
"diff_url": "https://github.com/ollama/ollama/pull/4581.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4581.patch",
"merged_at": null
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4581/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4581/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2897
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2897/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2897/comments
|
https://api.github.com/repos/ollama/ollama/issues/2897/events
|
https://github.com/ollama/ollama/issues/2897
| 2,165,438,352
|
I_kwDOJ0Z1Ps6BEfeQ
| 2,897
|
Windows preview CUDA 5.2 support
|
{
"login": "lyczak",
"id": 4741907,
"node_id": "MDQ6VXNlcjQ3NDE5MDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/4741907?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lyczak",
"html_url": "https://github.com/lyczak",
"followers_url": "https://api.github.com/users/lyczak/followers",
"following_url": "https://api.github.com/users/lyczak/following{/other_user}",
"gists_url": "https://api.github.com/users/lyczak/gists{/gist_id}",
"starred_url": "https://api.github.com/users/lyczak/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lyczak/subscriptions",
"organizations_url": "https://api.github.com/users/lyczak/orgs",
"repos_url": "https://api.github.com/users/lyczak/repos",
"events_url": "https://api.github.com/users/lyczak/events{/privacy}",
"received_events_url": "https://api.github.com/users/lyczak/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6430601766,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf0syJg",
"url": "https://api.github.com/repos/ollama/ollama/labels/nvidia",
"name": "nvidia",
"color": "8CDB00",
"default": false,
"description": "Issues relating to Nvidia GPUs and CUDA"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 4
| 2024-03-03T16:24:17
| 2024-03-21T17:36:20
| 2024-03-21T11:38:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello folks,
I've been trying to get started with the Windows preview version of ollama. However, I'm currently encountering an issue where my GTX 970 is not detected by the software. I've tried updating drivers and updating Windows to no avail. Assuming this is related to old CUDA version (CUDA 5.2) as mentioned in #1865 then it should've been fixed by #2116 but I don't know if this fix has been tested on the Windows preview version of ollama. Poking around in that PR, it seems like the commit which [adds support for CUDA 5.0, 7.5, and 8.0 ](https://github.com/ollama/ollama/pull/2116/commits/a447a083f2169e2a3c975cb5951d8b0b0dcddb04) touches `gen_common.sh`, `gen_linux.sh`, and `gen_windows.ps1` under `llm/generate` whereas the commit [targeting CUDA 5.2](https://github.com/ollama/ollama/pull/2116/commits/681a91499010be819dd45a1390e668b0817e7338) only touches `gen_linux.sh`. Could this be the source of the issue?
Assuming this was the problem, I was hoping to try using WSL but unfortunately I'm running Windows Server 2019 and can't install WSL2. I may set up a dual boot with Ubuntu later today to see if my GPU is recognized there. I'm more than happy to help with additional testing although I don't have time to setup the toolchain and build things myself right this moment. Thanks for helping to maintain this project!
Relevant lines from my server.log are as follows:
```
time=2024-03-02T11:54:38.873-05:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
time=2024-03-02T11:54:39.015-05:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library nvml.dll"
time=2024-03-02T11:54:45.186-05:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []"
time=2024-03-02T11:54:45.186-05:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library rocm_smi64.dll"
time=2024-03-02T11:54:45.190-05:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []"
time=2024-03-02T11:54:45.190-05:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-03-02T11:54:45.190-05:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-03-02T11:54:45.191-05:00 level=INFO source=llm.go:77 msg="GPU not available, falling back to CPU"
```
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2897/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2897/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/68
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/68/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/68/comments
|
https://api.github.com/repos/ollama/ollama/issues/68/events
|
https://github.com/ollama/ollama/issues/68
| 1,799,725,660
|
I_kwDOJ0Z1Ps5rRaJc
| 68
|
add an `/api/tokens` endpoint which returns the amount of token in a given input
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 0
| 2023-07-11T20:44:24
| 2023-09-10T03:38:06
| 2023-09-10T03:38:06
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
This is useful for determining where to trim an input in the client
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/68/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/68/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1570
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1570/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1570/comments
|
https://api.github.com/repos/ollama/ollama/issues/1570/events
|
https://github.com/ollama/ollama/pull/1570
| 2,045,221,967
|
PR_kwDOJ0Z1Ps5iMCWn
| 1,570
|
Fix omitempty typo
|
{
"login": "gluonfield",
"id": 5672094,
"node_id": "MDQ6VXNlcjU2NzIwOTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/5672094?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gluonfield",
"html_url": "https://github.com/gluonfield",
"followers_url": "https://api.github.com/users/gluonfield/followers",
"following_url": "https://api.github.com/users/gluonfield/following{/other_user}",
"gists_url": "https://api.github.com/users/gluonfield/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gluonfield/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gluonfield/subscriptions",
"organizations_url": "https://api.github.com/users/gluonfield/orgs",
"repos_url": "https://api.github.com/users/gluonfield/repos",
"events_url": "https://api.github.com/users/gluonfield/events{/privacy}",
"received_events_url": "https://api.github.com/users/gluonfield/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-12-17T13:35:11
| 2023-12-18T22:01:05
| 2023-12-18T22:01:04
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1570",
"html_url": "https://github.com/ollama/ollama/pull/1570",
"diff_url": "https://github.com/ollama/ollama/pull/1570.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1570.patch",
"merged_at": null
}
|
- Removes space typo before omitempty
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1570/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1570/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7095
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7095/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7095/comments
|
https://api.github.com/repos/ollama/ollama/issues/7095/events
|
https://github.com/ollama/ollama/issues/7095
| 2,565,155,539
|
I_kwDOJ0Z1Ps6Y5SrT
| 7,095
|
how to show the tray menu in dev mode?
|
{
"login": "hichemfantar",
"id": 34947993,
"node_id": "MDQ6VXNlcjM0OTQ3OTkz",
"avatar_url": "https://avatars.githubusercontent.com/u/34947993?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hichemfantar",
"html_url": "https://github.com/hichemfantar",
"followers_url": "https://api.github.com/users/hichemfantar/followers",
"following_url": "https://api.github.com/users/hichemfantar/following{/other_user}",
"gists_url": "https://api.github.com/users/hichemfantar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hichemfantar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hichemfantar/subscriptions",
"organizations_url": "https://api.github.com/users/hichemfantar/orgs",
"repos_url": "https://api.github.com/users/hichemfantar/repos",
"events_url": "https://api.github.com/users/hichemfantar/events{/privacy}",
"received_events_url": "https://api.github.com/users/hichemfantar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-10-03T23:34:44
| 2024-10-03T23:41:31
| 2024-10-03T23:41:31
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
I'm trying to work on the tray menu but It doesn't seem to show in dev mode.
Is there a special flag or other steps i need to do to get the dev build to register a tray icon?
|
{
"login": "hichemfantar",
"id": 34947993,
"node_id": "MDQ6VXNlcjM0OTQ3OTkz",
"avatar_url": "https://avatars.githubusercontent.com/u/34947993?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hichemfantar",
"html_url": "https://github.com/hichemfantar",
"followers_url": "https://api.github.com/users/hichemfantar/followers",
"following_url": "https://api.github.com/users/hichemfantar/following{/other_user}",
"gists_url": "https://api.github.com/users/hichemfantar/gists{/gist_id}",
"starred_url": "https://api.github.com/users/hichemfantar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hichemfantar/subscriptions",
"organizations_url": "https://api.github.com/users/hichemfantar/orgs",
"repos_url": "https://api.github.com/users/hichemfantar/repos",
"events_url": "https://api.github.com/users/hichemfantar/events{/privacy}",
"received_events_url": "https://api.github.com/users/hichemfantar/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7095/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7095/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3735
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3735/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3735/comments
|
https://api.github.com/repos/ollama/ollama/issues/3735/events
|
https://github.com/ollama/ollama/issues/3735
| 2,251,067,134
|
I_kwDOJ0Z1Ps6GLI7-
| 3,735
|
Can you support llama3?
|
{
"login": "ICLXL",
"id": 30027321,
"node_id": "MDQ6VXNlcjMwMDI3MzIx",
"avatar_url": "https://avatars.githubusercontent.com/u/30027321?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ICLXL",
"html_url": "https://github.com/ICLXL",
"followers_url": "https://api.github.com/users/ICLXL/followers",
"following_url": "https://api.github.com/users/ICLXL/following{/other_user}",
"gists_url": "https://api.github.com/users/ICLXL/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ICLXL/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ICLXL/subscriptions",
"organizations_url": "https://api.github.com/users/ICLXL/orgs",
"repos_url": "https://api.github.com/users/ICLXL/repos",
"events_url": "https://api.github.com/users/ICLXL/events{/privacy}",
"received_events_url": "https://api.github.com/users/ICLXL/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 27
| 2024-04-18T16:06:36
| 2024-04-19T21:40:53
| 2024-04-19T21:40:53
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
https://llama.meta.com/llama3/
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3735/reactions",
"total_count": 20,
"+1": 20,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3735/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4857
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4857/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4857/comments
|
https://api.github.com/repos/ollama/ollama/issues/4857/events
|
https://github.com/ollama/ollama/pull/4857
| 2,338,328,567
|
PR_kwDOJ0Z1Ps5xr3F2
| 4,857
|
feat: initial steps allow image embeddings
|
{
"login": "JoanFM",
"id": 19825685,
"node_id": "MDQ6VXNlcjE5ODI1Njg1",
"avatar_url": "https://avatars.githubusercontent.com/u/19825685?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/JoanFM",
"html_url": "https://github.com/JoanFM",
"followers_url": "https://api.github.com/users/JoanFM/followers",
"following_url": "https://api.github.com/users/JoanFM/following{/other_user}",
"gists_url": "https://api.github.com/users/JoanFM/gists{/gist_id}",
"starred_url": "https://api.github.com/users/JoanFM/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JoanFM/subscriptions",
"organizations_url": "https://api.github.com/users/JoanFM/orgs",
"repos_url": "https://api.github.com/users/JoanFM/repos",
"events_url": "https://api.github.com/users/JoanFM/events{/privacy}",
"received_events_url": "https://api.github.com/users/JoanFM/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 1
| 2024-06-06T13:53:37
| 2024-10-07T19:20:07
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4857",
"html_url": "https://github.com/ollama/ollama/pull/4857",
"diff_url": "https://github.com/ollama/ollama/pull/4857.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4857.patch",
"merged_at": null
}
|
I would like to start the discussion about the possibility to add Image as an Input to the Embedding route.
I guess it would also need the llama.cpp runner to be able to handle it, for which I still need to look at.
Also, I would like to know if `batch` embeddings is something you would consider adding.
This PR would only be a first step in that direction doing a small refactoring and adding the Image to the Embedding Request.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4857/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4857/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5262
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5262/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5262/comments
|
https://api.github.com/repos/ollama/ollama/issues/5262/events
|
https://github.com/ollama/ollama/issues/5262
| 2,371,429,144
|
I_kwDOJ0Z1Ps6NWSMY
| 5,262
|
api 404 403
|
{
"login": "vc815",
"id": 31056384,
"node_id": "MDQ6VXNlcjMxMDU2Mzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/31056384?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vc815",
"html_url": "https://github.com/vc815",
"followers_url": "https://api.github.com/users/vc815/followers",
"following_url": "https://api.github.com/users/vc815/following{/other_user}",
"gists_url": "https://api.github.com/users/vc815/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vc815/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vc815/subscriptions",
"organizations_url": "https://api.github.com/users/vc815/orgs",
"repos_url": "https://api.github.com/users/vc815/repos",
"events_url": "https://api.github.com/users/vc815/events{/privacy}",
"received_events_url": "https://api.github.com/users/vc815/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 0
| 2024-06-25T01:37:16
| 2024-06-25T01:56:02
| 2024-06-25T01:56:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Ollama Run sqlcoder has been successfully run, but the test interface api, http://localhost:11434/api/chat, http://localhost:11434/api/generate The above addresses cannot be accessed 403,404 error, windows11 system, what is the reason for this? Port 11434 has been added to the firewall
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.45
|
{
"login": "vc815",
"id": 31056384,
"node_id": "MDQ6VXNlcjMxMDU2Mzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/31056384?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vc815",
"html_url": "https://github.com/vc815",
"followers_url": "https://api.github.com/users/vc815/followers",
"following_url": "https://api.github.com/users/vc815/following{/other_user}",
"gists_url": "https://api.github.com/users/vc815/gists{/gist_id}",
"starred_url": "https://api.github.com/users/vc815/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vc815/subscriptions",
"organizations_url": "https://api.github.com/users/vc815/orgs",
"repos_url": "https://api.github.com/users/vc815/repos",
"events_url": "https://api.github.com/users/vc815/events{/privacy}",
"received_events_url": "https://api.github.com/users/vc815/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5262/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5262/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1511
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1511/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1511/comments
|
https://api.github.com/repos/ollama/ollama/issues/1511/events
|
https://github.com/ollama/ollama/issues/1511
| 2,040,462,951
|
I_kwDOJ0Z1Ps55nv5n
| 1,511
|
Submitting an image as the first argument in a prompt to a LLaVA model results in `Unknown command`
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 0
| 2023-12-13T21:02:08
| 2023-12-21T18:21:01
| 2023-12-21T18:21:01
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
```
ollama run llava
>>> /Users/bruce/Downloads/Ollama_christmas_background.png hi
Unknown command '/Users/bruce/Downloads/Ollama_christmas_background.png'. Type /? for help
>>>
```
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1511/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1511/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7212
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7212/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7212/comments
|
https://api.github.com/repos/ollama/ollama/issues/7212/events
|
https://github.com/ollama/ollama/pull/7212
| 2,589,551,127
|
PR_kwDOJ0Z1Ps5-txh5
| 7,212
|
Better support for AMD multi-GPU on linux
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-10-15T18:28:12
| 2024-10-26T21:04:17
| 2024-10-26T21:04:14
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7212",
"html_url": "https://github.com/ollama/ollama/pull/7212",
"diff_url": "https://github.com/ollama/ollama/pull/7212.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7212.patch",
"merged_at": "2024-10-26T21:04:14"
}
|
This resolves a number of problems related to AMD multi-GPU setups on linux.
The numeric IDs used by rocm are not the same as the numeric IDs exposed in sysfs although the ordering is consistent. We have to count up from the first valid gfx (major/minor/patch with non-zero values) we find starting at zero.
There are 3 different env vars for selecting GPUs, and only ROCR_VISIBLE_DEVICES supports UUID based identification, so we should favor that one, and try to use UUIDs if detected to avoid potential ordering bugs with numeric IDs.
Fixes #6595
Fixes #6304
Fixes #6802
Fixes #5143
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7212/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7212/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7566
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7566/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7566/comments
|
https://api.github.com/repos/ollama/ollama/issues/7566/events
|
https://github.com/ollama/ollama/issues/7566
| 2,642,721,062
|
I_kwDOJ0Z1Ps6dhLkm
| 7,566
|
Having trouble with vram using priority
|
{
"login": "morika546",
"id": 187546431,
"node_id": "U_kgDOCy27Pw",
"avatar_url": "https://avatars.githubusercontent.com/u/187546431?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/morika546",
"html_url": "https://github.com/morika546",
"followers_url": "https://api.github.com/users/morika546/followers",
"following_url": "https://api.github.com/users/morika546/following{/other_user}",
"gists_url": "https://api.github.com/users/morika546/gists{/gist_id}",
"starred_url": "https://api.github.com/users/morika546/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/morika546/subscriptions",
"organizations_url": "https://api.github.com/users/morika546/orgs",
"repos_url": "https://api.github.com/users/morika546/repos",
"events_url": "https://api.github.com/users/morika546/events{/privacy}",
"received_events_url": "https://api.github.com/users/morika546/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-08T03:22:42
| 2024-11-08T09:19:34
| 2024-11-08T09:19:34
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
My GPU is 6800xt with 16 GB VRAM,and my RAM is 64 GB.When I running the model,It always using so many shared VRAM while there is still a lot of free dedicated VRAM.For example, when I run a 13 b sized model,it needs 12GB VRAM,and only 4GB Runs by dedicated VRAM,the resist 8GB runs by shared VRAM,That makes even the processor is 100% GPU,the model still runs in a very very slow speed.
I have checked the serve.log,and doubt if this makes the problem:
time=2024-11-06T22:18:35.707+08:00 level=INFO source=sched.go:185 msg="one or more GPUs detected that are unable to accurately report free memory - disabling default concurrency
I am new to ollama,and have been suffering from the bug for two weeks,I will appreciate it if anyone can help me.Here is the full serve.log:
[2_server.log](https://github.com/user-attachments/files/17672177/2_server.log)
### OS
Windows
### GPU
AMD
### CPU
AMD
### Ollama version
0.4.0
|
{
"login": "morika546",
"id": 187546431,
"node_id": "U_kgDOCy27Pw",
"avatar_url": "https://avatars.githubusercontent.com/u/187546431?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/morika546",
"html_url": "https://github.com/morika546",
"followers_url": "https://api.github.com/users/morika546/followers",
"following_url": "https://api.github.com/users/morika546/following{/other_user}",
"gists_url": "https://api.github.com/users/morika546/gists{/gist_id}",
"starred_url": "https://api.github.com/users/morika546/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/morika546/subscriptions",
"organizations_url": "https://api.github.com/users/morika546/orgs",
"repos_url": "https://api.github.com/users/morika546/repos",
"events_url": "https://api.github.com/users/morika546/events{/privacy}",
"received_events_url": "https://api.github.com/users/morika546/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7566/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7566/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1134
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1134/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1134/comments
|
https://api.github.com/repos/ollama/ollama/issues/1134/events
|
https://github.com/ollama/ollama/pull/1134
| 1,993,837,800
|
PR_kwDOJ0Z1Ps5feCUJ
| 1,134
|
progress bar
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 9
| 2023-11-15T01:06:58
| 2023-11-17T22:03:36
| 2023-11-17T22:03:35
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1134",
"html_url": "https://github.com/ollama/ollama/pull/1134",
"diff_url": "https://github.com/ollama/ollama/pull/1134.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1134.patch",
"merged_at": "2023-11-17T22:03:35"
}
|
Example:
```
$ ollama pull mistral
pulling manifest (1s)
downloading 6ae280299950 100.0% [=========================================================================================================================================================] (4.1 GB/4.1 GB, 0 B/s, 0s)
downloading 22e1b2e8dc2f 100.0% [=============================================================================================================================================================] (43 B/43 B, 0 B/s, 0s)
downloading e35ab70a78c7 100.0% [=============================================================================================================================================================] (90 B/90 B, 0 B/s, 0s)
downloading 1cb90d66f4d4 100.0% [===========================================================================================================================================================] (381 B/381 B, 0 B/s, 0s)
verifying sha256 digest (2s)
writing manifest (0s)
removing any unused layers (0s)
success (0s)
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1134/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1134/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6306
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6306/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6306/comments
|
https://api.github.com/repos/ollama/ollama/issues/6306/events
|
https://github.com/ollama/ollama/issues/6306
| 2,459,393,297
|
I_kwDOJ0Z1Ps6Sl10R
| 6,306
|
Running ollama on island device with no Internet connection
|
{
"login": "whatdhack",
"id": 12969966,
"node_id": "MDQ6VXNlcjEyOTY5OTY2",
"avatar_url": "https://avatars.githubusercontent.com/u/12969966?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/whatdhack",
"html_url": "https://github.com/whatdhack",
"followers_url": "https://api.github.com/users/whatdhack/followers",
"following_url": "https://api.github.com/users/whatdhack/following{/other_user}",
"gists_url": "https://api.github.com/users/whatdhack/gists{/gist_id}",
"starred_url": "https://api.github.com/users/whatdhack/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/whatdhack/subscriptions",
"organizations_url": "https://api.github.com/users/whatdhack/orgs",
"repos_url": "https://api.github.com/users/whatdhack/repos",
"events_url": "https://api.github.com/users/whatdhack/events{/privacy}",
"received_events_url": "https://api.github.com/users/whatdhack/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 11
| 2024-08-11T03:01:43
| 2024-08-11T19:26:47
| 2024-08-11T17:33:49
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Trying to run ollama on island devices with no Internet connection. Getting the following error message.
`Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/llama2/manifests/latest":`
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6306/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6306/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6982
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6982/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6982/comments
|
https://api.github.com/repos/ollama/ollama/issues/6982/events
|
https://github.com/ollama/ollama/issues/6982
| 2,551,230,311
|
I_kwDOJ0Z1Ps6YEK9n
| 6,982
|
Mistral-NeMo-Minitron-8B-Base/Chat
|
{
"login": "Axenide",
"id": 66109459,
"node_id": "MDQ6VXNlcjY2MTA5NDU5",
"avatar_url": "https://avatars.githubusercontent.com/u/66109459?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Axenide",
"html_url": "https://github.com/Axenide",
"followers_url": "https://api.github.com/users/Axenide/followers",
"following_url": "https://api.github.com/users/Axenide/following{/other_user}",
"gists_url": "https://api.github.com/users/Axenide/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Axenide/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Axenide/subscriptions",
"organizations_url": "https://api.github.com/users/Axenide/orgs",
"repos_url": "https://api.github.com/users/Axenide/repos",
"events_url": "https://api.github.com/users/Axenide/events{/privacy}",
"received_events_url": "https://api.github.com/users/Axenide/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-09-26T18:05:35
| 2024-11-17T15:33:13
| 2024-11-17T15:33:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Mistral-NeMo-12B has great capabilities, but it doesn't fit in my GPU so I have to offload part of it to the CPU and RAM, which makes it really slow. 8B models work great though, so I think it would be a great adition to have this model.
Here is the base model:
https://huggingface.co/nvidia/Mistral-NeMo-Minitron-8B-Base
And here is a fine-tuned chat version:
https://huggingface.co/rasyosef/Mistral-NeMo-Minitron-8B-Chat
Here is the GUFF version of the fine-tune mentioned:
https://huggingface.co/mradermacher/Mistral-NeMo-Minitron-8B-Chat-GGUF
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6982/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6982/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3596
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3596/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3596/comments
|
https://api.github.com/repos/ollama/ollama/issues/3596/events
|
https://github.com/ollama/ollama/pull/3596
| 2,237,770,624
|
PR_kwDOJ0Z1Ps5sXOTG
| 3,596
|
api: fill up API documentation
|
{
"login": "eliben",
"id": 1130906,
"node_id": "MDQ6VXNlcjExMzA5MDY=",
"avatar_url": "https://avatars.githubusercontent.com/u/1130906?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eliben",
"html_url": "https://github.com/eliben",
"followers_url": "https://api.github.com/users/eliben/followers",
"following_url": "https://api.github.com/users/eliben/following{/other_user}",
"gists_url": "https://api.github.com/users/eliben/gists{/gist_id}",
"starred_url": "https://api.github.com/users/eliben/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eliben/subscriptions",
"organizations_url": "https://api.github.com/users/eliben/orgs",
"repos_url": "https://api.github.com/users/eliben/repos",
"events_url": "https://api.github.com/users/eliben/events{/privacy}",
"received_events_url": "https://api.github.com/users/eliben/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2024-04-11T13:47:32
| 2024-05-07T23:27:47
| 2024-05-07T23:27:47
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3596",
"html_url": "https://github.com/ollama/ollama/pull/3596",
"diff_url": "https://github.com/ollama/ollama/pull/3596.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3596.patch",
"merged_at": "2024-05-07T23:27:47"
}
|
Followup for #2878
Now that the documentation is more complete, mention it in the README. Once a new version of ollama is tagged, pkg.go.dev will pick up the documentation comments and display everything on the linked page
Updates #2840
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3596/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3596/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8222
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8222/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8222/comments
|
https://api.github.com/repos/ollama/ollama/issues/8222/events
|
https://github.com/ollama/ollama/issues/8222
| 2,756,724,761
|
I_kwDOJ0Z1Ps6kUEgZ
| 8,222
|
Change ToolFunction->Parameters to json.RawMessage like in the Format property
|
{
"login": "jerbob92",
"id": 1312921,
"node_id": "MDQ6VXNlcjEzMTI5MjE=",
"avatar_url": "https://avatars.githubusercontent.com/u/1312921?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jerbob92",
"html_url": "https://github.com/jerbob92",
"followers_url": "https://api.github.com/users/jerbob92/followers",
"following_url": "https://api.github.com/users/jerbob92/following{/other_user}",
"gists_url": "https://api.github.com/users/jerbob92/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jerbob92/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jerbob92/subscriptions",
"organizations_url": "https://api.github.com/users/jerbob92/orgs",
"repos_url": "https://api.github.com/users/jerbob92/repos",
"events_url": "https://api.github.com/users/jerbob92/events{/privacy}",
"received_events_url": "https://api.github.com/users/jerbob92/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
|
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "ParthSareen",
"id": 29360864,
"node_id": "MDQ6VXNlcjI5MzYwODY0",
"avatar_url": "https://avatars.githubusercontent.com/u/29360864?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ParthSareen",
"html_url": "https://github.com/ParthSareen",
"followers_url": "https://api.github.com/users/ParthSareen/followers",
"following_url": "https://api.github.com/users/ParthSareen/following{/other_user}",
"gists_url": "https://api.github.com/users/ParthSareen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/ParthSareen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ParthSareen/subscriptions",
"organizations_url": "https://api.github.com/users/ParthSareen/orgs",
"repos_url": "https://api.github.com/users/ParthSareen/repos",
"events_url": "https://api.github.com/users/ParthSareen/events{/privacy}",
"received_events_url": "https://api.github.com/users/ParthSareen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 0
| 2024-12-23T21:15:28
| 2024-12-24T19:22:58
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm trying to use Tools in the `ChatRequest`, but the `Parameters` property in `ToolFunction` does not allow me to put my full JSON schema in there, while the `Format` property does.
I would suggest changing the type of `Parameters` into `json.RawMessage` just like `Format`.
I'm currently using `Format` property as a workaround.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.4
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8222/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8222/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2852
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2852/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2852/comments
|
https://api.github.com/repos/ollama/ollama/issues/2852/events
|
https://github.com/ollama/ollama/issues/2852
| 2,162,517,774
|
I_kwDOJ0Z1Ps6A5WcO
| 2,852
|
Missing example files in examples/python-chat-app.
|
{
"login": "caol64",
"id": 6183265,
"node_id": "MDQ6VXNlcjYxODMyNjU=",
"avatar_url": "https://avatars.githubusercontent.com/u/6183265?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/caol64",
"html_url": "https://github.com/caol64",
"followers_url": "https://api.github.com/users/caol64/followers",
"following_url": "https://api.github.com/users/caol64/following{/other_user}",
"gists_url": "https://api.github.com/users/caol64/gists{/gist_id}",
"starred_url": "https://api.github.com/users/caol64/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/caol64/subscriptions",
"organizations_url": "https://api.github.com/users/caol64/orgs",
"repos_url": "https://api.github.com/users/caol64/repos",
"events_url": "https://api.github.com/users/caol64/events{/privacy}",
"received_events_url": "https://api.github.com/users/caol64/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-03-01T03:24:46
| 2024-03-12T00:17:10
| 2024-03-12T00:17:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null | null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2852/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2852/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4471
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4471/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4471/comments
|
https://api.github.com/repos/ollama/ollama/issues/4471/events
|
https://github.com/ollama/ollama/issues/4471
| 2,299,859,960
|
I_kwDOJ0Z1Ps6JFRP4
| 4,471
|
Warning: client version is different than Ollama version in Linux
|
{
"login": "sohang3112",
"id": 31966963,
"node_id": "MDQ6VXNlcjMxOTY2OTYz",
"avatar_url": "https://avatars.githubusercontent.com/u/31966963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sohang3112",
"html_url": "https://github.com/sohang3112",
"followers_url": "https://api.github.com/users/sohang3112/followers",
"following_url": "https://api.github.com/users/sohang3112/following{/other_user}",
"gists_url": "https://api.github.com/users/sohang3112/gists{/gist_id}",
"starred_url": "https://api.github.com/users/sohang3112/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sohang3112/subscriptions",
"organizations_url": "https://api.github.com/users/sohang3112/orgs",
"repos_url": "https://api.github.com/users/sohang3112/repos",
"events_url": "https://api.github.com/users/sohang3112/events{/privacy}",
"received_events_url": "https://api.github.com/users/sohang3112/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-05-16T09:31:20
| 2025-01-30T08:31:35
| 2024-05-16T19:52:19
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I had installed Ollama version 0.1.31 previously in Linux. Now after upgrading, ollama says its version is different than client version. How to fix this so that both are upgraded to 0.1.38?
```console
$ curl -fsSL https://ollama.com/install.sh | sh # command to upgrade ollama
$ ollama --version
ollama version is 0.1.31
Warning: client version is 0.1.38
```
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
_No response_
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4471/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4471/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/138
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/138/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/138/comments
|
https://api.github.com/repos/ollama/ollama/issues/138/events
|
https://github.com/ollama/ollama/issues/138
| 1,814,256,339
|
I_kwDOJ0Z1Ps5sI1rT
| 138
|
Progress spinner not quite right on WSL
|
{
"login": "nathanleclaire",
"id": 1476820,
"node_id": "MDQ6VXNlcjE0NzY4MjA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1476820?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nathanleclaire",
"html_url": "https://github.com/nathanleclaire",
"followers_url": "https://api.github.com/users/nathanleclaire/followers",
"following_url": "https://api.github.com/users/nathanleclaire/following{/other_user}",
"gists_url": "https://api.github.com/users/nathanleclaire/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nathanleclaire/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nathanleclaire/subscriptions",
"organizations_url": "https://api.github.com/users/nathanleclaire/orgs",
"repos_url": "https://api.github.com/users/nathanleclaire/repos",
"events_url": "https://api.github.com/users/nathanleclaire/events{/privacy}",
"received_events_url": "https://api.github.com/users/nathanleclaire/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2023-07-20T15:23:10
| 2023-08-30T21:35:44
| 2023-08-30T21:35:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |

|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/138/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/138/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3589
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3589/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3589/comments
|
https://api.github.com/repos/ollama/ollama/issues/3589/events
|
https://github.com/ollama/ollama/pull/3589
| 2,237,107,619
|
PR_kwDOJ0Z1Ps5sU8bI
| 3,589
|
types/model: remove (*Digest).Scan and Digest.Value
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-11T07:37:18
| 2024-04-11T07:37:27
| 2024-04-11T07:37:27
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3589",
"html_url": "https://github.com/ollama/ollama/pull/3589",
"diff_url": "https://github.com/ollama/ollama/pull/3589.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3589.patch",
"merged_at": "2024-04-11T07:37:27"
}
| null |
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3589/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3589/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1337
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1337/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1337/comments
|
https://api.github.com/repos/ollama/ollama/issues/1337/events
|
https://github.com/ollama/ollama/issues/1337
| 2,019,796,865
|
I_kwDOJ0Z1Ps54Y6eB
| 1,337
|
API interface works fine, CLI returns non-descriptive error presumably due to proxy with Docker install
|
{
"login": "mlewis1973",
"id": 2373703,
"node_id": "MDQ6VXNlcjIzNzM3MDM=",
"avatar_url": "https://avatars.githubusercontent.com/u/2373703?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mlewis1973",
"html_url": "https://github.com/mlewis1973",
"followers_url": "https://api.github.com/users/mlewis1973/followers",
"following_url": "https://api.github.com/users/mlewis1973/following{/other_user}",
"gists_url": "https://api.github.com/users/mlewis1973/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mlewis1973/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mlewis1973/subscriptions",
"organizations_url": "https://api.github.com/users/mlewis1973/orgs",
"repos_url": "https://api.github.com/users/mlewis1973/repos",
"events_url": "https://api.github.com/users/mlewis1973/events{/privacy}",
"received_events_url": "https://api.github.com/users/mlewis1973/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 8
| 2023-12-01T00:41:20
| 2024-08-23T21:05:09
| 2024-08-23T21:05:09
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Docker image installed on multiple Linux and Mac systems, both with and wo GPUs.
Local proxy settings set in daemon.json and well as passed to docker with -e and --env
API interface works fine, but CLI generates error for
'ollama run llama2'
'ollama list'
'ollama pull mistral'
ollama --version and --help do not generate error.
I even tried building the image with proxy hardcoded with ENVs in the Dockerfile... same error
Outside firewall, CLI works fine.
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1337/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1337/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/174
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/174/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/174/comments
|
https://api.github.com/repos/ollama/ollama/issues/174/events
|
https://github.com/ollama/ollama/pull/174
| 1,816,639,351
|
PR_kwDOJ0Z1Ps5WJFmW
| 174
|
allocate a large enough tokens slice
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-22T06:05:38
| 2023-07-24T15:49:52
| 2023-07-24T15:22:51
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/174",
"html_url": "https://github.com/ollama/ollama/pull/174",
"diff_url": "https://github.com/ollama/ollama/pull/174.diff",
"patch_url": "https://github.com/ollama/ollama/pull/174.patch",
"merged_at": "2023-07-24T15:22:51"
}
|
cherry picked from #102
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/174/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/174/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7321
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7321/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7321/comments
|
https://api.github.com/repos/ollama/ollama/issues/7321/events
|
https://github.com/ollama/ollama/issues/7321
| 2,606,044,576
|
I_kwDOJ0Z1Ps6bVRWg
| 7,321
|
Support loading the same model more than once
|
{
"login": "jfwreinhardt",
"id": 185949500,
"node_id": "U_kgDOCxVdPA",
"avatar_url": "https://avatars.githubusercontent.com/u/185949500?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jfwreinhardt",
"html_url": "https://github.com/jfwreinhardt",
"followers_url": "https://api.github.com/users/jfwreinhardt/followers",
"following_url": "https://api.github.com/users/jfwreinhardt/following{/other_user}",
"gists_url": "https://api.github.com/users/jfwreinhardt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jfwreinhardt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jfwreinhardt/subscriptions",
"organizations_url": "https://api.github.com/users/jfwreinhardt/orgs",
"repos_url": "https://api.github.com/users/jfwreinhardt/repos",
"events_url": "https://api.github.com/users/jfwreinhardt/events{/privacy}",
"received_events_url": "https://api.github.com/users/jfwreinhardt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 1
| 2024-10-22T17:20:53
| 2024-10-22T17:36:45
| 2024-10-22T17:36:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Are there any plans to support loading the same model more than once?
On a CUDA based system with multiple GPUs, I have observed that performance decreases for each new concurrent prompt against the same model. To put it another way, we see higher tokens/s for sending a prompt to four different models concurrently than if we send four prompts to the same model concurrently.
It would be helpful from a performance perspective if ollama spawned a new runner when OLLAMA_NUM_PARALLEL was reached, rather than placing all the prompts in a queue to wait on a single runner.
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7321/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7321/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5662
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5662/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5662/comments
|
https://api.github.com/repos/ollama/ollama/issues/5662/events
|
https://github.com/ollama/ollama/pull/5662
| 2,406,659,729
|
PR_kwDOJ0Z1Ps51R2xs
| 5,662
|
fix system prompt
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-07-13T03:22:40
| 2024-07-13T04:04:46
| 2024-07-13T04:04:44
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5662",
"html_url": "https://github.com/ollama/ollama/pull/5662",
"diff_url": "https://github.com/ollama/ollama/pull/5662.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5662.patch",
"merged_at": "2024-07-13T04:04:44"
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5662/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5662/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7764
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7764/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7764/comments
|
https://api.github.com/repos/ollama/ollama/issues/7764/events
|
https://github.com/ollama/ollama/pull/7764
| 2,676,671,410
|
PR_kwDOJ0Z1Ps6CkDRu
| 7,764
|
Fix minor typo in import.md
|
{
"login": "iamrohitanshu",
"id": 85547195,
"node_id": "MDQ6VXNlcjg1NTQ3MTk1",
"avatar_url": "https://avatars.githubusercontent.com/u/85547195?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/iamrohitanshu",
"html_url": "https://github.com/iamrohitanshu",
"followers_url": "https://api.github.com/users/iamrohitanshu/followers",
"following_url": "https://api.github.com/users/iamrohitanshu/following{/other_user}",
"gists_url": "https://api.github.com/users/iamrohitanshu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/iamrohitanshu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iamrohitanshu/subscriptions",
"organizations_url": "https://api.github.com/users/iamrohitanshu/orgs",
"repos_url": "https://api.github.com/users/iamrohitanshu/repos",
"events_url": "https://api.github.com/users/iamrohitanshu/events{/privacy}",
"received_events_url": "https://api.github.com/users/iamrohitanshu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-20T17:36:08
| 2024-11-20T17:57:32
| 2024-11-20T17:57:32
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7764",
"html_url": "https://github.com/ollama/ollama/pull/7764",
"diff_url": "https://github.com/ollama/ollama/pull/7764.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7764.patch",
"merged_at": "2024-11-20T17:57:32"
}
|
changed 'containg' to 'containing'.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7764/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7764/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5386
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5386/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5386/comments
|
https://api.github.com/repos/ollama/ollama/issues/5386/events
|
https://github.com/ollama/ollama/issues/5386
| 2,381,960,350
|
I_kwDOJ0Z1Ps6N-dSe
| 5,386
|
Add environment variable for "read only" mode
|
{
"login": "steren",
"id": 360895,
"node_id": "MDQ6VXNlcjM2MDg5NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/360895?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/steren",
"html_url": "https://github.com/steren",
"followers_url": "https://api.github.com/users/steren/followers",
"following_url": "https://api.github.com/users/steren/following{/other_user}",
"gists_url": "https://api.github.com/users/steren/gists{/gist_id}",
"starred_url": "https://api.github.com/users/steren/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/steren/subscriptions",
"organizations_url": "https://api.github.com/users/steren/orgs",
"repos_url": "https://api.github.com/users/steren/repos",
"events_url": "https://api.github.com/users/steren/events{/privacy}",
"received_events_url": "https://api.github.com/users/steren/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 3
| 2024-06-29T21:07:53
| 2024-06-29T21:43:20
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When running as a server, Ollama currently exposes all Ollama features as [an API](https://github.com/ollama/ollama/blob/main/docs/api.md)
If run as a public API, API maintainers might want to only expose Ollama's generation and model listing capabilities, so that their endpoint is "read only", and not other endpoints that could mutate the server's state (push, pull, delete...)
We could imagine an environment variable like `OLLAMA_READ_ONLY` that would achieve the above.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5386/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5386/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/8663
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8663/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8663/comments
|
https://api.github.com/repos/ollama/ollama/issues/8663/events
|
https://github.com/ollama/ollama/pull/8663
| 2,818,399,336
|
PR_kwDOJ0Z1Ps6JXytP
| 8,663
|
Update README.md Adding DeepSeek to the table of models
|
{
"login": "teymuur",
"id": 64795612,
"node_id": "MDQ6VXNlcjY0Nzk1NjEy",
"avatar_url": "https://avatars.githubusercontent.com/u/64795612?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/teymuur",
"html_url": "https://github.com/teymuur",
"followers_url": "https://api.github.com/users/teymuur/followers",
"following_url": "https://api.github.com/users/teymuur/following{/other_user}",
"gists_url": "https://api.github.com/users/teymuur/gists{/gist_id}",
"starred_url": "https://api.github.com/users/teymuur/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/teymuur/subscriptions",
"organizations_url": "https://api.github.com/users/teymuur/orgs",
"repos_url": "https://api.github.com/users/teymuur/repos",
"events_url": "https://api.github.com/users/teymuur/events{/privacy}",
"received_events_url": "https://api.github.com/users/teymuur/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2025-01-29T14:34:27
| 2025-01-30T05:12:26
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8663",
"html_url": "https://github.com/ollama/ollama/pull/8663",
"diff_url": "https://github.com/ollama/ollama/pull/8663.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8663.patch",
"merged_at": null
}
|
This is just a minor change, I added DeepSeek R1 to the model library table. Only changed `README.md`.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8663/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8663/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8039
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8039/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8039/comments
|
https://api.github.com/repos/ollama/ollama/issues/8039/events
|
https://github.com/ollama/ollama/pull/8039
| 2,731,883,281
|
PR_kwDOJ0Z1Ps6EzP5A
| 8,039
|
win: builtin arm runner
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-12-11T05:43:29
| 2024-12-11T16:32:17
| 2024-12-11T16:32:13
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8039",
"html_url": "https://github.com/ollama/ollama/pull/8039",
"diff_url": "https://github.com/ollama/ollama/pull/8039.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8039.patch",
"merged_at": "2024-12-11T16:32:13"
}
|
The new build embeds the arm runner in the
main binary, so there is no longer a lib/ollama
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8039/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8039/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1875
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1875/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1875/comments
|
https://api.github.com/repos/ollama/ollama/issues/1875/events
|
https://github.com/ollama/ollama/pull/1875
| 2,073,103,979
|
PR_kwDOJ0Z1Ps5jng0P
| 1,875
|
Calculate overhead based number of gpu devices
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-01-09T20:24:26
| 2024-01-09T20:53:34
| 2024-01-09T20:53:33
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1875",
"html_url": "https://github.com/ollama/ollama/pull/1875",
"diff_url": "https://github.com/ollama/ollama/pull/1875.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1875.patch",
"merged_at": "2024-01-09T20:53:33"
}
|
The CUDA memory allocated for overhead is placed on a single GPU
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1875/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1875/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4472
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4472/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4472/comments
|
https://api.github.com/repos/ollama/ollama/issues/4472/events
|
https://github.com/ollama/ollama/issues/4472
| 2,299,878,366
|
I_kwDOJ0Z1Ps6JFVve
| 4,472
|
`llama3-chatqa` always returns `Empty reponse`
|
{
"login": "pnmartinez",
"id": 29891887,
"node_id": "MDQ6VXNlcjI5ODkxODg3",
"avatar_url": "https://avatars.githubusercontent.com/u/29891887?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pnmartinez",
"html_url": "https://github.com/pnmartinez",
"followers_url": "https://api.github.com/users/pnmartinez/followers",
"following_url": "https://api.github.com/users/pnmartinez/following{/other_user}",
"gists_url": "https://api.github.com/users/pnmartinez/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pnmartinez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pnmartinez/subscriptions",
"organizations_url": "https://api.github.com/users/pnmartinez/orgs",
"repos_url": "https://api.github.com/users/pnmartinez/repos",
"events_url": "https://api.github.com/users/pnmartinez/events{/privacy}",
"received_events_url": "https://api.github.com/users/pnmartinez/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 2
| 2024-05-16T09:39:59
| 2024-05-21T16:56:41
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
# Problem
I've been toying around with RAG using `ollama` and `llama-index`.
The results I am getting with `llama3 8b` are not that good, so I was happy to see `llama3-chatqa` being added in `v0.1.35`.
However, I always get "Empty response" using `llama3-chatqa`. Is there sth I am missing?
## Code
```py
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Settings
from llama_index.embeddings.ollama import OllamaEmbedding
from llama_index.llms.ollama import Ollama
documents = SimpleDirectoryReader("data").load_data()
# Tested nomic-embed-text and mxbai-embed-large
Settings.embed_model = OllamaEmbedding(model_name="mxbai-embed-large")
#llama3 instead of llama3-chatqa can provide answers - though sometimes incorrect
Settings.llm = Ollama(model="llama3-chatqa", request_timeout=360.0)
index = VectorStoreIndex.from_documents(
documents,
)
query_engine = index.as_query_engine()
response = query_engine.query(query)
# "Empty Response" always when using llama3-chatqa
```
### OS
Linux, Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.135, 0.136, 0.138
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4472/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4472/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/717
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/717/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/717/comments
|
https://api.github.com/repos/ollama/ollama/issues/717/events
|
https://github.com/ollama/ollama/issues/717
| 1,930,346,259
|
I_kwDOJ0Z1Ps5zDr8T
| 717
|
Change system model when running as a service
|
{
"login": "wifiuk",
"id": 3785545,
"node_id": "MDQ6VXNlcjM3ODU1NDU=",
"avatar_url": "https://avatars.githubusercontent.com/u/3785545?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wifiuk",
"html_url": "https://github.com/wifiuk",
"followers_url": "https://api.github.com/users/wifiuk/followers",
"following_url": "https://api.github.com/users/wifiuk/following{/other_user}",
"gists_url": "https://api.github.com/users/wifiuk/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wifiuk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wifiuk/subscriptions",
"organizations_url": "https://api.github.com/users/wifiuk/orgs",
"repos_url": "https://api.github.com/users/wifiuk/repos",
"events_url": "https://api.github.com/users/wifiuk/events{/privacy}",
"received_events_url": "https://api.github.com/users/wifiuk/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 4
| 2023-10-06T14:51:44
| 2023-10-06T15:50:37
| 2023-10-06T14:56:36
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
If I originally was messing around with Llama 7b and got it running as a background service, how do I change the model that it uses?
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/717/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/717/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2282
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2282/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2282/comments
|
https://api.github.com/repos/ollama/ollama/issues/2282/events
|
https://github.com/ollama/ollama/issues/2282
| 2,108,819,234
|
I_kwDOJ0Z1Ps59sgci
| 2,282
|
Slow response with concurrent requests
|
{
"login": "oxaronick",
"id": 86964206,
"node_id": "MDQ6VXNlcjg2OTY0MjA2",
"avatar_url": "https://avatars.githubusercontent.com/u/86964206?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/oxaronick",
"html_url": "https://github.com/oxaronick",
"followers_url": "https://api.github.com/users/oxaronick/followers",
"following_url": "https://api.github.com/users/oxaronick/following{/other_user}",
"gists_url": "https://api.github.com/users/oxaronick/gists{/gist_id}",
"starred_url": "https://api.github.com/users/oxaronick/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/oxaronick/subscriptions",
"organizations_url": "https://api.github.com/users/oxaronick/orgs",
"repos_url": "https://api.github.com/users/oxaronick/repos",
"events_url": "https://api.github.com/users/oxaronick/events{/privacy}",
"received_events_url": "https://api.github.com/users/oxaronick/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-01-30T21:31:26
| 2024-03-11T22:34:39
| 2024-03-11T22:32:55
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Ollama is great. It makes deploying LLMs easy. However, I have an issue with sending two requests to Ollama within a second or so of each other.
When I do this, Ollama usually responds to one of the requests fine, but the CPU usage jumps by at least 100% and the other request doesn't get a response. Sometimes it will after many minutes, but I don't always wait around to find out. Responses are normally returned within 2s of a request.
I'm running Ollama on an A100 with 80GB of VRAM and according to `nvidia-smi` Ollama is only using ~7GB.
I would expect it to handle one request, then handle the other, both on the GPU but I'm wondering if the second request is causing Ollama to try to run something on the CPU.
How can I configure Ollama to handle concurrent (or near-concurrent) requests better?
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2282/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2282/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4582
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4582/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4582/comments
|
https://api.github.com/repos/ollama/ollama/issues/4582/events
|
https://github.com/ollama/ollama/issues/4582
| 2,311,815,287
|
I_kwDOJ0Z1Ps6Jy4B3
| 4,582
|
Add Alpaca to 'Community Integrations'
|
{
"login": "Jeffser",
"id": 69224322,
"node_id": "MDQ6VXNlcjY5MjI0MzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/69224322?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jeffser",
"html_url": "https://github.com/Jeffser",
"followers_url": "https://api.github.com/users/Jeffser/followers",
"following_url": "https://api.github.com/users/Jeffser/following{/other_user}",
"gists_url": "https://api.github.com/users/Jeffser/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jeffser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jeffser/subscriptions",
"organizations_url": "https://api.github.com/users/Jeffser/orgs",
"repos_url": "https://api.github.com/users/Jeffser/repos",
"events_url": "https://api.github.com/users/Jeffser/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jeffser/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-05-23T02:37:48
| 2024-12-25T04:25:14
| 2024-12-25T04:25:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi, I've been working on this app called [Alpaca](https://github.com/Jeffser/Alpaca), it's an Ollama client that uses GTK and Adwaita, it's meant for Gnome users, it comes with an integrated Ollama instance.
It's available only on [Flathub](https://flathub.org/apps/com.jeffser.Alpaca) right now.
|
{
"login": "Jeffser",
"id": 69224322,
"node_id": "MDQ6VXNlcjY5MjI0MzIy",
"avatar_url": "https://avatars.githubusercontent.com/u/69224322?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jeffser",
"html_url": "https://github.com/Jeffser",
"followers_url": "https://api.github.com/users/Jeffser/followers",
"following_url": "https://api.github.com/users/Jeffser/following{/other_user}",
"gists_url": "https://api.github.com/users/Jeffser/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Jeffser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Jeffser/subscriptions",
"organizations_url": "https://api.github.com/users/Jeffser/orgs",
"repos_url": "https://api.github.com/users/Jeffser/repos",
"events_url": "https://api.github.com/users/Jeffser/events{/privacy}",
"received_events_url": "https://api.github.com/users/Jeffser/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4582/reactions",
"total_count": 7,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 7,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4582/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8654
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8654/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8654/comments
|
https://api.github.com/repos/ollama/ollama/issues/8654/events
|
https://github.com/ollama/ollama/issues/8654
| 2,817,986,286
|
I_kwDOJ0Z1Ps6n9w7u
| 8,654
|
Available memory check should be disabled when mmap is in use
|
{
"login": "outis151",
"id": 11805613,
"node_id": "MDQ6VXNlcjExODA1NjEz",
"avatar_url": "https://avatars.githubusercontent.com/u/11805613?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/outis151",
"html_url": "https://github.com/outis151",
"followers_url": "https://api.github.com/users/outis151/followers",
"following_url": "https://api.github.com/users/outis151/following{/other_user}",
"gists_url": "https://api.github.com/users/outis151/gists{/gist_id}",
"starred_url": "https://api.github.com/users/outis151/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/outis151/subscriptions",
"organizations_url": "https://api.github.com/users/outis151/orgs",
"repos_url": "https://api.github.com/users/outis151/repos",
"events_url": "https://api.github.com/users/outis151/events{/privacy}",
"received_events_url": "https://api.github.com/users/outis151/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 1
| 2025-01-29T11:48:38
| 2025-01-29T13:07:03
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
With mmap enabled, a model does not need to fit in the system RAM. Therefore the associated check should be disabled in this case.
### OS
Linux
### GPU
_No response_
### CPU
Intel
### Ollama version
0.5.7
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8654/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8654/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/593
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/593/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/593/comments
|
https://api.github.com/repos/ollama/ollama/issues/593/events
|
https://github.com/ollama/ollama/pull/593
| 1,912,249,816
|
PR_kwDOJ0Z1Ps5bKh_h
| 593
|
update install.sh
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-09-25T20:40:09
| 2023-09-25T21:09:41
| 2023-09-25T21:09:40
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/593",
"html_url": "https://github.com/ollama/ollama/pull/593",
"diff_url": "https://github.com/ollama/ollama/pull/593.diff",
"patch_url": "https://github.com/ollama/ollama/pull/593.patch",
"merged_at": "2023-09-25T21:09:40"
}
| null |
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/593/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/593/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3069
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3069/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3069/comments
|
https://api.github.com/repos/ollama/ollama/issues/3069/events
|
https://github.com/ollama/ollama/pull/3069
| 2,180,404,073
|
PR_kwDOJ0Z1Ps5pUBQT
| 3,069
|
use `-trimpath` when building releases
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-11T22:56:50
| 2024-03-11T22:58:47
| 2024-03-11T22:58:47
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3069",
"html_url": "https://github.com/ollama/ollama/pull/3069",
"diff_url": "https://github.com/ollama/ollama/pull/3069.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3069.patch",
"merged_at": "2024-03-11T22:58:47"
}
|
Fixes https://github.com/ollama/ollama/issues/2958
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3069/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3069/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4709
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4709/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4709/comments
|
https://api.github.com/repos/ollama/ollama/issues/4709/events
|
https://github.com/ollama/ollama/issues/4709
| 2,324,189,173
|
I_kwDOJ0Z1Ps6KiE_1
| 4,709
|
Code models like codestral should have a lower temperature
|
{
"login": "DuckyBlender",
"id": 42645784,
"node_id": "MDQ6VXNlcjQyNjQ1Nzg0",
"avatar_url": "https://avatars.githubusercontent.com/u/42645784?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DuckyBlender",
"html_url": "https://github.com/DuckyBlender",
"followers_url": "https://api.github.com/users/DuckyBlender/followers",
"following_url": "https://api.github.com/users/DuckyBlender/following{/other_user}",
"gists_url": "https://api.github.com/users/DuckyBlender/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DuckyBlender/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DuckyBlender/subscriptions",
"organizations_url": "https://api.github.com/users/DuckyBlender/orgs",
"repos_url": "https://api.github.com/users/DuckyBlender/repos",
"events_url": "https://api.github.com/users/DuckyBlender/events{/privacy}",
"received_events_url": "https://api.github.com/users/DuckyBlender/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396200,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aaA",
"url": "https://api.github.com/repos/ollama/ollama/labels/feature%20request",
"name": "feature request",
"color": "a2eeef",
"default": false,
"description": "New feature or request"
}
] |
open
| false
| null |
[] | null | 6
| 2024-05-29T20:25:45
| 2024-07-03T12:44:45
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
This makes the code more correct
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4709/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4709/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2543
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2543/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2543/comments
|
https://api.github.com/repos/ollama/ollama/issues/2543/events
|
https://github.com/ollama/ollama/issues/2543
| 2,139,017,719
|
I_kwDOJ0Z1Ps5_ftH3
| 2,543
|
Ollama crashes on Llava on windows after passing image path OOM
|
{
"login": "jkfnc",
"id": 56741357,
"node_id": "MDQ6VXNlcjU2NzQxMzU3",
"avatar_url": "https://avatars.githubusercontent.com/u/56741357?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jkfnc",
"html_url": "https://github.com/jkfnc",
"followers_url": "https://api.github.com/users/jkfnc/followers",
"following_url": "https://api.github.com/users/jkfnc/following{/other_user}",
"gists_url": "https://api.github.com/users/jkfnc/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jkfnc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jkfnc/subscriptions",
"organizations_url": "https://api.github.com/users/jkfnc/orgs",
"repos_url": "https://api.github.com/users/jkfnc/repos",
"events_url": "https://api.github.com/users/jkfnc/events{/privacy}",
"received_events_url": "https://api.github.com/users/jkfnc/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-02-16T17:04:10
| 2024-05-10T01:10:13
| 2024-05-10T01:10:13
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Ollama crashes when tried with this for llava
What's in this image? C:\Users\test\Downloads\pexels-oleksandr-p-321552.jpg
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2543/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2543/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6249
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6249/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6249/comments
|
https://api.github.com/repos/ollama/ollama/issues/6249/events
|
https://github.com/ollama/ollama/issues/6249
| 2,454,650,191
|
I_kwDOJ0Z1Ps6STv1P
| 6,249
|
ollama run llama3.1 command outputs nonsense
|
{
"login": "erfan-khalaji",
"id": 54494671,
"node_id": "MDQ6VXNlcjU0NDk0Njcx",
"avatar_url": "https://avatars.githubusercontent.com/u/54494671?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/erfan-khalaji",
"html_url": "https://github.com/erfan-khalaji",
"followers_url": "https://api.github.com/users/erfan-khalaji/followers",
"following_url": "https://api.github.com/users/erfan-khalaji/following{/other_user}",
"gists_url": "https://api.github.com/users/erfan-khalaji/gists{/gist_id}",
"starred_url": "https://api.github.com/users/erfan-khalaji/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erfan-khalaji/subscriptions",
"organizations_url": "https://api.github.com/users/erfan-khalaji/orgs",
"repos_url": "https://api.github.com/users/erfan-khalaji/repos",
"events_url": "https://api.github.com/users/erfan-khalaji/events{/privacy}",
"received_events_url": "https://api.github.com/users/erfan-khalaji/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 4
| 2024-08-08T01:20:42
| 2024-08-09T05:30:35
| 2024-08-08T22:15:41
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
After installing Ollama on macOS, I attempted to run the model using the `ollama run llama3.1` command. However, when I tried running the model by inputting "hello," it returned what appeared to be random ASCII characters, which didn't make sense. I then used `ollama pull llama2` and `ollama pull llama3` to see if that would resolve the issue. While `ollama run llama3.1` still resulted in nonsensical output, `ollama run llama3` and `ollama run llama2` worked perfectly. I thought I would share my experience in case it helps someone facing a similar issue.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
3.1
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6249/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6249/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7000
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7000/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7000/comments
|
https://api.github.com/repos/ollama/ollama/issues/7000/events
|
https://github.com/ollama/ollama/issues/7000
| 2,552,479,448
|
I_kwDOJ0Z1Ps6YI77Y
| 7,000
|
Respect the Access-Control-Allow-Private-Network in Chrome
|
{
"login": "PaulKinlan",
"id": 45510,
"node_id": "MDQ6VXNlcjQ1NTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/45510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/PaulKinlan",
"html_url": "https://github.com/PaulKinlan",
"followers_url": "https://api.github.com/users/PaulKinlan/followers",
"following_url": "https://api.github.com/users/PaulKinlan/following{/other_user}",
"gists_url": "https://api.github.com/users/PaulKinlan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/PaulKinlan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PaulKinlan/subscriptions",
"organizations_url": "https://api.github.com/users/PaulKinlan/orgs",
"repos_url": "https://api.github.com/users/PaulKinlan/repos",
"events_url": "https://api.github.com/users/PaulKinlan/events{/privacy}",
"received_events_url": "https://api.github.com/users/PaulKinlan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2024-09-27T09:40:32
| 2024-09-27T09:40:32
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm testing ollama from an environment hosted on repl.it running in the browser. I have a local version of ollama running with the `OLLAMA_HOST=*, https://48a38c67-3eda-41cf-804b-e04fba963d55-00-14tthqngapcgy.worf.replit.dev` (other variations result in the same error).
It looks like the new `Access-Control-Allow-Private-Network` is starting to be enforced when accessing the ollama server from a non-localhost origin.
Chrome: 129.0.6668.58 - Works
Chrome: 131.0.6742.0 - Fails with the error `Access to fetch at 'http://127.0.0.1:11434/api/generate' from origin 'https://48a38c67-3eda-41cf-804b-e04fba963d55-00-14tthqngapcgy.worf.replit.dev' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Private-Network' header was present in the preflight response for this private network request targeting the `local` address space.`
It looks like somewhere between these two versions, we (Chrome) started to enforce requiring the response from locally hosted ollama servers to require `Access-Control-Allow-Private-Network: true` HTTP Header.
### OS
Linux, macOS, Windows
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.12
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7000/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 2
}
|
https://api.github.com/repos/ollama/ollama/issues/7000/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/7760
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7760/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7760/comments
|
https://api.github.com/repos/ollama/ollama/issues/7760/events
|
https://github.com/ollama/ollama/issues/7760
| 2,675,873,873
|
I_kwDOJ0Z1Ps6ffphR
| 7,760
|
qwen2.5-coder isn't utilizing the GPU
|
{
"login": "Novido",
"id": 4237670,
"node_id": "MDQ6VXNlcjQyMzc2NzA=",
"avatar_url": "https://avatars.githubusercontent.com/u/4237670?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Novido",
"html_url": "https://github.com/Novido",
"followers_url": "https://api.github.com/users/Novido/followers",
"following_url": "https://api.github.com/users/Novido/following{/other_user}",
"gists_url": "https://api.github.com/users/Novido/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Novido/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Novido/subscriptions",
"organizations_url": "https://api.github.com/users/Novido/orgs",
"repos_url": "https://api.github.com/users/Novido/repos",
"events_url": "https://api.github.com/users/Novido/events{/privacy}",
"received_events_url": "https://api.github.com/users/Novido/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-11-20T13:22:24
| 2024-12-14T16:41:12
| 2024-12-14T16:41:12
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When running a query with qwen2.5 (32b) it only uses the CPU for some reason. I can switch to another model (llama, phi, gemma) and they all utilize the GPU.
Reproduce:
1. Run docker in an Ubuntu container on an standalone server
2. Install Ollama and Open-Webui
3. Download models qwen2.5-coder:32b and another model like llama3.2
4. Run a query on llama3.2 and use nvtop, where you have ollama installed, to see GPU usage
5. Run a query on qwen2.5 and use nvtop to see GPU usage. It should show similar to step 4 but it doesn't.
### OS
Linux, Docker
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.4.1
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7760/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7760/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6887
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6887/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6887/comments
|
https://api.github.com/repos/ollama/ollama/issues/6887/events
|
https://github.com/ollama/ollama/issues/6887
| 2,537,641,050
|
I_kwDOJ0Z1Ps6XQVRa
| 6,887
|
`temperature` for reader-lm should be 0
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
open
| false
| null |
[] | null | 3
| 2024-09-20T01:29:27
| 2024-10-05T21:40:15
| null |
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
[reader-lm](https://ollama.com/library/reader-lm) converts HTML to Markdown but with the default temperature, it hallucinates content: https://github.com/ollama/ollama/issues/6875. Setting `temperature` to zero appears to resolve this. This would be nice to have in the model config in the ollama library.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6887/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6887/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/5284
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5284/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5284/comments
|
https://api.github.com/repos/ollama/ollama/issues/5284/events
|
https://github.com/ollama/ollama/pull/5284
| 2,373,688,590
|
PR_kwDOJ0Z1Ps5zjQIb
| 5,284
|
tools
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 11
| 2024-06-25T21:26:54
| 2024-07-25T09:50:57
| 2024-07-16T01:03:38
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/5284",
"html_url": "https://github.com/ollama/ollama/pull/5284",
"diff_url": "https://github.com/ollama/ollama/pull/5284.diff",
"patch_url": "https://github.com/ollama/ollama/pull/5284.patch",
"merged_at": "2024-07-16T01:03:38"
}
|
```
curl -s 127.0.0.1:11434/api/chat -d '{
"model": "mike/mistral",
"messages": [
{
"role": "user",
"content": "What's the weather like today in Paris?"
},
{
"role": "assistant",
"tool_calls": [
{
"id": "89a1e453-0bce-4de3-a456-c54bed09c520",
"type": "function",
"function": {
"name": "get_current_weather",
"arguments": {
"location": "Paris, France",
"format": "celsius"
}
}
}
]
},
{
"role": "tool",
"tool_call_id": "89a1e453-0bce-4de3-a456-c54bed09c520",
"content": "22"
},
{
"role": "assistant",
"content": "The weather in Paris is 22 degrees celsius."
},
{
"role": "user",
"content": "What's the weather like today in San Francisco and Toronto?"
}
],
"tools": [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA"
},
"format": {
"type": "string",
"enum": [
"celsius",
"fahrenheit"
],
"description": "The temperature unit to use. Infer this from the users location."
}
},
"required": [
"location",
"format"
]
}
}
}
],
"stream": false,
"options": {
"temperature": 0
}
}'
```
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5284/reactions",
"total_count": 59,
"+1": 30,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 18,
"rocket": 11,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5284/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3793
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3793/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3793/comments
|
https://api.github.com/repos/ollama/ollama/issues/3793/events
|
https://github.com/ollama/ollama/issues/3793
| 2,254,948,247
|
I_kwDOJ0Z1Ps6GZ8eX
| 3,793
|
Can`t get correct response via API if the content has Chinese words
|
{
"login": "wei-z-git",
"id": 32572815,
"node_id": "MDQ6VXNlcjMyNTcyODE1",
"avatar_url": "https://avatars.githubusercontent.com/u/32572815?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/wei-z-git",
"html_url": "https://github.com/wei-z-git",
"followers_url": "https://api.github.com/users/wei-z-git/followers",
"following_url": "https://api.github.com/users/wei-z-git/following{/other_user}",
"gists_url": "https://api.github.com/users/wei-z-git/gists{/gist_id}",
"starred_url": "https://api.github.com/users/wei-z-git/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wei-z-git/subscriptions",
"organizations_url": "https://api.github.com/users/wei-z-git/orgs",
"repos_url": "https://api.github.com/users/wei-z-git/repos",
"events_url": "https://api.github.com/users/wei-z-git/events{/privacy}",
"received_events_url": "https://api.github.com/users/wei-z-git/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-04-21T07:43:39
| 2024-06-04T22:33:47
| 2024-06-04T22:33:47
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
When I try to get response with chinese words using API, like
## Use API
### In Chinese
Request
```shell
$ curl http://localhost:11434/api/chat -d '{ "model": "llama3", "messages":[{"role":"user","content": "为什么天空是蓝色的"}] ,"stream": false}'
```
It seems the llama didn`t get my point.
Response:
```
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 623 100 520 100 103 352 69 0:00:01 0:00:01 --:--:-- 422{"model":"llama3","created_at":"2024-04-21T07:36:14.7121197Z","message":{"role":"assistant","content":"It looks like you're having some fun with the characters! Unfortunately, I'm a language model, I don't understand what's going on with all those symbols. Can you please rephrase or ask me a question in plain English? I'd be happy to help if I can! 😊"},"done":true,"total_duration":1260881000,"load_duration":2802800,"prompt_eval_count":15,"prompt_eval_duration":361040000,"eval_count":60,"eval_duration":894948000}
```
### In English
And if I ask the same question in English, it can understand:
```
curl http://localhost:11434/api/chat -d '{ "model": "llama3", "messages":[{"role":"user","content": "why the sky is blue"}] ,"stream": false}'
```
Res:
```shell
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2211 0 2107 100 104 342 16 0:00:06 0:00:06 --:--:-- 537{"model":"llama3","created_at":"2024-04-21T07:34:50.999231Z","message":{"role":"assistant","content":"The sky appears blue to our eyes because of a fascinating phenomenon called scattering. Here's why:\n\n1. **Light from the sun**: When sunlight enters Earth's atmosphere, it consists of all the colors of the visible spectrum (red, orange, yellow, green, blue, indigo, and violet).\n2. **Atmospheric particles**: The atmosphere is filled with tiny molecules of gases like nitrogen (N2) and oxygen (O2), as well as aerosols like dust, water vapor, and pollutants.\n3. **Scattering**: When light from the sun encounters these atmospheric particles, it scatters in all directions. This scattering occurs because the particles are much smaller than the wavelength of light.\n4. **Blue light dominates**: The shorter wavelengths of blue light (around 450-495 nanometers) scatter more efficiently than longer wavelengths like red and orange light. This is because the smaller particles are better at interacting with the shorter, higher-energy blue photons.\n5. **Our eyes perceive the sky**: As we look up at the sky, our eyes detect the scattered blue light, which appears to us as the color blue.\n\nSome interesting facts about why the sky appears blue:\n\n* The exact shade of blue can vary depending on atmospheric conditions like pollution levels and water vapor content.\n* During sunrise and sunset, when the sun's rays have to travel longer distances through the atmosphere, the scattering effect is more pronounced, making the sky appear more red or orange due to the dominance of longer wavelengths.\n* In space, without an atmosphere, the sky would appear black because there are no particles to scatter light.\n\nSo, to summarize: The sky appears blue because the shorter wavelengths of blue light scatter more efficiently than other colors when interacting with atmospheric particles, making it the most visible color to our eyes."},"done":true,"total_duration":5936881200,"load_duration":2144400,"prompt_eval_count":10,"prompt_eval_duration":365781000,"eval_count":363,"eval_duration":5566471000}
```
## Use terminal
But with the same model, when I use terminal, it can works even I use chinese:

### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.32
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3793/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3793/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8010
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8010/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8010/comments
|
https://api.github.com/repos/ollama/ollama/issues/8010/events
|
https://github.com/ollama/ollama/issues/8010
| 2,726,545,757
|
I_kwDOJ0Z1Ps6ig8ld
| 8,010
|
Llama 3.3 still has metadata from Llama 3.1
|
{
"login": "SamuelHafner",
"id": 45936995,
"node_id": "MDQ6VXNlcjQ1OTM2OTk1",
"avatar_url": "https://avatars.githubusercontent.com/u/45936995?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SamuelHafner",
"html_url": "https://github.com/SamuelHafner",
"followers_url": "https://api.github.com/users/SamuelHafner/followers",
"following_url": "https://api.github.com/users/SamuelHafner/following{/other_user}",
"gists_url": "https://api.github.com/users/SamuelHafner/gists{/gist_id}",
"starred_url": "https://api.github.com/users/SamuelHafner/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/SamuelHafner/subscriptions",
"organizations_url": "https://api.github.com/users/SamuelHafner/orgs",
"repos_url": "https://api.github.com/users/SamuelHafner/repos",
"events_url": "https://api.github.com/users/SamuelHafner/events{/privacy}",
"received_events_url": "https://api.github.com/users/SamuelHafner/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 2
| 2024-12-09T09:55:59
| 2024-12-11T09:33:58
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hello,
The model of LLAMA3.3 still has metadata from LLAMA3.1

btw. What type of Quantization are you using?
Thank you
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
0.5.1
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8010/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8010/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/2889
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2889/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2889/comments
|
https://api.github.com/repos/ollama/ollama/issues/2889/events
|
https://github.com/ollama/ollama/issues/2889
| 2,165,127,969
|
I_kwDOJ0Z1Ps6BDTsh
| 2,889
|
Windows CUDA OOM GTX 1650 switching models between mistral and gemma
|
{
"login": "qianjun1985",
"id": 65411571,
"node_id": "MDQ6VXNlcjY1NDExNTcx",
"avatar_url": "https://avatars.githubusercontent.com/u/65411571?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/qianjun1985",
"html_url": "https://github.com/qianjun1985",
"followers_url": "https://api.github.com/users/qianjun1985/followers",
"following_url": "https://api.github.com/users/qianjun1985/following{/other_user}",
"gists_url": "https://api.github.com/users/qianjun1985/gists{/gist_id}",
"starred_url": "https://api.github.com/users/qianjun1985/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/qianjun1985/subscriptions",
"organizations_url": "https://api.github.com/users/qianjun1985/orgs",
"repos_url": "https://api.github.com/users/qianjun1985/repos",
"events_url": "https://api.github.com/users/qianjun1985/events{/privacy}",
"received_events_url": "https://api.github.com/users/qianjun1985/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 6
| 2024-03-03T03:53:31
| 2024-05-18T03:04:45
| 2024-05-18T03:04:44
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
When I use an AI translator program that can use ollama to load local Llms, at first it worked well with one model (mistral), but after I downloaded another, memma 7b, both models failed to work. The UI of that translator program shows error information as follows:
Failed to call API, error sending request for url (http://127.0.0.1:11434/v1/chat/completions),error trying to connect, TCP connect error or Error 10061
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2889/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2889/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4534
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4534/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4534/comments
|
https://api.github.com/repos/ollama/ollama/issues/4534/events
|
https://github.com/ollama/ollama/issues/4534
| 2,305,150,640
|
I_kwDOJ0Z1Ps6JZc6w
| 4,534
|
Ctrl+Backspace doesn't delete full words in `ollama run ...` mode
|
{
"login": "DeflateAwning",
"id": 11021263,
"node_id": "MDQ6VXNlcjExMDIxMjYz",
"avatar_url": "https://avatars.githubusercontent.com/u/11021263?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DeflateAwning",
"html_url": "https://github.com/DeflateAwning",
"followers_url": "https://api.github.com/users/DeflateAwning/followers",
"following_url": "https://api.github.com/users/DeflateAwning/following{/other_user}",
"gists_url": "https://api.github.com/users/DeflateAwning/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DeflateAwning/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DeflateAwning/subscriptions",
"organizations_url": "https://api.github.com/users/DeflateAwning/orgs",
"repos_url": "https://api.github.com/users/DeflateAwning/repos",
"events_url": "https://api.github.com/users/DeflateAwning/events{/privacy}",
"received_events_url": "https://api.github.com/users/DeflateAwning/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 7
| 2024-05-20T05:53:17
| 2024-05-22T23:40:24
| 2024-05-22T05:49:37
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
In `ollama run <model>` mode, pressing Ctrl+Backspace should delete a whole word backwards.
To implement this, "listen" for "Ctrl+H" presses (as that's what Ctrl+Backspace is sent as)
### OS
Linux
### GPU
_No response_
### CPU
_No response_
### Ollama version
0.1.38
|
{
"login": "joshyan1",
"id": 76125168,
"node_id": "MDQ6VXNlcjc2MTI1MTY4",
"avatar_url": "https://avatars.githubusercontent.com/u/76125168?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/joshyan1",
"html_url": "https://github.com/joshyan1",
"followers_url": "https://api.github.com/users/joshyan1/followers",
"following_url": "https://api.github.com/users/joshyan1/following{/other_user}",
"gists_url": "https://api.github.com/users/joshyan1/gists{/gist_id}",
"starred_url": "https://api.github.com/users/joshyan1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joshyan1/subscriptions",
"organizations_url": "https://api.github.com/users/joshyan1/orgs",
"repos_url": "https://api.github.com/users/joshyan1/repos",
"events_url": "https://api.github.com/users/joshyan1/events{/privacy}",
"received_events_url": "https://api.github.com/users/joshyan1/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4534/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4534/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3049
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3049/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3049/comments
|
https://api.github.com/repos/ollama/ollama/issues/3049/events
|
https://github.com/ollama/ollama/pull/3049
| 2,177,924,627
|
PR_kwDOJ0Z1Ps5pLde3
| 3,049
|
Disable execstack for amd libraries
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2024-03-10T22:02:37
| 2024-03-11T16:22:00
| 2024-03-11T16:21:59
|
MEMBER
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3049",
"html_url": "https://github.com/ollama/ollama/pull/3049",
"diff_url": "https://github.com/ollama/ollama/pull/3049.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3049.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3049/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3049/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/6289
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6289/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6289/comments
|
https://api.github.com/repos/ollama/ollama/issues/6289/events
|
https://github.com/ollama/ollama/issues/6289
| 2,458,446,849
|
I_kwDOJ0Z1Ps6SiOwB
| 6,289
|
some models crash on rocm (7900XT)
|
{
"login": "markg85",
"id": 49061,
"node_id": "MDQ6VXNlcjQ5MDYx",
"avatar_url": "https://avatars.githubusercontent.com/u/49061?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/markg85",
"html_url": "https://github.com/markg85",
"followers_url": "https://api.github.com/users/markg85/followers",
"following_url": "https://api.github.com/users/markg85/following{/other_user}",
"gists_url": "https://api.github.com/users/markg85/gists{/gist_id}",
"starred_url": "https://api.github.com/users/markg85/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/markg85/subscriptions",
"organizations_url": "https://api.github.com/users/markg85/orgs",
"repos_url": "https://api.github.com/users/markg85/repos",
"events_url": "https://api.github.com/users/markg85/events{/privacy}",
"received_events_url": "https://api.github.com/users/markg85/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 5755339642,
"node_id": "LA_kwDOJ0Z1Ps8AAAABVwuDeg",
"url": "https://api.github.com/repos/ollama/ollama/labels/linux",
"name": "linux",
"color": "516E70",
"default": false,
"description": ""
},
{
"id": 6433346500,
"node_id": "LA_kwDOJ0Z1Ps8AAAABf3UTxA",
"url": "https://api.github.com/repos/ollama/ollama/labels/amd",
"name": "amd",
"color": "000000",
"default": false,
"description": "Issues relating to AMD GPUs and ROCm"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 12
| 2024-08-09T18:21:38
| 2024-10-23T13:23:28
| 2024-10-23T13:23:27
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I was trying to run the (new) embedding example:
```
curl http://10.0.3.22:11434/api/embed -d '{
"model": "all-minilm",
"input": ["Why is the sky blue?", "Why is the grass green?"]
}'
```
Which triggered a crash (i did pull the model first). Note that it crashes for some models but works for others. Llama 3.1 works just fine for instance.
The output you can probably do something with:
```
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.014+02:00 level=DEBUG source=amd_linux.go:440 msg="updating rocm free memory" gpu=0 name=1002:744c before="19.2 GiB" now="19.2 GiB"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.014+02:00 level=DEBUG source=sched.go:181 msg="updating default concurrency" OLLAMA_MAX_LOADED_MODELS=0x604b1c442c80 gpu_count=1
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=sched.go:219 msg="loading first model" model=/var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=memory.go:101 msg=evaluating library=rocm gpu_count=1 available="[19.2 GiB]"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=INFO source=sched.go:710 msg="new model will fit in available VRAM in single GPU, loading" model=/var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662 gpu=0 parallel=4 available=20665856000 required="505.5 MiB"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=server.go:101 msg="system memory" total="62.7 GiB" free="59.4 GiB" free_swap="0 B"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=memory.go:101 msg=evaluating library=rocm gpu_count=1 available="[19.2 GiB]"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=INFO source=memory.go:309 msg="offload to rocm" layers.requested=-1 layers.model=7 layers.offload=7 layers.split="" memory.available="[19.2 GiB]" memory.required.full="505.5 MiB" memory.required.partial="505.5 MiB" memory.required.kv="768.0 KiB" memory.required.allocations="[505.5 MiB]" memory.weights.total="21.1 MiB" memory.weights.repeating="17179869184.0 GiB" memory.weights.nonrepeating="22.4 MiB" memory.graph.full="1.5 MiB" memory.graph.partial="1.5 MiB"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/tmp/ollama213676016/runners/cpu/ollama_llama_server
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/tmp/ollama213676016/runners/rocm/ollama_llama_server
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/tmp/ollama213676016/runners/cpu/ollama_llama_server
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.018+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/tmp/ollama213676016/runners/rocm/ollama_llama_server
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.021+02:00 level=INFO source=server.go:392 msg="starting llama server" cmd="/tmp/ollama213676016/runners/rocm/ollama_llama_server --model /var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662 --ctx-size 1024 --batch-size 512 --embedding --log-disable --n-gpu-layers 7 --verbose --parallel 4 --port 35631"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.021+02:00 level=DEBUG source=server.go:409 msg=subprocess environment="[PATH=/home/mark/.nvm/versions/node/v20.8.0/bin:/home/mark/kde/src/kdesrc-build:/home/mark/bin:/usr/local/bin:/opt/rocm/bin:/home/mark/.local/bin:/opt/miniconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/lib/jvm/default/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl:/opt/rocm/bin:/usr/lib/rustup/bin LD_LIBRARY_PATH=/opt/rocm/lib:/tmp/ollama213676016/runners/rocm:/tmp/ollama213676016/runners HIP_VISIBLE_DEVICES=0]"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.021+02:00 level=INFO source=sched.go:445 msg="loaded runners" count=1
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.021+02:00 level=INFO source=server.go:592 msg="waiting for llama runner to start responding"
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.022+02:00 level=INFO source=server.go:626 msg="waiting for server to become available" status="llm server error"
Aug 09 20:08:14 newphobos ollama[5524]: INFO [main] build info | build=3535 commit="1e6f6554a" tid="135273391656000" timestamp=1723226894
Aug 09 20:08:14 newphobos ollama[5524]: INFO [main] system info | n_threads=16 n_threads_batch=-1 system_info="AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | AVX512_BF16 = 0 | FMA = 0 | NEON = 0 | SVE = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | MATMUL_INT8 = 0 | LLAMAFILE = 1 | " tid="135273391656000" timestamp=1723226894 total_threads=32
Aug 09 20:08:14 newphobos ollama[5524]: INFO [main] HTTP server listening | hostname="127.0.0.1" n_threads_http="31" port="35631" tid="135273391656000" timestamp=1723226894
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: loaded meta data with 23 key-value pairs and 101 tensors from /var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662 (version GGUF V3 (latest))
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 0: general.architecture str = bert
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 1: general.name str = all-MiniLM-L6-v2
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 2: bert.block_count u32 = 6
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 3: bert.context_length u32 = 512
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 4: bert.embedding_length u32 = 384
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 5: bert.feed_forward_length u32 = 1536
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 6: bert.attention.head_count u32 = 12
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 7: bert.attention.layer_norm_epsilon f32 = 0.000000
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 8: general.file_type u32 = 1
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 9: bert.attention.causal bool = false
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 10: bert.pooling_type u32 = 1
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 11: tokenizer.ggml.token_type_count u32 = 2
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 12: tokenizer.ggml.bos_token_id u32 = 101
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 13: tokenizer.ggml.eos_token_id u32 = 102
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 14: tokenizer.ggml.model str = bert
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,30522] = ["[PAD]", "[unused0]", "[unused1]", "...
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 16: tokenizer.ggml.scores arr[f32,30522] = [-1000.000000, -1000.000000, -1000.00...
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 17: tokenizer.ggml.token_type arr[i32,30522] = [3, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 18: tokenizer.ggml.unknown_token_id u32 = 100
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 19: tokenizer.ggml.seperator_token_id u32 = 102
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 20: tokenizer.ggml.padding_token_id u32 = 0
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 21: tokenizer.ggml.cls_token_id u32 = 101
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - kv 22: tokenizer.ggml.mask_token_id u32 = 103
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - type f32: 63 tensors
Aug 09 20:08:14 newphobos ollama[5501]: llama_model_loader: - type f16: 38 tensors
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_vocab: special tokens cache size = 5
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_vocab: token to piece cache size = 0.2032 MB
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: format = GGUF V3 (latest)
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: arch = bert
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: vocab type = WPM
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_vocab = 30522
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_merges = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: vocab_only = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_ctx_train = 512
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_embd = 384
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_layer = 6
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_head = 12
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_head_kv = 12
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_rot = 32
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_swa = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_embd_head_k = 32
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_embd_head_v = 32
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_gqa = 1
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_embd_k_gqa = 384
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_embd_v_gqa = 384
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: f_norm_eps = 1.0e-12
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: f_norm_rms_eps = 0.0e+00
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: f_clamp_kqv = 0.0e+00
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: f_max_alibi_bias = 0.0e+00
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: f_logit_scale = 0.0e+00
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_ff = 1536
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_expert = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_expert_used = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: causal attn = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: pooling type = 1
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: rope type = 2
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: rope scaling = linear
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: freq_base_train = 10000.0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: freq_scale_train = 1
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: n_ctx_orig_yarn = 512
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: rope_finetuned = unknown
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: ssm_d_conv = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: ssm_d_inner = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: ssm_d_state = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: ssm_dt_rank = 0
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: model type = 22M
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: model ftype = F16
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: model params = 22.57 M
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: model size = 43.10 MiB (16.02 BPW)
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: general.name = all-MiniLM-L6-v2
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: BOS token = 101 '[CLS]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: EOS token = 102 '[SEP]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: UNK token = 100 '[UNK]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: SEP token = 102 '[SEP]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: PAD token = 0 '[PAD]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: CLS token = 101 '[CLS]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: MASK token = 103 '[MASK]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: LF token = 0 '[PAD]'
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_print_meta: max token length = 21
Aug 09 20:08:14 newphobos ollama[5501]: time=2024-08-09T20:08:14.272+02:00 level=INFO source=server.go:626 msg="waiting for server to become available" status="llm server loading model"
Aug 09 20:08:14 newphobos ollama[5501]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
Aug 09 20:08:14 newphobos ollama[5501]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Aug 09 20:08:14 newphobos ollama[5501]: ggml_cuda_init: found 1 ROCm devices:
Aug 09 20:08:14 newphobos ollama[5501]: Device 0: AMD Radeon RX 7900 XT, compute capability 11.0, VMM: no
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_tensors: ggml ctx size = 0.08 MiB
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_tensors: offloading 6 repeating layers to GPU
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_tensors: offloading non-repeating layers to GPU
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_tensors: offloaded 7/7 layers to GPU
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_tensors: ROCm0 buffer size = 20.37 MiB
Aug 09 20:08:14 newphobos ollama[5501]: llm_load_tensors: CPU buffer size = 22.73 MiB
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: n_ctx = 1024
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: n_batch = 512
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: n_ubatch = 512
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: flash_attn = 0
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: freq_base = 10000.0
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: freq_scale = 1
Aug 09 20:08:15 newphobos ollama[5501]: llama_kv_cache_init: ROCm0 KV buffer size = 9.00 MiB
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: KV self size = 9.00 MiB, K (f16): 4.50 MiB, V (f16): 4.50 MiB
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: CPU output buffer size = 0.00 MiB
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: ROCm0 compute buffer size = 16.01 MiB
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: ROCm_Host compute buffer size = 2.51 MiB
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: graph nodes = 221
Aug 09 20:08:15 newphobos ollama[5501]: llama_new_context_with_model: graph splits = 2
Aug 09 20:08:15 newphobos ollama[5501]: /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/bits/stl_vector.h:1130: reference std::vector<unsigned long>::operator[](size_type) [_Tp = unsigned long, _Alloc = std::allocator<unsigned long>]: Assertion '__n < this->size()' failed.
Aug 09 20:08:15 newphobos ollama[5501]: time=2024-08-09T20:08:15.476+02:00 level=INFO source=server.go:626 msg="waiting for server to become available" status="llm server not responding"
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.982+02:00 level=ERROR source=sched.go:451 msg="error loading llama server" error="llama runner process has terminated: signal: aborted (core dumped)"
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.982+02:00 level=DEBUG source=sched.go:454 msg="triggering expiration for failed load" model=/var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.982+02:00 level=DEBUG source=sched.go:355 msg="runner expired event received" modelPath=/var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.982+02:00 level=DEBUG source=sched.go:371 msg="got lock to unload" modelPath=/var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662
Aug 09 20:08:17 newphobos ollama[5501]: [GIN] 2024/08/09 - 20:08:17 | 500 | 3.969014093s | 10.0.3.96 | POST "/api/embed"
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.983+02:00 level=DEBUG source=gpu.go:359 msg="updating system memory data" before.total="62.7 GiB" before.free="59.4 GiB" before.free_swap="0 B" now.total="62.7 GiB" now.free="59.3 GiB" now.free_swap="0 B"
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.983+02:00 level=DEBUG source=amd_linux.go:440 msg="updating rocm free memory" gpu=0 name=1002:744c before="19.2 GiB" now="19.2 GiB"
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.983+02:00 level=DEBUG source=server.go:1053 msg="stopping llama server"
Aug 09 20:08:17 newphobos ollama[5501]: time=2024-08-09T20:08:17.983+02:00 level=DEBUG source=sched.go:376 msg="runner released" modelPath=/var/lib/ollama/.ollama/models/blobs/sha256-797b70c4edf85907fe0a49eb85811256f65fa0f7bf52166b147fd16be2be4662
Aug 09 20:08:18 newphobos ollama[5501]: time=2024-08-09T20:08:18.234+02:00 level=DEBUG source=gpu.go:359 msg="updating system memory data" before.total="62.7 GiB" before.free="59.3 GiB" before.free_swap="0 B" now.total="62.7 GiB" now.free="59.3 GiB" now.free_swap="0 B"
Aug 09 20:08:18 newphobos ollama[5501]: time=2024-08-09T20:08:18.234+02:00 level=DEBUG source=amd_linux.go:440 msg="updating rocm free memory" gpu=0 name=1002:744c before="19.2 GiB" now="19.2 GiB"
Aug 09 20:08:18 newphobos ollama[5501]: time=2024-08-09T20:08:18.483+02:00 level=DEBUG source=gpu.go:359 msg="updating system memory data" before.total="62.7 GiB" before.free="59.3 GiB" before.free_swap="0 B" now.total="62.7 GiB" now.free="59.3 GiB" now.free_swap="0 B"
```
Special emphasis on this line:
```
Aug 09 20:08:15 newphobos ollama[5501]: /usr/lib64/gcc/x86_64-pc-linux-gnu/14.2.1/../../../../include/c++/14.2.1/bits/stl_vector.h:1130: reference std::vector<unsigned long>::operator[](size_type) [_Tp = unsigned long, _Alloc = std::allocator<unsigned long>]: Assertion '__n < this->size()' failed.
```
Index out of bounds perhaps? I don't get how/where/what with that error as ollama is a Go application and this error is C++..
### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.3.4
|
{
"login": "markg85",
"id": 49061,
"node_id": "MDQ6VXNlcjQ5MDYx",
"avatar_url": "https://avatars.githubusercontent.com/u/49061?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/markg85",
"html_url": "https://github.com/markg85",
"followers_url": "https://api.github.com/users/markg85/followers",
"following_url": "https://api.github.com/users/markg85/following{/other_user}",
"gists_url": "https://api.github.com/users/markg85/gists{/gist_id}",
"starred_url": "https://api.github.com/users/markg85/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/markg85/subscriptions",
"organizations_url": "https://api.github.com/users/markg85/orgs",
"repos_url": "https://api.github.com/users/markg85/repos",
"events_url": "https://api.github.com/users/markg85/events{/privacy}",
"received_events_url": "https://api.github.com/users/markg85/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6289/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6289/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/1658
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1658/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1658/comments
|
https://api.github.com/repos/ollama/ollama/issues/1658/events
|
https://github.com/ollama/ollama/issues/1658
| 2,052,629,556
|
I_kwDOJ0Z1Ps56WKQ0
| 1,658
|
Feature request: delete partially downloaded model
|
{
"login": "kokizzu",
"id": 1061610,
"node_id": "MDQ6VXNlcjEwNjE2MTA=",
"avatar_url": "https://avatars.githubusercontent.com/u/1061610?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kokizzu",
"html_url": "https://github.com/kokizzu",
"followers_url": "https://api.github.com/users/kokizzu/followers",
"following_url": "https://api.github.com/users/kokizzu/following{/other_user}",
"gists_url": "https://api.github.com/users/kokizzu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/kokizzu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kokizzu/subscriptions",
"organizations_url": "https://api.github.com/users/kokizzu/orgs",
"repos_url": "https://api.github.com/users/kokizzu/repos",
"events_url": "https://api.github.com/users/kokizzu/events{/privacy}",
"received_events_url": "https://api.github.com/users/kokizzu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2023-12-21T15:27:21
| 2023-12-21T19:24:58
| 2023-12-21T19:19:07
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
So i was downloading some model, and apparently it took like 26GB of disk space XD
so i cancelled it midway
tried to `ollama rm dolphin-mixtral` but it shows `Error: model 'dolphin-mixtral' not found`
it would be nice if there's command to remove partially downloaded model.
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1658/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1658/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/5539
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5539/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5539/comments
|
https://api.github.com/repos/ollama/ollama/issues/5539/events
|
https://github.com/ollama/ollama/issues/5539
| 2,394,886,691
|
I_kwDOJ0Z1Ps6OvxIj
| 5,539
|
can't embedding PDF file in Korean
|
{
"login": "codeMonkey-shin",
"id": 80636401,
"node_id": "MDQ6VXNlcjgwNjM2NDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/80636401?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codeMonkey-shin",
"html_url": "https://github.com/codeMonkey-shin",
"followers_url": "https://api.github.com/users/codeMonkey-shin/followers",
"following_url": "https://api.github.com/users/codeMonkey-shin/following{/other_user}",
"gists_url": "https://api.github.com/users/codeMonkey-shin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codeMonkey-shin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codeMonkey-shin/subscriptions",
"organizations_url": "https://api.github.com/users/codeMonkey-shin/orgs",
"repos_url": "https://api.github.com/users/codeMonkey-shin/repos",
"events_url": "https://api.github.com/users/codeMonkey-shin/events{/privacy}",
"received_events_url": "https://api.github.com/users/codeMonkey-shin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-07-08T07:38:39
| 2024-07-09T08:47:43
| 2024-07-09T08:47:43
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I'm trying to use rag by embedding a PDF file in Korean, but the encoding seems to be broken. When saved to vectordb, broken strings are stored.
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.48
|
{
"login": "codeMonkey-shin",
"id": 80636401,
"node_id": "MDQ6VXNlcjgwNjM2NDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/80636401?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/codeMonkey-shin",
"html_url": "https://github.com/codeMonkey-shin",
"followers_url": "https://api.github.com/users/codeMonkey-shin/followers",
"following_url": "https://api.github.com/users/codeMonkey-shin/following{/other_user}",
"gists_url": "https://api.github.com/users/codeMonkey-shin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/codeMonkey-shin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/codeMonkey-shin/subscriptions",
"organizations_url": "https://api.github.com/users/codeMonkey-shin/orgs",
"repos_url": "https://api.github.com/users/codeMonkey-shin/repos",
"events_url": "https://api.github.com/users/codeMonkey-shin/events{/privacy}",
"received_events_url": "https://api.github.com/users/codeMonkey-shin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5539/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5539/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8456
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8456/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8456/comments
|
https://api.github.com/repos/ollama/ollama/issues/8456/events
|
https://github.com/ollama/ollama/issues/8456
| 2,792,306,559
|
I_kwDOJ0Z1Ps6mbzd_
| 8,456
|
ollama create fails for GGUF files with unaligned tensors
|
{
"login": "rick-github",
"id": 14946854,
"node_id": "MDQ6VXNlcjE0OTQ2ODU0",
"avatar_url": "https://avatars.githubusercontent.com/u/14946854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rick-github",
"html_url": "https://github.com/rick-github",
"followers_url": "https://api.github.com/users/rick-github/followers",
"following_url": "https://api.github.com/users/rick-github/following{/other_user}",
"gists_url": "https://api.github.com/users/rick-github/gists{/gist_id}",
"starred_url": "https://api.github.com/users/rick-github/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rick-github/subscriptions",
"organizations_url": "https://api.github.com/users/rick-github/orgs",
"repos_url": "https://api.github.com/users/rick-github/repos",
"events_url": "https://api.github.com/users/rick-github/events{/privacy}",
"received_events_url": "https://api.github.com/users/rick-github/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 0
| 2025-01-16T10:24:14
| 2025-01-16T13:04:57
| null |
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
```
$ ollama show --modelfile minicpm-v > Modelfile
$ ollama create minicpm-v:test
gathering model components
copying file sha256:262843d4806aeb402336980badd414a72576b20b1e5d537647da15f16c4a4df0 100%
copying file sha256:f8a805e9e62085805c69c427287acefc284932eb4abfe6e1b1ce431d27e2f4e0 100%
parsing GGUF
Error: invalid file magic
```
The tensors in some GGUF files are not aligned with `general.alignment` and when imported, the alignment bytes at the end of the file are treated as the start of a new GGUF, resulting in a failed `file magic` match.
### OS
Linux
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.5.6
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8456/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8456/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/6990
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6990/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6990/comments
|
https://api.github.com/repos/ollama/ollama/issues/6990/events
|
https://github.com/ollama/ollama/issues/6990
| 2,551,601,831
|
I_kwDOJ0Z1Ps6YFlqn
| 6,990
|
Unrecognized import path "gorgonia.org/vecf64"
|
{
"login": "opacicmarko",
"id": 98588282,
"node_id": "U_kgDOBeBWeg",
"avatar_url": "https://avatars.githubusercontent.com/u/98588282?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/opacicmarko",
"html_url": "https://github.com/opacicmarko",
"followers_url": "https://api.github.com/users/opacicmarko/followers",
"following_url": "https://api.github.com/users/opacicmarko/following{/other_user}",
"gists_url": "https://api.github.com/users/opacicmarko/gists{/gist_id}",
"starred_url": "https://api.github.com/users/opacicmarko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/opacicmarko/subscriptions",
"organizations_url": "https://api.github.com/users/opacicmarko/orgs",
"repos_url": "https://api.github.com/users/opacicmarko/repos",
"events_url": "https://api.github.com/users/opacicmarko/events{/privacy}",
"received_events_url": "https://api.github.com/users/opacicmarko/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 1
| 2024-09-26T22:04:05
| 2024-09-30T21:04:23
| 2024-09-30T21:04:22
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I tried to build the project from source, but running `go build .` fails with the following errors:
```
go: downloading gorgonia.org/vecf32 v0.9.0
go: downloading gorgonia.org/vecf64 v0.9.0
../../go/pkg/mod/github.com/pdevine/tensor@v0.0.0-20240510204454-f88f4562727c/internal/execution/generic_arith_vv.go:10:2: unrecognized import path "gorgonia.org/vecf32": reading https://gorgonia.org/vecf32?go-get=1: 436
../../go/pkg/mod/github.com/pdevine/tensor@v0.0.0-20240510204454-f88f4562727c/internal/execution/generic_arith_vv.go:11:2: unrecognized import path "gorgonia.org/vecf64": reading https://gorgonia.org/vecf64?go-get=1: 436
```
It appears the [gorgonia.org](https://gorgonia.org/) domain has expired:

### OS
Linux
### GPU
AMD
### CPU
AMD
### Ollama version
0.3.12
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6990/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6990/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/7543
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7543/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7543/comments
|
https://api.github.com/repos/ollama/ollama/issues/7543/events
|
https://github.com/ollama/ollama/issues/7543
| 2,640,173,480
|
I_kwDOJ0Z1Ps6dXdmo
| 7,543
|
Please add qwen2-vl-7b
|
{
"login": "bingbing6",
"id": 51957370,
"node_id": "MDQ6VXNlcjUxOTU3Mzcw",
"avatar_url": "https://avatars.githubusercontent.com/u/51957370?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bingbing6",
"html_url": "https://github.com/bingbing6",
"followers_url": "https://api.github.com/users/bingbing6/followers",
"following_url": "https://api.github.com/users/bingbing6/following{/other_user}",
"gists_url": "https://api.github.com/users/bingbing6/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bingbing6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bingbing6/subscriptions",
"organizations_url": "https://api.github.com/users/bingbing6/orgs",
"repos_url": "https://api.github.com/users/bingbing6/repos",
"events_url": "https://api.github.com/users/bingbing6/events{/privacy}",
"received_events_url": "https://api.github.com/users/bingbing6/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-11-07T07:46:51
| 2024-11-07T22:15:20
| 2024-11-07T22:15:20
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Please add qwen2-vl-7b
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7543/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7543/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6845
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6845/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6845/comments
|
https://api.github.com/repos/ollama/ollama/issues/6845/events
|
https://github.com/ollama/ollama/pull/6845
| 2,532,179,326
|
PR_kwDOJ0Z1Ps570V3F
| 6,845
|
llama: fix race in parallel make
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-09-17T21:19:07
| 2024-09-23T19:04:02
| 2024-09-23T19:03:54
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/6845",
"html_url": "https://github.com/ollama/ollama/pull/6845",
"diff_url": "https://github.com/ollama/ollama/pull/6845.diff",
"patch_url": "https://github.com/ollama/ollama/pull/6845.patch",
"merged_at": null
}
|
Ensure the cleanup step completes before starting to build targets
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6845/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6845/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/5082
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/5082/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/5082/comments
|
https://api.github.com/repos/ollama/ollama/issues/5082/events
|
https://github.com/ollama/ollama/issues/5082
| 2,355,836,344
|
I_kwDOJ0Z1Ps6MazW4
| 5,082
|
`ollama list` shows empty list of models
|
{
"login": "DoLife",
"id": 67223389,
"node_id": "MDQ6VXNlcjY3MjIzMzg5",
"avatar_url": "https://avatars.githubusercontent.com/u/67223389?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/DoLife",
"html_url": "https://github.com/DoLife",
"followers_url": "https://api.github.com/users/DoLife/followers",
"following_url": "https://api.github.com/users/DoLife/following{/other_user}",
"gists_url": "https://api.github.com/users/DoLife/gists{/gist_id}",
"starred_url": "https://api.github.com/users/DoLife/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DoLife/subscriptions",
"organizations_url": "https://api.github.com/users/DoLife/orgs",
"repos_url": "https://api.github.com/users/DoLife/repos",
"events_url": "https://api.github.com/users/DoLife/events{/privacy}",
"received_events_url": "https://api.github.com/users/DoLife/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
},
{
"id": 6677367769,
"node_id": "LA_kwDOJ0Z1Ps8AAAABjgCL2Q",
"url": "https://api.github.com/repos/ollama/ollama/labels/needs%20more%20info",
"name": "needs more info",
"color": "BA8041",
"default": false,
"description": "More information is needed to assist"
}
] |
closed
| false
| null |
[] | null | 7
| 2024-06-16T15:21:30
| 2024-07-12T19:25:14
| 2024-07-12T19:25:14
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Hi
My models no longer load.
When i do ollama list it gives me a blank list, but all the models is in the directories.
See Images, it was working correctly a few days ago.



### OS
Windows
### GPU
Nvidia
### CPU
AMD
### Ollama version
0.1.44
|
{
"login": "pdevine",
"id": 75239,
"node_id": "MDQ6VXNlcjc1MjM5",
"avatar_url": "https://avatars.githubusercontent.com/u/75239?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/pdevine",
"html_url": "https://github.com/pdevine",
"followers_url": "https://api.github.com/users/pdevine/followers",
"following_url": "https://api.github.com/users/pdevine/following{/other_user}",
"gists_url": "https://api.github.com/users/pdevine/gists{/gist_id}",
"starred_url": "https://api.github.com/users/pdevine/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdevine/subscriptions",
"organizations_url": "https://api.github.com/users/pdevine/orgs",
"repos_url": "https://api.github.com/users/pdevine/repos",
"events_url": "https://api.github.com/users/pdevine/events{/privacy}",
"received_events_url": "https://api.github.com/users/pdevine/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/5082/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/5082/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/2499
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2499/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2499/comments
|
https://api.github.com/repos/ollama/ollama/issues/2499/events
|
https://github.com/ollama/ollama/pull/2499
| 2,134,953,061
|
PR_kwDOJ0Z1Ps5m5HOX
| 2,499
|
Windows Preview
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-02-14T18:40:19
| 2024-02-16T00:06:33
| 2024-02-16T00:06:33
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/2499",
"html_url": "https://github.com/ollama/ollama/pull/2499",
"diff_url": "https://github.com/ollama/ollama/pull/2499.diff",
"patch_url": "https://github.com/ollama/ollama/pull/2499.patch",
"merged_at": "2024-02-16T00:06:32"
}
| null |
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2499/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2499/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/125
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/125/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/125/comments
|
https://api.github.com/repos/ollama/ollama/issues/125/events
|
https://github.com/ollama/ollama/pull/125
| 1,812,056,742
|
PR_kwDOJ0Z1Ps5V5e-P
| 125
|
Updated modelfile doc to include license
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2023-07-19T14:17:06
| 2023-07-19T15:57:09
| 2023-07-19T15:57:07
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/125",
"html_url": "https://github.com/ollama/ollama/pull/125",
"diff_url": "https://github.com/ollama/ollama/pull/125.diff",
"patch_url": "https://github.com/ollama/ollama/pull/125.patch",
"merged_at": "2023-07-19T15:57:07"
}
|
and attributed midjourneyprompt
|
{
"login": "technovangelist",
"id": 633681,
"node_id": "MDQ6VXNlcjYzMzY4MQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/633681?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/technovangelist",
"html_url": "https://github.com/technovangelist",
"followers_url": "https://api.github.com/users/technovangelist/followers",
"following_url": "https://api.github.com/users/technovangelist/following{/other_user}",
"gists_url": "https://api.github.com/users/technovangelist/gists{/gist_id}",
"starred_url": "https://api.github.com/users/technovangelist/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/technovangelist/subscriptions",
"organizations_url": "https://api.github.com/users/technovangelist/orgs",
"repos_url": "https://api.github.com/users/technovangelist/repos",
"events_url": "https://api.github.com/users/technovangelist/events{/privacy}",
"received_events_url": "https://api.github.com/users/technovangelist/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/125/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/125/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7233
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7233/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7233/comments
|
https://api.github.com/repos/ollama/ollama/issues/7233/events
|
https://github.com/ollama/ollama/issues/7233
| 2,593,734,928
|
I_kwDOJ0Z1Ps6amUEQ
| 7,233
|
Support for Whisper-family models
|
{
"login": "gileneusz",
"id": 34601970,
"node_id": "MDQ6VXNlcjM0NjAxOTcw",
"avatar_url": "https://avatars.githubusercontent.com/u/34601970?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gileneusz",
"html_url": "https://github.com/gileneusz",
"followers_url": "https://api.github.com/users/gileneusz/followers",
"following_url": "https://api.github.com/users/gileneusz/following{/other_user}",
"gists_url": "https://api.github.com/users/gileneusz/gists{/gist_id}",
"starred_url": "https://api.github.com/users/gileneusz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gileneusz/subscriptions",
"organizations_url": "https://api.github.com/users/gileneusz/orgs",
"repos_url": "https://api.github.com/users/gileneusz/repos",
"events_url": "https://api.github.com/users/gileneusz/events{/privacy}",
"received_events_url": "https://api.github.com/users/gileneusz/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5789807732,
"node_id": "LA_kwDOJ0Z1Ps8AAAABWRl0dA",
"url": "https://api.github.com/repos/ollama/ollama/labels/model%20request",
"name": "model request",
"color": "1E5DE6",
"default": false,
"description": "Model requests"
}
] |
closed
| false
| null |
[] | null | 3
| 2024-10-17T06:19:10
| 2024-11-21T07:05:53
| 2024-11-21T07:05:52
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Please consider adding this feature.
Input:
- audio file
- youtube link
- ...
Output:
- text
|
{
"login": "mchiang0610",
"id": 3325447,
"node_id": "MDQ6VXNlcjMzMjU0NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/3325447?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mchiang0610",
"html_url": "https://github.com/mchiang0610",
"followers_url": "https://api.github.com/users/mchiang0610/followers",
"following_url": "https://api.github.com/users/mchiang0610/following{/other_user}",
"gists_url": "https://api.github.com/users/mchiang0610/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mchiang0610/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mchiang0610/subscriptions",
"organizations_url": "https://api.github.com/users/mchiang0610/orgs",
"repos_url": "https://api.github.com/users/mchiang0610/repos",
"events_url": "https://api.github.com/users/mchiang0610/events{/privacy}",
"received_events_url": "https://api.github.com/users/mchiang0610/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7233/reactions",
"total_count": 5,
"+1": 5,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7233/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/8235
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8235/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8235/comments
|
https://api.github.com/repos/ollama/ollama/issues/8235/events
|
https://github.com/ollama/ollama/issues/8235
| 2,758,225,991
|
I_kwDOJ0Z1Ps6kZzBH
| 8,235
|
Requests begin to all fail after several independent prompts
|
{
"login": "steveseguin",
"id": 2575698,
"node_id": "MDQ6VXNlcjI1NzU2OTg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2575698?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/steveseguin",
"html_url": "https://github.com/steveseguin",
"followers_url": "https://api.github.com/users/steveseguin/followers",
"following_url": "https://api.github.com/users/steveseguin/following{/other_user}",
"gists_url": "https://api.github.com/users/steveseguin/gists{/gist_id}",
"starred_url": "https://api.github.com/users/steveseguin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/steveseguin/subscriptions",
"organizations_url": "https://api.github.com/users/steveseguin/orgs",
"repos_url": "https://api.github.com/users/steveseguin/repos",
"events_url": "https://api.github.com/users/steveseguin/events{/privacy}",
"received_events_url": "https://api.github.com/users/steveseguin/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
open
| false
| null |
[] | null | 5
| 2024-12-24T19:31:20
| 2025-01-27T10:49:47
| null |
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I've been having an issue with Ollama where the output is either gibberish or just a series of @@@@@ characters. I don't recall it being this way some weeks ago, but I've found a solution. (The gibberish seems to happen mostly when stream: true, and @@@ mostly with stream: false. w/e.)
I started having this issue with Llama3.1 lorablated some weeks ago, but I'm having the issue with wen2.5-coder:32b as well, now that I'm trying to use it.
An example of the issue, after several successful requests, one suddenly fails, and then everyone after starts to fail.


**The solution I discovered has been to just set the keep_alive to 0.** I suspect there's some sort of context caching going on and I'm hitting some memory limit. My Titan RTX 24GB just squeaks by with these models. Windows 11. 96GB RAM. Most recent Ollama.
My requests are pretty short; just a couple of sentences in most cases.
Works:
```
const response = await fetch(`${endpoint}/api/generate`, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: model,
prompt: prompt,
stream: false,
keep_alive: 0 <===== note the keep_alive=0
})
});
```
Fails after a few prompts:
```
const response = await fetch(`${endpoint}/api/generate`, {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: model,
prompt: prompt,
stream: false
})
});
```
If this is a caching issue, it be pretty nice to have the option as to whether to cache the conversation or not, and if I am caching it, perhaps have a way to refer to that specific conversation. More fine grain control via the API that way.

[server log - Not working](https://github.com/user-attachments/files/18241593/not_working_default.log)
[server log - working](https://github.com/user-attachments/files/18241545/working_timeout_0.log)


This might not be a bug, but just me being stupid. Still, time_out = 0 works, however crude and slow it is.
Merry Christmas, all. 🎄
### OS
Windows
### GPU
Nvidia
### CPU
Intel, AMD
### Ollama version
0.5.4
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8235/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8235/timeline
| null | null | false
|
https://api.github.com/repos/ollama/ollama/issues/3172
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3172/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3172/comments
|
https://api.github.com/repos/ollama/ollama/issues/3172/events
|
https://github.com/ollama/ollama/issues/3172
| 2,189,586,448
|
I_kwDOJ0Z1Ps6CgnAQ
| 3,172
|
Allow to choose a preferred variant (an AMD GPU/an NIVIDIA GPU/CPU) when running a model
|
{
"login": "Inokinoki",
"id": 8311300,
"node_id": "MDQ6VXNlcjgzMTEzMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8311300?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Inokinoki",
"html_url": "https://github.com/Inokinoki",
"followers_url": "https://api.github.com/users/Inokinoki/followers",
"following_url": "https://api.github.com/users/Inokinoki/following{/other_user}",
"gists_url": "https://api.github.com/users/Inokinoki/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Inokinoki/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Inokinoki/subscriptions",
"organizations_url": "https://api.github.com/users/Inokinoki/orgs",
"repos_url": "https://api.github.com/users/Inokinoki/repos",
"events_url": "https://api.github.com/users/Inokinoki/events{/privacy}",
"received_events_url": "https://api.github.com/users/Inokinoki/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5860134234,
"node_id": "LA_kwDOJ0Z1Ps8AAAABXUqNWg",
"url": "https://api.github.com/repos/ollama/ollama/labels/windows",
"name": "windows",
"color": "0052CC",
"default": false,
"description": ""
}
] |
closed
| false
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 12
| 2024-03-15T22:55:41
| 2024-05-04T22:07:26
| 2024-05-04T22:07:26
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What are you trying to do?
I have both NVIDIA and AMD cards on one PC. Both `nvml.dll` and `amdhip64.dll` are available on Windows.
I saw in `gpu/gpu.go` ollama tries to detect first NVIDIA and will not try AMD if it found NVIDIA.
### How should we solve this?
Could it be possible to add an arg to indicate the preferred device or variant?
### What is the impact of not solving this?
As a workaround, I just move `nvml.dll` out of the path to by-pass the detection of NVIDIA.
### Anything else?
_No response_
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3172/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3172/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4273
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4273/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4273/comments
|
https://api.github.com/repos/ollama/ollama/issues/4273/events
|
https://github.com/ollama/ollama/issues/4273
| 2,286,772,593
|
I_kwDOJ0Z1Ps6ITWFx
| 4,273
|
API useing
|
{
"login": "w1757876747",
"id": 38978960,
"node_id": "MDQ6VXNlcjM4OTc4OTYw",
"avatar_url": "https://avatars.githubusercontent.com/u/38978960?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/w1757876747",
"html_url": "https://github.com/w1757876747",
"followers_url": "https://api.github.com/users/w1757876747/followers",
"following_url": "https://api.github.com/users/w1757876747/following{/other_user}",
"gists_url": "https://api.github.com/users/w1757876747/gists{/gist_id}",
"starred_url": "https://api.github.com/users/w1757876747/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/w1757876747/subscriptions",
"organizations_url": "https://api.github.com/users/w1757876747/orgs",
"repos_url": "https://api.github.com/users/w1757876747/repos",
"events_url": "https://api.github.com/users/w1757876747/events{/privacy}",
"received_events_url": "https://api.github.com/users/w1757876747/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-05-09T02:25:40
| 2024-06-04T22:31:45
| 2024-06-04T22:31:45
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
I have a problem. When I call api/pull of the project, the background displays the following error, and no streaming response such as status is returned, because I want to get the real-time download progress. May I ask why I can download normally if I use the run model command in the service`time=2024-05-09T10:03:13.104+08:00 level=INFO source=download.go:251 msg="00e1317cbf74 part 36 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-09T10:03:25.107+08:00 level=INFO source=download.go:251 msg="00e1317cbf74 part 14 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
time=2024-05-09T10:03:35.106+08:00 level=INFO source=download.go:251 msg="00e1317cbf74 part 36 stalled; retrying. If this persists, press ctrl-c to exit, then 'ollama pull' to find a faster connection."
[GIN] 2024/05/09 - 10:03:46 | 200 | 35.609442s | 127.0.0.1 | POST "/api/pull"`
### OS
Windows
### GPU
Nvidia
### CPU
Intel
### Ollama version
0.1.34
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4273/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4273/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/271
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/271/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/271/comments
|
https://api.github.com/repos/ollama/ollama/issues/271/events
|
https://github.com/ollama/ollama/pull/271
| 1,835,653,035
|
PR_kwDOJ0Z1Ps5XJBFb
| 271
|
README.md: Add info about `serve`, logging, and env vars (+ some icons)
|
{
"login": "drhino",
"id": 2538708,
"node_id": "MDQ6VXNlcjI1Mzg3MDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2538708?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/drhino",
"html_url": "https://github.com/drhino",
"followers_url": "https://api.github.com/users/drhino/followers",
"following_url": "https://api.github.com/users/drhino/following{/other_user}",
"gists_url": "https://api.github.com/users/drhino/gists{/gist_id}",
"starred_url": "https://api.github.com/users/drhino/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/drhino/subscriptions",
"organizations_url": "https://api.github.com/users/drhino/orgs",
"repos_url": "https://api.github.com/users/drhino/repos",
"events_url": "https://api.github.com/users/drhino/events{/privacy}",
"received_events_url": "https://api.github.com/users/drhino/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 3
| 2023-08-03T19:56:15
| 2023-10-24T22:17:14
| 2023-10-24T22:17:14
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/271",
"html_url": "https://github.com/ollama/ollama/pull/271",
"diff_url": "https://github.com/ollama/ollama/pull/271.diff",
"patch_url": "https://github.com/ollama/ollama/pull/271.patch",
"merged_at": null
}
| null |
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/271/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/271/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1000
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1000/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1000/comments
|
https://api.github.com/repos/ollama/ollama/issues/1000/events
|
https://github.com/ollama/ollama/pull/1000
| 1,977,357,341
|
PR_kwDOJ0Z1Ps5emOFP
| 1,000
|
Added clear command
|
{
"login": "tommyneu",
"id": 57959550,
"node_id": "MDQ6VXNlcjU3OTU5NTUw",
"avatar_url": "https://avatars.githubusercontent.com/u/57959550?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tommyneu",
"html_url": "https://github.com/tommyneu",
"followers_url": "https://api.github.com/users/tommyneu/followers",
"following_url": "https://api.github.com/users/tommyneu/following{/other_user}",
"gists_url": "https://api.github.com/users/tommyneu/gists{/gist_id}",
"starred_url": "https://api.github.com/users/tommyneu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tommyneu/subscriptions",
"organizations_url": "https://api.github.com/users/tommyneu/orgs",
"repos_url": "https://api.github.com/users/tommyneu/repos",
"events_url": "https://api.github.com/users/tommyneu/events{/privacy}",
"received_events_url": "https://api.github.com/users/tommyneu/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 1
| 2023-11-04T14:05:13
| 2023-11-09T00:50:48
| 2023-11-09T00:49:18
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/1000",
"html_url": "https://github.com/ollama/ollama/pull/1000",
"diff_url": "https://github.com/ollama/ollama/pull/1000.diff",
"patch_url": "https://github.com/ollama/ollama/pull/1000.patch",
"merged_at": null
}
|
Added clear command for ease of use
Closes #989
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1000/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1000/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/2382
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/2382/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/2382/comments
|
https://api.github.com/repos/ollama/ollama/issues/2382/events
|
https://github.com/ollama/ollama/issues/2382
| 2,122,399,323
|
I_kwDOJ0Z1Ps5-gT5b
| 2,382
|
Some LLM are not really open source
|
{
"login": "Edmartt",
"id": 47486245,
"node_id": "MDQ6VXNlcjQ3NDg2MjQ1",
"avatar_url": "https://avatars.githubusercontent.com/u/47486245?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Edmartt",
"html_url": "https://github.com/Edmartt",
"followers_url": "https://api.github.com/users/Edmartt/followers",
"following_url": "https://api.github.com/users/Edmartt/following{/other_user}",
"gists_url": "https://api.github.com/users/Edmartt/gists{/gist_id}",
"starred_url": "https://api.github.com/users/Edmartt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Edmartt/subscriptions",
"organizations_url": "https://api.github.com/users/Edmartt/orgs",
"repos_url": "https://api.github.com/users/Edmartt/repos",
"events_url": "https://api.github.com/users/Edmartt/events{/privacy}",
"received_events_url": "https://api.github.com/users/Edmartt/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 2
| 2024-02-07T07:49:59
| 2024-02-21T00:05:18
| 2024-02-21T00:05:17
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Not because a company says their LLM are open source is truth:

[https://spectrum.ieee.org/open-source-llm-not-open](url)
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/2382/reactions",
"total_count": 3,
"+1": 3,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/2382/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/4766
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4766/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4766/comments
|
https://api.github.com/repos/ollama/ollama/issues/4766/events
|
https://github.com/ollama/ollama/pull/4766
| 2,329,126,774
|
PR_kwDOJ0Z1Ps5xMgp1
| 4,766
|
add embed model command and fix question invoke
|
{
"login": "shoebham",
"id": 25881429,
"node_id": "MDQ6VXNlcjI1ODgxNDI5",
"avatar_url": "https://avatars.githubusercontent.com/u/25881429?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/shoebham",
"html_url": "https://github.com/shoebham",
"followers_url": "https://api.github.com/users/shoebham/followers",
"following_url": "https://api.github.com/users/shoebham/following{/other_user}",
"gists_url": "https://api.github.com/users/shoebham/gists{/gist_id}",
"starred_url": "https://api.github.com/users/shoebham/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shoebham/subscriptions",
"organizations_url": "https://api.github.com/users/shoebham/orgs",
"repos_url": "https://api.github.com/users/shoebham/repos",
"events_url": "https://api.github.com/users/shoebham/events{/privacy}",
"received_events_url": "https://api.github.com/users/shoebham/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-01T12:51:34
| 2024-06-04T05:21:18
| 2024-06-04T05:20:48
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4766",
"html_url": "https://github.com/ollama/ollama/pull/4766",
"diff_url": "https://github.com/ollama/ollama/pull/4766.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4766.patch",
"merged_at": "2024-06-04T05:20:48"
}
|
I was following the tutorial but i couldn't run it because embedding model was not available, so i had to download the embedding model using `ollama pull nomic-embed-text`. Also the code where we are asking the LLM the question, we are not printing anything so fixed that too.
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4766/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4766/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/7818
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/7818/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/7818/comments
|
https://api.github.com/repos/ollama/ollama/issues/7818/events
|
https://github.com/ollama/ollama/pull/7818
| 2,687,948,631
|
PR_kwDOJ0Z1Ps6C8TWp
| 7,818
|
Update README.md
|
{
"login": "adarshM84",
"id": 95633830,
"node_id": "U_kgDOBbNBpg",
"avatar_url": "https://avatars.githubusercontent.com/u/95633830?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adarshM84",
"html_url": "https://github.com/adarshM84",
"followers_url": "https://api.github.com/users/adarshM84/followers",
"following_url": "https://api.github.com/users/adarshM84/following{/other_user}",
"gists_url": "https://api.github.com/users/adarshM84/gists{/gist_id}",
"starred_url": "https://api.github.com/users/adarshM84/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adarshM84/subscriptions",
"organizations_url": "https://api.github.com/users/adarshM84/orgs",
"repos_url": "https://api.github.com/users/adarshM84/repos",
"events_url": "https://api.github.com/users/adarshM84/events{/privacy}",
"received_events_url": "https://api.github.com/users/adarshM84/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-11-24T15:57:00
| 2024-11-24T18:32:24
| 2024-11-24T18:32:24
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/7818",
"html_url": "https://github.com/ollama/ollama/pull/7818",
"diff_url": "https://github.com/ollama/ollama/pull/7818.diff",
"patch_url": "https://github.com/ollama/ollama/pull/7818.patch",
"merged_at": "2024-11-24T18:32:24"
}
|
Description added for link
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/7818/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/7818/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/4841
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/4841/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/4841/comments
|
https://api.github.com/repos/ollama/ollama/issues/4841/events
|
https://github.com/ollama/ollama/pull/4841
| 2,336,501,019
|
PR_kwDOJ0Z1Ps5xlkeF
| 4,841
|
Remove False Time Fields
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-06-05T17:56:28
| 2024-06-05T18:02:23
| 2024-06-05T18:02:16
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/4841",
"html_url": "https://github.com/ollama/ollama/pull/4841",
"diff_url": "https://github.com/ollama/ollama/pull/4841.diff",
"patch_url": "https://github.com/ollama/ollama/pull/4841.patch",
"merged_at": null
}
|
/api/tags was returning "0001-01-01T00:00:00Z" for 'expires_at'
/api/ps was returning "0001-01-01T00:00:00Z" for 'modified_at'
Removes these fields from the respective endpoints
Added assertion in test case, and tested locally both with curl and CLI
|
{
"login": "royjhan",
"id": 65097070,
"node_id": "MDQ6VXNlcjY1MDk3MDcw",
"avatar_url": "https://avatars.githubusercontent.com/u/65097070?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/royjhan",
"html_url": "https://github.com/royjhan",
"followers_url": "https://api.github.com/users/royjhan/followers",
"following_url": "https://api.github.com/users/royjhan/following{/other_user}",
"gists_url": "https://api.github.com/users/royjhan/gists{/gist_id}",
"starred_url": "https://api.github.com/users/royjhan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/royjhan/subscriptions",
"organizations_url": "https://api.github.com/users/royjhan/orgs",
"repos_url": "https://api.github.com/users/royjhan/repos",
"events_url": "https://api.github.com/users/royjhan/events{/privacy}",
"received_events_url": "https://api.github.com/users/royjhan/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/4841/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/4841/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/1492
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/1492/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/1492/comments
|
https://api.github.com/repos/ollama/ollama/issues/1492/events
|
https://github.com/ollama/ollama/issues/1492
| 2,038,707,483
|
I_kwDOJ0Z1Ps55hDUb
| 1,492
|
7b model on Colab: CUDA error 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/gguf/ggml-cuda.cu:8001: out of memory
|
{
"login": "nnWhisperer",
"id": 13225349,
"node_id": "MDQ6VXNlcjEzMjI1MzQ5",
"avatar_url": "https://avatars.githubusercontent.com/u/13225349?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nnWhisperer",
"html_url": "https://github.com/nnWhisperer",
"followers_url": "https://api.github.com/users/nnWhisperer/followers",
"following_url": "https://api.github.com/users/nnWhisperer/following{/other_user}",
"gists_url": "https://api.github.com/users/nnWhisperer/gists{/gist_id}",
"starred_url": "https://api.github.com/users/nnWhisperer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nnWhisperer/subscriptions",
"organizations_url": "https://api.github.com/users/nnWhisperer/orgs",
"repos_url": "https://api.github.com/users/nnWhisperer/repos",
"events_url": "https://api.github.com/users/nnWhisperer/events{/privacy}",
"received_events_url": "https://api.github.com/users/nnWhisperer/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 3
| 2023-12-12T23:41:00
| 2024-01-14T22:13:02
| 2024-01-14T22:13:02
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hello,
On a Google Colab 50GB ram 16GB Vram T4 instance (problem persisted in V100 instance), I install ollama as follows:
```
!sudo curl -L https://ollama.ai/download/ollama-linux-amd64 -o /usr/bin/ollama
!sudo chmod +x /usr/bin/ollama
!ollama serve
```
On the terminal I say:
`ollama run yarn-mistral:7b-128k`
Log gives the following error while of the 16gbVRAM only 4.3 Gb of the VRAM was used :
`CUDA error 2 at /go/src/github.com/jmorganca/ollama/llm/llama.cpp/gguf/ggml-cuda.cu:8001: out of memory`
Following are the logs received:
[output.log](https://github.com/jmorganca/ollama/files/13654181/output.log)
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/1492/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/1492/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/6419
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/6419/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/6419/comments
|
https://api.github.com/repos/ollama/ollama/issues/6419/events
|
https://github.com/ollama/ollama/issues/6419
| 2,473,004,850
|
I_kwDOJ0Z1Ps6TZw8y
| 6,419
|
Ollama Tools - random results without providing tools in second call
|
{
"login": "jprogramista",
"id": 240528,
"node_id": "MDQ6VXNlcjI0MDUyOA==",
"avatar_url": "https://avatars.githubusercontent.com/u/240528?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jprogramista",
"html_url": "https://github.com/jprogramista",
"followers_url": "https://api.github.com/users/jprogramista/followers",
"following_url": "https://api.github.com/users/jprogramista/following{/other_user}",
"gists_url": "https://api.github.com/users/jprogramista/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jprogramista/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jprogramista/subscriptions",
"organizations_url": "https://api.github.com/users/jprogramista/orgs",
"repos_url": "https://api.github.com/users/jprogramista/repos",
"events_url": "https://api.github.com/users/jprogramista/events{/privacy}",
"received_events_url": "https://api.github.com/users/jprogramista/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"id": 5667396184,
"node_id": "LA_kwDOJ0Z1Ps8AAAABUc2aWA",
"url": "https://api.github.com/repos/ollama/ollama/labels/bug",
"name": "bug",
"color": "d73a4a",
"default": true,
"description": "Something isn't working"
}
] |
closed
| false
| null |
[] | null | 2
| 2024-08-19T10:22:41
| 2024-08-22T03:58:57
| 2024-08-22T01:27:54
|
NONE
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
### What is the issue?
Not always, but very often this script: https://github.com/ollama/ollama-python/blob/main/examples/tools/main.py gives me random answers (I have modified `'NYC-LAX': {'departure': '08:00 AM', 'arrival': '11:30 PM', 'duration': '15h 30m'}` and I expect to get 15h 30m from tools, but get 5h 30m or 3h 50m), like tools not included in query - I observe that sometimes the tool answer is included, but more times not. If I include tools in final_response I get tools response included:
```
final_response = client.chat(model=model, messages=messages, tools=tools)
```
I have also tried with sync client, and observe same behaviour.
I tried also a REST call:
```
{
"model": "llama3.1",
"messages": [
{
"content": "What is the flight price from New York (NYC) to Los Angeles (LAX)?",
"role": "user"
},
{
"content": "",
"role": "assistant",
"tool_calls": [
{
"function": {
"arguments": {
"arrival": "LAX",
"departure": "NYC"
},
"name": "get_flight_info"
}
}
]
},
{
"content": "{"from": "NYC", "to": "LAX", "duration": "15h 30m", "price": "150 USD"}",
"role": "tool"
}
],
"options": {},
"stream": false,
"tools": [
{
"type": "function",
"function": {
"name": "get_flight_info",
"description": "Get the flight data",
"parameters": {
"properties": {
"arrival": {
"description": "The arrival ",
"departure": "string"
},
"departure": {
"description": "The departure",
"type": "string"
}
},
"required": [
"arrival",
"departure"
]
}
}
}
]
}
```
which give me 100% times correct results if "tools" node provided, and not correct or 90% of not correct if don't.
Model is 8b.
### OS
macOS
### GPU
Apple
### CPU
Apple
### Ollama version
0.3.6
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/6419/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/6419/timeline
| null |
not_planned
| false
|
https://api.github.com/repos/ollama/ollama/issues/533
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/533/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/533/comments
|
https://api.github.com/repos/ollama/ollama/issues/533/events
|
https://github.com/ollama/ollama/issues/533
| 1,898,422,416
|
I_kwDOJ0Z1Ps5xJ6CQ
| 533
|
GPU Support for Ollama on Microsoft Windows
|
{
"login": "dcasota",
"id": 14890243,
"node_id": "MDQ6VXNlcjE0ODkwMjQz",
"avatar_url": "https://avatars.githubusercontent.com/u/14890243?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dcasota",
"html_url": "https://github.com/dcasota",
"followers_url": "https://api.github.com/users/dcasota/followers",
"following_url": "https://api.github.com/users/dcasota/following{/other_user}",
"gists_url": "https://api.github.com/users/dcasota/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dcasota/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dcasota/subscriptions",
"organizations_url": "https://api.github.com/users/dcasota/orgs",
"repos_url": "https://api.github.com/users/dcasota/repos",
"events_url": "https://api.github.com/users/dcasota/events{/privacy}",
"received_events_url": "https://api.github.com/users/dcasota/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
|
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[
{
"login": "BruceMacD",
"id": 5853428,
"node_id": "MDQ6VXNlcjU4NTM0Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/5853428?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BruceMacD",
"html_url": "https://github.com/BruceMacD",
"followers_url": "https://api.github.com/users/BruceMacD/followers",
"following_url": "https://api.github.com/users/BruceMacD/following{/other_user}",
"gists_url": "https://api.github.com/users/BruceMacD/gists{/gist_id}",
"starred_url": "https://api.github.com/users/BruceMacD/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BruceMacD/subscriptions",
"organizations_url": "https://api.github.com/users/BruceMacD/orgs",
"repos_url": "https://api.github.com/users/BruceMacD/repos",
"events_url": "https://api.github.com/users/BruceMacD/events{/privacy}",
"received_events_url": "https://api.github.com/users/BruceMacD/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
] | null | 13
| 2023-09-15T13:19:06
| 2024-03-17T03:00:44
| 2023-10-26T00:34:13
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | null | null |
Hi,
To make run Ollama from source code with Nvidia GPU on Microsoft Windows, actually there is no setup description and the Ollama sourcecode has some ToDo's as well, is that right ?
Here some thoughts.
Setup
-
1. NVidia drivers
1A. Software drivers: https://www.nvidia.com/download/index.aspx
1B. Nvidia CUDA Toolkit https://developer.nvidia.com/cuda-downloads
Check the GPU support in nvidia-smi.exe and nvcc.exe for cuda compilation tools ../11/12.
~1C. NVidia Omniverse >PhysX>Blast seems to become necessary for NVidia gpu support, as well.~
~`git clone https://github.com/NVIDIA-Omniverse/PhysX`~
~`call .\PhysX\blast\build.bat`~
2. Git https://git-scm.com/download/win
3. Python https://www.python.org/downloads/windows/
4. Go https://go.dev/doc/install
5. Gcc https://sourceforge.net/projects/mingw-w64/files/mingw-w64/mingw-w64-release/
6. Cmake https://cmake.org/download/
7. Winlibs https://winlibs.com/
8. Bazel https://github.com/bazelbuild/bazel/releases
edited:
With respect to the content in .\examples, there are a few additional tools necessary, to make run requirements.txt on Microsoft Windows. Some of the dependencies have to be installed (steps 6-8) and most can be added simply by `pip install`.
The following code snippet still produces warnings, but it helps to make start the .\examples\langchain-document\main.py.
```
pip install unstructured
pip install pdf2image
pip install pdfminer
pip install pdfminer.six
pip install pyproject.toml
pip install pysqlite3
pip install gpt4all
pip install chromadb
pip install tensorflow
pip install opencv-python
pip install bazel-runfiles
pip install -r .\examples\langchain-document\requirements.txt
pip install langchain
```
After that, install Ollama.
`git clone https://github.com/jmorganca/ollama`
`cd .\ollama`
`mkdir ..\.ollama`
`go generate .\...`
`go build -ldflags '-linkmode external -extldflags "-static"' .`
Check if the executable ollama.exe has been created.
Foreseen sourcecode modifications
-
llm\llama.go, function chooseRunner, function NumGPU
docs\development.md
generate_darwin_amd64.go (compare with generate_linux.go for cuda)
...
|
{
"login": "jmorganca",
"id": 251292,
"node_id": "MDQ6VXNlcjI1MTI5Mg==",
"avatar_url": "https://avatars.githubusercontent.com/u/251292?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jmorganca",
"html_url": "https://github.com/jmorganca",
"followers_url": "https://api.github.com/users/jmorganca/followers",
"following_url": "https://api.github.com/users/jmorganca/following{/other_user}",
"gists_url": "https://api.github.com/users/jmorganca/gists{/gist_id}",
"starred_url": "https://api.github.com/users/jmorganca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmorganca/subscriptions",
"organizations_url": "https://api.github.com/users/jmorganca/orgs",
"repos_url": "https://api.github.com/users/jmorganca/repos",
"events_url": "https://api.github.com/users/jmorganca/events{/privacy}",
"received_events_url": "https://api.github.com/users/jmorganca/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/533/reactions",
"total_count": 2,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 2,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/533/timeline
| null |
completed
| false
|
https://api.github.com/repos/ollama/ollama/issues/3282
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3282/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3282/comments
|
https://api.github.com/repos/ollama/ollama/issues/3282/events
|
https://github.com/ollama/ollama/pull/3282
| 2,199,761,390
|
PR_kwDOJ0Z1Ps5qV3K5
| 3,282
|
Add docs for GPU selection and nvidia uvm workaround
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-03-21T10:20:29
| 2024-07-24T15:14:33
| 2024-03-24T18:15:04
|
COLLABORATOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3282",
"html_url": "https://github.com/ollama/ollama/pull/3282",
"diff_url": "https://github.com/ollama/ollama/pull/3282.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3282.patch",
"merged_at": "2024-03-24T18:15:04"
}
|
Fixes #1813
Fixes #2934
Fixes #2718
|
{
"login": "dhiltgen",
"id": 4033016,
"node_id": "MDQ6VXNlcjQwMzMwMTY=",
"avatar_url": "https://avatars.githubusercontent.com/u/4033016?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhiltgen",
"html_url": "https://github.com/dhiltgen",
"followers_url": "https://api.github.com/users/dhiltgen/followers",
"following_url": "https://api.github.com/users/dhiltgen/following{/other_user}",
"gists_url": "https://api.github.com/users/dhiltgen/gists{/gist_id}",
"starred_url": "https://api.github.com/users/dhiltgen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhiltgen/subscriptions",
"organizations_url": "https://api.github.com/users/dhiltgen/orgs",
"repos_url": "https://api.github.com/users/dhiltgen/repos",
"events_url": "https://api.github.com/users/dhiltgen/events{/privacy}",
"received_events_url": "https://api.github.com/users/dhiltgen/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3282/reactions",
"total_count": 2,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3282/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/3506
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/3506/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/3506/comments
|
https://api.github.com/repos/ollama/ollama/issues/3506/events
|
https://github.com/ollama/ollama/pull/3506
| 2,228,366,846
|
PR_kwDOJ0Z1Ps5r3PBG
| 3,506
|
cgo quantize
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
closed
| false
| null |
[] | null | 0
| 2024-04-05T16:00:19
| 2024-04-09T19:32:54
| 2024-04-09T19:32:53
|
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | false
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/3506",
"html_url": "https://github.com/ollama/ollama/pull/3506",
"diff_url": "https://github.com/ollama/ollama/pull/3506.diff",
"patch_url": "https://github.com/ollama/ollama/pull/3506.patch",
"merged_at": "2024-04-09T19:32:53"
}
|
revive #307
this will _only_ quantize a converted model. quantizing an arbitrary fp16/fp32 will be a follow up
|
{
"login": "mxyng",
"id": 2372640,
"node_id": "MDQ6VXNlcjIzNzI2NDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/2372640?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mxyng",
"html_url": "https://github.com/mxyng",
"followers_url": "https://api.github.com/users/mxyng/followers",
"following_url": "https://api.github.com/users/mxyng/following{/other_user}",
"gists_url": "https://api.github.com/users/mxyng/gists{/gist_id}",
"starred_url": "https://api.github.com/users/mxyng/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxyng/subscriptions",
"organizations_url": "https://api.github.com/users/mxyng/orgs",
"repos_url": "https://api.github.com/users/mxyng/repos",
"events_url": "https://api.github.com/users/mxyng/events{/privacy}",
"received_events_url": "https://api.github.com/users/mxyng/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
{
"url": "https://api.github.com/repos/ollama/ollama/issues/3506/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/3506/timeline
| null | null | true
|
https://api.github.com/repos/ollama/ollama/issues/8124
|
https://api.github.com/repos/ollama/ollama
|
https://api.github.com/repos/ollama/ollama/issues/8124/labels{/name}
|
https://api.github.com/repos/ollama/ollama/issues/8124/comments
|
https://api.github.com/repos/ollama/ollama/issues/8124/events
|
https://github.com/ollama/ollama/pull/8124
| 2,743,484,124
|
PR_kwDOJ0Z1Ps6Fao2N
| 8,124
|
grammar: introduce new grammar package
|
{
"login": "bmizerany",
"id": 46,
"node_id": "MDQ6VXNlcjQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/46?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/bmizerany",
"html_url": "https://github.com/bmizerany",
"followers_url": "https://api.github.com/users/bmizerany/followers",
"following_url": "https://api.github.com/users/bmizerany/following{/other_user}",
"gists_url": "https://api.github.com/users/bmizerany/gists{/gist_id}",
"starred_url": "https://api.github.com/users/bmizerany/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmizerany/subscriptions",
"organizations_url": "https://api.github.com/users/bmizerany/orgs",
"repos_url": "https://api.github.com/users/bmizerany/repos",
"events_url": "https://api.github.com/users/bmizerany/events{/privacy}",
"received_events_url": "https://api.github.com/users/bmizerany/received_events",
"type": "User",
"user_view_type": "public",
"site_admin": false
}
|
[] |
open
| false
| null |
[] | null | 0
| 2024-12-16T22:00:57
| 2024-12-18T00:10:59
| null |
CONTRIBUTOR
|
{
"total": 0,
"completed": 0,
"percent_completed": 0
}
| null | true
|
{
"url": "https://api.github.com/repos/ollama/ollama/pulls/8124",
"html_url": "https://github.com/ollama/ollama/pull/8124",
"diff_url": "https://github.com/ollama/ollama/pull/8124.diff",
"patch_url": "https://github.com/ollama/ollama/pull/8124.patch",
"merged_at": null
}
|
This package provides a way to convert JSON schemas to equivalent EBNF. It is intended to be a replacement to llama.cpp's schema_to_grammar.
This is still an early version and does not yet support all JSON schema features. The to-do list includes:
- [ ] minumum/maximum constraints on integer types
- [ ] minLength/maxLength constraints on string types
- [ ] defs and refs
- [ ] performance improvements once we're happy with correctness and feature set.
NOTE: This also fixes outstanding issues caused by llama.cpp's schema_to_grammer; specifically that it does not maintain order of object keys in the schema when producing the resulting ebnf.
Owning our own converter allows us to have more control over bug prevention, speed, and accuracy of conversions, and removes yet another C dependency.
| null |
{
"url": "https://api.github.com/repos/ollama/ollama/issues/8124/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
}
|
https://api.github.com/repos/ollama/ollama/issues/8124/timeline
| null | null | true
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.