Replace llama with Grantite
Browse files
README.md
CHANGED
|
@@ -11,25 +11,20 @@ Our flagship products focus on answering hard questions about code:
|
|
| 11 |
|
| 12 |
---
|
| 13 |
|
| 14 |
-
|
| 15 |
## Available Models
|
| 16 |
|
| 17 |
| Model Family | Parameters | Model | Quantization | Direct Download |
|
| 18 |
-
|
| 19 |
-
|
|
| 20 |
-
|
|
| 21 |
-
| Llama 3.2 | 8 Billion | Llama-3.2-8B-Instruct | Q5_K_M | [Download](https://huggingface.co/SciTools/Llama3.2/resolve/main/Llama-3.2-8B-Instruct-Q5_K_M.gguf?download=true) |
|
| 22 |
-
| Qwen3 | .6 Billion | Qwen3-0.6B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-0.6B-Q4_K_M.gguf?download=true) |
|
| 23 |
| Qwen3 | 1.7 Billion | Qwen3-1.7B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-1.7B-Q4_K_M.gguf?download=true) |
|
| 24 |
| Qwen3 | 4 Billion | Qwen3-4B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-4B-Q4_K_M.gguf?download=true) |
|
| 25 |
| Qwen3 | 8 Billion | Qwen3-8B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-8B-Q4_K_M.gguf?download=true) |
|
| 26 |
-
| Qwen3 | 14 Billion | Qwen3-14B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-14B-Q4_K_M.gguf?download=true) |
|
| 27 |
|
| 28 |
[Help me choose a local LLM for Understand](https://support.scitools.com/support/solutions/articles/70000680303-choosing-a-local-llm-for-understand)
|
| 29 |
|
| 30 |
---
|
| 31 |
|
| 32 |
-
|
| 33 |
## What We Build
|
| 34 |
|
| 35 |
**Understand**
|
|
@@ -52,6 +47,7 @@ We focus on **local, transparent, and developer-controlled AI**.
|
|
| 52 |
The models hosted here are provided to support SciTools products as well as other GGUF-compatible tools.
|
| 53 |
|
| 54 |
---
|
|
|
|
| 55 |
## Who Uses SciTools
|
| 56 |
|
| 57 |
SciTools products are used by:
|
|
|
|
| 11 |
|
| 12 |
---
|
| 13 |
|
|
|
|
| 14 |
## Available Models
|
| 15 |
|
| 16 |
| Model Family | Parameters | Model | Quantization | Direct Download |
|
| 17 |
+
|-------------|------------|-------|--------------|-----------------|
|
| 18 |
+
| Granite 4 | 1 Billion | Granite-4.0-1B | Q4 (GGUF) | [Download](https://huggingface.co/ibm-granite/granite-4.0-1b-GGUF) |
|
| 19 |
+
| Granite 4 | 3 Billion | Granite-4.0-Micro | Q4_K_M | [Download](https://huggingface.co/unsloth/granite-4.0-micro-Q4_K_M.gguf) |
|
|
|
|
|
|
|
| 20 |
| Qwen3 | 1.7 Billion | Qwen3-1.7B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-1.7B-Q4_K_M.gguf?download=true) |
|
| 21 |
| Qwen3 | 4 Billion | Qwen3-4B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-4B-Q4_K_M.gguf?download=true) |
|
| 22 |
| Qwen3 | 8 Billion | Qwen3-8B | Q4_K_M | [Download](https://huggingface.co/SciTools/Qwen3/resolve/main/Qwen3-8B-Q4_K_M.gguf?download=true) |
|
|
|
|
| 23 |
|
| 24 |
[Help me choose a local LLM for Understand](https://support.scitools.com/support/solutions/articles/70000680303-choosing-a-local-llm-for-understand)
|
| 25 |
|
| 26 |
---
|
| 27 |
|
|
|
|
| 28 |
## What We Build
|
| 29 |
|
| 30 |
**Understand**
|
|
|
|
| 47 |
The models hosted here are provided to support SciTools products as well as other GGUF-compatible tools.
|
| 48 |
|
| 49 |
---
|
| 50 |
+
|
| 51 |
## Who Uses SciTools
|
| 52 |
|
| 53 |
SciTools products are used by:
|