Update README.md
Browse files
README.md
CHANGED
|
@@ -30,6 +30,17 @@ Each branch contains an individual bits per weight, with the main one containing
|
|
| 30 |
|
| 31 |
Original model: https://huggingface.co/google/codegemma-7b-it
|
| 32 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
No GQA - VRAM requirements will be higher
|
| 34 |
|
| 35 |
| Branch | Bits | lm_head bits | Size (4k) | Size (16k) | Description |
|
|
|
|
| 30 |
|
| 31 |
Original model: https://huggingface.co/google/codegemma-7b-it
|
| 32 |
|
| 33 |
+
## Prompt format
|
| 34 |
+
|
| 35 |
+
```
|
| 36 |
+
<bos><start_of_turn>user
|
| 37 |
+
{prompt}<end_of_turn>
|
| 38 |
+
<start_of_turn>model
|
| 39 |
+
|
| 40 |
+
```
|
| 41 |
+
|
| 42 |
+
## Available sizes
|
| 43 |
+
|
| 44 |
No GQA - VRAM requirements will be higher
|
| 45 |
|
| 46 |
| Branch | Bits | lm_head bits | Size (4k) | Size (16k) | Description |
|