Spaces:
Runtime error
Runtime error
unfortunately my guess was incorrect, and we are missing 13B with a minor RAM insufficiency. :(
Browse files
app.py
CHANGED
|
@@ -92,7 +92,7 @@ You can learn more about LoRa here:
|
|
| 92 |
|
| 93 |
This space is loading the model to RAM without performing any quantization, so the required RAM is high.
|
| 94 |
|
| 95 |
-
You can merge models up to
|
| 96 |
"""
|
| 97 |
|
| 98 |
|
|
|
|
| 92 |
|
| 93 |
This space is loading the model to RAM without performing any quantization, so the required RAM is high.
|
| 94 |
|
| 95 |
+
You can merge models up to 7B. (If your adapter weights are too large, it might not work.)
|
| 96 |
"""
|
| 97 |
|
| 98 |
|