SS-350M-SQL-Strict-GGUF
This repository contains the GGUF quantization of SS-350M-SQL-Strict.
Model Summary
SS-350M-SQL-Strict-GGUF is a specialized, ultra-lightweight Small Language Model (SLM) optimized for Text-to-SQL translation on edge devices. Built upon the LiquidAI LFM2.5-350M architecture, this model is engineered for "Strict" output: it generates only raw SQL code, eliminating conversational filler, explanations, or Markdown formatting.
Technical Specifications
- Architecture: Liquid Foundation Model (LFM) 2.5
- Parameters: 350 Million
- Quantization: Q8_0 (8-bit)
- Model Size: ~370 MB
- Context Length: 32,768 tokens
- Inference Engine: Optimized for
llama.cpp
Key Features
- Zero Filler: Returns raw SQL queries immediately (no "Sure, here is your code").
- High Speed: Leverages LFM's linear-complexity architecture for near-instantaneous generation on CPUs.
- Low Footprint: Runs comfortably on devices with < 1GB RAM, making it ideal for mobile or embedded database interfaces.
Prompting Specification (ChatML)
To ensure the "Strict" behavior and prevent hallucinations, you must follow the ChatML prompt format.
Template
<|im_start|>system
You are a SQL translation engine. Return ONLY raw SQL. Schema: {YOUR_SCHEMA}<|im_end|>
<|im_start|>user
{YOUR_QUESTION}<|im_end|>
<|im_start|>assistant
Example Input
System: Table 'employees' (id, name, department, salary)
User: Find the total salary of the 'Sales' department.
Example Output
SELECT SUM(salary) FROM employees WHERE department = 'Sales';
Local Deployment with llama.cpp
You can run this model locally using the following command:
./llama-cli -m SS-350M-SQL-Strict.Q8_0.gguf \
-p "<|im_start|>system\nYou are a SQL engine. Return ONLY raw SQL. Schema: Table 'inventory' (item, quantity)\n<|im_end|>\n<|im_start|>user\nHow many items are in stock?\n<|im_end|>\n<|im_start|>assistant\n" \
--temp 0 \
-n 128
Training Logic
The base model was fine-tuned using 4-bit QLoRA on the Gretel Synthetic SQL dataset. A key differentiator in its training was the use of Completion-Only Loss masking, which focused 100% of the model's learning capacity on SQL syntax rather than prompt structure.
Limitations & Dialect
- Dialect: Defaulted to Standard SQL.
- Complexity: Best suited for schemas with fewer than 20 tables.
- Reasoning: This is a translation engine; it does not "think" step-by-step or explain its logic. If the input is ambiguous, it will attempt the most likely SQL translation.
Citation
If you use this model or the underlying LFM architecture, please cite:
@article{saadsalman2026sqlstrict,
author = {Saad Salman},
title = {SS-350M-SQL-Strict: Edge-Optimized Text-to-SQL},
year = {2026}
}
- Downloads last month
- 27
8-bit
Model tree for saadxsalman/SS-350M-SQL-Strict-GGUF
Base model
LiquidAI/LFM2.5-350M-Base