Model Depot - ONNX
Collection
Leading Models packaged in ONNX format optimized for use with AI PCs • 26 items • Updated • 1
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("llmware/slim-sql-onnx")
model = AutoModelForCausalLM.from_pretrained("llmware/slim-sql-onnx")slim-sql-onnx is a small specialized function calling model that takes as input a table schema and a natural language query, and outputs a SQL statement that corresponds to the query, and can be run against a database table. This is a very small text-to-sql model designed for reasonable accuracy on single tables and relatively straightforward queries, and for easy integration into multi-step processes.
This is an ONNX int4 quantized version of slim-sql-1b-v0, providing a very fast, very small inference implementation, optimized for AI PCs using Intel GPU, CPU and NPU.
Base model
llmware/slim-sql-1b-v0
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="llmware/slim-sql-onnx")