Instructions to use defog/sqlcoder-7b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use defog/sqlcoder-7b with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="defog/sqlcoder-7b")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("defog/sqlcoder-7b") model = AutoModelForCausalLM.from_pretrained("defog/sqlcoder-7b") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use defog/sqlcoder-7b with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "defog/sqlcoder-7b" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "defog/sqlcoder-7b", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker
docker model run hf.co/defog/sqlcoder-7b
- SGLang
How to use defog/sqlcoder-7b with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "defog/sqlcoder-7b" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "defog/sqlcoder-7b", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "defog/sqlcoder-7b" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "defog/sqlcoder-7b", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }' - Docker Model Runner
How to use defog/sqlcoder-7b with Docker Model Runner:
docker model run hf.co/defog/sqlcoder-7b
Prompt to add comments and normalize your script schema like the example.
I tested Google Bard in normalizing the script to match the model example.
Edit the prompt to your need.
Prompt:
"
Giving the following sql database create table scripts, add a small comment that describe the meaning of each column, consider that the column names are in PT-BR but the comments should be in english. The scripts represent the database schema of a Profit and Loss datawarehouse in starschema model.
Input Example:
CREATE TABLE products (
product_id INTEGER PRIMARY KEY,
name VARCHAR(50),
price DECIMAL(10,2),
quantity INTEGER
);
Here is a expected response example:
CREATE TABLE products (
product_id INTEGER PRIMARY KEY, -- Unique ID for each product
name VARCHAR(50), -- Name of the product
price DECIMAL(10,2), -- Price of each unit of the product
quantity INTEGER -- Current quantity in stock
);
The response should contain only the original script with the column description besides the column declaration on the script.
"
After sending this first instruction, paste your sql script and the LLM should return the script with briefly comments.
add this to retrieve the relationships using the column name:
"
Give the relationship between tables like the example bellow. Maintain the same pattern "table_name.column_name can be joined to table_name.column.name:
OUTPUT EXPECTED (do not use this in the output, use for reference only):
-- sales.product_id can be joined with products.product_id
-- sales.customer_id can be joined with customers.customer_id
-- sales.salesperson_id can be joined with salespeople.salesperson_id
-- product_suppliers.product_id can be joined with products.product_id
Use the column name to identify foreign keys. The column that matchs the name from another table is considered a relationship.
"