Karan6124 commited on
Commit
2e3758f
ยท
verified ยท
1 Parent(s): cb50119

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +58 -0
README.md ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: mit
5
+ tags:
6
+ - text-to-sql
7
+ - t5
8
+ - nlp2sql
9
+ - agentic-ai
10
+ - postgresql
11
+ datasets:
12
+ - spider
13
+ metrics:
14
+ - accuracy
15
+ model-index:
16
+ - name: T5-NL2SQL-Gen
17
+ results: []
18
+ ---
19
+
20
+ # ๐Ÿค– T5-NL2SQL-Gen Specialist
21
+
22
+ This model is a fine-tuned **T5-Small** architecture specialized for converting Natural Language questions into precise PostgreSQL queries. It serves as the primary "Reasoning Specialist" within the **NLP2SQL Autonomous Intelligence Layer**.
23
+
24
+ ## ๐Ÿš€ Model Details
25
+ - **Architecture**: T5 (Text-to-Text Transfer Transformer)
26
+ - **Specialization**: PostgreSQL Query Generation
27
+ - **Training Data**: Fine-tuned on SQL-specific datasets (Spider/WikiSQL) and custom schema-mapped samples.
28
+ - **Project Role**: Acts as the initial SQL Generator in a Hybrid Agentic loop.
29
+
30
+ ## ๐Ÿ”„ Hybrid Agentic Flow
31
+ This model is designed to work in tandem with Large Language Models (like Gemini 2.0 Flash) in a structured multi-agent workflow:
32
+ 1. **Local ML (T5)**: Generates the initial high-speed SQL draft.
33
+ 2. **Gemini Auditor**: Validates the draft against the actual schema, adds double-quotes, and fixes hallucinations.
34
+ 3. **Self-Healing Loop**: If execution fails, the agents use this model's logic to refine the plan.
35
+
36
+ ## ๐Ÿ”— Project Context
37
+ This model is the engine for the **NLP2SQL Platform**.
38
+ - **GitHub Repository**: [sumit08099/NLP_SQL_GEN](https://github.com/sumit08099/NLP_SQL_GEN)
39
+ - **Frontend**: React 19 / Vite
40
+ - **Backend**: FastAPI / LangGraph
41
+
42
+ ## ๐Ÿ›  Usage (Hugging Face Transformers)
43
+ ```python
44
+ from transformers import T5Tokenizer, T5ForConditionalGeneration
45
+
46
+ model_name = "Karan6124/t5-nl2sql-gen"
47
+ tokenizer = T5Tokenizer.from_pretrained(model_name)
48
+ model = T5ForConditionalGeneration.from_pretrained(model_name)
49
+
50
+ input_text = "translate English to SQL: How many users signed up in the last 30 days? \n Context: Table users (id, username, created_at)"
51
+ inputs = tokenizer(input_text, return_tensors="pt")
52
+ outputs = model.generate(**inputs, max_length=512)
53
+
54
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
55
+ ```
56
+
57
+ ## ๐Ÿ‘ฅ Lead Engineers
58
+ Developed and optimized by **Sumit** & **Karan**.