Instructions to use juierror/flan-t5-text2sql-with-schema with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use juierror/flan-t5-text2sql-with-schema with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("juierror/flan-t5-text2sql-with-schema") model = AutoModelForSeq2SeqLM.from_pretrained("juierror/flan-t5-text2sql-with-schema") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -33,4 +33,6 @@ def inference(question: str, table: List[str]) -> str:
|
|
| 33 |
print(inference(question="get people name with age equal 25", table=["id", "name", "age"]))
|
| 34 |
```
|
| 35 |
|
| 36 |
-
PS. From this [discussion](https://huggingface.co/juierror/flan-t5-text2sql-with-schema/discussions/5), I think the base model that I use for finetune did not support the token `<`, so this might not be a good model to do this tasks.
|
|
|
|
|
|
|
|
|
| 33 |
print(inference(question="get people name with age equal 25", table=["id", "name", "age"]))
|
| 34 |
```
|
| 35 |
|
| 36 |
+
PS. From this [discussion](https://huggingface.co/juierror/flan-t5-text2sql-with-schema/discussions/5), I think the base model that I use for finetune did not support the token `<`, so this might not be a good model to do this tasks.
|
| 37 |
+
|
| 38 |
+
However, you might consider to use work around method from [vonjack](https://huggingface.co/juierror/flan-t5-text2sql-with-schema/discussions/5#64743e462a74fb43ccec0a69).
|