Transformers How to use aswin1906/llama-7b-sql-2k with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("question-answering", model="aswin1906/llama-7b-sql-2k") # Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("aswin1906/llama-7b-sql-2k")
model = AutoModelForCausalLM.from_pretrained("aswin1906/llama-7b-sql-2k")