llm.create_chat_completion(
messages = "No input example has been defined for this model task."
)YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
SireIQ
SireIQ is a chat-based AI assistant built for accurate and structured responses.
Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("itstalmeez/sireiq")
model = AutoModelForCausalLM.from_pretrained("itstalmeez/sireiq")
inputs = tokenizer("Hello", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="itstalmeez/sireIQ_model_2", filename="tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf", )