How to use from the
Use from the
llama-cpp-python library
# !pip install llama-cpp-python

from llama_cpp import Llama

llm = Llama.from_pretrained(
	repo_id="itstalmeez/sireIQ_model_2",
	filename="tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf",
)
llm.create_chat_completion(
	messages = "No input example has been defined for this model task."
)

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

SireIQ

SireIQ is a chat-based AI assistant built for accurate and structured responses.

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("itstalmeez/sireiq")
model = AutoModelForCausalLM.from_pretrained("itstalmeez/sireiq")

inputs = tokenizer("Hello", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
Downloads last month
1
Safetensors
Model size
1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Space using itstalmeez/sireIQ_model_2 1