output = llm(
"Once upon a time,",
max_tokens=512,
echo=True
)
print(output)UwU_tinyllama-unsloth_V2.Q4_K_M.gguf
is trained on the https://huggingface.co/datasets/superdrew100/UwU_Alpaca_data_V2
for 6 epoches
saved in gguf format
out of error on my part you have to enter every prompt with:
Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
UwU_tinyllama-unsloth_V1.Q4_K_M.gguf_V1
-unknown test output is garbadge
- Downloads last month
- 128
Hardware compatibility
Log In to add your hardware
4-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="superdrew100/UwU_tinyllama", filename="", )