metadata
license: mit
language:
- ru
- en
- uk
๐ฅ PyroNet-mini
PyroNet-mini is a fine-tuned lightweight variant of microsoft/Phi-4-mini-instruct,
trained with a custom multilingual instruction dataset (English, Ukrainian, Russian).
It belongs to the PyroNet family, serving as the mini version of the main PyroNet.
โจ Features
- ๐น Based on Phi-4-mini-instruct
- ๐น Fine-tuned with LoRA adapters
- ๐น Multilingual (English / ะฃะบัะฐัะฝััะบะฐ / ะ ัััะบะธะน)
- ๐น Softer safety alignment for more natural answers
- ๐น Optimized for dialogue, coding help, and educational purposes
๐ Usage
Inference with Transformers
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = "Kenan023214/PyroNet-mini"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
torch_dtype="auto"
)
messages = [
{"role": "system", "content": system_prompt},
{"role": "user", "content": "Explain quantum entanglement in simple words."}
]
input_text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(input_text, return_tensors="pt").to(model.device)
# Generate the response
outputs = model.generate(
inputs,
max_new_tokens=256,
pad_token_id=tokenizer.eos_token_id
)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
๐ Intended Use
Educational Q&A
Coding assistance
Multilingual chatbots
Creative writing and storytelling
โก Limitations
Being a mini model, it is less capable than larger PyroNet variants
May produce factual inaccuracies
Softer censorship may allow sensitive content
๐ค Author & Contact
Author: IceL1ghtning
Maintainer: KenanAI
Contact:
---