PyroNet / README.md
Kenan023214's picture
Update README.md
cef1cf5 verified
---
license: apache-2.0
language:
- en
- ru
- uk
base_model:
- openai/gpt-oss-20b
---
# 🔥 PyroNet
**PyroNet** is a fine-tuned and customized open-source large language model with a unique system identity.
Originally based on **[gpt-oss-20b](https://huggingface.co/openai/gpt-oss-20b)**, this model has been **further trained and specialized** to embody the **PyroNet persona**.
Created and maintained by **IceL1ghtning** from **Ukraine** 🇺🇦.
---
## ✨ Features
- 🧠 Fine-tuned on custom datasets to define the **PyroNet identity**
- 🎭 Optimized for **chat, reasoning, coding, and explanation tasks**
- 🔗 Fully compatible with the Hugging Face `transformers` ecosystem
- 📦 Includes a custom **chat template** and structured **system prompt**
---
## 🚀 Usage
### Install requirements
```bash
pip install transformers accelerate bitsandbytes
```
### Quick inference
```python
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
model_id = "Kenan023214/PyroNet"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto")
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
prompt = "Hello, PyroNet! Can you introduce yourself?"
result = pipe(prompt, max_new_tokens=200, do_sample=True, temperature=0.8)
print(result[0]["generated_text"])
```
---
### 💡 Recommendations
Best run on GPU with ≥24 GB VRAM (e.g. RTX 3090, A100).
For smaller GPUs, use:
```python
model = AutoModelForCausalLM.from_pretrained(
model_id,
device_map="auto",
load_in_8bit=True
)
```
(Requires bitsandbytes).
Adjust temperature and top_p for more creative or deterministic outputs.
---
[💬 Telegram](https://t.me/LogovoOfEngineer)
📧 Contact: engineerglab@gmail.com
---
### 📜 License & Disclaimer
License: Apache 2.0
Based on gpt-oss-20b
For research purposes only. Not intended for production without further alignment and safety checks.
Responsibility for usage lies with the end-user.
---
🔥 PyroNet — Where logic meets creativity.