muverqqw commited on
Commit
39902d3
·
verified ·
1 Parent(s): e6f1fae

Create README.md

Browse files

# PyroNet-v2

**PyroNet-v2** is a fine-tuned conversational AI model based on [Qwen2.5-3B-Instruct](https://huggingface.co/Qwen/Qwen2.5-3B-Instruct).
It is the successor to **PyroNet-v1.5**, which was built on top of [phi-2](https://huggingface.co/microsoft/phi-2).

Created by **IceL1ghtning (Artyom, Ukraine)**.

---

## 🔧 Model Details
- **Base model:** Qwen2.5-3B-Instruct
- **Parameters:** ~3B
- **Previous version:** PyroNet-v1.5 (phi-2)
- **Input format:** ChatML (`<|im_start|>role ... <|im_end|>`)
- **Multilingual support:** English, Russian, Ukrainian, and more

---

## 🚀 Quick Start

### Installation
```bash
pip install transformers accelerate
```
## Usage Example
```python
from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "Kenan023214/PyroNet-v2"

# Load tokenizer and model
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
device_map="auto"
)

# Example conversation
messages = [
{"role": "user", "content": "Hi! Can you solve the equation x^2 - 5x + 6 = 0?"}
]

# Apply chat template
inputs = tokenizer.apply_chat_template(
messages,
tokenize=True,
add_generation_prompt=True,
return_tensors="pt"
).to(model.device)

# Generate output
outputs = model.generate(
inputs,
max_new_tokens=256,
temperature=0.7,
do_sample=True
)

# Decode and print
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(response)
```
## 📂 Version History

* **PyroNet-v1.5** — based on Microsoft phi-2

* **PyroNet-v2** — upgraded to Qwen2.5-3B-Instruct with improved accuracy and longer context handling

## ⚠️ License & Limitations

This model is provided as is.
It must **not** be used for:

harmful or malicious activities

generating unsafe or illegal content

✦ Created by IceL1ghtning (Artyom, Ukraine)

Files changed (1) hide show
  1. README.md +8 -0
README.md ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language:
4
+ - ru
5
+ - en
6
+ - uk
7
+ - zh
8
+ ---