echo-cyber-adapter / README.md
Bmcbob76's picture
Add detailed model card README with usage examples
2ea4f35 verified
---
library_name: peft
base_model: Qwen/Qwen2.5-7B-Instruct
tags:
- lora
- qwen2
- echo-omega-prime
- cybersecurity
- threat-analysis
- vulnerability
- incident-response
- security
license: apache-2.0
language:
- en
pipeline_tag: text-generation
---
# Echo Cybersecurity Adapter
> Part of the **Echo Omega Prime** AI engine collection — domain-specialized LoRA adapters built on Qwen2.5-7B-Instruct.
## Overview
Cybersecurity threat analysis, vulnerability assessment, incident response, and security architecture review.
**Domain:** Cybersecurity & Threat Intelligence
## Training Details
| Parameter | Value |
|-----------|-------|
| **Base Model** | [Qwen/Qwen2.5-7B-Instruct](https://huggingface.co/Qwen/Qwen2.5-7B-Instruct) |
| **Method** | QLoRA (4-bit NF4 quantization + LoRA) |
| **LoRA Rank (r)** | 16 |
| **LoRA Alpha** | 32 |
| **Target Modules** | q_proj, k_proj, v_proj, o_proj, gate_proj, up_proj, down_proj |
| **Training Data** | Security doctrine blocks covering MITRE ATT&CK, CVE analysis, NIST frameworks, and incident response playbooks |
| **Epochs** | 3 |
| **Loss** | converged |
| **Adapter Size** | ~38 MB |
| **Framework** | PEFT + Transformers + bitsandbytes |
| **Precision** | bf16 (adapter) / 4-bit NF4 (base during training) |
## Usage with PEFT
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
import torch
# Load base model
base_model = AutoModelForCausalLM.from_pretrained(
"Qwen/Qwen2.5-7B-Instruct",
torch_dtype=torch.bfloat16,
device_map="auto",
)
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2.5-7B-Instruct")
# Load LoRA adapter
model = PeftModel.from_pretrained(base_model, "Bmcbob76/echo-cyber-adapter")
# Generate
messages = [
{"role": "system", "content": "You are a domain expert in Cybersecurity & Threat Intelligence."},
{"role": "user", "content": "Analyze this network traffic log for indicators of compromise using MITRE ATT&CK framework and recommend containment actions."},
]
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(text, return_tensors="pt").to(model.device)
with torch.no_grad():
outputs = model.generate(**inputs, max_new_tokens=1024, temperature=0.3)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:], skip_special_tokens=True))
```
## vLLM Multi-Adapter Serving
```bash
python -m vllm.entrypoints.openai.api_server \
--model Qwen/Qwen2.5-7B-Instruct \
--enable-lora \
--lora-modules 'echo-cyber-adapter=Bmcbob76/echo-cyber-adapter'
```
Then query via OpenAI-compatible API:
```python
from openai import OpenAI
client = OpenAI(base_url="http://localhost:8000/v1", api_key="token")
response = client.chat.completions.create(
model="echo-cyber-adapter",
messages=[
{"role": "system", "content": "You are a domain expert in Cybersecurity & Threat Intelligence."},
{"role": "user", "content": "Analyze this network traffic log for indicators of compromise using MITRE ATT&CK framework and recommend containment actions."},
],
temperature=0.3,
max_tokens=1024,
)
print(response.choices[0].message.content)
```
## Echo Omega Prime Collection
This adapter is part of the **Echo Omega Prime** intelligence engine system — 2,600+ domain-specialized engines spanning law, engineering, medicine, cybersecurity, oil & gas, and more.
| Adapter | Domain |
|---------|--------|
| [echo-titlehound-lora](https://huggingface.co/Bmcbob76/echo-titlehound-lora) | Oil & Gas Title Examination |
| [echo-doctrine-generator-qlora](https://huggingface.co/Bmcbob76/echo-doctrine-generator-qlora) | AI Doctrine Generation |
| [echo-landman-adapter](https://huggingface.co/Bmcbob76/echo-landman-adapter) | Landman Operations |
| [echo-taxlaw-adapter](https://huggingface.co/Bmcbob76/echo-taxlaw-adapter) | Tax Law & IRC |
| [echo-legal-adapter](https://huggingface.co/Bmcbob76/echo-legal-adapter) | Legal Analysis |
| [echo-realestate-adapter](https://huggingface.co/Bmcbob76/echo-realestate-adapter) | Real Estate Law |
| [echo-cyber-adapter](https://huggingface.co/Bmcbob76/echo-cyber-adapter) | Cybersecurity |
| [echo-engineering-adapter](https://huggingface.co/Bmcbob76/echo-engineering-adapter) | Engineering Analysis |
| [echo-medical-adapter](https://huggingface.co/Bmcbob76/echo-medical-adapter) | Medical & Clinical |
| [echo-software-adapter](https://huggingface.co/Bmcbob76/echo-software-adapter) | Software & DevOps |
## License
Apache 2.0