| --- |
| language: en |
| tags: |
| - dga |
| - cybersecurity |
| - domain-generation-algorithm |
| - text-classification |
| - pytorch |
| license: mit |
| --- |
| |
| # DGA-BiLSTM: Bidirectional LSTM + Self-Attention for DGA Detection |
|
|
| BiLSTM with Self-Attention (Namgung et al. 2021) trained on 54 DGA families. |
| Part of the **DGA Multi-Family Benchmark** (Reynier et al., 2026). |
|
|
| ## Model Description |
|
|
| - **Architecture:** Embedding β BiLSTM(128Γ2) β Self-Attention β FC(64) β sigmoid |
| - **Input:** Character-level encoding, right-padded to 75 chars |
| - **Output:** Binary classification β `legit` (0) or `dga` (1) |
| - **Framework:** PyTorch |
| - **Reference:** Namgung et al., Security and Communication Networks, 2021 |
|
|
| ## Performance (54 DGA families, 30 runs each) |
|
|
| | Metric | Value | |
| |-----------|--------| |
| | Accuracy | 0.8916 | |
| | F1 | 0.8556 | |
| | Precision | 0.9134 | |
| | Recall | 0.8433 | |
| | FPR | 0.0600 | |
| | Query Time| 0.067 ms/domain (CPU) | |
|
|
| ## Usage |
|
|
| ```python |
| from huggingface_hub import hf_hub_download |
| import importlib.util, torch |
| |
| weights = hf_hub_download("Reynier/dga-bilstm", "bilstm_best.pth") |
| model_py = hf_hub_download("Reynier/dga-bilstm", "model.py") |
| |
| spec = importlib.util.spec_from_file_location("bilstm_model", model_py) |
| mod = importlib.util.module_from_spec(spec) |
| spec.loader.exec_module(mod) |
| |
| model = mod.load_model(weights) |
| results = mod.predict(model, ["google.com", "xkr3f9mq.ru"]) |
| print(results) |
| ``` |
|
|
| ## Citation |
|
|
| ```bibtex |
| @article{reynier2026dga, |
| title={DGA Multi-Family Benchmark: Comparing Classical and Transformer-based Detectors}, |
| author={Reynier et al.}, |
| year={2026} |
| } |
| ``` |
|
|