DGA-BiLSTM: Bidirectional LSTM + Self-Attention for DGA Detection

BiLSTM with Self-Attention (Namgung et al. 2021) trained on 54 DGA families. Part of the DGA Multi-Family Benchmark (Reynier et al., 2026).

Model Description

  • Architecture: Embedding โ†’ BiLSTM(128ร—2) โ†’ Self-Attention โ†’ FC(64) โ†’ sigmoid
  • Input: Character-level encoding, right-padded to 75 chars
  • Output: Binary classification โ€” legit (0) or dga (1)
  • Framework: PyTorch
  • Reference: Namgung et al., Security and Communication Networks, 2021

Performance (54 DGA families, 30 runs each)

Metric Value
Accuracy 0.8916
F1 0.8556
Precision 0.9134
Recall 0.8433
FPR 0.0600
Query Time 0.067 ms/domain (CPU)

Usage

from huggingface_hub import hf_hub_download
import importlib.util, torch

weights = hf_hub_download("Reynier/dga-bilstm", "bilstm_best.pth")
model_py = hf_hub_download("Reynier/dga-bilstm", "model.py")

spec = importlib.util.spec_from_file_location("bilstm_model", model_py)
mod = importlib.util.module_from_spec(spec)
spec.loader.exec_module(mod)

model = mod.load_model(weights)
results = mod.predict(model, ["google.com", "xkr3f9mq.ru"])
print(results)

Citation

@article{reynier2026dga,
  title={DGA Multi-Family Benchmark: Comparing Classical and Transformer-based Detectors},
  author={Reynier et al.},
  year={2026}
}
Downloads last month
28
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Collection including Reynier/dga-bilstm