dga-labin / README.md
Reynier's picture
Upload README.md with huggingface_hub
1e596ef verified
metadata
language: en
tags:
  - dga
  - cybersecurity
  - domain-generation-algorithm
  - text-classification
  - keras
license: mit

DGA-LABin: BiLSTM + Attention (Keras) for DGA Detection

Bidirectional LSTM with Weighted Self-Attention (Keras) trained on 54 DGA families. Part of the DGA Multi-Family Benchmark (Reynier et al., 2026).

Model Description

  • Architecture: Embedding(128) → BiLSTM(128) → SeqWeightedAttention → Dense → sigmoid
  • Input: Character-level encoding, left-padded to 64 chars
  • Output: Binary classification — legit (0) or dga (1)
  • Framework: Keras (TensorFlow)

Usage

# Install dependency
# !pip install keras-self-attention

from huggingface_hub import hf_hub_download
import importlib.util

weights = hf_hub_download("Reynier/dga-labin", "LABin_best_model.keras")
model_py = hf_hub_download("Reynier/dga-labin", "model.py")

spec = importlib.util.spec_from_file_location("labin_model", model_py)
mod = importlib.util.module_from_spec(spec)
spec.loader.exec_module(mod)

model = mod.load_model(weights)
results = mod.predict(model, ["google.com", "xkr3f9mq.ru"])
print(results)

Citation

@article{reynier2026dga,
  title={DGA Multi-Family Benchmark: Comparing Classical and Transformer-based Detectors},
  author={Reynier et al.},
  year={2026}
}