DGA-LABin: BiLSTM + Attention (Keras) for DGA Detection

Bidirectional LSTM with Weighted Self-Attention (Keras) trained on 54 DGA families. Part of the DGA Multi-Family Benchmark (Reynier et al., 2026).

Model Description

  • Architecture: Embedding(128) โ†’ BiLSTM(128) โ†’ SeqWeightedAttention โ†’ Dense โ†’ sigmoid
  • Input: Character-level encoding, left-padded to 64 chars
  • Output: Binary classification โ€” legit (0) or dga (1)
  • Framework: Keras (TensorFlow)

Usage

# Install dependency
# !pip install keras-self-attention

from huggingface_hub import hf_hub_download
import importlib.util

weights = hf_hub_download("Reynier/dga-labin", "LABin_best_model.keras")
model_py = hf_hub_download("Reynier/dga-labin", "model.py")

spec = importlib.util.spec_from_file_location("labin_model", model_py)
mod = importlib.util.module_from_spec(spec)
spec.loader.exec_module(mod)

model = mod.load_model(weights)
results = mod.predict(model, ["google.com", "xkr3f9mq.ru"])
print(results)

Citation

@article{reynier2026dga,
  title={DGA Multi-Family Benchmark: Comparing Classical and Transformer-based Detectors},
  author={Reynier et al.},
  year={2026}
}
Downloads last month
87
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Collection including Reynier/dga-labin