DGA Multi-Family Benchmark
Collection
8 DGA detection models (CNN, BiLSTM, Bilbo, LABin, Logit, FANCI, DomURLsBERT, ModernBERT) trained on 54 malware families. โข 8 items โข Updated
Bidirectional LSTM with Weighted Self-Attention (Keras) trained on 54 DGA families. Part of the DGA Multi-Family Benchmark (Reynier et al., 2026).
legit (0) or dga (1)# Install dependency
# !pip install keras-self-attention
from huggingface_hub import hf_hub_download
import importlib.util
weights = hf_hub_download("Reynier/dga-labin", "LABin_best_model.keras")
model_py = hf_hub_download("Reynier/dga-labin", "model.py")
spec = importlib.util.spec_from_file_location("labin_model", model_py)
mod = importlib.util.module_from_spec(spec)
spec.loader.exec_module(mod)
model = mod.load_model(weights)
results = mod.predict(model, ["google.com", "xkr3f9mq.ru"])
print(results)
@article{reynier2026dga,
title={DGA Multi-Family Benchmark: Comparing Classical and Transformer-based Detectors},
author={Reynier et al.},
year={2026}
}