Datasets:
metadata
pretty_name: CICIDS2017 (mirror)
language:
- en
task_categories:
- other
tags:
- cybersecurity
- intrusion-detection
- network-traffic
- pcap
size_categories:
- 10B<n<100B
license: other
CICIDS2017 (Unofficial mirror on Hugging Face)
Dataset Summary
This repository provides a mirrored copy of the CICIDS2017 dataset files (PCAPs and accompanying archives) for easier access and reproducibility in ML/security research workflows.
Important: This is not the original distribution. Please refer to the official source for authoritative documentation, updates, and terms.
Source / Origin
- Original dataset name: CICIDS2017
- Original publisher: Canadian Institute for Cybersecurity (CIC), University of New Brunswick (UNB)
- Official page: https://www.unb.ca//cic/datasets/ids-2017.html
- Original paper / description: Iman Sharafaldin, Arash Habibi Lashkari, and Ali A. Ghorbani, “Toward Generating a New Intrusion Detection Dataset and Intrusion Traffic Characterization”, 4th International Conference on Information Systems Security and Privacy (ICISSP), Portugal, January 2018.
What is included here
Files mirrored from the original distribution (as provided by CIC), e.g.:
Monday-WorkingHours.pcapTuesday-WorkingHours.pcapWednesday-workingHours.pcapThursday-WorkingHours.pcapFriday-WorkingHours.pcapMachineLearningCSV.zipGeneratedLabelledFlows.zip
Notes:
- File list may differ depending on the version you mirrored.
- If you require a specific release, pin to a commit hash in this repo.
Intended Use
Common use cases include (but are not limited to):
- Network intrusion detection research
- Traffic classification
- Feature extraction from PCAPs
- Benchmarking and reproduction of published results
Data Access / How to Use
Using datasets (metadata-only listing)
If you want to manage the dataset via the Hub, you can download the repository files directly.
Download with huggingface_hub
from huggingface_hub import hf_hub_download
repo_id = "bencorn/CICIDS2017"
filename = "Monday-WorkingHours.pcap" # example
local_path = hf_hub_download(
repo_id=repo_id,
repo_type="dataset",
filename=filename,
)
print(local_path)