calmodovar commited on
Commit
fe8d2ee
·
verified ·
1 Parent(s): e4dc76a

Update dataset card

Browse files
Files changed (1) hide show
  1. README.md +54 -27
README.md CHANGED
@@ -1,29 +1,56 @@
1
  ---
2
- dataset_info:
3
- features:
4
- - name: date
5
- dtype: string
6
- - name: time
7
- dtype: string
8
- - name: pid
9
- dtype: int32
10
- - name: level
11
- dtype: string
12
- - name: component
13
- dtype: string
14
- - name: content
15
- dtype: string
16
- - name: anomaly
17
- dtype: int8
18
- splits:
19
- - name: train
20
- num_bytes: 1729634475
21
- num_examples: 11175629
22
- download_size: 286024691
23
- dataset_size: 1729634475
24
- configs:
25
- - config_name: default
26
- data_files:
27
- - split: train
28
- path: data/train-*
29
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ pretty_name: HDFS_v1
3
+ dataset_name: logfit-project/hdfs_v1
4
+ task_categories:
5
+ - anomaly-detection
6
+ language:
7
+ - en
8
+ size_categories:
9
+ - 10M<n<15M
10
+ annotations_creators:
11
+ - logfit-project
12
+ license: other
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ---
14
+
15
+ # Dataset Card for logfit-project/hdfs_v1
16
+
17
+ ## Dataset Summary
18
+ The HDFS v1 log dataset captures Hadoop Distributed File System (HDFS) console logs that were collected
19
+ from a private cloud deployment while benchmark workloads were executed. Each log line can be associated
20
+ with one or more block identifiers; block-level anomaly labels were generated by manually crafted rules.
21
+ This script preserves the raw line structure while attaching a binary anomaly flag for downstream anomaly
22
+ detection research.
23
+
24
+ ## Supported Tasks and Leaderboards
25
+ - anomaly-detection: binary classification of log lines based on the attached anomaly label.
26
+
27
+ ## Dataset Structure
28
+ - `date`: Six-digit date stamp from the original console output (`YYMMDD`).
29
+ - `time`: Six-digit time stamp (`HHMMSS`).
30
+ - `pid`: Process identifier extracted from the log line.
31
+ - `level`: Log level (e.g., `INFO`).
32
+ - `component`: Java/daemon component emitting the log entry.
33
+ - `content`: Verbose message content for the event.
34
+ - `anomaly`: Binary indicator derived from block-level labels (`1` = anomalous block).
35
+
36
+ ## Source Data
37
+ - **Homepage:** https://github.com/logpai/loghub/tree/master/HDFS#hdfs_v1
38
+ - **Original Maintainers:** The LogPAI team (https://logpai.com/).
39
+
40
+ ## Dataset Creation
41
+ The raw logs were parsed in a streaming fashion using a deterministic regular expression so that large-scale
42
+ HDFS deployments can be transformed without exhausting memory. Block-level labels are joined on the fly by
43
+ searching for block identifiers in each line.
44
+
45
+ ## Uses
46
+ Suitable for supervised and semi-supervised anomaly detection across distributed system logs, log template
47
+ mining, and benchmarking log representation learning.
48
+
49
+ ## Citation
50
+ - Wei Xu, Ling Huang, Armando Fox, David Patterson, Michael Jordan. "Detecting Large-Scale System Problems
51
+ by Mining Console Logs", SOSP 2009.
52
+ - Jieming Zhu, Shilin He, Pinjia He, Jinyang Liu, Michael R. Lyu. "Loghub: A Large Collection of System Log
53
+ Datasets for AI-driven Log Analytics", ISSRE 2023.
54
+
55
+ ## Dataset Statistics
56
+ - Number of log lines: 11175629