Datasets:
Tasks:
Graph Machine Learning
Modalities:
Text
Formats:
json
Languages:
English
Size:
< 1K
ArXiv:
Tags:
hypergraph
Update README.md
Browse files
README.md
CHANGED
|
@@ -8,12 +8,6 @@ task_categories:
|
|
| 8 |
---
|
| 9 |
# email-Enron
|
| 10 |
|
| 11 |
-
Some basic statistics of this dataset are:
|
| 12 |
-
|
| 13 |
-
- Number of nodes 143
|
| 14 |
-
- Number of hyperedges 1,512
|
| 15 |
-
- Number of connected components 1
|
| 16 |
-
|
| 17 |
<div align="center">
|
| 18 |
<table>
|
| 19 |
<tbody>
|
|
@@ -40,6 +34,20 @@ Some basic statistics of this dataset are:
|
|
| 40 |
</table>
|
| 41 |
</div>
|
| 42 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
```
|
| 44 |
@article{Benson-2018-simplicial,
|
| 45 |
author = {Benson, Austin R. and Abebe, Rediet and Schaub, Michael T. and Jadbabaie, Ali and Kleinberg, Jon},
|
|
|
|
| 8 |
---
|
| 9 |
# email-Enron
|
| 10 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
<div align="center">
|
| 12 |
<table>
|
| 13 |
<tbody>
|
|
|
|
| 34 |
</table>
|
| 35 |
</div>
|
| 36 |
|
| 37 |
+
email-Enron is an undirected hypergraph built from the Enron email corpus, designed for higher-order network / hypergraph machine learning. In email communication, a single message can involve more than two people; this dataset captures that group interaction by modeling each email as a hyperedge connecting the sender and all recipients, while nodes represent Enron email addresses (restricted to a core set of employees).
|
| 38 |
+
|
| 39 |
+
The hypergraph is stored in HIF (Hypergraph Interchange Format) as a JSON object, following the schema used to exchange higher-order network data across tools. Concretely, the dataset provides the canonical HIF fields—network-type, metadata, nodes, edges, and incidences—so you can reconstruct the full incidence structure without additional processing.
|
| 40 |
+
|
| 41 |
+
In addition to the raw hypergraph topology, vector features are provided for both nodes and hyperedges (in their attribute dictionaries), enabling out-of-the-box experimentation with representation learning and downstream tasks:
|
| 42 |
+
|
| 43 |
+
Spectral features: eigenvectors of the (hypergraph) Laplacian (computed via sparse eigensolvers).
|
| 44 |
+
|
| 45 |
+
Node2Vec embeddings: random-walk–based structural embeddings.
|
| 46 |
+
|
| 47 |
+
VilLain embeddings: self-supervised hypergraph representation learning via virtual label propagation.
|
| 48 |
+
|
| 49 |
+
Basic statistics (as packaged here): 143 nodes, 1,512 hyperedges, 1 connected component.
|
| 50 |
+
|
| 51 |
```
|
| 52 |
@article{Benson-2018-simplicial,
|
| 53 |
author = {Benson, Austin R. and Abebe, Rediet and Schaub, Michael T. and Jadbabaie, Ali and Kleinberg, Jon},
|