graph-bracs / README.md
ogutsevda's picture
Update README.md
b2ee95a verified
metadata
license: cc-by-nc-sa-4.0
task_categories:
  - graph-ml
tags:
  - histopathology
  - graph-classification
  - breast-cancer
  - pytorch-geometric
pretty_name: Graph-BRACS
size_categories:
  - 1K<n<10K

Graph-BRACS: A Cell-Graph Dataset for Breast Cancer from BRACS

Graph-BRACS teaser – cell-graph construction from a histopathology ROI

Graph-BRACS is a graph-level classification dataset derived from the BRACS (BReAst Carcinoma Subtyping) histopathology dataset. Each region-of-interest (ROI) image is converted into a cell-graph where nodes represent detected cell nuclei and edges encode spatial proximity, enabling graph-based learning for fine-grained breast lesion subtyping across 7 clinically relevant classes. Note that node features describe cell morphology, texture, and color intensity whereas edge features are Euclidean distance in micrometers.

This dataset is part of the paper GrapHist: Graph Self-Supervised Learning for Histopathology.

⚠️ Edge Weight Note: While the architecture in GrapHist supports both positive and negative edge weights, by default edge features represent Euclidean distances—meaning farther nodes have larger, positive values. This can be counterintuitive for many graph neural network models. We recommend experimenting with edge weights, such as using their inverse (e.g., 1/distance) or negative distance (e.g., -distance), to better capture proximity and benefit learning.

Dataset Summary

Property Value
Total graphs 4493
Classes 7
Train / Test split 3594 / 899
Node feature dim 96
Edge feature dim 1

Classes

Label Full Name Count
N Normal 474
PB Pathological Benign 833
UDH Usual Ductal Hyperplasia 515
FEA Flat Epithelial Atypia 752
ADH Atypical Ductal Hyperplasia 505
DCIS Ductal Carcinoma In Situ 768
IC Invasive Carcinoma 646

Data Structure

graph-bracs/
├── README.md
├── metadata.csv                          # sample_id, label, split, graph_path
├── animation.gif        
└── data/
    ├── BRACS_1003670_IC_1.pt
    ├── BRACS_1003660_UDH_1.pt
    └── ...                               

Each .pt file is a PyTorch Geometric Data object with the following attributes:

Attribute Shape Description
x [num_nodes, 96] Node feature matrix
edge_index [2, num_edges] Graph connectivity in COO format
edge_attr [num_edges, 1] Edge features
label str Class label
sample_id str Unique sample identifier

metadata.csv

A CSV file mapping each sample to its label, train/test split, and file path:

sample_id,label,split,graph_path
BRACS_1003670_IC_1,IC,train,graph-bracs/data/BRACS_1003670_IC_1.pt
BRACS_1003660_UDH_1,UDH,train,graph-bracs/data/BRACS_1003660_UDH_1.pt
...

Quick Start

import torch
from torch_geometric.data import Data

# Load a single graph
graph = torch.load("data/BRACS_1003670_IC_1.pt", weights_only=False)

print(graph)
# Data(x=[156, 96], edge_index=[2, 382], edge_attr=[382, 1], label='IC', sample_id='BRACS_1003670_IC_1')

print(f"Nodes: {graph.x.shape[0]}, Edges: {graph.edge_index.shape[1]}")

Citation

If you use this dataset, please cite both our work, and the original BRACS dataset:

GrapHist (this dataset):

@misc{ogut2026graphist,
    title={GrapHist: Graph Self-Supervised Learning for Histopathology}, 
    author={Sevda Öğüt and Cédric Vincent-Cuaz and Natalia Dubljevic and Carlos Hurtado and Vaishnavi Subramanian and Pascal Frossard and Dorina Thanou},
    year={2026},
    eprint={2603.00143},
    url={https://arxiv.org/abs/2603.00143}, 
}

BRACS (source images):

@article{brancati2022bracs,
  title     = {{BRACS}: A Dataset for BReAst Carcinoma Subtyping in {H\&E} Histology Images},
  author={Brancati, Nadia and Anniciello, Anna Maria and Pati, Pushpak and Riccio, Daniel and Scognamiglio, Giosu{\`e} and Jaume, Guillaume and De Pietro, Giuseppe and Di Bonito, Maurizio and Foncubierta, Antonio and Botti, Gerardo and others},  
  journal={Database},
  volume={2022},
  pages={baac093},
  year={2022},
  publisher={Oxford University Press UK}
}

License

This dataset is released under the CC BY-NC-SA 4.0 license.