--- license: cc-by-4.0 task_categories: - tabular-regression - tabular-classification language: - en tags: - gpu - memory-estimation - utilization-estimation - deep-learning - resource-management - mlp - cnn - transformer pretty_name: GPUMemNet, GPUUtilNet Dataset paper: https://arxiv.org/abs/2602.17817 arxiv: 2602.17817 repo: https://github.com/itu-rad/GPUMemNet --- # GPUMemNet Dataset This dataset accompanies the paper **"GPU Memory and Utilization Estimation for Training-Aware Resource Management: Opportunities and Limitations"** and is released as part of the [GPUMemNet repository](https://github.com/itu-rad/GPUMemNet). ## Description A synthetic, extensible dataset for GPU memory and utilization estimation across three neural network architecture families: MLPs, CNNs, and Transformers. Each sample captures architectural properties (layer counts, depth, batch size, number of parameters, number of activations) alongside measured GPU memory consumption and hardware utilization metrics (SMACT, SMOCC, DRAMA), collected under controlled training conditions. The dataset is designed to support the development and evaluation of training-aware GPU resource management systems, with a focus on pre-execution memory estimation and interference-aware scheduling through utilization prediction. ## Repository Code, models, and reproducibility artifacts are available at: [https://github.com/itu-rad/GPUMemNet](https://github.com/itu-rad/GPUMemNet)