Upload folder using huggingface_hub
Browse files- README.md +213 -0
- camera-distribution.png +3 -0
- download_all.py +128 -0
- manifest.json +114 -0
- train_all.py +351 -0
README.md
ADDED
|
@@ -0,0 +1,213 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: cc0-1.0
|
| 3 |
+
task_categories:
|
| 4 |
+
- image-to-3d
|
| 5 |
+
- depth-estimation
|
| 6 |
+
tags:
|
| 7 |
+
- nerf
|
| 8 |
+
- 3d-gaussian-splatting
|
| 9 |
+
- 3dgs
|
| 10 |
+
- nerfstudio
|
| 11 |
+
- multi-view
|
| 12 |
+
- depth-maps
|
| 13 |
+
- normal-maps
|
| 14 |
+
- point-cloud
|
| 15 |
+
- computer-vision
|
| 16 |
+
- 3d-reconstruction
|
| 17 |
+
pretty_name: "DX.GL Multi-View Datasets"
|
| 18 |
+
size_categories:
|
| 19 |
+
- 1K<n<10K
|
| 20 |
+
---
|
| 21 |
+
|
| 22 |
+
# DX.GL Multi-View Datasets for NeRF & 3D Gaussian Splatting
|
| 23 |
+
|
| 24 |
+
Multi-view training datasets rendered from CC0 3D models via [DX.GL](https://dx.gl). Each dataset includes calibrated camera poses, depth maps, normal maps, binary masks, and point clouds — ready for [nerfstudio](https://docs.nerf.studio/) out of the box.
|
| 25 |
+
|
| 26 |
+
**10 objects × 196 views × 1024×1024 resolution × full sphere coverage.**
|
| 27 |
+
|
| 28 |
+
## Quick Start
|
| 29 |
+
|
| 30 |
+
```bash
|
| 31 |
+
# Download a dataset (Apple, 196 views, 1024x1024)
|
| 32 |
+
wget https://dx.gl/api/v/EJbs8npt2RVM/vCHDLxjWG65d/dataset -O apple.zip
|
| 33 |
+
unzip apple.zip -d apple
|
| 34 |
+
|
| 35 |
+
# Train with nerfstudio
|
| 36 |
+
pip install nerfstudio
|
| 37 |
+
ns-train splatfacto --data ./apple \
|
| 38 |
+
--max-num-iterations 20000 \
|
| 39 |
+
--pipeline.model.sh-degree 3 \
|
| 40 |
+
--pipeline.model.background-color white
|
| 41 |
+
```
|
| 42 |
+
|
| 43 |
+
Or use the download script:
|
| 44 |
+
|
| 45 |
+
```bash
|
| 46 |
+
pip install requests
|
| 47 |
+
python download_all.py
|
| 48 |
+
```
|
| 49 |
+
|
| 50 |
+
## What's in Each Dataset ZIP
|
| 51 |
+
|
| 52 |
+
```
|
| 53 |
+
dataset/
|
| 54 |
+
├── images/ # RGB frames (PNG, transparent background)
|
| 55 |
+
│ ├── frame_00000.png
|
| 56 |
+
│ └── ...
|
| 57 |
+
├── depth/ # 8-bit grayscale depth maps
|
| 58 |
+
├── depth_16bit/ # 16-bit grayscale depth maps (higher precision)
|
| 59 |
+
├── normals/ # World-space normal maps
|
| 60 |
+
├── masks/ # Binary alpha masks
|
| 61 |
+
├── transforms.json # Camera poses (nerfstudio / instant-ngp format)
|
| 62 |
+
└── points3D.ply # Sparse point cloud for initialization
|
| 63 |
+
```
|
| 64 |
+
|
| 65 |
+
### transforms.json Format
|
| 66 |
+
|
| 67 |
+
Compatible with both **nerfstudio** and **instant-ngp**:
|
| 68 |
+
|
| 69 |
+
```json
|
| 70 |
+
{
|
| 71 |
+
"camera_angle_x": 0.857,
|
| 72 |
+
"camera_angle_y": 0.857,
|
| 73 |
+
"fl_x": 693.5,
|
| 74 |
+
"fl_y": 693.5,
|
| 75 |
+
"cx": 400,
|
| 76 |
+
"cy": 400,
|
| 77 |
+
"w": 800,
|
| 78 |
+
"h": 800,
|
| 79 |
+
"depth_near": 0.85,
|
| 80 |
+
"depth_far": 2.35,
|
| 81 |
+
"ply_file_path": "points3D.ply",
|
| 82 |
+
"frames": [
|
| 83 |
+
{
|
| 84 |
+
"file_path": "images/frame_00000.png",
|
| 85 |
+
"depth_file_path": "depth/frame_00000.png",
|
| 86 |
+
"normal_file_path": "normals/frame_00000.png",
|
| 87 |
+
"mask_file_path": "masks/frame_00000.png",
|
| 88 |
+
"transform_matrix": [[...], [...], [...], [0, 0, 0, 1]]
|
| 89 |
+
}
|
| 90 |
+
]
|
| 91 |
+
}
|
| 92 |
+
```
|
| 93 |
+
|
| 94 |
+
## Specs
|
| 95 |
+
|
| 96 |
+
| Property | Value |
|
| 97 |
+
|---|---|
|
| 98 |
+
| **Views** | 196 per object |
|
| 99 |
+
| **Resolution** | 1024×1024 |
|
| 100 |
+
| **Coverage** | Full sphere (±89° elevation) |
|
| 101 |
+
| **Point cloud** | ~200k points |
|
| 102 |
+
| **Camera distribution** | Fibonacci golden-angle spiral |
|
| 103 |
+
| **Background** | Transparent (RGBA) |
|
| 104 |
+
| **Lighting** | Studio HDRI + directional lights |
|
| 105 |
+
|
| 106 |
+
## Camera Distribution
|
| 107 |
+
|
| 108 |
+
Views are distributed on a full sphere (±89° elevation) using a golden-angle Fibonacci spiral. The distribution is uniform in solid angle — more views near the equator, fewer near the poles — optimized for NeRF/3DGS training.
|
| 109 |
+
|
| 110 |
+

|
| 111 |
+
|
| 112 |
+
## Objects
|
| 113 |
+
|
| 114 |
+
| # | Object | Category | Download | Browse |
|
| 115 |
+
|---|---|---|---|---|
|
| 116 |
+
| 1 | Apple | organic | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/vCHDLxjWG65d/dataset) | [View](https://dx.gl/datasets/vCHDLxjWG65d) |
|
| 117 |
+
| 2 | Cash Register | electronics | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/JfjLRexr6J7z/dataset) | [View](https://dx.gl/datasets/JfjLRexr6J7z) |
|
| 118 |
+
| 3 | Drill | tool | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/A0dcsk7HHgAg/dataset) | [View](https://dx.gl/datasets/A0dcsk7HHgAg) |
|
| 119 |
+
| 4 | Fire Extinguisher | metallic | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/cLgyqM5mhQoq/dataset) | [View](https://dx.gl/datasets/cLgyqM5mhQoq) |
|
| 120 |
+
| 5 | LED Lightbulb | glass | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/ZuYmv3K9xN7u/dataset) | [View](https://dx.gl/datasets/ZuYmv3K9xN7u) |
|
| 121 |
+
| 6 | Measuring Tape | tool | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/qqvDYx7RtHZd/dataset) | [View](https://dx.gl/datasets/qqvDYx7RtHZd) |
|
| 122 |
+
| 7 | Modern Arm Chair | furniture | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/KLBJAuie9JaB/dataset) | [View](https://dx.gl/datasets/KLBJAuie9JaB) |
|
| 123 |
+
| 8 | Multi Cleaner 5L | product | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/79gDW15Gw9Ft/dataset) | [View](https://dx.gl/datasets/79gDW15Gw9Ft) |
|
| 124 |
+
| 9 | Potted Plant | organic | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/o4c5zRyGuT7W/dataset) | [View](https://dx.gl/datasets/o4c5zRyGuT7W) |
|
| 125 |
+
| 10 | Wet Floor Sign | plastic | [ZIP](https://dx.gl/api/v/EJbs8npt2RVM/tHdRul1GzzoU/dataset) | [View](https://dx.gl/datasets/tHdRul1GzzoU) |
|
| 126 |
+
|
| 127 |
+
All source models from [Polyhaven](https://polyhaven.com) (CC0).
|
| 128 |
+
|
| 129 |
+
## Pre-trained 3DGS Splats
|
| 130 |
+
|
| 131 |
+
We include pre-trained Gaussian Splat `.ply` files (nerfstudio splatfacto, 20k iterations, SH degree 3) for each object. Download them with:
|
| 132 |
+
|
| 133 |
+
```bash
|
| 134 |
+
python download_all.py --splats
|
| 135 |
+
```
|
| 136 |
+
|
| 137 |
+
Or view them directly:
|
| 138 |
+
|
| 139 |
+
- [DX.GL Splat Viewer](https://dx.gl/splat/index.html) (all 10 models, use ← → to browse)
|
| 140 |
+
- [SuperSplat Editor](https://superspl.at/editor) (drag-drop the .ply)
|
| 141 |
+
- nerfstudio viewer: `ns-viewer --load-config outputs/*/config.yml`
|
| 142 |
+
|
| 143 |
+
### Training Parameters
|
| 144 |
+
|
| 145 |
+
```bash
|
| 146 |
+
ns-train splatfacto --data ./dataset \
|
| 147 |
+
--max-num-iterations 20000 \
|
| 148 |
+
--pipeline.model.sh-degree 3 \
|
| 149 |
+
--pipeline.model.background-color white \
|
| 150 |
+
--pipeline.model.cull-alpha-thresh 0.2 \
|
| 151 |
+
--pipeline.model.densify-size-thresh 0.005 \
|
| 152 |
+
--pipeline.model.use-scale-regularization True \
|
| 153 |
+
--pipeline.model.max-gauss-ratio 5.0
|
| 154 |
+
```
|
| 155 |
+
|
| 156 |
+
Training time: ~10 minutes on RTX 4000 Pro Ada (70W) at the 196×1024 tier.
|
| 157 |
+
|
| 158 |
+
## Rendering Pipeline
|
| 159 |
+
|
| 160 |
+
Datasets are rendered using [DX.GL](https://dx.gl)'s cloud GPU rendering pipeline:
|
| 161 |
+
|
| 162 |
+
- **Lighting**: Studio HDRI environment with PBR materials
|
| 163 |
+
- **Camera**: Fibonacci golden-angle sphere distribution
|
| 164 |
+
- **Depth**: Tight near/far planes from model bounding sphere for maximum precision
|
| 165 |
+
- **Point cloud**: Back-projected from depth maps, ~1000 points per view
|
| 166 |
+
- **Background**: Transparent (RGBA)
|
| 167 |
+
|
| 168 |
+
## Modalities
|
| 169 |
+
|
| 170 |
+
| Modality | Format | Notes |
|
| 171 |
+
|---|---|---|
|
| 172 |
+
| **RGB** | PNG, RGBA | Transparent background, PBR-lit |
|
| 173 |
+
| **Depth (8-bit)** | PNG, grayscale | Normalized to near/far range |
|
| 174 |
+
| **Depth (16-bit)** | PNG, grayscale | RG-encoded, higher precision |
|
| 175 |
+
| **Normals** | PNG, RGB | World-space, MeshNormalMaterial |
|
| 176 |
+
| **Masks** | PNG, grayscale | Binary alpha from RGB alpha channel |
|
| 177 |
+
| **Point Cloud** | PLY, binary | XYZ + RGB, ~100k points |
|
| 178 |
+
| **Camera Poses** | JSON | 4×4 camera-to-world matrices |
|
| 179 |
+
|
| 180 |
+
## License
|
| 181 |
+
|
| 182 |
+
All source 3D models are **CC0** (public domain) from [Polyhaven](https://polyhaven.com). The rendered datasets inherit this license — use them for anything, no attribution required.
|
| 183 |
+
|
| 184 |
+
## Citation
|
| 185 |
+
|
| 186 |
+
```bibtex
|
| 187 |
+
@misc{dxgl_multiview_2026,
|
| 188 |
+
title = {DX.GL Multi-View Datasets for NeRF and 3D Gaussian Splatting},
|
| 189 |
+
author = {DXGL},
|
| 190 |
+
year = {2026},
|
| 191 |
+
url = {https://huggingface.co/datasets/dxgl/multiview-datasets},
|
| 192 |
+
note = {Multi-view datasets with depth, normals, masks, and point clouds. Rendered via DX.GL.}
|
| 193 |
+
}
|
| 194 |
+
```
|
| 195 |
+
|
| 196 |
+
## Links
|
| 197 |
+
|
| 198 |
+
- **This collection**: [dx.gl/datasets/polyhaven-10](https://dx.gl/datasets/polyhaven-10)
|
| 199 |
+
- **Browse all datasets**: [dx.gl/datasets](https://dx.gl/datasets)
|
| 200 |
+
- **Pipeline details**: [dx.gl/for-research](https://dx.gl/for-research)
|
| 201 |
+
- **API documentation**: [dx.gl/portal/docs](https://dx.gl/portal/docs)
|
| 202 |
+
- **Generate your own**: [dx.gl/signup](https://dx.gl/signup) (2 free renders included)
|
| 203 |
+
|
| 204 |
+
## Feedback
|
| 205 |
+
|
| 206 |
+
We're actively improving the rendering pipeline. If you find issues with depth accuracy, mask quality, camera calibration, or view distribution — please open a Discussion on this repo. Specific feedback we're looking for:
|
| 207 |
+
|
| 208 |
+
- Depth map accuracy at object edges
|
| 209 |
+
- Mask quality for transparent/reflective materials
|
| 210 |
+
- Point cloud alignment with RGB views
|
| 211 |
+
- View distribution quality for your training method
|
| 212 |
+
- Missing modalities or metadata
|
| 213 |
+
- Any other issues or suggestions?
|
camera-distribution.png
ADDED
|
Git LFS Details
|
download_all.py
ADDED
|
@@ -0,0 +1,128 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/usr/bin/env python3
|
| 2 |
+
"""
|
| 3 |
+
Download DX.GL multi-view datasets from the manifest.
|
| 4 |
+
|
| 5 |
+
Usage:
|
| 6 |
+
python download_all.py # download all datasets
|
| 7 |
+
python download_all.py --object apple # specific object
|
| 8 |
+
python download_all.py --output ./datasets # custom output directory
|
| 9 |
+
python download_all.py --splats # also download pre-trained .ply splats
|
| 10 |
+
|
| 11 |
+
Requires: pip install requests
|
| 12 |
+
"""
|
| 13 |
+
|
| 14 |
+
import argparse
|
| 15 |
+
import json
|
| 16 |
+
import os
|
| 17 |
+
import sys
|
| 18 |
+
|
| 19 |
+
try:
|
| 20 |
+
import requests
|
| 21 |
+
except ImportError:
|
| 22 |
+
print("Please install requests: pip install requests")
|
| 23 |
+
sys.exit(1)
|
| 24 |
+
|
| 25 |
+
MANIFEST_URL = "https://huggingface.co/datasets/dxgl/multiview-datasets/resolve/main/manifest.json"
|
| 26 |
+
MANIFEST_LOCAL = os.path.join(os.path.dirname(os.path.abspath(__file__)), "manifest.json")
|
| 27 |
+
|
| 28 |
+
|
| 29 |
+
def load_manifest():
|
| 30 |
+
"""Load manifest from local file or download from HuggingFace."""
|
| 31 |
+
if os.path.exists(MANIFEST_LOCAL):
|
| 32 |
+
with open(MANIFEST_LOCAL) as f:
|
| 33 |
+
return json.load(f)
|
| 34 |
+
print(f"Downloading manifest from {MANIFEST_URL} ...")
|
| 35 |
+
resp = requests.get(MANIFEST_URL)
|
| 36 |
+
resp.raise_for_status()
|
| 37 |
+
return resp.json()
|
| 38 |
+
|
| 39 |
+
|
| 40 |
+
def download_file(url, dest_path):
|
| 41 |
+
"""Download a file with progress display."""
|
| 42 |
+
resp = requests.get(url, stream=True)
|
| 43 |
+
resp.raise_for_status()
|
| 44 |
+
total = int(resp.headers.get("content-length", 0))
|
| 45 |
+
downloaded = 0
|
| 46 |
+
with open(dest_path, "wb") as f:
|
| 47 |
+
for chunk in resp.iter_content(chunk_size=1 << 20):
|
| 48 |
+
f.write(chunk)
|
| 49 |
+
downloaded += len(chunk)
|
| 50 |
+
if total > 0:
|
| 51 |
+
pct = downloaded / total * 100
|
| 52 |
+
mb = downloaded / 1e6
|
| 53 |
+
print(f"\r {mb:.1f} MB ({pct:.0f}%)", end="", flush=True)
|
| 54 |
+
print()
|
| 55 |
+
|
| 56 |
+
|
| 57 |
+
def main():
|
| 58 |
+
parser = argparse.ArgumentParser(description="Download DX.GL multi-view datasets")
|
| 59 |
+
parser.add_argument("--object", default=None,
|
| 60 |
+
help="Download only a specific object (by name, case-insensitive)")
|
| 61 |
+
parser.add_argument("--output", default="./dxgl-datasets",
|
| 62 |
+
help="Output directory (default: ./dxgl-datasets)")
|
| 63 |
+
parser.add_argument("--splats", action="store_true",
|
| 64 |
+
help="Also download pre-trained 3DGS .ply files")
|
| 65 |
+
parser.add_argument("--extract", action="store_true", default=True,
|
| 66 |
+
help="Extract ZIPs after download (default: true)")
|
| 67 |
+
parser.add_argument("--no-extract", action="store_true",
|
| 68 |
+
help="Keep ZIPs without extracting")
|
| 69 |
+
args = parser.parse_args()
|
| 70 |
+
|
| 71 |
+
manifest = load_manifest()
|
| 72 |
+
objects = manifest["objects"]
|
| 73 |
+
|
| 74 |
+
# Filter by object name
|
| 75 |
+
if args.object:
|
| 76 |
+
objects = [o for o in objects if args.object.lower() in o["name"].lower()]
|
| 77 |
+
if not objects:
|
| 78 |
+
print(f"No object matching '{args.object}' found in manifest.")
|
| 79 |
+
sys.exit(1)
|
| 80 |
+
|
| 81 |
+
os.makedirs(args.output, exist_ok=True)
|
| 82 |
+
|
| 83 |
+
total_downloads = 0
|
| 84 |
+
for obj in objects:
|
| 85 |
+
name_slug = obj["name"].lower().replace(" ", "_")
|
| 86 |
+
filename = f"{name_slug}.zip"
|
| 87 |
+
dest_path = os.path.join(args.output, filename)
|
| 88 |
+
|
| 89 |
+
if os.path.exists(dest_path):
|
| 90 |
+
print(f" Skipping {filename} (already exists)")
|
| 91 |
+
continue
|
| 92 |
+
|
| 93 |
+
print(f"Downloading {obj['name']} ...")
|
| 94 |
+
download_file(obj["download_url"], dest_path)
|
| 95 |
+
total_downloads += 1
|
| 96 |
+
|
| 97 |
+
# Extract
|
| 98 |
+
if args.extract and not args.no_extract:
|
| 99 |
+
import zipfile
|
| 100 |
+
extract_dir = os.path.join(args.output, name_slug)
|
| 101 |
+
os.makedirs(extract_dir, exist_ok=True)
|
| 102 |
+
with zipfile.ZipFile(dest_path) as zf:
|
| 103 |
+
zf.extractall(extract_dir)
|
| 104 |
+
print(f" Extracted to {extract_dir}")
|
| 105 |
+
|
| 106 |
+
# Download splats
|
| 107 |
+
if args.splats:
|
| 108 |
+
splats_dir = os.path.join(args.output, "splats")
|
| 109 |
+
os.makedirs(splats_dir, exist_ok=True)
|
| 110 |
+
for obj in objects:
|
| 111 |
+
if "splat_url" not in obj or not obj["splat_url"]:
|
| 112 |
+
continue
|
| 113 |
+
name_slug = obj["name"].lower().replace(" ", "_")
|
| 114 |
+
dest_path = os.path.join(splats_dir, f"{name_slug}.ply")
|
| 115 |
+
if os.path.exists(dest_path):
|
| 116 |
+
print(f" Skipping {name_slug}.ply (already exists)")
|
| 117 |
+
continue
|
| 118 |
+
print(f"Downloading splat: {obj['name']} ...")
|
| 119 |
+
download_file(obj["splat_url"], dest_path)
|
| 120 |
+
total_downloads += 1
|
| 121 |
+
|
| 122 |
+
print(f"\nDone. Downloaded {total_downloads} files to {args.output}")
|
| 123 |
+
if total_downloads == 0:
|
| 124 |
+
print("(All files already existed — delete them to re-download)")
|
| 125 |
+
|
| 126 |
+
|
| 127 |
+
if __name__ == "__main__":
|
| 128 |
+
main()
|
manifest.json
ADDED
|
@@ -0,0 +1,114 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"name": "dxgl-multiview-datasets",
|
| 3 |
+
"version": "1.0",
|
| 4 |
+
"description": "Multi-view datasets rendered from CC0 3D models via DX.GL. Calibrated cameras with RGB, depth, normals, masks, and point clouds.",
|
| 5 |
+
"pipeline": "DX.GL GPU Renderer",
|
| 6 |
+
"format": "nerfstudio / instant-ngp (transforms.json)",
|
| 7 |
+
"coverage": "sphere",
|
| 8 |
+
"modalities": ["rgb", "depth_8bit", "depth_16bit", "normals", "masks", "pointcloud", "camera_poses"],
|
| 9 |
+
"license": "CC0-1.0",
|
| 10 |
+
"source": "Polyhaven (polyhaven.com)",
|
| 11 |
+
"tier": { "views": 196, "resolution": "1024x1024", "points": 200000 },
|
| 12 |
+
"objects": [
|
| 13 |
+
{
|
| 14 |
+
"name": "Apple",
|
| 15 |
+
"source_url": "https://polyhaven.com/a/food_apple_01",
|
| 16 |
+
"license": "CC0",
|
| 17 |
+
"category": "organic",
|
| 18 |
+
"render_id": "vCHDLxjWG65d",
|
| 19 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/vCHDLxjWG65d/dataset",
|
| 20 |
+
"dataset_url": "https://dx.gl/datasets/vCHDLxjWG65d",
|
| 21 |
+
"splat_url": "https://dx.gl/splat/apple.ply"
|
| 22 |
+
},
|
| 23 |
+
{
|
| 24 |
+
"name": "Cash Register",
|
| 25 |
+
"source_url": "https://polyhaven.com/a/cash_register",
|
| 26 |
+
"license": "CC0",
|
| 27 |
+
"category": "electronics",
|
| 28 |
+
"render_id": "JfjLRexr6J7z",
|
| 29 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/JfjLRexr6J7z/dataset",
|
| 30 |
+
"dataset_url": "https://dx.gl/datasets/JfjLRexr6J7z",
|
| 31 |
+
"splat_url": "https://dx.gl/splat/cash_register.ply"
|
| 32 |
+
},
|
| 33 |
+
{
|
| 34 |
+
"name": "Drill",
|
| 35 |
+
"source_url": "https://polyhaven.com/a/drill",
|
| 36 |
+
"license": "CC0",
|
| 37 |
+
"category": "tool",
|
| 38 |
+
"render_id": "A0dcsk7HHgAg",
|
| 39 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/A0dcsk7HHgAg/dataset",
|
| 40 |
+
"dataset_url": "https://dx.gl/datasets/A0dcsk7HHgAg",
|
| 41 |
+
"splat_url": "https://dx.gl/splat/drill.ply"
|
| 42 |
+
},
|
| 43 |
+
{
|
| 44 |
+
"name": "Fire Extinguisher",
|
| 45 |
+
"source_url": "https://polyhaven.com/a/korean_fire_extinguisher",
|
| 46 |
+
"license": "CC0",
|
| 47 |
+
"category": "metallic",
|
| 48 |
+
"render_id": "cLgyqM5mhQoq",
|
| 49 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/cLgyqM5mhQoq/dataset",
|
| 50 |
+
"dataset_url": "https://dx.gl/datasets/cLgyqM5mhQoq",
|
| 51 |
+
"splat_url": "https://dx.gl/splat/fire_extinguisher.ply"
|
| 52 |
+
},
|
| 53 |
+
{
|
| 54 |
+
"name": "LED Lightbulb",
|
| 55 |
+
"source_url": "https://polyhaven.com/a/lightbulb_led",
|
| 56 |
+
"license": "CC0",
|
| 57 |
+
"category": "glass",
|
| 58 |
+
"render_id": "ZuYmv3K9xN7u",
|
| 59 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/ZuYmv3K9xN7u/dataset",
|
| 60 |
+
"dataset_url": "https://dx.gl/datasets/ZuYmv3K9xN7u",
|
| 61 |
+
"splat_url": "https://dx.gl/splat/led_lightbulb.ply"
|
| 62 |
+
},
|
| 63 |
+
{
|
| 64 |
+
"name": "Measuring Tape",
|
| 65 |
+
"source_url": "https://polyhaven.com/a/measuring_tape",
|
| 66 |
+
"license": "CC0",
|
| 67 |
+
"category": "tool",
|
| 68 |
+
"render_id": "qqvDYx7RtHZd",
|
| 69 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/qqvDYx7RtHZd/dataset",
|
| 70 |
+
"dataset_url": "https://dx.gl/datasets/qqvDYx7RtHZd",
|
| 71 |
+
"splat_url": "https://dx.gl/splat/measuring_tape.ply"
|
| 72 |
+
},
|
| 73 |
+
{
|
| 74 |
+
"name": "Modern Arm Chair",
|
| 75 |
+
"source_url": "https://polyhaven.com/a/modern_arm_chair",
|
| 76 |
+
"license": "CC0",
|
| 77 |
+
"category": "furniture",
|
| 78 |
+
"render_id": "KLBJAuie9JaB",
|
| 79 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/KLBJAuie9JaB/dataset",
|
| 80 |
+
"dataset_url": "https://dx.gl/datasets/KLBJAuie9JaB",
|
| 81 |
+
"splat_url": "https://dx.gl/splat/modern_arm_chair.ply"
|
| 82 |
+
},
|
| 83 |
+
{
|
| 84 |
+
"name": "Multi Cleaner 5L",
|
| 85 |
+
"source_url": "https://polyhaven.com/a/multi_cleaner_5_litre",
|
| 86 |
+
"license": "CC0",
|
| 87 |
+
"category": "product",
|
| 88 |
+
"render_id": "79gDW15Gw9Ft",
|
| 89 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/79gDW15Gw9Ft/dataset",
|
| 90 |
+
"dataset_url": "https://dx.gl/datasets/79gDW15Gw9Ft",
|
| 91 |
+
"splat_url": "https://dx.gl/splat/multi_cleaner_5l.ply"
|
| 92 |
+
},
|
| 93 |
+
{
|
| 94 |
+
"name": "Potted Plant",
|
| 95 |
+
"source_url": "https://polyhaven.com/a/potted_plant",
|
| 96 |
+
"license": "CC0",
|
| 97 |
+
"category": "organic",
|
| 98 |
+
"render_id": "o4c5zRyGuT7W",
|
| 99 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/o4c5zRyGuT7W/dataset",
|
| 100 |
+
"dataset_url": "https://dx.gl/datasets/o4c5zRyGuT7W",
|
| 101 |
+
"splat_url": "https://dx.gl/splat/potted_plant.ply"
|
| 102 |
+
},
|
| 103 |
+
{
|
| 104 |
+
"name": "Wet Floor Sign",
|
| 105 |
+
"source_url": "https://polyhaven.com/a/wet_floor_sign",
|
| 106 |
+
"license": "CC0",
|
| 107 |
+
"category": "plastic",
|
| 108 |
+
"render_id": "tHdRul1GzzoU",
|
| 109 |
+
"download_url": "https://dx.gl/api/v/EJbs8npt2RVM/tHdRul1GzzoU/dataset",
|
| 110 |
+
"dataset_url": "https://dx.gl/datasets/tHdRul1GzzoU",
|
| 111 |
+
"splat_url": "https://dx.gl/splat/wet_floor_sign.ply"
|
| 112 |
+
}
|
| 113 |
+
]
|
| 114 |
+
}
|
train_all.py
ADDED
|
@@ -0,0 +1,351 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#!/usr/bin/env python3
|
| 2 |
+
"""
|
| 3 |
+
Train 3D Gaussian Splats for all DX.GL multi-view datasets.
|
| 4 |
+
|
| 5 |
+
Downloads datasets (if needed), trains each with nerfstudio splatfacto,
|
| 6 |
+
exports the PLY, and converts to .splat for web viewers.
|
| 7 |
+
|
| 8 |
+
Usage:
|
| 9 |
+
python train_all.py # train all objects
|
| 10 |
+
python train_all.py --object apple # train specific object
|
| 11 |
+
python train_all.py --data-dir ./dxgl-datasets # custom dataset location
|
| 12 |
+
python train_all.py --output ./splats # custom output directory
|
| 13 |
+
python train_all.py --iterations 30000 # custom iteration count
|
| 14 |
+
python train_all.py --dry-run # show what would be trained
|
| 15 |
+
|
| 16 |
+
Requires:
|
| 17 |
+
pip install nerfstudio requests plyfile numpy
|
| 18 |
+
"""
|
| 19 |
+
|
| 20 |
+
import argparse
|
| 21 |
+
import glob
|
| 22 |
+
import json
|
| 23 |
+
import os
|
| 24 |
+
import struct
|
| 25 |
+
import subprocess
|
| 26 |
+
import sys
|
| 27 |
+
import time
|
| 28 |
+
|
| 29 |
+
try:
|
| 30 |
+
import numpy as np
|
| 31 |
+
from plyfile import PlyData
|
| 32 |
+
except ImportError:
|
| 33 |
+
print("Please install dependencies: pip install plyfile numpy requests")
|
| 34 |
+
sys.exit(1)
|
| 35 |
+
|
| 36 |
+
MANIFEST_LOCAL = os.path.join(os.path.dirname(os.path.abspath(__file__)), "manifest.json")
|
| 37 |
+
|
| 38 |
+
# Validated training params (from tuning on RTX 4000 Pro Ada)
|
| 39 |
+
DEFAULT_ITERATIONS = 20000
|
| 40 |
+
TRAIN_PARAMS = [
|
| 41 |
+
"--pipeline.model.sh-degree", "3",
|
| 42 |
+
"--pipeline.model.background-color", "white",
|
| 43 |
+
"--pipeline.model.cull-alpha-thresh", "0.2",
|
| 44 |
+
"--pipeline.model.densify-size-thresh", "0.005",
|
| 45 |
+
"--pipeline.model.use-scale-regularization", "True",
|
| 46 |
+
"--pipeline.model.max-gauss-ratio", "5.0",
|
| 47 |
+
]
|
| 48 |
+
|
| 49 |
+
|
| 50 |
+
# ── PLY → .splat conversion (from scripts/ply-to-splat.py) ──────────────
|
| 51 |
+
|
| 52 |
+
SH_C0 = 0.28209479177387814 # 1 / (2 * sqrt(pi))
|
| 53 |
+
|
| 54 |
+
|
| 55 |
+
def _ply_field(v, *names):
|
| 56 |
+
"""Find the first matching field name in PLY vertex data."""
|
| 57 |
+
available = v.data.dtype.names if hasattr(v.data, "dtype") else v.dtype.names
|
| 58 |
+
for name in names:
|
| 59 |
+
if name in available:
|
| 60 |
+
return v[name]
|
| 61 |
+
raise KeyError(f"No field matching: {names}. Available: {available}")
|
| 62 |
+
|
| 63 |
+
|
| 64 |
+
def ply_to_splat(input_path: str, output_path: str):
|
| 65 |
+
"""Convert a nerfstudio Gaussian Splatting PLY to .splat format."""
|
| 66 |
+
ply = PlyData.read(input_path)
|
| 67 |
+
v = ply["vertex"]
|
| 68 |
+
n = len(v)
|
| 69 |
+
|
| 70 |
+
xyz = np.column_stack([v["x"], v["y"], v["z"]]).astype(np.float32)
|
| 71 |
+
|
| 72 |
+
s0 = _ply_field(v, "f_scale_0", "scale_0", "sx")
|
| 73 |
+
s1 = _ply_field(v, "f_scale_1", "scale_1", "sy")
|
| 74 |
+
s2 = _ply_field(v, "f_scale_2", "scale_2", "sz")
|
| 75 |
+
scales = np.exp(np.column_stack([s0, s1, s2])).astype(np.float32)
|
| 76 |
+
|
| 77 |
+
raw_opacity = _ply_field(v, "opacity", "f_opacity")
|
| 78 |
+
opacity = (1.0 / (1.0 + np.exp(-raw_opacity.astype(np.float64)))).astype(np.float64)
|
| 79 |
+
|
| 80 |
+
dc0 = _ply_field(v, "f_dc_0", "f_rest_0", "red")
|
| 81 |
+
dc1 = _ply_field(v, "f_dc_1", "f_rest_1", "green")
|
| 82 |
+
dc2 = _ply_field(v, "f_dc_2", "f_rest_2", "blue")
|
| 83 |
+
if dc0.max() > 10:
|
| 84 |
+
r = np.clip(dc0, 0, 255).astype(np.uint8)
|
| 85 |
+
g = np.clip(dc1, 0, 255).astype(np.uint8)
|
| 86 |
+
b = np.clip(dc2, 0, 255).astype(np.uint8)
|
| 87 |
+
else:
|
| 88 |
+
r = np.clip((0.5 + SH_C0 * dc0) * 255, 0, 255).astype(np.uint8)
|
| 89 |
+
g = np.clip((0.5 + SH_C0 * dc1) * 255, 0, 255).astype(np.uint8)
|
| 90 |
+
b = np.clip((0.5 + SH_C0 * dc2) * 255, 0, 255).astype(np.uint8)
|
| 91 |
+
a = np.clip(opacity * 255, 0, 255).astype(np.uint8)
|
| 92 |
+
|
| 93 |
+
qw = _ply_field(v, "rot_0", "qw", "f_rot_0").astype(np.float64)
|
| 94 |
+
qx = _ply_field(v, "rot_1", "qx", "f_rot_1").astype(np.float64)
|
| 95 |
+
qy = _ply_field(v, "rot_2", "qy", "f_rot_2").astype(np.float64)
|
| 96 |
+
qz = _ply_field(v, "rot_3", "qz", "f_rot_3").astype(np.float64)
|
| 97 |
+
norm = np.sqrt(qw * qw + qx * qx + qy * qy + qz * qz)
|
| 98 |
+
qw /= norm; qx /= norm; qy /= norm; qz /= norm
|
| 99 |
+
rot_x = np.clip(qx * 128 + 128, 0, 255).astype(np.uint8)
|
| 100 |
+
rot_y = np.clip(qy * 128 + 128, 0, 255).astype(np.uint8)
|
| 101 |
+
rot_z = np.clip(qz * 128 + 128, 0, 255).astype(np.uint8)
|
| 102 |
+
rot_w = np.clip(qw * 128 + 128, 0, 255).astype(np.uint8)
|
| 103 |
+
|
| 104 |
+
order = np.argsort(-opacity)
|
| 105 |
+
|
| 106 |
+
buf = bytearray(n * 32)
|
| 107 |
+
for i in range(n):
|
| 108 |
+
idx = order[i]
|
| 109 |
+
off = i * 32
|
| 110 |
+
struct.pack_into("3f", buf, off, xyz[idx, 0], xyz[idx, 1], xyz[idx, 2])
|
| 111 |
+
struct.pack_into("3f", buf, off + 12, scales[idx, 0], scales[idx, 1], scales[idx, 2])
|
| 112 |
+
buf[off + 24] = r[idx]
|
| 113 |
+
buf[off + 25] = g[idx]
|
| 114 |
+
buf[off + 26] = b[idx]
|
| 115 |
+
buf[off + 27] = a[idx]
|
| 116 |
+
buf[off + 28] = rot_w[idx]
|
| 117 |
+
buf[off + 29] = rot_x[idx]
|
| 118 |
+
buf[off + 30] = rot_y[idx]
|
| 119 |
+
buf[off + 31] = rot_z[idx]
|
| 120 |
+
|
| 121 |
+
with open(output_path, "wb") as f:
|
| 122 |
+
f.write(buf)
|
| 123 |
+
|
| 124 |
+
ply_mb = os.path.getsize(input_path) / 1e6
|
| 125 |
+
splat_mb = len(buf) / 1e6
|
| 126 |
+
return n, ply_mb, splat_mb
|
| 127 |
+
|
| 128 |
+
|
| 129 |
+
# ── Nerfstudio helpers ───────────────────────────────────────────────────
|
| 130 |
+
|
| 131 |
+
def find_latest_config(output_base: str, experiment_name: str):
|
| 132 |
+
"""Find the most recent config.yml from nerfstudio outputs."""
|
| 133 |
+
pattern = os.path.join(output_base, experiment_name, "splatfacto", "*", "config.yml")
|
| 134 |
+
configs = sorted(glob.glob(pattern))
|
| 135 |
+
if not configs:
|
| 136 |
+
return None
|
| 137 |
+
return configs[-1] # latest timestamp
|
| 138 |
+
|
| 139 |
+
|
| 140 |
+
def train_splatfacto(data_dir: str, experiment_name: str, output_base: str,
|
| 141 |
+
max_iterations: int):
|
| 142 |
+
"""Run ns-train splatfacto for a single dataset."""
|
| 143 |
+
cmd = [
|
| 144 |
+
"ns-train", "splatfacto",
|
| 145 |
+
"--data", data_dir,
|
| 146 |
+
"--output-dir", output_base,
|
| 147 |
+
"--experiment-name", experiment_name,
|
| 148 |
+
"--max-num-iterations", str(max_iterations),
|
| 149 |
+
*TRAIN_PARAMS,
|
| 150 |
+
]
|
| 151 |
+
print(f" Command: {' '.join(cmd)}")
|
| 152 |
+
result = subprocess.run(cmd)
|
| 153 |
+
if result.returncode != 0:
|
| 154 |
+
raise RuntimeError(f"ns-train failed with exit code {result.returncode}")
|
| 155 |
+
|
| 156 |
+
|
| 157 |
+
def export_ply(config_path: str, export_dir: str):
|
| 158 |
+
"""Run ns-export gaussian-splat to get the PLY file."""
|
| 159 |
+
os.makedirs(export_dir, exist_ok=True)
|
| 160 |
+
cmd = [
|
| 161 |
+
"ns-export", "gaussian-splat",
|
| 162 |
+
"--load-config", config_path,
|
| 163 |
+
"--output-dir", export_dir,
|
| 164 |
+
]
|
| 165 |
+
print(f" Export command: {' '.join(cmd)}")
|
| 166 |
+
result = subprocess.run(cmd)
|
| 167 |
+
if result.returncode != 0:
|
| 168 |
+
raise RuntimeError(f"ns-export failed with exit code {result.returncode}")
|
| 169 |
+
|
| 170 |
+
# ns-export writes splat.ply in the output dir
|
| 171 |
+
ply_path = os.path.join(export_dir, "splat.ply")
|
| 172 |
+
if not os.path.exists(ply_path):
|
| 173 |
+
# Some versions write point_cloud.ply
|
| 174 |
+
alt = os.path.join(export_dir, "point_cloud.ply")
|
| 175 |
+
if os.path.exists(alt):
|
| 176 |
+
return alt
|
| 177 |
+
raise FileNotFoundError(f"No PLY found in {export_dir}")
|
| 178 |
+
return ply_path
|
| 179 |
+
|
| 180 |
+
|
| 181 |
+
# ── Main ─────────────────────────────────────────────────────────────────
|
| 182 |
+
|
| 183 |
+
def load_manifest():
|
| 184 |
+
if os.path.exists(MANIFEST_LOCAL):
|
| 185 |
+
with open(MANIFEST_LOCAL) as f:
|
| 186 |
+
return json.load(f)
|
| 187 |
+
try:
|
| 188 |
+
import requests
|
| 189 |
+
url = "https://huggingface.co/datasets/dxgl/multiview-datasets/resolve/main/manifest.json"
|
| 190 |
+
print(f"Downloading manifest from {url} ...")
|
| 191 |
+
resp = requests.get(url)
|
| 192 |
+
resp.raise_for_status()
|
| 193 |
+
return resp.json()
|
| 194 |
+
except ImportError:
|
| 195 |
+
print("manifest.json not found locally and requests not installed.")
|
| 196 |
+
sys.exit(1)
|
| 197 |
+
|
| 198 |
+
|
| 199 |
+
def main():
|
| 200 |
+
parser = argparse.ArgumentParser(
|
| 201 |
+
description="Train 3DGS splats for all DX.GL multi-view datasets"
|
| 202 |
+
)
|
| 203 |
+
parser.add_argument("--object", default=None,
|
| 204 |
+
help="Train only a specific object (by name, case-insensitive)")
|
| 205 |
+
parser.add_argument("--data-dir", default="./dxgl-datasets",
|
| 206 |
+
help="Directory containing extracted datasets (default: ./dxgl-datasets)")
|
| 207 |
+
parser.add_argument("--output", default="./dxgl-splats",
|
| 208 |
+
help="Output directory for .ply and .splat files (default: ./dxgl-splats)")
|
| 209 |
+
parser.add_argument("--ns-output", default="./ns-outputs",
|
| 210 |
+
help="Nerfstudio outputs/checkpoints directory (default: ./ns-outputs)")
|
| 211 |
+
parser.add_argument("--iterations", type=int, default=DEFAULT_ITERATIONS,
|
| 212 |
+
help=f"Max training iterations (default: {DEFAULT_ITERATIONS})")
|
| 213 |
+
parser.add_argument("--dry-run", action="store_true",
|
| 214 |
+
help="Show what would be trained without running")
|
| 215 |
+
parser.add_argument("--export-only", action="store_true",
|
| 216 |
+
help="Skip training, only export/convert from existing ns-outputs")
|
| 217 |
+
args = parser.parse_args()
|
| 218 |
+
|
| 219 |
+
manifest = load_manifest()
|
| 220 |
+
objects = manifest["objects"]
|
| 221 |
+
|
| 222 |
+
if args.object:
|
| 223 |
+
objects = [o for o in objects if args.object.lower() in o["name"].lower()]
|
| 224 |
+
if not objects:
|
| 225 |
+
print(f"No object matching '{args.object}' found in manifest.")
|
| 226 |
+
sys.exit(1)
|
| 227 |
+
|
| 228 |
+
os.makedirs(args.output, exist_ok=True)
|
| 229 |
+
os.makedirs(args.ns_output, exist_ok=True)
|
| 230 |
+
|
| 231 |
+
results = []
|
| 232 |
+
total_time = 0
|
| 233 |
+
|
| 234 |
+
for i, obj in enumerate(objects, 1):
|
| 235 |
+
name = obj["name"]
|
| 236 |
+
slug = name.lower().replace(" ", "_")
|
| 237 |
+
splat_out = os.path.join(args.output, f"{slug}.splat")
|
| 238 |
+
ply_out = os.path.join(args.output, f"{slug}.ply")
|
| 239 |
+
|
| 240 |
+
print(f"\n{'='*60}")
|
| 241 |
+
print(f"[{i}/{len(objects)}] {name}")
|
| 242 |
+
print(f"{'='*60}")
|
| 243 |
+
|
| 244 |
+
# Check if already done
|
| 245 |
+
if os.path.exists(splat_out) and not args.export_only:
|
| 246 |
+
size_mb = os.path.getsize(splat_out) / 1e6
|
| 247 |
+
print(f" ✓ Already done ({size_mb:.1f} MB .splat) — skipping")
|
| 248 |
+
results.append({"name": name, "status": "skipped"})
|
| 249 |
+
continue
|
| 250 |
+
|
| 251 |
+
# Find dataset
|
| 252 |
+
data_dir = os.path.join(args.data_dir, slug)
|
| 253 |
+
transforms = os.path.join(data_dir, "transforms.json")
|
| 254 |
+
if not os.path.exists(transforms):
|
| 255 |
+
# Try nested: dataset might be inside a subdirectory
|
| 256 |
+
nested = os.path.join(data_dir, "dataset", "transforms.json")
|
| 257 |
+
if os.path.exists(nested):
|
| 258 |
+
data_dir = os.path.join(data_dir, "dataset")
|
| 259 |
+
else:
|
| 260 |
+
print(f" ✗ Dataset not found at {data_dir}")
|
| 261 |
+
print(f" Run download_all.py first, or specify --data-dir")
|
| 262 |
+
results.append({"name": name, "status": "missing_data"})
|
| 263 |
+
continue
|
| 264 |
+
|
| 265 |
+
if args.dry_run:
|
| 266 |
+
print(f" Would train: {data_dir}")
|
| 267 |
+
print(f" Iterations: {args.iterations}")
|
| 268 |
+
print(f" Output: {splat_out}")
|
| 269 |
+
results.append({"name": name, "status": "dry_run"})
|
| 270 |
+
continue
|
| 271 |
+
|
| 272 |
+
t0 = time.time()
|
| 273 |
+
experiment = slug
|
| 274 |
+
|
| 275 |
+
# Step 1: Train (unless export-only)
|
| 276 |
+
if not args.export_only:
|
| 277 |
+
print(f" Training splatfacto ({args.iterations} iterations) ...")
|
| 278 |
+
try:
|
| 279 |
+
train_splatfacto(data_dir, experiment, args.ns_output, args.iterations)
|
| 280 |
+
except RuntimeError as e:
|
| 281 |
+
print(f" ✗ Training failed: {e}")
|
| 282 |
+
results.append({"name": name, "status": "train_error", "error": str(e)})
|
| 283 |
+
continue
|
| 284 |
+
|
| 285 |
+
# Step 2: Find config and export PLY
|
| 286 |
+
config_path = find_latest_config(args.ns_output, experiment)
|
| 287 |
+
if not config_path:
|
| 288 |
+
print(f" ✗ No config.yml found in {args.ns_output}/{experiment}/")
|
| 289 |
+
results.append({"name": name, "status": "no_config"})
|
| 290 |
+
continue
|
| 291 |
+
|
| 292 |
+
print(f" Exporting PLY from {config_path} ...")
|
| 293 |
+
export_dir = os.path.join(args.ns_output, experiment, "export")
|
| 294 |
+
try:
|
| 295 |
+
exported_ply = export_ply(config_path, export_dir)
|
| 296 |
+
except (RuntimeError, FileNotFoundError) as e:
|
| 297 |
+
print(f" ✗ Export failed: {e}")
|
| 298 |
+
results.append({"name": name, "status": "export_error", "error": str(e)})
|
| 299 |
+
continue
|
| 300 |
+
|
| 301 |
+
# Step 3: Copy PLY to output
|
| 302 |
+
import shutil
|
| 303 |
+
shutil.copy2(exported_ply, ply_out)
|
| 304 |
+
ply_mb = os.path.getsize(ply_out) / 1e6
|
| 305 |
+
print(f" PLY: {ply_mb:.1f} MB → {ply_out}")
|
| 306 |
+
|
| 307 |
+
# Step 4: Convert to .splat
|
| 308 |
+
print(f" Converting to .splat ...")
|
| 309 |
+
try:
|
| 310 |
+
n_gaussians, _, splat_mb = ply_to_splat(ply_out, splat_out)
|
| 311 |
+
except Exception as e:
|
| 312 |
+
print(f" ✗ Conversion failed: {e}")
|
| 313 |
+
results.append({"name": name, "status": "convert_error", "error": str(e)})
|
| 314 |
+
continue
|
| 315 |
+
|
| 316 |
+
elapsed = time.time() - t0
|
| 317 |
+
total_time += elapsed
|
| 318 |
+
print(f" ✓ Done: {n_gaussians:,} gaussians, {splat_mb:.1f} MB .splat ({elapsed:.0f}s)")
|
| 319 |
+
results.append({
|
| 320 |
+
"name": name, "status": "done",
|
| 321 |
+
"gaussians": n_gaussians, "ply_mb": round(ply_mb, 1),
|
| 322 |
+
"splat_mb": round(splat_mb, 1), "seconds": round(elapsed),
|
| 323 |
+
})
|
| 324 |
+
|
| 325 |
+
# Summary
|
| 326 |
+
print(f"\n{'='*60}")
|
| 327 |
+
print("SUMMARY")
|
| 328 |
+
print(f"{'='*60}")
|
| 329 |
+
done = [r for r in results if r["status"] == "done"]
|
| 330 |
+
skipped = [r for r in results if r["status"] == "skipped"]
|
| 331 |
+
errors = [r for r in results if r["status"] not in ("done", "skipped", "dry_run")]
|
| 332 |
+
|
| 333 |
+
if done:
|
| 334 |
+
print(f"\n Trained: {len(done)}")
|
| 335 |
+
for r in done:
|
| 336 |
+
print(f" {r['name']}: {r['gaussians']:,} gaussians, "
|
| 337 |
+
f"{r['splat_mb']} MB, {r['seconds']}s")
|
| 338 |
+
if skipped:
|
| 339 |
+
print(f"\n Skipped (already done): {len(skipped)}")
|
| 340 |
+
if errors:
|
| 341 |
+
print(f"\n Errors: {len(errors)}")
|
| 342 |
+
for r in errors:
|
| 343 |
+
print(f" {r['name']}: {r['status']} — {r.get('error', '')}")
|
| 344 |
+
|
| 345 |
+
if total_time > 0:
|
| 346 |
+
print(f"\n Total training time: {total_time/60:.1f} minutes")
|
| 347 |
+
print(f" Output: {os.path.abspath(args.output)}")
|
| 348 |
+
|
| 349 |
+
|
| 350 |
+
if __name__ == "__main__":
|
| 351 |
+
main()
|