fix(arch): Config SSOT, reproducible builds, and data pipeline documentation (#41)
Browse files* fix(arch): consolidate config and enforce reproducible builds
Consolidates API config into core/config.py (BUG-009) and enforces reproducible builds in Dockerfile using uv and pinned base image (BUG-012).
* chore(deps): migrate from CloseChoice/datasets to neuroimaging-go-brrrr
Replaces the hard-forked dependency with the maintained extension package from . This aligns with the new domain-specific extension architecture while preserving BIDS/NIfTI lazy loading capabilities.
* chore: add results/ to .gitignore
* docs: update NEXT-CONCERNS.md with validated dependency fixes
* docs: add comprehensive documentation for data pipeline and refactor plans
Introduces detailed specifications for the neuroimaging-go-brrrr data pipeline, addressing the integration of NIfTI/BIDS formats with HuggingFace datasets. The documentation outlines the current architecture, proposed refactor for data loading, and the async job queue for long-running ML inference to mitigate timeout issues. Additionally, it includes a postmortem on the NiiVue and Gradio integration attempts, highlighting the challenges faced and lessons learned. This commit aims to enhance clarity and guide future development efforts.
* docs: correct specs per senior review feedback
CORRECTIONS:
- Nifti() returns nibabel.Nifti1Image, NOT numpy array
- Nifti1ImageWrapper calls get_fdata() in constructor (eager load)
- CaseFiles contract requires Paths, wrapper must materialize to temp files
- staging.py already handles nibabel via hasattr(source, "to_filename")
ADDED TO SPEC:
- Tests to delete/rewrite: test_hf_adapter.py, test_loader.py
- Risk matrix: full dataset download, all modalities decoded, eager RAM
- Performance considerations and mitigations
- Open questions to validate during implementation
* style: fix markdown linting in refactor spec
Add language identifiers to fenced code blocks (text for ASCII diagrams)
- .gitignore +1 -0
- DATA-PIPELINE.md +177 -0
- Dockerfile +20 -13
- NEXT-CONCERNS.md +19 -197
- docs/bugs/BUGS-HF-SPACES-INTEGRATION.md +6 -10
- docs/specs/00-data-loading-refactor.md +340 -0
- docs/specs/{frontend β archive/frontend}/36-frontend-without-gradio-hf-spaces.md +2 -0
- docs/specs/{frontend β archive/frontend}/37-0-project-setup.md +0 -0
- docs/specs/{frontend β archive/frontend}/37-1-foundation-components.md +0 -0
- docs/specs/{frontend β archive/frontend}/37-2-api-layer.md +0 -0
- docs/specs/{frontend β archive/frontend}/37-3-interactive-components.md +0 -0
- docs/specs/{frontend β archive/frontend}/37-4-app-integration.md +0 -0
- docs/specs/{frontend β archive/frontend}/37-5-e2e-and-ci.md +0 -0
- docs/specs/{NIIVUE-GRADIO-POSTMORTEM.md β archive/frontend/NIIVUE-GRADIO-POSTMORTEM.md} +0 -0
- docs/specs/{async-job-queue.md β archive/frontend/async-job-queue.md} +0 -0
- frontend/README.md +5 -2
- pyproject.toml +2 -2
- requirements.txt +2 -3
- src/stroke_deepisles_demo/api/config.py +0 -16
- src/stroke_deepisles_demo/api/files.py +4 -3
- src/stroke_deepisles_demo/api/job_store.py +2 -2
- src/stroke_deepisles_demo/api/main.py +9 -22
- src/stroke_deepisles_demo/api/routes.py +7 -8
- src/stroke_deepisles_demo/core/config.py +13 -1
- uv.lock +38 -3
|
@@ -217,3 +217,4 @@ data/scratch/
|
|
| 217 |
.DS_Store
|
| 218 |
# Auto-generated at runtime (path is environment-specific)
|
| 219 |
src/stroke_deepisles_demo/ui/assets/niivue-loader.html
|
|
|
|
|
|
| 217 |
.DS_Store
|
| 218 |
# Auto-generated at runtime (path is environment-specific)
|
| 219 |
src/stroke_deepisles_demo/ui/assets/niivue-loader.html
|
| 220 |
+
results/
|
|
@@ -0,0 +1,177 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Data Pipeline
|
| 2 |
+
|
| 3 |
+
> **The Problem:** HuggingFace `datasets` doesn't natively support NIfTI/BIDS neuroimaging formats.
|
| 4 |
+
> **The Solution:** `neuroimaging-go-brrrr` extends `datasets` with `Nifti()` feature type.
|
| 5 |
+
|
| 6 |
+
---
|
| 7 |
+
|
| 8 |
+
## What is neuroimaging-go-brrrr?
|
| 9 |
+
|
| 10 |
+
```text
|
| 11 |
+
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 12 |
+
β neuroimaging-go-brrrr EXTENDS HUGGINGFACE DATASETS β
|
| 13 |
+
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
|
| 14 |
+
β β
|
| 15 |
+
β pip install datasets pip install neuroimaging-go-brrrr β
|
| 16 |
+
β βββββββββββββββββββββββ βββββββββββββββββββββββββββββββββ β
|
| 17 |
+
β Standard HuggingFace EXTENDS datasets with: β
|
| 18 |
+
β β’ Images, text, audio β’ Nifti() feature type for .nii.gz β
|
| 19 |
+
β β’ Parquet/Arrow storage β’ BIDS directory parsing β
|
| 20 |
+
β β’ Hub integration β’ Upload utilities (BIDSβHub) β
|
| 21 |
+
β β’ Validation utilities β
|
| 22 |
+
β β’ Bug workarounds for upstream issues β
|
| 23 |
+
β β
|
| 24 |
+
β When you install neuroimaging-go-brrrr, you get: β
|
| 25 |
+
β β’ A patched datasets library with Nifti() support (pinned git commit) β
|
| 26 |
+
β β’ bids_hub module for upload/validation β
|
| 27 |
+
β β’ All upstream bug workarounds in one place β
|
| 28 |
+
β β
|
| 29 |
+
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 30 |
+
```
|
| 31 |
+
|
| 32 |
+
**Key insight:** `neuroimaging-go-brrrr` pins to a specific commit of `datasets` that includes `Nifti()` support:
|
| 33 |
+
|
| 34 |
+
```toml
|
| 35 |
+
# From neuroimaging-go-brrrr/pyproject.toml
|
| 36 |
+
[tool.uv.sources]
|
| 37 |
+
datasets = { git = "https://github.com/huggingface/datasets.git", rev = "004a5bf4..." }
|
| 38 |
+
```
|
| 39 |
+
|
| 40 |
+
---
|
| 41 |
+
|
| 42 |
+
## The Two Pipelines
|
| 43 |
+
|
| 44 |
+
### Pipeline 1: UPLOAD (How Data Gets to HuggingFace)
|
| 45 |
+
|
| 46 |
+
```text
|
| 47 |
+
βββββββββββββββββββ ββββββββββββββββββββββββ βββββββββββββββββββββββ
|
| 48 |
+
β Local BIDS β β neuroimaging-go- β β HuggingFace Hub β
|
| 49 |
+
β Directory β βββΊ β brrrr (bids_hub) β βββΊ β hugging-science/ β
|
| 50 |
+
β (Zenodo) β β β β isles24-stroke β
|
| 51 |
+
βββββββββββββββββββ β β’ build_isles24_ β βββββββββββββββββββββββ
|
| 52 |
+
β file_table() β
|
| 53 |
+
β β’ Nifti() features β
|
| 54 |
+
β β’ push_to_hub() β
|
| 55 |
+
ββββββββββββββββββββββββ
|
| 56 |
+
```
|
| 57 |
+
|
| 58 |
+
### Pipeline 2: CONSUMPTION (How This Demo Loads Data)
|
| 59 |
+
|
| 60 |
+
**THE CORRECT PATTERN:**
|
| 61 |
+
|
| 62 |
+
```python
|
| 63 |
+
from datasets import load_dataset
|
| 64 |
+
|
| 65 |
+
# neuroimaging-go-brrrr provides the patched datasets with Nifti() support
|
| 66 |
+
ds = load_dataset("hugging-science/isles24-stroke", split="train")
|
| 67 |
+
|
| 68 |
+
# Access data - Nifti() returns nibabel.Nifti1Image objects
|
| 69 |
+
example = ds[0]
|
| 70 |
+
dwi = example["dwi"] # nibabel.Nifti1Image (NOT numpy array)
|
| 71 |
+
adc = example["adc"] # nibabel.Nifti1Image
|
| 72 |
+
lesion_mask = example["lesion_mask"] # nibabel.Nifti1Image
|
| 73 |
+
|
| 74 |
+
# To get numpy array: dwi.get_fdata()
|
| 75 |
+
# To save to file: dwi.to_filename("dwi.nii.gz")
|
| 76 |
+
```
|
| 77 |
+
|
| 78 |
+
This is the **intended consumption pattern**. It should just work because:
|
| 79 |
+
1. `neuroimaging-go-brrrr` provides the patched `datasets` with `Nifti()` support
|
| 80 |
+
2. The dataset was uploaded with `Nifti()` features
|
| 81 |
+
3. `Nifti(decode=True)` returns nibabel images with affine/header preserved
|
| 82 |
+
|
| 83 |
+
---
|
| 84 |
+
|
| 85 |
+
## Current State: REFACTOR NEEDED
|
| 86 |
+
|
| 87 |
+
**Problem:** stroke-deepisles-demo currently has a hand-rolled workaround in `data/adapter.py` that bypasses `datasets.load_dataset()`. This workaround uses `HfFileSystem` + `pyarrow` directly to download individual parquet files.
|
| 88 |
+
|
| 89 |
+
**Why this is wrong:**
|
| 90 |
+
1. Duplicates bug workarounds that should live in `neuroimaging-go-brrrr`
|
| 91 |
+
2. Doesn't use the `Nifti()` feature type properly
|
| 92 |
+
3. Harder to maintain - fixes need to happen in multiple places
|
| 93 |
+
|
| 94 |
+
**The fix:**
|
| 95 |
+
1. Delete the custom `HuggingFaceDataset` adapter in `data/adapter.py`
|
| 96 |
+
2. Use standard `datasets.load_dataset()` consumption pattern
|
| 97 |
+
3. If there are bugs, fix them in `neuroimaging-go-brrrr`, not locally
|
| 98 |
+
|
| 99 |
+
---
|
| 100 |
+
|
| 101 |
+
## Dependency Relationship
|
| 102 |
+
|
| 103 |
+
```text
|
| 104 |
+
stroke-deepisles-demo (this repo)
|
| 105 |
+
β
|
| 106 |
+
βββ neuroimaging-go-brrrr @ v0.2.1
|
| 107 |
+
β
|
| 108 |
+
βββ datasets @ git commit 004a5bf4... (patched with Nifti())
|
| 109 |
+
βββ huggingface-hub
|
| 110 |
+
βββ bids_hub module (upload + validation utilities)
|
| 111 |
+
```
|
| 112 |
+
|
| 113 |
+
**The consumption should flow through the standard pattern:**
|
| 114 |
+
|
| 115 |
+
```text
|
| 116 |
+
stroke-deepisles-demo
|
| 117 |
+
β
|
| 118 |
+
β from datasets import load_dataset
|
| 119 |
+
β ds = load_dataset("hugging-science/isles24-stroke")
|
| 120 |
+
βΌ
|
| 121 |
+
neuroimaging-go-brrrr (provides patched datasets)
|
| 122 |
+
β
|
| 123 |
+
β Nifti() feature type handles lazy loading
|
| 124 |
+
βΌ
|
| 125 |
+
HuggingFace Hub (isles24-stroke dataset)
|
| 126 |
+
```
|
| 127 |
+
|
| 128 |
+
---
|
| 129 |
+
|
| 130 |
+
## Dataset Info
|
| 131 |
+
|
| 132 |
+
| Property | Value |
|
| 133 |
+
|----------|-------|
|
| 134 |
+
| Dataset ID | `hugging-science/isles24-stroke` |
|
| 135 |
+
| Subjects | 149 |
|
| 136 |
+
| Modalities | DWI, ADC, Lesion Mask, NCCT, CTA, CTP, Perfusion Maps |
|
| 137 |
+
| Source | [Zenodo 17652035](https://zenodo.org/records/17652035) |
|
| 138 |
+
|
| 139 |
+
---
|
| 140 |
+
|
| 141 |
+
## What bids_hub Provides
|
| 142 |
+
|
| 143 |
+
```text
|
| 144 |
+
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 145 |
+
β neuroimaging-go-brrrr (bids_hub) β
|
| 146 |
+
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
|
| 147 |
+
β β
|
| 148 |
+
β FOR UPLOADING: FOR CONSUMING: β
|
| 149 |
+
β ββββββββββββββ ββββββββββββββ β
|
| 150 |
+
β build_isles24_file_table() Patched datasets with Nifti() β
|
| 151 |
+
β get_isles24_features() βββ Use standard load_dataset() β
|
| 152 |
+
β push_dataset_to_hub() β
|
| 153 |
+
β validate_isles24_download() β
|
| 154 |
+
β We DON'T use these in this demo. βββ ISLES24_EXPECTED_COUNTS β
|
| 155 |
+
β Dataset already uploaded. βββ Can use for sanity checking β
|
| 156 |
+
β β
|
| 157 |
+
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 158 |
+
```
|
| 159 |
+
|
| 160 |
+
---
|
| 161 |
+
|
| 162 |
+
## Related Documentation
|
| 163 |
+
|
| 164 |
+
- [neuroimaging-go-brrrr](https://github.com/The-Obstacle-Is-The-Way/neuroimaging-go-brrrr)
|
| 165 |
+
- [isles24-stroke dataset card](https://huggingface.co/datasets/hugging-science/isles24-stroke)
|
| 166 |
+
|
| 167 |
+
---
|
| 168 |
+
|
| 169 |
+
## TODO: Refactor Data Loading
|
| 170 |
+
|
| 171 |
+
The current hand-rolled adapter in `data/adapter.py` should be replaced with standard `datasets.load_dataset()` consumption. This refactor should:
|
| 172 |
+
|
| 173 |
+
1. Remove `HuggingFaceDataset` class from `data/adapter.py`
|
| 174 |
+
2. Update `data/loader.py` to use `datasets.load_dataset()`
|
| 175 |
+
3. Remove pre-computed constants in `data/constants.py` (no longer needed)
|
| 176 |
+
4. Test that `Nifti()` lazy loading works correctly
|
| 177 |
+
5. If bugs are found, report/fix them in `neuroimaging-go-brrrr`
|
|
@@ -13,7 +13,7 @@
|
|
| 13 |
# FROM isleschallenge/deepisles@sha256:<digest>
|
| 14 |
# Check https://hub.docker.com/r/isleschallenge/deepisles/tags for updates.
|
| 15 |
# Current base: DeepISLES v1.1 (as of Dec 2025)
|
| 16 |
-
FROM isleschallenge/deepisles:
|
| 17 |
|
| 18 |
# Set environment variables for non-interactive installation
|
| 19 |
ENV DEBIAN_FRONTEND=noninteractive
|
|
@@ -28,16 +28,21 @@ RUN useradd -m -u 1000 user 2>/dev/null || true
|
|
| 28 |
# /app contains DeepISLES code (main.py, src/, weights/) that we must NOT overwrite
|
| 29 |
WORKDIR /home/user/demo
|
| 30 |
|
| 31 |
-
# Copy
|
| 32 |
-
COPY --chown=1000:1000
|
| 33 |
|
| 34 |
-
# Install
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 38 |
|
| 39 |
# Copy application source code and package files
|
| 40 |
-
COPY --chown=1000:1000 pyproject.toml /home/user/demo/pyproject.toml
|
| 41 |
COPY --chown=1000:1000 README.md /home/user/demo/README.md
|
| 42 |
COPY --chown=1000:1000 src/ /home/user/demo/src/
|
| 43 |
|
|
@@ -45,9 +50,8 @@ COPY --chown=1000:1000 src/ /home/user/demo/src/
|
|
| 45 |
# This script runs in the conda env (Py3.8) and is called via subprocess
|
| 46 |
COPY --chown=1000:1000 scripts/deepisles_adapter.py /app/deepisles_adapter.py
|
| 47 |
|
| 48 |
-
# Install the package itself
|
| 49 |
-
|
| 50 |
-
RUN pip install --no-cache-dir --no-deps -e .
|
| 51 |
|
| 52 |
# Set environment variable to indicate we're running in HF Spaces
|
| 53 |
# This allows the app to detect runtime environment and use direct invocation
|
|
@@ -76,11 +80,14 @@ EXPOSE 7860
|
|
| 76 |
ENTRYPOINT []
|
| 77 |
|
| 78 |
# Explicit frontend origin for CORS
|
| 79 |
-
ENV
|
| 80 |
|
| 81 |
# Explicit backend public URL for constructing file URLs
|
| 82 |
# This ensures correct https:// URLs even if proxy headers aren't forwarded correctly
|
| 83 |
-
ENV
|
|
|
|
|
|
|
|
|
|
| 84 |
|
| 85 |
# Run FastAPI with uvicorn (module path: stroke_deepisles_demo.api.main:app)
|
| 86 |
# --proxy-headers: Trust X-Forwarded-Proto from HF Spaces proxy (ensures https:// in request.base_url)
|
|
|
|
| 13 |
# FROM isleschallenge/deepisles@sha256:<digest>
|
| 14 |
# Check https://hub.docker.com/r/isleschallenge/deepisles/tags for updates.
|
| 15 |
# Current base: DeepISLES v1.1 (as of Dec 2025)
|
| 16 |
+
FROM isleschallenge/deepisles@sha256:848c9eceb67dbc585bcb37f093389d142caeaa98878bd31039af04ef297a5af4
|
| 17 |
|
| 18 |
# Set environment variables for non-interactive installation
|
| 19 |
ENV DEBIAN_FRONTEND=noninteractive
|
|
|
|
| 28 |
# /app contains DeepISLES code (main.py, src/, weights/) that we must NOT overwrite
|
| 29 |
WORKDIR /home/user/demo
|
| 30 |
|
| 31 |
+
# Copy dependency files for reproducible installs
|
| 32 |
+
COPY --chown=1000:1000 pyproject.toml uv.lock /home/user/demo/
|
| 33 |
|
| 34 |
+
# Install uv for reproducible dependency management
|
| 35 |
+
RUN pip install --no-cache-dir uv
|
| 36 |
+
|
| 37 |
+
# Create virtual environment and add to PATH
|
| 38 |
+
ENV VIRTUAL_ENV=/home/user/demo/.venv
|
| 39 |
+
ENV PATH="$VIRTUAL_ENV/bin:$PATH"
|
| 40 |
+
|
| 41 |
+
# Install Python dependencies from lock file (frozen = fail if lock stale)
|
| 42 |
+
# This ensures CI, local dev, and production use IDENTICAL versions
|
| 43 |
+
RUN uv sync --frozen --no-dev --no-install-project
|
| 44 |
|
| 45 |
# Copy application source code and package files
|
|
|
|
| 46 |
COPY --chown=1000:1000 README.md /home/user/demo/README.md
|
| 47 |
COPY --chown=1000:1000 src/ /home/user/demo/src/
|
| 48 |
|
|
|
|
| 50 |
# This script runs in the conda env (Py3.8) and is called via subprocess
|
| 51 |
COPY --chown=1000:1000 scripts/deepisles_adapter.py /app/deepisles_adapter.py
|
| 52 |
|
| 53 |
+
# Install the package itself (dependencies already installed from lock)
|
| 54 |
+
RUN uv pip install --no-deps -e .
|
|
|
|
| 55 |
|
| 56 |
# Set environment variable to indicate we're running in HF Spaces
|
| 57 |
# This allows the app to detect runtime environment and use direct invocation
|
|
|
|
| 80 |
ENTRYPOINT []
|
| 81 |
|
| 82 |
# Explicit frontend origin for CORS
|
| 83 |
+
ENV STROKE_DEMO_FRONTEND_ORIGINS='["https://vibecodermcswaggins-stroke-viewer-frontend.hf.space"]'
|
| 84 |
|
| 85 |
# Explicit backend public URL for constructing file URLs
|
| 86 |
# This ensures correct https:// URLs even if proxy headers aren't forwarded correctly
|
| 87 |
+
ENV STROKE_DEMO_BACKEND_PUBLIC_URL=https://vibecodermcswaggins-stroke-deepisles-demo.hf.space
|
| 88 |
+
|
| 89 |
+
# Results directory (matches default in code, but explicit is better)
|
| 90 |
+
ENV STROKE_DEMO_RESULTS_DIR=/tmp/stroke-results
|
| 91 |
|
| 92 |
# Run FastAPI with uvicorn (module path: stroke_deepisles_demo.api.main:app)
|
| 93 |
# --proxy-headers: Trust X-Forwarded-Proto from HF Spaces proxy (ensures https:// in request.base_url)
|
|
@@ -5,212 +5,34 @@
|
|
| 5 |
|
| 6 |
---
|
| 7 |
|
| 8 |
-
## PART 1: CONFIG DRIFT (BUG-009) -
|
| 9 |
|
| 10 |
-
###
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
# Line 93
|
| 16 |
-
results_dir: Path = Path("./results")
|
| 17 |
-
```
|
| 18 |
-
|
| 19 |
-
- Uses `STROKE_DEMO_*` env prefix
|
| 20 |
-
- Supports `.env` file loading
|
| 21 |
-
- Used by: `inference/deepisles.py`, `ui/components.py`, `ui/app.py`
|
| 22 |
-
|
| 23 |
-
**System B: Module Constants** (`api/config.py`)
|
| 24 |
-
|
| 25 |
-
```python
|
| 26 |
-
# Line 10
|
| 27 |
-
RESULTS_DIR = Path("/tmp/stroke-results")
|
| 28 |
-
```
|
| 29 |
-
|
| 30 |
-
- Raw constants, no env var support
|
| 31 |
-
- Used by: `routes.py`, `files.py`, `main.py`, `job_store.py`
|
| 32 |
-
|
| 33 |
-
### The Problem
|
| 34 |
-
|
| 35 |
-
These define **DIFFERENT VALUES** for the same concept:
|
| 36 |
-
|
| 37 |
-
- `./results` vs `/tmp/stroke-results`
|
| 38 |
-
|
| 39 |
-
Production uses System B (API), but System A exists and could cause confusion.
|
| 40 |
-
|
| 41 |
-
### Env Var Fragmentation
|
| 42 |
-
|
| 43 |
-
| Where | Var Name | System |
|
| 44 |
-
|-------|----------|--------|
|
| 45 |
-
| Dockerfile:79 | `FRONTEND_ORIGIN` | Raw env var |
|
| 46 |
-
| Dockerfile:83 | `BACKEND_PUBLIC_URL` | Raw env var |
|
| 47 |
-
| core/config.py:68 | `STROKE_DEMO_*` | Pydantic Settings |
|
| 48 |
-
|
| 49 |
-
**These don't talk to each other.** The Dockerfile sets raw vars that Settings doesn't read.
|
| 50 |
-
|
| 51 |
-
### Recommended Fix
|
| 52 |
-
|
| 53 |
-
**Option B from audit: Pydantic Settings as SSOT**
|
| 54 |
-
|
| 55 |
-
1. Add API-relevant settings to `core/config.py`:
|
| 56 |
-
- `results_dir` (already exists, fix default to `/tmp/stroke-results`)
|
| 57 |
-
- `max_concurrent_jobs`
|
| 58 |
-
- `frontend_origin`
|
| 59 |
-
- `backend_public_url`
|
| 60 |
-
|
| 61 |
-
2. Delete or convert `api/config.py` to thin shim that imports from Settings
|
| 62 |
-
|
| 63 |
-
3. Update Dockerfile env vars to use `STROKE_DEMO_*` prefix
|
| 64 |
-
|
| 65 |
-
4. Update all API modules to import from `core/config.py`
|
| 66 |
|
| 67 |
---
|
| 68 |
|
| 69 |
-
## PART 2: DEPENDENCY PINNING (BUG-012) -
|
| 70 |
-
|
| 71 |
-
### Current State: Production Build is NOT Reproducible
|
| 72 |
-
|
| 73 |
-
**What we have:**
|
| 74 |
-
|
| 75 |
-
- `uv.lock` β
Committed and used by CI
|
| 76 |
-
- `requirements.txt` β οΈ Has loose `>=` ranges
|
| 77 |
-
- Dockerfile uses `pip install -r requirements.txt` β **IGNORES uv.lock**
|
| 78 |
-
|
| 79 |
-
**Evidence:**
|
| 80 |
-
|
| 81 |
-
```dockerfile
|
| 82 |
-
# Dockerfile:37 - IGNORES uv.lock!
|
| 83 |
-
RUN pip install --no-cache-dir -r requirements.txt
|
| 84 |
-
```
|
| 85 |
-
|
| 86 |
-
```txt
|
| 87 |
-
# requirements.txt - LOOSE PINS
|
| 88 |
-
fastapi>=0.115.0
|
| 89 |
-
pydantic>=2.5.0
|
| 90 |
-
uvicorn[standard]>=0.32.0
|
| 91 |
-
```
|
| 92 |
-
|
| 93 |
-
### Base Image Unpinned
|
| 94 |
-
|
| 95 |
-
```dockerfile
|
| 96 |
-
# Dockerfile:16 - NO SHA DIGEST
|
| 97 |
-
FROM isleschallenge/deepisles:latest
|
| 98 |
-
```
|
| 99 |
-
|
| 100 |
-
`:latest` can change anytime. A rebuild tomorrow could get different dependencies.
|
| 101 |
-
|
| 102 |
-
### The Problem
|
| 103 |
-
|
| 104 |
-
| Environment | Install Method | Reproducible? |
|
| 105 |
-
|-------------|----------------|---------------|
|
| 106 |
-
| CI | `uv sync` | β
Yes |
|
| 107 |
-
| Local dev | `uv sync` | β
Yes |
|
| 108 |
-
| **Production Docker** | `pip install -r requirements.txt` | β **NO** |
|
| 109 |
-
|
| 110 |
-
### Recommended Fix
|
| 111 |
-
|
| 112 |
-
**Option A: Make Docker use uv.lock**
|
| 113 |
-
|
| 114 |
-
```dockerfile
|
| 115 |
-
# Install uv
|
| 116 |
-
RUN pip install uv
|
| 117 |
-
|
| 118 |
-
# Copy lock file
|
| 119 |
-
COPY uv.lock pyproject.toml ./
|
| 120 |
-
|
| 121 |
-
# Install from lock (frozen = fail if lock is stale)
|
| 122 |
-
RUN uv sync --frozen --no-dev
|
| 123 |
-
```
|
| 124 |
-
|
| 125 |
-
**Option B: Pin requirements.txt exactly**
|
| 126 |
-
|
| 127 |
-
```txt
|
| 128 |
-
# requirements.txt - EXACT PINS
|
| 129 |
-
fastapi==0.115.6
|
| 130 |
-
pydantic==2.10.3
|
| 131 |
-
uvicorn[standard]==0.32.1
|
| 132 |
-
```
|
| 133 |
-
|
| 134 |
-
**Either way: Pin the base image**
|
| 135 |
-
|
| 136 |
-
```dockerfile
|
| 137 |
-
# Get digest from: docker pull isleschallenge/deepisles:latest && docker images --digests
|
| 138 |
-
FROM isleschallenge/deepisles@sha256:<actual-digest>
|
| 139 |
-
```
|
| 140 |
-
|
| 141 |
-
---
|
| 142 |
-
|
| 143 |
-
## PART 3: FRONTEND CONFIG
|
| 144 |
-
|
| 145 |
-
### Current State
|
| 146 |
-
|
| 147 |
-
- `frontend/.env.production` hardcodes API URL at build time
|
| 148 |
-
- Works but not flexible for different deployments
|
| 149 |
-
|
| 150 |
-
### HF Static Spaces Alternative
|
| 151 |
-
|
| 152 |
-
HF exposes runtime variables via `window.huggingface.variables`. Currently unused.
|
| 153 |
-
|
| 154 |
-
### Recommendation
|
| 155 |
-
|
| 156 |
-
Keep current approach (build-time `.env.production`) unless multi-deployment flexibility needed.
|
| 157 |
-
|
| 158 |
-
---
|
| 159 |
-
|
| 160 |
-
## MIGRATION PLAN
|
| 161 |
-
|
| 162 |
-
### Phase 1: Config Consolidation (BUG-009)
|
| 163 |
-
|
| 164 |
-
1. Add to `core/config.py` Settings class:
|
| 165 |
-
|
| 166 |
-
```python
|
| 167 |
-
# API settings
|
| 168 |
-
results_dir: Path = Path("/tmp/stroke-results") # Fix default
|
| 169 |
-
max_concurrent_jobs: int = 10
|
| 170 |
-
frontend_origins: list[str] = ["http://localhost:5173"]
|
| 171 |
-
backend_public_url: str | None = None
|
| 172 |
-
```
|
| 173 |
-
|
| 174 |
-
2. Update Dockerfile env vars:
|
| 175 |
-
|
| 176 |
-
```dockerfile
|
| 177 |
-
ENV STROKE_DEMO_FRONTEND_ORIGINS='["https://...hf.space"]'
|
| 178 |
-
ENV STROKE_DEMO_BACKEND_PUBLIC_URL=https://...hf.space
|
| 179 |
-
```
|
| 180 |
-
|
| 181 |
-
3. Update imports in API modules:
|
| 182 |
-
- `routes.py`, `files.py`, `main.py`, `job_store.py`
|
| 183 |
-
- Change from `api.config` to `core.config.get_settings()`
|
| 184 |
-
|
| 185 |
-
4. Delete `api/config.py` or convert to compatibility shim
|
| 186 |
-
|
| 187 |
-
### Phase 2: Dependency Reproducibility (BUG-012)
|
| 188 |
-
|
| 189 |
-
1. Pin base image with SHA digest in Dockerfile
|
| 190 |
-
|
| 191 |
-
2. Choose ONE lock strategy:
|
| 192 |
-
- **Recommended:** Make Dockerfile use `uv sync --frozen`
|
| 193 |
-
- Alternative: Generate pinned requirements.txt from uv.lock
|
| 194 |
-
|
| 195 |
-
3. Update CI to use `--frozen` flag (fail if lock stale)
|
| 196 |
-
|
| 197 |
-
### Phase 3: Validation
|
| 198 |
|
| 199 |
-
|
| 200 |
-
|
| 201 |
-
|
|
|
|
|
|
|
|
|
|
| 202 |
|
| 203 |
---
|
| 204 |
|
| 205 |
-
##
|
| 206 |
|
| 207 |
-
|
| 208 |
-
-
|
| 209 |
-
-
|
| 210 |
-
- Docker pinning: https://docs.docker.com/build/building/best-practices/
|
| 211 |
-
- 12-Factor Config: https://12factor.net/config
|
| 212 |
|
| 213 |
---
|
| 214 |
|
| 215 |
-
**Validated:**
|
| 216 |
-
**Status:**
|
|
|
|
| 5 |
|
| 6 |
---
|
| 7 |
|
| 8 |
+
## PART 1: CONFIG DRIFT (BUG-009) - RESOLVED
|
| 9 |
|
| 10 |
+
### Status
|
| 11 |
+
- **Consolidated:** `api/config.py` deleted.
|
| 12 |
+
- **SSOT:** `core/config.py` now holds all API settings.
|
| 13 |
+
- **Env Vars:** `Dockerfile` updated to use `STROKE_DEMO_*` prefix.
|
| 14 |
+
- **Validation:** Tests pass, env var overrides work.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 15 |
|
| 16 |
---
|
| 17 |
|
| 18 |
+
## PART 2: DEPENDENCY PINNING (BUG-012) - RESOLVED
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
|
| 20 |
+
### Status
|
| 21 |
+
- **Base Image:** Pinned to `sha256:848c9eceb67dbc585bcb37f093389d142caeaa98878bd31039af04ef297a5af4`.
|
| 22 |
+
- **Lock File:** Dockerfile now uses `uv sync --frozen` to respect `uv.lock`.
|
| 23 |
+
- **Path:** Dockerfile adds `.venv/bin` to `PATH` for correct execution.
|
| 24 |
+
- **Dependency Migration:** Migrated from hard-forked `datasets` to maintained `neuroimaging-go-brrrr` extension (v0.2.1) + standard `datasets` library. Validated end-to-end in Docker.
|
| 25 |
+
- **Validation:** Docker build succeeds, runtime verifies settings load and modules importable.
|
| 26 |
|
| 27 |
---
|
| 28 |
|
| 29 |
+
## PART 3: FRONTEND CONFIG - NO ACTION NEEDED
|
| 30 |
|
| 31 |
+
### Status
|
| 32 |
+
- Keeping current build-time `.env.production` approach.
|
| 33 |
+
- No immediate need for runtime variables via `window.huggingface.variables`.
|
|
|
|
|
|
|
| 34 |
|
| 35 |
---
|
| 36 |
|
| 37 |
+
**Validated:** 2025-12-12
|
| 38 |
+
**Status:** COMPLETED
|
|
@@ -357,24 +357,20 @@ Document in README:
|
|
| 357 |
|
| 358 |
---
|
| 359 |
|
| 360 |
-
### BUG-009: FRONTEND_ORIGIN Env Var Not Explicitly Set
|
| 361 |
|
| 362 |
-
**
|
| 363 |
-
|
| 364 |
-
**File**: `Dockerfile`
|
| 365 |
-
|
| 366 |
-
#### Problem
|
| 367 |
|
|
|
|
| 368 |
Code supports `FRONTEND_ORIGIN` but Dockerfile doesn't set it:
|
| 369 |
```python
|
| 370 |
FRONTEND_ORIGIN = os.environ.get("FRONTEND_ORIGIN", "")
|
| 371 |
```
|
| 372 |
|
| 373 |
-
|
| 374 |
-
|
| 375 |
-
Add to Dockerfile:
|
| 376 |
```dockerfile
|
| 377 |
-
ENV
|
| 378 |
```
|
| 379 |
|
| 380 |
---
|
|
|
|
| 357 |
|
| 358 |
---
|
| 359 |
|
| 360 |
+
### BUG-009: FRONTEND_ORIGIN Env Var Not Explicitly Set (RESOLVED)
|
| 361 |
|
| 362 |
+
**Resolved:** 2024-12-12 via Config Consolidation.
|
| 363 |
+
Backend now uses `STROKE_DEMO_FRONTEND_ORIGINS` (JSON list) via Pydantic Settings.
|
|
|
|
|
|
|
|
|
|
| 364 |
|
| 365 |
+
Previous state:
|
| 366 |
Code supports `FRONTEND_ORIGIN` but Dockerfile doesn't set it:
|
| 367 |
```python
|
| 368 |
FRONTEND_ORIGIN = os.environ.get("FRONTEND_ORIGIN", "")
|
| 369 |
```
|
| 370 |
|
| 371 |
+
Fixed state:
|
|
|
|
|
|
|
| 372 |
```dockerfile
|
| 373 |
+
ENV STROKE_DEMO_FRONTEND_ORIGINS='["https://vibecodermcswaggins-stroke-viewer-frontend.hf.space"]'
|
| 374 |
```
|
| 375 |
|
| 376 |
---
|
|
@@ -0,0 +1,340 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# SPEC-00: Data Loading Refactor
|
| 2 |
+
|
| 3 |
+
> **Status:** Draft (Updated per Senior Review)
|
| 4 |
+
> **Priority:** Critical
|
| 5 |
+
> **Estimated Scope:** Delete ~350 lines, add ~100 lines, rewrite 2 test files
|
| 6 |
+
|
| 7 |
+
---
|
| 8 |
+
|
| 9 |
+
## Problem Statement
|
| 10 |
+
|
| 11 |
+
`stroke-deepisles-demo` has a hand-rolled data loading workaround that:
|
| 12 |
+
|
| 13 |
+
1. **Bypasses `datasets.load_dataset()`** - Uses `HfFileSystem + pyarrow` directly
|
| 14 |
+
2. **Duplicates bug workarounds** - Should live in `neuroimaging-go-brrrr`
|
| 15 |
+
3. **Doesn't use `Nifti()` feature type** - Manually extracts bytes from parquet
|
| 16 |
+
4. **Pre-computes 149 case IDs** - Static list that could drift from source
|
| 17 |
+
|
| 18 |
+
This defeats the purpose of depending on `neuroimaging-go-brrrr`.
|
| 19 |
+
|
| 20 |
+
---
|
| 21 |
+
|
| 22 |
+
## Root Cause
|
| 23 |
+
|
| 24 |
+
The workaround was created due to:
|
| 25 |
+
- PyArrow streaming bug (apache/arrow#45214) - hangs on parquet iteration
|
| 26 |
+
- Memory concerns about downloading 99GB
|
| 27 |
+
|
| 28 |
+
**But:** These should be solved in `neuroimaging-go-brrrr`, not locally.
|
| 29 |
+
|
| 30 |
+
---
|
| 31 |
+
|
| 32 |
+
## Current Architecture (WRONG)
|
| 33 |
+
|
| 34 |
+
```text
|
| 35 |
+
stroke-deepisles-demo
|
| 36 |
+
β
|
| 37 |
+
βββ data/adapter.py (379 lines)
|
| 38 |
+
β βββ HuggingFaceDataset class
|
| 39 |
+
β β βββ _download_case_from_parquet() - manual parquet reading
|
| 40 |
+
β β βββ Uses HfFileSystem + pyarrow directly
|
| 41 |
+
β βββ build_huggingface_dataset() - bypasses load_dataset()
|
| 42 |
+
β
|
| 43 |
+
βββ data/constants.py (182 lines)
|
| 44 |
+
β βββ ISLES24_CASE_IDS - pre-computed 149 case IDs
|
| 45 |
+
β
|
| 46 |
+
βββ data/loader.py
|
| 47 |
+
β βββ load_isles_dataset() - dispatches to adapter
|
| 48 |
+
β
|
| 49 |
+
βββ tests/data/
|
| 50 |
+
βββ test_hf_adapter.py - tests HuggingFaceDataset (DELETE/REWRITE)
|
| 51 |
+
βββ test_loader.py - imports HuggingFaceDataset (UPDATE)
|
| 52 |
+
```
|
| 53 |
+
|
| 54 |
+
---
|
| 55 |
+
|
| 56 |
+
## What Nifti() Actually Returns
|
| 57 |
+
|
| 58 |
+
**CRITICAL CORRECTION:** The original spec incorrectly stated `Nifti()` returns numpy arrays.
|
| 59 |
+
|
| 60 |
+
Per the `datasets` source code:
|
| 61 |
+
- **`Nifti(decode=True)`** (default): Returns a `Nifti1ImageWrapper` (subclass of `nibabel.nifti1.Nifti1Image`)
|
| 62 |
+
- The wrapper calls `get_fdata()` in its constructor (eager load to RAM)
|
| 63 |
+
- Preserves affine and header
|
| 64 |
+
- **`Nifti(decode=False)`**: Returns a dict `{"path": ..., "bytes": ...}`
|
| 65 |
+
|
| 66 |
+
This means:
|
| 67 |
+
```python
|
| 68 |
+
ds = load_dataset("hugging-science/isles24-stroke", split="train")
|
| 69 |
+
example = ds[0]
|
| 70 |
+
dwi = example["dwi"] # nibabel.Nifti1Image, NOT numpy array
|
| 71 |
+
```
|
| 72 |
+
|
| 73 |
+
---
|
| 74 |
+
|
| 75 |
+
## CaseFiles Contract
|
| 76 |
+
|
| 77 |
+
The existing `CaseFiles` TypedDict expects **Paths**, not nibabel images:
|
| 78 |
+
|
| 79 |
+
```python
|
| 80 |
+
# core/types.py:12
|
| 81 |
+
class CaseFiles(TypedDict):
|
| 82 |
+
dwi: Path
|
| 83 |
+
adc: Path
|
| 84 |
+
flair: NotRequired[Path]
|
| 85 |
+
ground_truth: NotRequired[Path]
|
| 86 |
+
```
|
| 87 |
+
|
| 88 |
+
Downstream code (`pipeline.py:124`) uses:
|
| 89 |
+
```python
|
| 90 |
+
shutil.copy2(case_files["dwi"], dwi_dest) # Expects Path!
|
| 91 |
+
```
|
| 92 |
+
|
| 93 |
+
**The new wrapper MUST continue materializing to temp files to preserve this contract.**
|
| 94 |
+
|
| 95 |
+
---
|
| 96 |
+
|
| 97 |
+
## Target Architecture (CORRECT)
|
| 98 |
+
|
| 99 |
+
```text
|
| 100 |
+
stroke-deepisles-demo
|
| 101 |
+
β
|
| 102 |
+
βββ data/loader.py
|
| 103 |
+
βββ load_isles_dataset()
|
| 104 |
+
β
|
| 105 |
+
β from datasets import load_dataset
|
| 106 |
+
β ds = load_dataset("hugging-science/isles24-stroke")
|
| 107 |
+
βΌ
|
| 108 |
+
HuggingFaceDatasetWrapper (thin wrapper)
|
| 109 |
+
β
|
| 110 |
+
β get_case() materializes nibabel β temp file β Path
|
| 111 |
+
βΌ
|
| 112 |
+
CaseFiles (Paths to temp files, same contract as before)
|
| 113 |
+
```
|
| 114 |
+
|
| 115 |
+
---
|
| 116 |
+
|
| 117 |
+
## Files to Modify
|
| 118 |
+
|
| 119 |
+
### DELETE
|
| 120 |
+
|
| 121 |
+
| File | Lines | Reason |
|
| 122 |
+
|------|-------|--------|
|
| 123 |
+
| `data/constants.py` | 182 | Pre-computed IDs no longer needed |
|
| 124 |
+
| `data/adapter.py` (partial) | ~250 | `HuggingFaceDataset` class + `build_huggingface_dataset()` |
|
| 125 |
+
|
| 126 |
+
**Keep in `adapter.py`:**
|
| 127 |
+
- `LocalDataset` class (for local BIDS directories)
|
| 128 |
+
- `build_local_dataset()` function
|
| 129 |
+
- `parse_subject_id()` helper
|
| 130 |
+
|
| 131 |
+
### DELETE/REWRITE (Tests)
|
| 132 |
+
|
| 133 |
+
| File | Reason |
|
| 134 |
+
|------|--------|
|
| 135 |
+
| `tests/data/test_hf_adapter.py` | Tests deleted `HuggingFaceDataset` class |
|
| 136 |
+
| `tests/data/test_loader.py` | Imports `HuggingFaceDataset` |
|
| 137 |
+
|
| 138 |
+
### MODIFY
|
| 139 |
+
|
| 140 |
+
| File | Change |
|
| 141 |
+
|------|--------|
|
| 142 |
+
| `data/loader.py` | Add `HuggingFaceDatasetWrapper`, use `load_dataset()` |
|
| 143 |
+
| `data/__init__.py` | Update exports |
|
| 144 |
+
|
| 145 |
+
### NO CHANGE NEEDED
|
| 146 |
+
|
| 147 |
+
| File | Reason |
|
| 148 |
+
|------|--------|
|
| 149 |
+
| `data/staging.py` | Already handles nibabel via `hasattr(source, "to_filename")` |
|
| 150 |
+
| `pipeline.py` | Will work unchanged if wrapper returns Paths |
|
| 151 |
+
|
| 152 |
+
---
|
| 153 |
+
|
| 154 |
+
## Implementation Plan
|
| 155 |
+
|
| 156 |
+
### Phase 1: Add HuggingFaceDatasetWrapper
|
| 157 |
+
|
| 158 |
+
```python
|
| 159 |
+
# data/loader.py
|
| 160 |
+
|
| 161 |
+
from pathlib import Path
|
| 162 |
+
import tempfile
|
| 163 |
+
import shutil
|
| 164 |
+
|
| 165 |
+
class HuggingFaceDatasetWrapper:
|
| 166 |
+
"""Thin wrapper matching Dataset protocol.
|
| 167 |
+
|
| 168 |
+
Uses datasets.load_dataset() for consumption.
|
| 169 |
+
Materializes nibabel images to temp files to preserve CaseFiles contract.
|
| 170 |
+
"""
|
| 171 |
+
|
| 172 |
+
def __init__(self, hf_dataset, dataset_id: str):
|
| 173 |
+
self._ds = hf_dataset
|
| 174 |
+
self._dataset_id = dataset_id
|
| 175 |
+
self._temp_dir: Path | None = None
|
| 176 |
+
self._cached_cases: dict[str, CaseFiles] = {}
|
| 177 |
+
# Build subject_id index once (avoid repeated iteration)
|
| 178 |
+
self._subject_index: dict[str, int] = {
|
| 179 |
+
row["subject_id"]: i for i, row in enumerate(self._ds)
|
| 180 |
+
}
|
| 181 |
+
|
| 182 |
+
def __len__(self) -> int:
|
| 183 |
+
return len(self._ds)
|
| 184 |
+
|
| 185 |
+
def __enter__(self) -> Self:
|
| 186 |
+
return self
|
| 187 |
+
|
| 188 |
+
def __exit__(self, *args: object) -> None:
|
| 189 |
+
self.cleanup()
|
| 190 |
+
|
| 191 |
+
def list_case_ids(self) -> list[str]:
|
| 192 |
+
"""Return sorted list of subject IDs (uses cached index)."""
|
| 193 |
+
return sorted(self._subject_index.keys())
|
| 194 |
+
|
| 195 |
+
def get_case(self, case_id: str | int) -> CaseFiles:
|
| 196 |
+
"""Get case - materializes nibabel images to temp files."""
|
| 197 |
+
# Resolve case_id
|
| 198 |
+
if isinstance(case_id, int):
|
| 199 |
+
subject_id = list(self._subject_index.keys())[case_id]
|
| 200 |
+
idx = case_id
|
| 201 |
+
else:
|
| 202 |
+
subject_id = case_id
|
| 203 |
+
idx = self._subject_index[subject_id]
|
| 204 |
+
|
| 205 |
+
# Return cached if available
|
| 206 |
+
if subject_id in self._cached_cases:
|
| 207 |
+
return self._cached_cases[subject_id]
|
| 208 |
+
|
| 209 |
+
# Create temp dir on first use
|
| 210 |
+
if self._temp_dir is None:
|
| 211 |
+
self._temp_dir = Path(tempfile.mkdtemp(prefix="isles24_hf_"))
|
| 212 |
+
|
| 213 |
+
# Get row from dataset (this triggers Nifti() decode)
|
| 214 |
+
row = self._ds[idx]
|
| 215 |
+
|
| 216 |
+
# Materialize nibabel images to temp files
|
| 217 |
+
case_dir = self._temp_dir / subject_id
|
| 218 |
+
case_dir.mkdir(exist_ok=True)
|
| 219 |
+
|
| 220 |
+
dwi_path = case_dir / f"{subject_id}_dwi.nii.gz"
|
| 221 |
+
adc_path = case_dir / f"{subject_id}_adc.nii.gz"
|
| 222 |
+
|
| 223 |
+
# row["dwi"] is a nibabel.Nifti1Image
|
| 224 |
+
row["dwi"].to_filename(str(dwi_path))
|
| 225 |
+
row["adc"].to_filename(str(adc_path))
|
| 226 |
+
|
| 227 |
+
case_files: CaseFiles = {
|
| 228 |
+
"dwi": dwi_path,
|
| 229 |
+
"adc": adc_path,
|
| 230 |
+
}
|
| 231 |
+
|
| 232 |
+
# Handle lesion_mask if present
|
| 233 |
+
if row.get("lesion_mask") is not None:
|
| 234 |
+
mask_path = case_dir / f"{subject_id}_lesion-msk.nii.gz"
|
| 235 |
+
row["lesion_mask"].to_filename(str(mask_path))
|
| 236 |
+
case_files["ground_truth"] = mask_path
|
| 237 |
+
|
| 238 |
+
self._cached_cases[subject_id] = case_files
|
| 239 |
+
return case_files
|
| 240 |
+
|
| 241 |
+
def cleanup(self) -> None:
|
| 242 |
+
"""Remove temp directory."""
|
| 243 |
+
if self._temp_dir and self._temp_dir.exists():
|
| 244 |
+
shutil.rmtree(self._temp_dir)
|
| 245 |
+
self._temp_dir = None
|
| 246 |
+
self._cached_cases.clear()
|
| 247 |
+
```
|
| 248 |
+
|
| 249 |
+
### Phase 2: Update load_isles_dataset()
|
| 250 |
+
|
| 251 |
+
```python
|
| 252 |
+
def load_isles_dataset(
|
| 253 |
+
source: str | Path | None = None,
|
| 254 |
+
*,
|
| 255 |
+
local_mode: bool | None = None,
|
| 256 |
+
) -> Dataset:
|
| 257 |
+
"""Load ISLES24 dataset."""
|
| 258 |
+
|
| 259 |
+
if local_mode:
|
| 260 |
+
from stroke_deepisles_demo.data.adapter import build_local_dataset
|
| 261 |
+
return build_local_dataset(Path(source or "data/isles24"))
|
| 262 |
+
|
| 263 |
+
# HuggingFace mode - USE STANDARD CONSUMPTION
|
| 264 |
+
from datasets import load_dataset
|
| 265 |
+
|
| 266 |
+
dataset_id = str(source) if source else "hugging-science/isles24-stroke"
|
| 267 |
+
ds = load_dataset(dataset_id, split="train")
|
| 268 |
+
|
| 269 |
+
return HuggingFaceDatasetWrapper(ds, dataset_id)
|
| 270 |
+
```
|
| 271 |
+
|
| 272 |
+
### Phase 3: Delete Dead Code
|
| 273 |
+
|
| 274 |
+
1. Delete `data/constants.py` entirely
|
| 275 |
+
2. Remove from `adapter.py`:
|
| 276 |
+
- `HuggingFaceDataset` class (lines 143-337)
|
| 277 |
+
- `build_huggingface_dataset()` function (lines 339-378)
|
| 278 |
+
3. Delete `tests/data/test_hf_adapter.py`
|
| 279 |
+
4. Rewrite `tests/data/test_loader.py` to not import deleted classes
|
| 280 |
+
|
| 281 |
+
### Phase 4: Test
|
| 282 |
+
|
| 283 |
+
1. Verify `load_isles_dataset()` works with HuggingFace
|
| 284 |
+
2. Verify nibabel β temp file materialization works
|
| 285 |
+
3. Verify `pipeline.py` still works (shutil.copy2 on Paths)
|
| 286 |
+
4. Run inference end-to-end
|
| 287 |
+
|
| 288 |
+
---
|
| 289 |
+
|
| 290 |
+
## Risks and Mitigations
|
| 291 |
+
|
| 292 |
+
| Risk | Severity | Mitigation |
|
| 293 |
+
|------|----------|------------|
|
| 294 |
+
| **Full dataset download** | HIGH | `load_dataset()` may download all 27GB. Test if streaming or column selection works. May need to file issue in neuroimaging-go-brrrr. |
|
| 295 |
+
| **All modalities decoded** | MEDIUM | Accessing `ds[i]` may decode ALL Nifti columns (ncct, cta, ctp, tmax...). Consider `ds.select_columns(["subject_id", "dwi", "adc", "lesion_mask"])` |
|
| 296 |
+
| **Eager RAM load** | MEDIUM | `Nifti1ImageWrapper` calls `get_fdata()` in constructor. For large 4D volumes this could be GB per modality. |
|
| 297 |
+
| **Byte-for-byte fidelity** | LOW | `to_filename()` may re-encode differently than original bytes. Verify inference results are equivalent. |
|
| 298 |
+
| **O(n) index build** | LOW | Building `_subject_index` iterates full dataset once. Acceptable for 149 rows. |
|
| 299 |
+
|
| 300 |
+
---
|
| 301 |
+
|
| 302 |
+
## Performance Considerations
|
| 303 |
+
|
| 304 |
+
The current adapter downloads ~50MB per case on-demand. The new approach may:
|
| 305 |
+
1. Download more data upfront (all parquet shards)
|
| 306 |
+
2. Decode more modalities than needed
|
| 307 |
+
|
| 308 |
+
**If this regresses performance significantly:**
|
| 309 |
+
1. File issue in `neuroimaging-go-brrrr` for selective loading
|
| 310 |
+
2. Consider `streaming=True` mode (if supported with Nifti)
|
| 311 |
+
3. Consider column selection before access
|
| 312 |
+
|
| 313 |
+
---
|
| 314 |
+
|
| 315 |
+
## Success Criteria
|
| 316 |
+
|
| 317 |
+
1. `data/constants.py` deleted
|
| 318 |
+
2. `HuggingFaceDataset` class deleted
|
| 319 |
+
3. `load_isles_dataset()` uses `datasets.load_dataset()`
|
| 320 |
+
4. All tests pass (with rewritten HF tests)
|
| 321 |
+
5. Inference works end-to-end
|
| 322 |
+
6. No regression in single-case load time (verify <30s)
|
| 323 |
+
|
| 324 |
+
---
|
| 325 |
+
|
| 326 |
+
## Dependencies
|
| 327 |
+
|
| 328 |
+
- `neuroimaging-go-brrrr @ v0.2.1` (already installed)
|
| 329 |
+
- Patched `datasets` with `Nifti()` support (provided by neuroimaging-go-brrrr)
|
| 330 |
+
|
| 331 |
+
---
|
| 332 |
+
|
| 333 |
+
## Open Questions (To Validate During Implementation)
|
| 334 |
+
|
| 335 |
+
1. Does `load_dataset()` download all shards or lazy-load?
|
| 336 |
+
2. Does `ds.select_columns()` prevent unwanted Nifti decodes?
|
| 337 |
+
3. Is `streaming=True` compatible with `Nifti()` features?
|
| 338 |
+
4. Any byte-for-byte differences when re-encoding via nibabel?
|
| 339 |
+
|
| 340 |
+
These should be answered by testing, not by adding local workarounds.
|
|
@@ -4,6 +4,8 @@
|
|
| 4 |
**Date**: 2025-12-11
|
| 5 |
**Goal**: Replace Gradio with React frontend for NiiVue, FastAPI backend for DeepISLES
|
| 6 |
|
|
|
|
|
|
|
| 7 |
---
|
| 8 |
|
| 9 |
## Security Note: CVE-2025-55182 Does NOT Affect This App
|
|
|
|
| 4 |
**Date**: 2025-12-11
|
| 5 |
**Goal**: Replace Gradio with React frontend for NiiVue, FastAPI backend for DeepISLES
|
| 6 |
|
| 7 |
+
**UPDATE (2025-12-12):** See `NEXT-CONCERNS.md` for latest architecture fixes regarding config consolidation (BUG-009) and dependency reproducibility (BUG-012). The env var `FRONTEND_ORIGIN` is now `STROKE_DEMO_FRONTEND_ORIGINS`.
|
| 8 |
+
|
| 9 |
---
|
| 10 |
|
| 11 |
## Security Note: CVE-2025-55182 Does NOT Affect This App
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
@@ -98,8 +98,11 @@ If you fork this repository, update these files before deploying:
|
|
| 98 |
VITE_API_URL=https://{YOUR_HF_USERNAME}-stroke-deepisles-demo.hf.space
|
| 99 |
```
|
| 100 |
|
| 101 |
-
2. **Backend CORS** (`src/stroke_deepisles_demo/
|
| 102 |
-
|
|
|
|
|
|
|
|
|
|
| 103 |
|
| 104 |
3. **Rebuild frontend**:
|
| 105 |
```bash
|
|
|
|
| 98 |
VITE_API_URL=https://{YOUR_HF_USERNAME}-stroke-deepisles-demo.hf.space
|
| 99 |
```
|
| 100 |
|
| 101 |
+
2. **Backend CORS** (`src/stroke_deepisles_demo/core/config.py`):
|
| 102 |
+
Set `STROKE_DEMO_FRONTEND_ORIGINS` env var (JSON list) on the backend Space:
|
| 103 |
+
```bash
|
| 104 |
+
STROKE_DEMO_FRONTEND_ORIGINS='["https://{YOUR_HF_USERNAME}-stroke-viewer-frontend.hf.space"]'
|
| 105 |
+
```
|
| 106 |
|
| 107 |
3. **Rebuild frontend**:
|
| 108 |
```bash
|
|
@@ -19,8 +19,8 @@ classifiers = [
|
|
| 19 |
keywords = ["stroke", "neuroimaging", "segmentation", "BIDS", "NIfTI", "deep-learning"]
|
| 20 |
|
| 21 |
dependencies = [
|
| 22 |
-
# Core -
|
| 23 |
-
"
|
| 24 |
"huggingface-hub>=0.25.0",
|
| 25 |
|
| 26 |
# NIfTI handling
|
|
|
|
| 19 |
keywords = ["stroke", "neuroimaging", "segmentation", "BIDS", "NIfTI", "deep-learning"]
|
| 20 |
|
| 21 |
dependencies = [
|
| 22 |
+
# Core - BIDS + NIfTI lazy loading (maintained fork)
|
| 23 |
+
"neuroimaging-go-brrrr @ git+https://github.com/The-Obstacle-Is-The-Way/neuroimaging-go-brrrr.git@v0.2.1",
|
| 24 |
"huggingface-hub>=0.25.0",
|
| 25 |
|
| 26 |
# NIfTI handling
|
|
@@ -2,9 +2,8 @@
|
|
| 2 |
# Generated: December 2025
|
| 3 |
# See: docs/specs/07-hf-spaces-deployment.md
|
| 4 |
|
| 5 |
-
# Core -
|
| 6 |
-
|
| 7 |
-
git+https://github.com/CloseChoice/datasets.git@c1c15aaa4f00f28f1916f3a896283494162eac49
|
| 8 |
|
| 9 |
# HuggingFace
|
| 10 |
huggingface-hub>=0.25.0
|
|
|
|
| 2 |
# Generated: December 2025
|
| 3 |
# See: docs/specs/07-hf-spaces-deployment.md
|
| 4 |
|
| 5 |
+
# Core - BIDS + NIfTI lazy loading (maintained fork)
|
| 6 |
+
neuroimaging-go-brrrr @ git+https://github.com/The-Obstacle-Is-The-Way/neuroimaging-go-brrrr.git@v0.2.1
|
|
|
|
| 7 |
|
| 8 |
# HuggingFace
|
| 9 |
huggingface-hub>=0.25.0
|
|
@@ -1,16 +0,0 @@
|
|
| 1 |
-
"""API configuration constants.
|
| 2 |
-
|
| 3 |
-
Single source of truth for API configuration values.
|
| 4 |
-
"""
|
| 5 |
-
|
| 6 |
-
from pathlib import Path
|
| 7 |
-
|
| 8 |
-
# Results directory for job outputs (must be in /tmp for HF Spaces)
|
| 9 |
-
# CRITICAL: This is the single source of truth. Import this instead of hardcoding.
|
| 10 |
-
RESULTS_DIR = Path("/tmp/stroke-results")
|
| 11 |
-
|
| 12 |
-
# Maximum active jobs (pending + running) accepted by the API
|
| 13 |
-
# This limits how many jobs can be queued/running at once, NOT serialized GPU execution
|
| 14 |
-
# T4 GPU (16GB) can handle ~1-2 concurrent DeepISLES inferences safely
|
| 15 |
-
# Value of 10 allows reasonable queue depth while preventing unbounded accumulation
|
| 16 |
-
MAX_CONCURRENT_JOBS = 10
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
@@ -18,7 +18,7 @@ Reference: https://github.com/fastapi/fastapi/discussions/7319
|
|
| 18 |
from fastapi import APIRouter, HTTPException
|
| 19 |
from fastapi.responses import FileResponse
|
| 20 |
|
| 21 |
-
from stroke_deepisles_demo.
|
| 22 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 23 |
|
| 24 |
logger = get_logger(__name__)
|
|
@@ -45,13 +45,14 @@ async def get_result_file(job_id: str, case_id: str, filename: str) -> FileRespo
|
|
| 45 |
404: File not found (job expired, invalid path, or doesn't exist)
|
| 46 |
"""
|
| 47 |
# Construct file path
|
| 48 |
-
|
|
|
|
| 49 |
|
| 50 |
# Security: Ensure path doesn't escape RESULTS_DIR (path traversal protection)
|
| 51 |
# Using is_relative_to() instead of startswith() to prevent prefix-collision bypass
|
| 52 |
# e.g., /tmp/stroke-results-evil/file.txt would pass startswith but fail is_relative_to
|
| 53 |
try:
|
| 54 |
-
base_dir =
|
| 55 |
resolved = file_path.resolve()
|
| 56 |
if not resolved.is_relative_to(base_dir):
|
| 57 |
logger.warning("Path traversal attempt blocked: %s", filename)
|
|
|
|
| 18 |
from fastapi import APIRouter, HTTPException
|
| 19 |
from fastapi.responses import FileResponse
|
| 20 |
|
| 21 |
+
from stroke_deepisles_demo.core.config import get_settings
|
| 22 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 23 |
|
| 24 |
logger = get_logger(__name__)
|
|
|
|
| 45 |
404: File not found (job expired, invalid path, or doesn't exist)
|
| 46 |
"""
|
| 47 |
# Construct file path
|
| 48 |
+
results_dir = get_settings().results_dir
|
| 49 |
+
file_path = results_dir / job_id / case_id / filename
|
| 50 |
|
| 51 |
# Security: Ensure path doesn't escape RESULTS_DIR (path traversal protection)
|
| 52 |
# Using is_relative_to() instead of startswith() to prevent prefix-collision bypass
|
| 53 |
# e.g., /tmp/stroke-results-evil/file.txt would pass startswith but fail is_relative_to
|
| 54 |
try:
|
| 55 |
+
base_dir = results_dir.resolve()
|
| 56 |
resolved = file_path.resolve()
|
| 57 |
if not resolved.is_relative_to(base_dir):
|
| 58 |
logger.warning("Path traversal attempt blocked: %s", filename)
|
|
@@ -25,7 +25,7 @@ from datetime import datetime, timedelta
|
|
| 25 |
from enum import Enum
|
| 26 |
from typing import TYPE_CHECKING, Any
|
| 27 |
|
| 28 |
-
from stroke_deepisles_demo.
|
| 29 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 30 |
|
| 31 |
if TYPE_CHECKING:
|
|
@@ -138,7 +138,7 @@ class JobStore:
|
|
| 138 |
self._jobs: dict[str, Job] = {}
|
| 139 |
self._lock = threading.RLock()
|
| 140 |
self._ttl = ttl
|
| 141 |
-
self._results_dir = results_dir or
|
| 142 |
self._cleanup_thread: threading.Thread | None = None
|
| 143 |
self._shutdown = threading.Event()
|
| 144 |
|
|
|
|
| 25 |
from enum import Enum
|
| 26 |
from typing import TYPE_CHECKING, Any
|
| 27 |
|
| 28 |
+
from stroke_deepisles_demo.core.config import get_settings
|
| 29 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 30 |
|
| 31 |
if TYPE_CHECKING:
|
|
|
|
| 138 |
self._jobs: dict[str, Job] = {}
|
| 139 |
self._lock = threading.RLock()
|
| 140 |
self._ttl = ttl
|
| 141 |
+
self._results_dir = results_dir or get_settings().results_dir
|
| 142 |
self._cleanup_thread: threading.Thread | None = None
|
| 143 |
self._shutdown = threading.Event()
|
| 144 |
|
|
@@ -13,7 +13,6 @@ Architecture designed to work within HuggingFace Spaces constraints:
|
|
| 13 |
- /tmp writable only (results stored there)
|
| 14 |
"""
|
| 15 |
|
| 16 |
-
import os
|
| 17 |
from collections.abc import AsyncIterator
|
| 18 |
from contextlib import asynccontextmanager
|
| 19 |
from typing import Any
|
|
@@ -22,10 +21,10 @@ from fastapi import FastAPI, Request, Response
|
|
| 22 |
from fastapi.middleware.cors import CORSMiddleware
|
| 23 |
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
|
| 24 |
|
| 25 |
-
from stroke_deepisles_demo.api.config import RESULTS_DIR
|
| 26 |
from stroke_deepisles_demo.api.files import files_router
|
| 27 |
from stroke_deepisles_demo.api.job_store import init_job_store
|
| 28 |
from stroke_deepisles_demo.api.routes import router
|
|
|
|
| 29 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 30 |
|
| 31 |
logger = get_logger(__name__)
|
|
@@ -44,6 +43,7 @@ async def lifespan(_app: FastAPI) -> AsyncIterator[None]:
|
|
| 44 |
"""
|
| 45 |
# Startup
|
| 46 |
logger.info("Starting stroke segmentation API...")
|
|
|
|
| 47 |
|
| 48 |
# Check for GPU availability (DeepISLES requires GPU)
|
| 49 |
try:
|
|
@@ -58,10 +58,10 @@ async def lifespan(_app: FastAPI) -> AsyncIterator[None]:
|
|
| 58 |
pass # torch may not be available in all environments
|
| 59 |
|
| 60 |
# Create results directory
|
| 61 |
-
|
| 62 |
|
| 63 |
# Initialize job store with cleanup scheduler
|
| 64 |
-
job_store = init_job_store(results_dir=
|
| 65 |
logger.info("Job store initialized with %d jobs", len(job_store))
|
| 66 |
|
| 67 |
yield
|
|
@@ -94,21 +94,7 @@ class CORPMiddleware(BaseHTTPMiddleware):
|
|
| 94 |
return response
|
| 95 |
|
| 96 |
|
| 97 |
-
# CORS configuration - Single source of truth
|
| 98 |
-
# Production HF Space frontend origin
|
| 99 |
-
HF_SPACE_FRONTEND = "https://vibecodermcswaggins-stroke-viewer-frontend.hf.space"
|
| 100 |
-
|
| 101 |
-
CORS_ORIGINS: list[str] = [
|
| 102 |
-
"http://localhost:5173", # Vite dev server
|
| 103 |
-
"http://localhost:3000", # Alternative local port
|
| 104 |
-
HF_SPACE_FRONTEND, # Production HF Space frontend
|
| 105 |
-
]
|
| 106 |
-
|
| 107 |
-
# Allow override via environment variable (for custom deployments)
|
| 108 |
-
FRONTEND_ORIGIN = os.environ.get("FRONTEND_ORIGIN", "")
|
| 109 |
-
if FRONTEND_ORIGIN and FRONTEND_ORIGIN not in CORS_ORIGINS:
|
| 110 |
-
CORS_ORIGINS.append(FRONTEND_ORIGIN)
|
| 111 |
-
|
| 112 |
# Add CORP middleware first (for COEP compatibility)
|
| 113 |
app.add_middleware(CORPMiddleware)
|
| 114 |
|
|
@@ -117,7 +103,7 @@ app.add_middleware(CORPMiddleware)
|
|
| 117 |
# This eliminates regex security concerns while maintaining single source of truth
|
| 118 |
app.add_middleware(
|
| 119 |
CORSMiddleware,
|
| 120 |
-
allow_origins=
|
| 121 |
allow_credentials=False, # Not needed - no cookies/auth
|
| 122 |
allow_methods=["GET", "POST", "HEAD"], # HEAD for preflight checks
|
| 123 |
allow_headers=["Content-Type", "Range"], # Range needed for partial content requests
|
|
@@ -150,9 +136,10 @@ async def health() -> dict[str, Any]:
|
|
| 150 |
from stroke_deepisles_demo.api.job_store import get_job_store
|
| 151 |
|
| 152 |
store = get_job_store()
|
|
|
|
| 153 |
return {
|
| 154 |
"status": "healthy",
|
| 155 |
"jobs_in_memory": len(store),
|
| 156 |
-
"results_dir": str(
|
| 157 |
-
"results_dir_exists":
|
| 158 |
}
|
|
|
|
| 13 |
- /tmp writable only (results stored there)
|
| 14 |
"""
|
| 15 |
|
|
|
|
| 16 |
from collections.abc import AsyncIterator
|
| 17 |
from contextlib import asynccontextmanager
|
| 18 |
from typing import Any
|
|
|
|
| 21 |
from fastapi.middleware.cors import CORSMiddleware
|
| 22 |
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
|
| 23 |
|
|
|
|
| 24 |
from stroke_deepisles_demo.api.files import files_router
|
| 25 |
from stroke_deepisles_demo.api.job_store import init_job_store
|
| 26 |
from stroke_deepisles_demo.api.routes import router
|
| 27 |
+
from stroke_deepisles_demo.core.config import get_settings
|
| 28 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 29 |
|
| 30 |
logger = get_logger(__name__)
|
|
|
|
| 43 |
"""
|
| 44 |
# Startup
|
| 45 |
logger.info("Starting stroke segmentation API...")
|
| 46 |
+
settings = get_settings()
|
| 47 |
|
| 48 |
# Check for GPU availability (DeepISLES requires GPU)
|
| 49 |
try:
|
|
|
|
| 58 |
pass # torch may not be available in all environments
|
| 59 |
|
| 60 |
# Create results directory
|
| 61 |
+
settings.results_dir.mkdir(parents=True, exist_ok=True)
|
| 62 |
|
| 63 |
# Initialize job store with cleanup scheduler
|
| 64 |
+
job_store = init_job_store(results_dir=settings.results_dir)
|
| 65 |
logger.info("Job store initialized with %d jobs", len(job_store))
|
| 66 |
|
| 67 |
yield
|
|
|
|
| 94 |
return response
|
| 95 |
|
| 96 |
|
| 97 |
+
# CORS configuration - Single source of truth from Settings
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 98 |
# Add CORP middleware first (for COEP compatibility)
|
| 99 |
app.add_middleware(CORPMiddleware)
|
| 100 |
|
|
|
|
| 103 |
# This eliminates regex security concerns while maintaining single source of truth
|
| 104 |
app.add_middleware(
|
| 105 |
CORSMiddleware,
|
| 106 |
+
allow_origins=get_settings().frontend_origins,
|
| 107 |
allow_credentials=False, # Not needed - no cookies/auth
|
| 108 |
allow_methods=["GET", "POST", "HEAD"], # HEAD for preflight checks
|
| 109 |
allow_headers=["Content-Type", "Range"], # Range needed for partial content requests
|
|
|
|
| 136 |
from stroke_deepisles_demo.api.job_store import get_job_store
|
| 137 |
|
| 138 |
store = get_job_store()
|
| 139 |
+
settings = get_settings()
|
| 140 |
return {
|
| 141 |
"status": "healthy",
|
| 142 |
"jobs_in_memory": len(store),
|
| 143 |
+
"results_dir": str(settings.results_dir),
|
| 144 |
+
"results_dir_exists": settings.results_dir.exists(),
|
| 145 |
}
|
|
@@ -10,12 +10,10 @@ This pattern avoids HuggingFace Spaces' ~60s gateway timeout.
|
|
| 10 |
|
| 11 |
from __future__ import annotations
|
| 12 |
|
| 13 |
-
import os
|
| 14 |
import uuid
|
| 15 |
|
| 16 |
from fastapi import APIRouter, BackgroundTasks, HTTPException, Request
|
| 17 |
|
| 18 |
-
from stroke_deepisles_demo.api.config import MAX_CONCURRENT_JOBS, RESULTS_DIR
|
| 19 |
from stroke_deepisles_demo.api.job_store import JobStatus, get_job_store
|
| 20 |
from stroke_deepisles_demo.api.schemas import (
|
| 21 |
CasesResponse,
|
|
@@ -24,6 +22,7 @@ from stroke_deepisles_demo.api.schemas import (
|
|
| 24 |
SegmentRequest,
|
| 25 |
SegmentResponse,
|
| 26 |
)
|
|
|
|
| 27 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 28 |
from stroke_deepisles_demo.data import list_case_ids
|
| 29 |
from stroke_deepisles_demo.metrics import compute_volume_ml
|
|
@@ -38,12 +37,12 @@ def get_backend_base_url(request: Request) -> str:
|
|
| 38 |
"""Get the backend's public URL for building absolute file URLs.
|
| 39 |
|
| 40 |
Priority:
|
| 41 |
-
1. BACKEND_PUBLIC_URL env var
|
| 42 |
2. Request's base URL (for local development)
|
| 43 |
"""
|
| 44 |
-
|
| 45 |
-
if
|
| 46 |
-
return
|
| 47 |
return str(request.base_url).rstrip("/")
|
| 48 |
|
| 49 |
|
|
@@ -92,7 +91,7 @@ def create_segment_job(
|
|
| 92 |
try:
|
| 93 |
# Concurrency limit to prevent GPU memory exhaustion (BUG-006 fix)
|
| 94 |
store = get_job_store()
|
| 95 |
-
if store.get_active_job_count() >=
|
| 96 |
raise HTTPException(
|
| 97 |
status_code=503,
|
| 98 |
detail="Server busy: too many active jobs. Please try again later.",
|
|
@@ -223,7 +222,7 @@ def run_segmentation_job(
|
|
| 223 |
store.update_progress(job_id, 10, "Loading case data...")
|
| 224 |
|
| 225 |
# Set up output directory
|
| 226 |
-
output_dir =
|
| 227 |
|
| 228 |
store.update_progress(job_id, 20, "Staging files for DeepISLES...")
|
| 229 |
|
|
|
|
| 10 |
|
| 11 |
from __future__ import annotations
|
| 12 |
|
|
|
|
| 13 |
import uuid
|
| 14 |
|
| 15 |
from fastapi import APIRouter, BackgroundTasks, HTTPException, Request
|
| 16 |
|
|
|
|
| 17 |
from stroke_deepisles_demo.api.job_store import JobStatus, get_job_store
|
| 18 |
from stroke_deepisles_demo.api.schemas import (
|
| 19 |
CasesResponse,
|
|
|
|
| 22 |
SegmentRequest,
|
| 23 |
SegmentResponse,
|
| 24 |
)
|
| 25 |
+
from stroke_deepisles_demo.core.config import get_settings
|
| 26 |
from stroke_deepisles_demo.core.logging import get_logger
|
| 27 |
from stroke_deepisles_demo.data import list_case_ids
|
| 28 |
from stroke_deepisles_demo.metrics import compute_volume_ml
|
|
|
|
| 37 |
"""Get the backend's public URL for building absolute file URLs.
|
| 38 |
|
| 39 |
Priority:
|
| 40 |
+
1. BACKEND_PUBLIC_URL setting (from env var or config)
|
| 41 |
2. Request's base URL (for local development)
|
| 42 |
"""
|
| 43 |
+
settings_url = get_settings().backend_public_url
|
| 44 |
+
if settings_url:
|
| 45 |
+
return settings_url.rstrip("/")
|
| 46 |
return str(request.base_url).rstrip("/")
|
| 47 |
|
| 48 |
|
|
|
|
| 91 |
try:
|
| 92 |
# Concurrency limit to prevent GPU memory exhaustion (BUG-006 fix)
|
| 93 |
store = get_job_store()
|
| 94 |
+
if store.get_active_job_count() >= get_settings().max_concurrent_jobs:
|
| 95 |
raise HTTPException(
|
| 96 |
status_code=503,
|
| 97 |
detail="Server busy: too many active jobs. Please try again later.",
|
|
|
|
| 222 |
store.update_progress(job_id, 10, "Loading case data...")
|
| 223 |
|
| 224 |
# Set up output directory
|
| 225 |
+
output_dir = get_settings().results_dir / job_id
|
| 226 |
|
| 227 |
store.update_progress(job_id, 20, "Staging files for DeepISLES...")
|
| 228 |
|
|
@@ -90,7 +90,19 @@ class Settings(BaseSettings):
|
|
| 90 |
|
| 91 |
# Paths
|
| 92 |
temp_dir: Path | None = None
|
| 93 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 94 |
|
| 95 |
# UI
|
| 96 |
gradio_server_name: str = "0.0.0.0"
|
|
|
|
| 90 |
|
| 91 |
# Paths
|
| 92 |
temp_dir: Path | None = None
|
| 93 |
+
# Results directory - MUST be /tmp for HF Spaces (only /tmp is writable)
|
| 94 |
+
results_dir: Path = Path("/tmp/stroke-results")
|
| 95 |
+
|
| 96 |
+
# API Settings
|
| 97 |
+
# Concurrency control
|
| 98 |
+
max_concurrent_jobs: int = 10
|
| 99 |
+
|
| 100 |
+
# CORS - frontend origins allowed to call this API
|
| 101 |
+
frontend_origins: list[str] = Field(default=["http://localhost:5173", "http://localhost:3000"])
|
| 102 |
+
|
| 103 |
+
# Public URL for constructing absolute file URLs in responses
|
| 104 |
+
# If not set, uses request.base_url (works for local dev)
|
| 105 |
+
backend_public_url: str | None = None
|
| 106 |
|
| 107 |
# UI
|
| 108 |
gradio_server_name: str = "0.0.0.0"
|
|
@@ -582,7 +582,7 @@ wheels = [
|
|
| 582 |
[[package]]
|
| 583 |
name = "datasets"
|
| 584 |
version = "4.4.2.dev0"
|
| 585 |
-
source = { git = "https://github.com/
|
| 586 |
dependencies = [
|
| 587 |
{ name = "dill" },
|
| 588 |
{ name = "filelock" },
|
|
@@ -618,6 +618,15 @@ wheels = [
|
|
| 618 |
{ url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047 },
|
| 619 |
]
|
| 620 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 621 |
[[package]]
|
| 622 |
name = "fastapi"
|
| 623 |
version = "0.123.8"
|
|
@@ -1567,6 +1576,20 @@ wheels = [
|
|
| 1567 |
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963 },
|
| 1568 |
]
|
| 1569 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1570 |
[[package]]
|
| 1571 |
name = "nibabel"
|
| 1572 |
version = "5.3.2"
|
|
@@ -1615,6 +1638,18 @@ wheels = [
|
|
| 1615 |
{ url = "https://files.pythonhosted.org/packages/16/2e/86f24451c2d530c88daf997cb8d6ac622c1d40d19f5a031ed68a4b73a374/numpy-1.26.4-cp312-cp312-win_amd64.whl", hash = "sha256:08beddf13648eb95f8d867350f6a018a4be2e5ad54c8d8caed89ebca558b2818", size = 15517754 },
|
| 1616 |
]
|
| 1617 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1618 |
[[package]]
|
| 1619 |
name = "orjson"
|
| 1620 |
version = "3.11.4"
|
|
@@ -2437,8 +2472,8 @@ name = "stroke-deepisles-demo"
|
|
| 2437 |
version = "0.1.0"
|
| 2438 |
source = { editable = "." }
|
| 2439 |
dependencies = [
|
| 2440 |
-
{ name = "datasets" },
|
| 2441 |
{ name = "huggingface-hub" },
|
|
|
|
| 2442 |
{ name = "nibabel" },
|
| 2443 |
{ name = "numpy" },
|
| 2444 |
{ name = "pydantic" },
|
|
@@ -2471,12 +2506,12 @@ dev = [
|
|
| 2471 |
|
| 2472 |
[package.metadata]
|
| 2473 |
requires-dist = [
|
| 2474 |
-
{ name = "datasets", git = "https://github.com/CloseChoice/datasets.git?rev=c1c15aaa4f00f28f1916f3a896283494162eac49" },
|
| 2475 |
{ name = "fastapi", marker = "extra == 'api'", specifier = ">=0.115.0" },
|
| 2476 |
{ name = "gradio", marker = "extra == 'gradio'", specifier = ">=6.0.0,<7.0.0" },
|
| 2477 |
{ name = "gradio-niivueviewer", marker = "extra == 'gradio'", editable = "packages/niivueviewer" },
|
| 2478 |
{ name = "huggingface-hub", specifier = ">=0.25.0" },
|
| 2479 |
{ name = "matplotlib", marker = "extra == 'gradio'", specifier = ">=3.8.0" },
|
|
|
|
| 2480 |
{ name = "nibabel", specifier = ">=5.2.0" },
|
| 2481 |
{ name = "numpy", specifier = ">=1.26.0,<2.0.0" },
|
| 2482 |
{ name = "pydantic", specifier = ">=2.5.0" },
|
|
|
|
| 582 |
[[package]]
|
| 583 |
name = "datasets"
|
| 584 |
version = "4.4.2.dev0"
|
| 585 |
+
source = { git = "https://github.com/huggingface/datasets.git?rev=004a5bf4addd9293d6d40f43360c03c8f7e42b28#004a5bf4addd9293d6d40f43360c03c8f7e42b28" }
|
| 586 |
dependencies = [
|
| 587 |
{ name = "dill" },
|
| 588 |
{ name = "filelock" },
|
|
|
|
| 618 |
{ url = "https://files.pythonhosted.org/packages/33/6b/e0547afaf41bf2c42e52430072fa5658766e3d65bd4b03a563d1b6336f57/distlib-0.4.0-py2.py3-none-any.whl", hash = "sha256:9659f7d87e46584a30b5780e43ac7a2143098441670ff0a49d5f9034c54a6c16", size = 469047 },
|
| 619 |
]
|
| 620 |
|
| 621 |
+
[[package]]
|
| 622 |
+
name = "et-xmlfile"
|
| 623 |
+
version = "2.0.0"
|
| 624 |
+
source = { registry = "https://pypi.org/simple" }
|
| 625 |
+
sdist = { url = "https://files.pythonhosted.org/packages/d3/38/af70d7ab1ae9d4da450eeec1fa3918940a5fafb9055e934af8d6eb0c2313/et_xmlfile-2.0.0.tar.gz", hash = "sha256:dab3f4764309081ce75662649be815c4c9081e88f0837825f90fd28317d4da54", size = 17234 }
|
| 626 |
+
wheels = [
|
| 627 |
+
{ url = "https://files.pythonhosted.org/packages/c1/8b/5fe2cc11fee489817272089c4203e679c63b570a5aaeb18d852ae3cbba6a/et_xmlfile-2.0.0-py3-none-any.whl", hash = "sha256:7a91720bc756843502c3b7504c77b8fe44217c85c537d85037f0f536151b2caa", size = 18059 },
|
| 628 |
+
]
|
| 629 |
+
|
| 630 |
[[package]]
|
| 631 |
name = "fastapi"
|
| 632 |
version = "0.123.8"
|
|
|
|
| 1576 |
{ url = "https://files.pythonhosted.org/packages/79/7b/2c79738432f5c924bef5071f933bcc9efd0473bac3b4aa584a6f7c1c8df8/mypy_extensions-1.1.0-py3-none-any.whl", hash = "sha256:1be4cccdb0f2482337c4743e60421de3a356cd97508abadd57d47403e94f5505", size = 4963 },
|
| 1577 |
]
|
| 1578 |
|
| 1579 |
+
[[package]]
|
| 1580 |
+
name = "neuroimaging-go-brrrr"
|
| 1581 |
+
version = "0.2.0"
|
| 1582 |
+
source = { git = "https://github.com/The-Obstacle-Is-The-Way/neuroimaging-go-brrrr.git?rev=v0.2.1#97445127210269f1d59fd0bf96fa8127585a7fc1" }
|
| 1583 |
+
dependencies = [
|
| 1584 |
+
{ name = "datasets" },
|
| 1585 |
+
{ name = "hf-xet" },
|
| 1586 |
+
{ name = "huggingface-hub" },
|
| 1587 |
+
{ name = "nibabel" },
|
| 1588 |
+
{ name = "openpyxl" },
|
| 1589 |
+
{ name = "pandas" },
|
| 1590 |
+
{ name = "typer" },
|
| 1591 |
+
]
|
| 1592 |
+
|
| 1593 |
[[package]]
|
| 1594 |
name = "nibabel"
|
| 1595 |
version = "5.3.2"
|
|
|
|
| 1638 |
{ url = "https://files.pythonhosted.org/packages/16/2e/86f24451c2d530c88daf997cb8d6ac622c1d40d19f5a031ed68a4b73a374/numpy-1.26.4-cp312-cp312-win_amd64.whl", hash = "sha256:08beddf13648eb95f8d867350f6a018a4be2e5ad54c8d8caed89ebca558b2818", size = 15517754 },
|
| 1639 |
]
|
| 1640 |
|
| 1641 |
+
[[package]]
|
| 1642 |
+
name = "openpyxl"
|
| 1643 |
+
version = "3.1.5"
|
| 1644 |
+
source = { registry = "https://pypi.org/simple" }
|
| 1645 |
+
dependencies = [
|
| 1646 |
+
{ name = "et-xmlfile" },
|
| 1647 |
+
]
|
| 1648 |
+
sdist = { url = "https://files.pythonhosted.org/packages/3d/f9/88d94a75de065ea32619465d2f77b29a0469500e99012523b91cc4141cd1/openpyxl-3.1.5.tar.gz", hash = "sha256:cf0e3cf56142039133628b5acffe8ef0c12bc902d2aadd3e0fe5878dc08d1050", size = 186464 }
|
| 1649 |
+
wheels = [
|
| 1650 |
+
{ url = "https://files.pythonhosted.org/packages/c0/da/977ded879c29cbd04de313843e76868e6e13408a94ed6b987245dc7c8506/openpyxl-3.1.5-py2.py3-none-any.whl", hash = "sha256:5282c12b107bffeef825f4617dc029afaf41d0ea60823bbb665ef3079dc79de2", size = 250910 },
|
| 1651 |
+
]
|
| 1652 |
+
|
| 1653 |
[[package]]
|
| 1654 |
name = "orjson"
|
| 1655 |
version = "3.11.4"
|
|
|
|
| 2472 |
version = "0.1.0"
|
| 2473 |
source = { editable = "." }
|
| 2474 |
dependencies = [
|
|
|
|
| 2475 |
{ name = "huggingface-hub" },
|
| 2476 |
+
{ name = "neuroimaging-go-brrrr" },
|
| 2477 |
{ name = "nibabel" },
|
| 2478 |
{ name = "numpy" },
|
| 2479 |
{ name = "pydantic" },
|
|
|
|
| 2506 |
|
| 2507 |
[package.metadata]
|
| 2508 |
requires-dist = [
|
|
|
|
| 2509 |
{ name = "fastapi", marker = "extra == 'api'", specifier = ">=0.115.0" },
|
| 2510 |
{ name = "gradio", marker = "extra == 'gradio'", specifier = ">=6.0.0,<7.0.0" },
|
| 2511 |
{ name = "gradio-niivueviewer", marker = "extra == 'gradio'", editable = "packages/niivueviewer" },
|
| 2512 |
{ name = "huggingface-hub", specifier = ">=0.25.0" },
|
| 2513 |
{ name = "matplotlib", marker = "extra == 'gradio'", specifier = ">=3.8.0" },
|
| 2514 |
+
{ name = "neuroimaging-go-brrrr", git = "https://github.com/The-Obstacle-Is-The-Way/neuroimaging-go-brrrr.git?rev=v0.2.1" },
|
| 2515 |
{ name = "nibabel", specifier = ">=5.2.0" },
|
| 2516 |
{ name = "numpy", specifier = ">=1.26.0,<2.0.0" },
|
| 2517 |
{ name = "pydantic", specifier = ">=2.5.0" },
|