mindgrab / README.md
spikedoanz's picture
MindGrab-only WebGPU Space
8bbab59
metadata
title: MindGrab WebGPU
emoji: 🧠
colorFrom: purple
colorTo: indigo
sdk: static
app_file: dist/index.html
app_build_command: npm run build
license: apache-2.0
pinned: false

Overview

This Space bundles the MindGrab skull-stripping model (and its three TTA variants) exported from brainchop via tinygrad. Everything runs in the browser with WebGPU; MRI data never leaves the client. The UI embeds Niivue for visualization and uses the same conform/normalization steps as the CLI.

Local Development

cd hf
npm install          # once
npm run dev          # hot reload, http://localhost:5173
npm run build        # produces dist/
npx serve dist       # static preview (or npm run preview)

Deploying to Hugging Face Spaces

Follow the latest Spaces overview docs / Static HTML guide:

  1. Create or open the Space – visit https://huggingface.co/spaces/neuroneural/mindgrab, choose SDK static, and ensure the README metadata matches the block above (it declares app_file + app_build_command so Hugging Face runs the same npm run build step you run locally).
  2. Authenticate oncepip install -U huggingface_hub if needed, then run huggingface-cli login with a write-enabled token.
  3. Clone the Space repo (instead of initializing a fresh Git repo):
    git clone https://huggingface.co/spaces/neuroneural/mindgrab mindgrab-space
    cd mindgrab-space
    git lfs install  # .gitattributes already tracks .safetensors + .nii.gz
    
  4. Sync this project into the clone (from the brainchop-cli root run something like):
    rsync -av --delete hf/ /path/to/mindgrab-space/
    
    or copy the files manually—the key is to keep .gitattributes, public/, and the built dist/ folder together.
  5. Build locally before pushing (same command Hugging Face runs server-side):
    npm install
    npm run build
    
  6. Commit + push – every push to main triggers an automatic Space rebuild:
    git add -A
    git commit -m "Update MindGrab Space assets"
    git push origin main
    

Because .gitattributes tracks .safetensors and .nii.gz, Git LFS handles the large weights/sample volumes automatically. Once pushed, the Space serves the built dist/ bundle and fetches weights from /public, so inference continues entirely client-side over WebGPU.