| <!DOCTYPE html> |
| <html lang="en"> |
| <head> |
| <meta charset="UTF-8" /> |
| <meta name="viewport" content="width=device-width, initial-scale=1" /> |
| <title>EEGDash — open catalog of EEG / MEG datasets</title> |
| <meta name="description" content="Open catalog of 736 EEG/MEG datasets indexed, described, and loadable with one line of Python." /> |
| <link rel="preconnect" href="https://fonts.googleapis.com" /> |
| <link rel="preconnect" href="https://fonts.gstatic.com" crossorigin /> |
| <link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&family=JetBrains+Mono:wght@400;500&display=swap" rel="stylesheet" /> |
| <style> |
| :root { |
| --brand:#0072B2; --brand-strong:#005A8F; --brand-soft:#e6f1fa; |
| --accent:#E69F00; --ok:#009E73; --warn:#D55E00; |
| --ink:#0F172A; --muted:#64748B; --outline:#E2E8F0; |
| --surface:#FFFFFF; --subtle:#F8FAFC; --code-bg:#0F172A; --code-ink:#E2E8F0; |
| } |
| @media (prefers-color-scheme: dark) { |
| :root { |
| --ink:#E2E8F0; --muted:#94A3B8; --outline:#1F2937; |
| --surface:#0F172A; --subtle:#020617; --brand-soft:#102A42; |
| } |
| } |
| *, *::before, *::after { box-sizing: border-box; } |
| html, body { margin: 0; padding: 0; } |
| body { |
| font-family: Inter, system-ui, -apple-system, sans-serif; |
| color: var(--ink); |
| background: var(--subtle); |
| line-height: 1.55; |
| font-size: 15px; |
| -webkit-font-smoothing: antialiased; |
| } |
| main { |
| max-width: 980px; |
| margin: 0 auto; |
| padding: 40px 24px 64px; |
| } |
| h1 { font-size: 28px; line-height: 1.15; letter-spacing: -0.02em; margin: 32px 0 12px; } |
| h2 { font-size: 20px; letter-spacing: -0.01em; margin: 40px 0 14px; color: var(--ink); } |
| p { margin: 0 0 14px; } |
| em { color: var(--muted); font-style: italic; } |
| a { color: var(--brand); text-decoration: none; border-bottom: 1px solid transparent; transition: border-color 150ms ease; } |
| a:hover { border-bottom-color: var(--brand); } |
| strong { color: var(--ink); } |
| hr { border: 0; height: 1px; background: var(--outline); margin: 40px 0 24px; } |
| |
| .hero { |
| background: var(--surface); |
| border: 1px solid var(--outline); |
| border-radius: 16px; |
| padding: 20px; |
| box-shadow: 0 1px 2px rgba(15,23,42,.04); |
| margin-bottom: 20px; |
| } |
| .hero svg { width: 100%; height: auto; display: block; } |
| |
| .tagline { color: var(--muted); font-size: 15px; margin: 18px 0 14px; } |
| |
| .badges { display: flex; flex-wrap: wrap; gap: 6px; margin: 16px 0 20px; } |
| .badges img { height: 20px; display: block; } |
| |
| .intro { font-size: 15px; color: var(--ink); margin: 14px 0; } |
| .intro a { font-weight: 500; } |
| |
| .ctas { display: flex; flex-wrap: wrap; gap: 12px; margin: 20px 0 24px; font-size: 14px; } |
| .ctas a { |
| padding: 8px 14px; border-radius: 10px; border: 1px solid var(--outline); |
| background: var(--surface); color: var(--ink); font-weight: 500; |
| } |
| .ctas a:hover { border-color: var(--brand); color: var(--brand); } |
| |
| .shape { |
| background: var(--surface); border: 1px solid var(--outline); |
| border-radius: 14px; padding: 18px; margin: 12px 0 24px; |
| box-shadow: 0 1px 2px rgba(15,23,42,.04); |
| } |
| .shape svg { width: 100%; height: auto; display: block; } |
| |
| .stats-line { font-size: 14.5px; color: var(--ink); margin: 14px 0 28px; line-height: 1.6; } |
| |
| table { |
| width: 100%; border-collapse: collapse; margin: 16px 0 24px; |
| background: var(--surface); border: 1px solid var(--outline); border-radius: 14px; |
| overflow: hidden; font-size: 13.5px; |
| } |
| th { |
| text-align: left; padding: 12px 14px; background: var(--subtle); |
| font-weight: 600; font-size: 13px; color: var(--ink); |
| border-bottom: 1px solid var(--outline); |
| } |
| td { |
| padding: 14px; vertical-align: top; border-top: 1px solid var(--outline); |
| color: var(--ink); |
| } |
| td a { font-weight: 500; } |
| td code { |
| font-family: "JetBrains Mono", ui-monospace, monospace; |
| font-size: 12px; background: var(--brand-soft); color: var(--brand); |
| padding: 2px 6px; border-radius: 4px; |
| } |
| |
| pre { |
| background: var(--code-bg); color: var(--code-ink); |
| padding: 16px 18px; border-radius: 10px; overflow-x: auto; |
| font-family: "JetBrains Mono", ui-monospace, monospace; |
| font-size: 13px; line-height: 1.6; margin: 14px 0 20px; |
| } |
| p code, li code { |
| font-family: "JetBrains Mono", ui-monospace, monospace; |
| font-size: 0.9em; background: var(--brand-soft); color: var(--brand); |
| padding: 1px 6px; border-radius: 4px; |
| } |
| |
| ul { padding-left: 22px; } |
| li { margin: 4px 0; } |
| |
| .browse-cta { |
| display: inline-block; margin: 8px 0 24px; padding: 10px 16px; |
| background: var(--brand); color: #fff; border-radius: 10px; |
| font-weight: 600; font-size: 14px; |
| } |
| .browse-cta:hover { background: var(--brand-strong); border-bottom: none; } |
| |
| footer { |
| margin-top: 56px; padding-top: 24px; border-top: 1px solid var(--outline); |
| color: var(--muted); font-size: 12.5px; text-align: center; |
| } |
| </style> |
| </head> |
| <body> |
| <main> |
|
|
| <section class="hero"> |
| <svg viewBox="0 0 900 150" xmlns="http://www.w3.org/2000/svg" role="img" aria-label="EEGDash catalog banner: 736 datasets, 40k subjects, 85k hours"> |
| <rect x="0" y="0" width="900" height="150" fill="#F8FAFC"/> |
| <rect x="0" y="0" width="900" height="150" fill="#0072B2" fill-opacity="0.035"/> |
| <text x="28" y="48" font-family="Inter, system-ui, sans-serif" font-size="28" font-weight="700" fill="#0F172A">EEGDash</text> |
| <text x="28" y="74" font-family="Inter, system-ui, sans-serif" font-size="13" font-weight="400" fill="#64748B">open catalog of EEG / MEG datasets · load with one line of Python</text> |
| <text x="28" y="118" font-family="Inter, system-ui, sans-serif" font-size="14" fill="#334155"><tspan font-weight="700" fill="#0F172A" font-size="22">736</tspan> datasets · <tspan font-weight="700" fill="#0F172A" font-size="22">40,361</tspan> subjects · <tspan font-weight="700" fill="#0F172A" font-size="22">85,298</tspan> hours</text> |
| <g stroke-width="1.6" fill="none" opacity="0.9"> |
| <path stroke="#0072B2" d="M560 38 L580 42 L595 30 L612 48 L628 26 L644 40 L662 34 L680 46 L700 30 L720 42 L738 34 L756 44 L776 30 L796 44 L816 34 L836 42 L856 30 L876 40"/> |
| <path stroke="#009E73" d="M560 62 L582 66 L600 58 L618 72 L636 54 L654 66 L672 60 L692 72 L712 58 L732 68 L750 60 L770 70 L788 58 L810 68 L828 60 L848 66 L868 54 L888 64"/> |
| <path stroke="#E69F00" d="M560 86 L580 92 L598 82 L616 96 L634 80 L652 92 L670 86 L688 96 L708 82 L726 92 L744 86 L764 94 L782 84 L802 96 L820 86 L840 94 L860 82 L880 92"/> |
| <path stroke="#D55E00" d="M560 110 L582 114 L600 106 L620 118 L638 100 L656 112 L674 108 L694 116 L714 102 L734 112 L752 106 L772 116 L790 104 L812 114 L830 106 L850 112 L870 100 L890 110"/> |
| </g> |
| </svg> |
| </section> |
|
|
| <p class="tagline"><em>The open catalog of EEG / MEG datasets — indexed, described, and loadable with one line of Python.</em></p> |
|
|
| <div class="badges"> |
| <a href="https://pypi.org/project/eegdash/"><img src="https://img.shields.io/pypi/v/eegdash?style=flat-square&logo=pypi&logoColor=white&color=0072B2" alt="PyPI" /></a> |
| <a href="https://pypi.org/project/eegdash/"><img src="https://img.shields.io/pypi/pyversions/eegdash?style=flat-square&color=0072B2" alt="Python" /></a> |
| <a href="https://github.com/eegdash/EEGDash/blob/main/LICENSE"><img src="https://img.shields.io/badge/license-BSD--3--Clause-009E73?style=flat-square" alt="License" /></a> |
| <a href="https://pepy.tech/project/eegdash"><img src="https://static.pepy.tech/badge/eegdash" alt="Downloads" /></a> |
| <a href="https://github.com/eegdash/EEGDash"><img src="https://img.shields.io/github/stars/eegdash/EEGDash?style=flat-square&logo=github&color=E69F00" alt="GitHub stars" /></a> |
| </div> |
|
|
| <p class="intro">Welcome to the official Hugging Face org for <strong><a href="https://eegdash.org">EEGDash</a></strong>. Raw EEG/MEG recordings are never rehosted here — each dataset on this page is a <strong>pointer</strong> to its canonical source (OpenNeuro, NEMAR, or the lab that collected it), and <code>EEGDashDataset</code> handles download, caching, and conversion to a PyTorch-ready <a href="https://braindecode.org">braindecode</a> object. One CSV drives the whole catalog; every card you see here regenerates from it automatically.</p> |
|
|
| <div class="ctas"> |
| <a href="https://huggingface.co/spaces/EEGDash/catalog">🗺️ Browse the interactive catalog</a> |
| <a href="https://eegdash.org">📚 Docs</a> |
| <a href="https://github.com/eegdash/EEGDash">💻 GitHub</a> |
| <a href="https://pypi.org/project/eegdash/">📦 PyPI</a> |
| </div> |
|
|
| <h2>Catalog shape</h2> |
|
|
| <div class="shape"> |
| <svg viewBox="0 0 900 60" xmlns="http://www.w3.org/2000/svg" role="img" aria-label="Datasets by experimental paradigm"> |
| <text x="0" y="12" font-family="Inter, system-ui, sans-serif" font-size="11" font-weight="600" fill="#64748B" letter-spacing="1.2">EXPERIMENTAL PARADIGM</text> |
| <g> |
| <rect x="0" y="22" width="510" height="18" fill="#0072B2"/> |
| <rect x="510" y="22" width="100" height="18" fill="#009E73"/> |
| <rect x="610" y="22" width="60" height="18" fill="#E69F00"/> |
| <rect x="670" y="22" width="38" height="18" fill="#D55E00"/> |
| <rect x="708" y="22" width="30" height="18" fill="#CC79A7"/> |
| <rect x="738" y="22" width="28" height="18" fill="#56B4E9"/> |
| <rect x="766" y="22" width="23" height="18" fill="#F0E442"/> |
| <rect x="789" y="22" width="44" height="18" fill="#999999"/> |
| <rect x="833" y="22" width="67" height="18" fill="#E2E8F0"/> |
| </g> |
| <g font-family="Inter, system-ui, sans-serif" font-size="11" fill="#334155"> |
| <text x="0" y="56">■ Visual 300</text> |
| <text x="110" y="56">■ Auditory 59</text> |
| <text x="205" y="56">■ Multi. 35</text> |
| <text x="280" y="56">■ Other 26</text> |
| <text x="345" y="56">■ Rest. 22</text> |
| <text x="408" y="56">■ Motor 17</text> |
| <text x="468" y="56">■ Tactile 16</text> |
| <text x="535" y="56">■ Sleep 13</text> |
| <text x="595" y="56">■ Anesth. 4</text> |
| <text x="670" y="56" fill="#94A3B8">+ 207 unclassified</text> |
| </g> |
| </svg> |
| </div> |
|
|
| <p class="stats-line"><strong>In numbers:</strong> the archive indexes <strong>736</strong> EEG / MEG datasets totalling <strong>40,361</strong> subjects, <strong>222,750</strong> recordings, and <strong>85,298 hours</strong> of signal. <strong>600+</strong> are already mirrored on 🤗 and growing daily, sourced from <a href="https://openneuro.org"><strong>OpenNeuro</strong></a> (546) and <a href="https://nemar.org"><strong>NEMAR</strong></a> (190). By recording type: <strong>571 EEG · 73 iEEG · 55 MEG · 22 fNIRS</strong>, plus a handful of multimodal combos.</p> |
|
|
| <h2>Featured datasets</h2> |
|
|
| <p>A handful of representative entries, grouped by population. Every slug links to its HF card; every card links back to the canonical source.</p> |
|
|
| <table> |
| <thead> |
| <tr> |
| <th>🟢 Healthy / neurotypical</th> |
| <th>🟠 Clinical populations</th> |
| <th>🟡 Developmental (HBN)</th> |
| </tr> |
| </thead> |
| <tbody> |
| <tr> |
| <td><strong>ds002718</strong> · Visual, 18 subj<br>Face processing (Wakeman & Henson)<br><a href="https://huggingface.co/datasets/EEGDash/ds002718">HF</a> · <code>Wakeman2015</code></td> |
| <td><strong>ds003800</strong> · Resting, PD<br>EEG in Parkinson’s disease<br><a href="https://huggingface.co/datasets/EEGDash/ds003800">HF</a></td> |
| <td><strong>EEG2025r1</strong> · 10 paradigms, 136 subj<br>Healthy Brain Network release 1<br><a href="https://huggingface.co/datasets/EEGDash/eeg2025r1">HF</a> · <code>HBN_r1_bdf</code></td> |
| </tr> |
| <tr> |
| <td><strong>ds000117</strong> · Visual, MEG + EEG<br>Multimodal face processing<br><a href="https://huggingface.co/datasets/EEGDash/ds000117">HF</a> · <code>WakemanHenson_MEEG</code></td> |
| <td><strong>ds002799</strong> · Clinical monitoring<br>Patient-day recording, dementia<br><a href="https://huggingface.co/datasets/EEGDash/ds002799">HF</a></td> |
| <td><strong>EEG2025r10</strong> · 8 paradigms, 533 subj<br>HBN release 10 — 32 GB<br><a href="https://huggingface.co/datasets/EEGDash/eeg2025r10">HF</a></td> |
| </tr> |
| <tr> |
| <td><strong>ds000246</strong> · Auditory, MEG<br>CTF 275-channel MEG<br><a href="https://huggingface.co/datasets/EEGDash/ds000246">HF</a></td> |
| <td><strong>ds004551</strong> · iEEG<br>Intracranial recordings, surgical<br><a href="https://huggingface.co/datasets/EEGDash/ds004551">HF</a></td> |
| <td><strong>EEG2025r10mini</strong> · 20 subj<br>HBN mini release for tutorials<br><a href="https://huggingface.co/datasets/EEGDash/eeg2025r10mini">HF</a></td> |
| </tr> |
| <tr> |
| <td><strong>ds003061</strong> · Auditory<br>Speech / naturalistic listening<br><a href="https://huggingface.co/datasets/EEGDash/ds003061">HF</a></td> |
| <td><strong>ds004598</strong> · Motor<br>Motor paradigm study<br><a href="https://huggingface.co/datasets/EEGDash/ds004598">HF</a></td> |
| <td>… 22 HBN releases total<br><a href="https://huggingface.co/EEGDash?search_datasets=eeg2025">browse all HBN</a></td> |
| </tr> |
| </tbody> |
| </table> |
|
|
| <a class="browse-cta" href="https://huggingface.co/EEGDash">Browse all 736 datasets →</a> |
|
|
| <h2>Get started in 30 seconds</h2> |
|
|
| <pre><code>pip install eegdash</code></pre> |
|
|
| <pre><code>from eegdash import EEGDashDataset |
|
|
| # Load any dataset in the catalog by its ID… |
| ds = EEGDashDataset(dataset="ds002718", cache_dir="./cache") |
|
|
| # …or by canonical alias — every known name is a registered class: |
| from eegdash.dataset import Wakeman2015 |
| ds = Wakeman2015(cache_dir="./cache") |
|
|
| # …or pull a Hub-mirrored, pre-windowed Zarr copy: |
| from braindecode.datasets import BaseConcatDataset |
| ds = BaseConcatDataset.pull_from_hub("EEGDash/ds002718") |
|
|
| # EEGDash datasets ARE braindecode datasets — plug into PyTorch unchanged. |
| from torch.utils.data import DataLoader |
| loader = DataLoader(ds, batch_size=32, shuffle=True)</code></pre> |
|
|
| <h2>Contribute</h2> |
|
|
| <p>Missing a dataset? Wrong metadata? The whole catalog regenerates from one CSV — fix once, propagate everywhere. <strong><a href="https://github.com/eegdash/EEGDash/issues">Open an issue</a></strong> or see <a href="https://github.com/eegdash/EEGDash/blob/main/CONTRIBUTING.md">CONTRIBUTING.md</a>.</p> |
|
|
| <h2>Cite</h2> |
|
|
| <p>If you use EEGDash in your research, please cite the software entry below (and the companion paper once it’s available):</p> |
|
|
| <pre><code>@software{eegdash, |
| title = {EEG-DaSh: an open data, tool, and compute resource for machine learning on neuroelectromagnetic data}, |
| author = {Aristimunha, Bruno and Dotan, Aviv and Guetschel, Pierre and Truong, Dung |
| and Kokate, Kuntal and Jaiswal, Aman and Majumdar, Amitrava |
| and Shirazi, Seyed Yahya and Shriki, Oren and Delorme, Arnaud}, |
| year = {2026}, |
| version = {0.6.0}, |
| license = {BSD-3-Clause}, |
| url = {https://eegdash.org}, |
| howpublished = {\url{https://github.com/eegdash/EEGDash}} |
| }</code></pre> |
|
|
| <p>When you use a specific dataset, always follow its upstream citation policy — the link lives in every dataset’s HF card under <em>How to cite</em>.</p> |
|
|
| <footer> |
| EEGDash code is BSD-3-Clause. Each dataset retains its upstream license — always check the card before redistribution. <em>Open, indexed, loadable.</em> |
| </footer> |
|
|
| </main> |
| </body> |
| </html> |
|
|