Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
|
@@ -1,3 +1,499 @@
|
|
| 1 |
-
---
|
| 2 |
-
license:
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: cc-by-4.0
|
| 3 |
+
task_categories:
|
| 4 |
+
- other
|
| 5 |
+
tags:
|
| 6 |
+
- physics
|
| 7 |
+
- high-energy-physics
|
| 8 |
+
- particle-physics
|
| 9 |
+
- tracking
|
| 10 |
+
- calorimetry
|
| 11 |
+
- machine-learning
|
| 12 |
+
- simulation
|
| 13 |
+
pretty_name: ColliderML Top-Quark Pair Production (No Pileup)
|
| 14 |
+
size_categories:
|
| 15 |
+
- 10K<n<100K
|
| 16 |
+
configs:
|
| 17 |
+
- config_name: particles
|
| 18 |
+
data_files: "https://portal.nersc.gov/cfs/m4958/ColliderML/hard_scatter/ttbar/v1/parquet/truth/particles/*.parquet"
|
| 19 |
+
- config_name: tracker_hits
|
| 20 |
+
data_files: "https://portal.nersc.gov/cfs/m4958/ColliderML/hard_scatter/ttbar/v1/parquet/reco/tracker_hits/*.parquet"
|
| 21 |
+
- config_name: calo_hits
|
| 22 |
+
data_files: "https://portal.nersc.gov/cfs/m4958/ColliderML/hard_scatter/ttbar/v1/parquet/reco/calo_hits/*.parquet"
|
| 23 |
+
- config_name: tracks
|
| 24 |
+
data_files: "https://portal.nersc.gov/cfs/m4958/ColliderML/hard_scatter/ttbar/v1/parquet/reco/tracks/*.parquet"
|
| 25 |
+
---
|
| 26 |
+
|
| 27 |
+
# ColliderML: Top-Quark Pair Production Dataset (ttbar, No Pileup)
|
| 28 |
+
|
| 29 |
+
## Dataset Description
|
| 30 |
+
|
| 31 |
+
This dataset contains simulated high-energy physics collision events for top-quark pair (ttbar) production with **no pileup** (single interaction per event). The data is generated using the **Open Data Detector (ODD)** geometry within the **ACTS (A Common Tracking Software)** framework, representing a generic collider detector similar to those at the LHC.
|
| 32 |
+
|
| 33 |
+
### Dataset Summary
|
| 34 |
+
|
| 35 |
+
- **Campaign**: `hard_scatter`
|
| 36 |
+
- **Process**: Top-quark pair production (ttbar)
|
| 37 |
+
- **Version**: `v1`
|
| 38 |
+
- **Number of Events**: ~29,000 events (29 files × 1000 events per file)
|
| 39 |
+
- **Pileup**: 0 (no additional interactions)
|
| 40 |
+
- **Detector**: Open Data Detector (ODD)
|
| 41 |
+
- **Format**: Apache Parquet with list columns for variable-length data
|
| 42 |
+
- **License**: CC-BY-4.0
|
| 43 |
+
|
| 44 |
+
### Supported Tasks
|
| 45 |
+
|
| 46 |
+
This dataset is designed for machine learning tasks in high-energy physics, including:
|
| 47 |
+
|
| 48 |
+
- **Particle tracking**: Reconstruct charged particle trajectories from detector hits
|
| 49 |
+
- **Track-to-particle matching**: Associate reconstructed tracks with truth particles
|
| 50 |
+
- **Jet tagging**: Identify jets originating from top quarks, b-quarks, or light quarks
|
| 51 |
+
- **Energy reconstruction**: Predict particle energies from calorimeter deposits
|
| 52 |
+
- **Physics analysis**: Event classification (signal vs. background discrimination)
|
| 53 |
+
- **Representation learning**: Study hierarchical information at different detector levels
|
| 54 |
+
|
| 55 |
+
### Languages
|
| 56 |
+
|
| 57 |
+
N/A (Physics data)
|
| 58 |
+
|
| 59 |
+
## Dataset Structure
|
| 60 |
+
|
| 61 |
+
### Data Instances
|
| 62 |
+
|
| 63 |
+
Each row in the Parquet files represents a single collision event. Variable-length quantities (e.g., lists of particles, hits, tracks) are stored as Parquet list columns.
|
| 64 |
+
|
| 65 |
+
Example event structure:
|
| 66 |
+
|
| 67 |
+
```python
|
| 68 |
+
{
|
| 69 |
+
'event_id': 42,
|
| 70 |
+
'particle_id': [0, 1, 2, 3, ...], # List of particle IDs
|
| 71 |
+
'pdg_id': [11, -11, 211, ...], # Particle type codes
|
| 72 |
+
'px': [1.2, -0.5, 3.4, ...], # Momentum components (GeV)
|
| 73 |
+
'py': [0.8, 1.1, -0.3, ...],
|
| 74 |
+
'pz': [5.2, -2.1, 10.5, ...],
|
| 75 |
+
'energy': [5.5, 2.3, 11.2, ...],
|
| 76 |
+
# ... additional fields
|
| 77 |
+
}
|
| 78 |
+
```
|
| 79 |
+
|
| 80 |
+
### Data Fields
|
| 81 |
+
|
| 82 |
+
The dataset contains four data types organized by detector hierarchy:
|
| 83 |
+
|
| 84 |
+
#### 1. `particles` (Truth-level)
|
| 85 |
+
|
| 86 |
+
Truth information about generated particles before detector simulation.
|
| 87 |
+
|
| 88 |
+
| Field | Type | Description |
|
| 89 |
+
|-------|------|-------------|
|
| 90 |
+
| `event_id` | int64 | Unique event identifier |
|
| 91 |
+
| `particle_id` | list\<int64\> | Unique particle ID within event |
|
| 92 |
+
| `pdg_id` | list\<int64\> | PDG particle code (e.g., 11=electron, 13=muon, 211=pion) |
|
| 93 |
+
| `mass` | list\<float64\> | Particle rest mass (GeV/c²) |
|
| 94 |
+
| `energy` | list\<float64\> | Particle total energy (GeV) |
|
| 95 |
+
| `charge` | list\<float64\> | Electric charge (in units of e) |
|
| 96 |
+
| `px`, `py`, `pz` | list\<float64\> | Momentum components (GeV/c) |
|
| 97 |
+
| `vx`, `vy`, `vz` | list\<float64\> | Vertex position (mm) |
|
| 98 |
+
| `time` | list\<float64\> | Production time (ns) |
|
| 99 |
+
| `num_tracker_hits` | list\<int64\> | Number of hits in tracker |
|
| 100 |
+
| `num_calo_hits` | list\<int64\> | Number of hits in calorimeter |
|
| 101 |
+
| `vertex_primary` | list\<int64\> | Primary vertex flag (1=primary, 0=secondary) |
|
| 102 |
+
| `parent_id` | list\<float64\> | ID of parent particle |
|
| 103 |
+
|
| 104 |
+
**Typical event**: ~200-300 particles per event
|
| 105 |
+
|
| 106 |
+
#### 2. `tracker_hits` (Detector-level)
|
| 107 |
+
|
| 108 |
+
Digitized spatial measurements from the tracking detector (silicon sensors).
|
| 109 |
+
|
| 110 |
+
| Field | Type | Description |
|
| 111 |
+
|-------|------|-------------|
|
| 112 |
+
| `event_id` | int64 | Unique event identifier |
|
| 113 |
+
| `x`, `y`, `z` | list\<float64\> | Measured hit position (mm) |
|
| 114 |
+
| `true_x`, `true_y`, `true_z` | list\<float64\> | True (simulated) hit position before digitization (mm) |
|
| 115 |
+
| `time` | list\<float64\> | Hit time (ns) |
|
| 116 |
+
| `particle_id` | list\<int64\> | Truth particle that created this hit |
|
| 117 |
+
| `volume_id` | list\<int64\> | Detector volume identifier |
|
| 118 |
+
| `layer_id` | list\<int64\> | Detector layer number |
|
| 119 |
+
| `surface_id` | list\<int64\> | Sensor surface identifier |
|
| 120 |
+
| `cell_id` | list\<int64\> | Cell/pixel identifier |
|
| 121 |
+
| `detector` | list\<int64\> | Detector subsystem code |
|
| 122 |
+
|
| 123 |
+
**Typical event**: ~2,000-3,000 hits per event
|
| 124 |
+
|
| 125 |
+
#### 3. `calo_hits` (Calorimeter-level)
|
| 126 |
+
|
| 127 |
+
Energy deposits in the calorimeter system (electromagnetic + hadronic).
|
| 128 |
+
|
| 129 |
+
| Field | Type | Description |
|
| 130 |
+
|-------|------|-------------|
|
| 131 |
+
| `event_id` | int64 | Unique event identifier |
|
| 132 |
+
| `detector` | list\<string\> | Calorimeter subsystem name |
|
| 133 |
+
| `cell_id` | list\<string\> | Calorimeter cell identifier |
|
| 134 |
+
| `total_energy` | list\<float64\> | Total energy deposited in cell (GeV) |
|
| 135 |
+
| `x`, `y`, `z` | list\<float64\> | Cell center position (mm) |
|
| 136 |
+
| `contrib_particle_ids` | list\<list\<int64\>\> | IDs of particles contributing to this cell |
|
| 137 |
+
| `contrib_energies` | list\<list\<float64\>\> | Energy contribution from each particle (GeV) |
|
| 138 |
+
| `contrib_times` | list\<list\<float64\>\> | Time of each contribution (ns) |
|
| 139 |
+
|
| 140 |
+
**Note**: Nested lists for contributions (one cell can have multiple particle deposits).
|
| 141 |
+
|
| 142 |
+
**Typical event**: ~500-1,000 calorimeter cells with deposits
|
| 143 |
+
|
| 144 |
+
#### 4. `tracks` (Reconstruction-level)
|
| 145 |
+
|
| 146 |
+
Reconstructed particle tracks from pattern recognition and track fitting algorithms.
|
| 147 |
+
|
| 148 |
+
| Field | Type | Description |
|
| 149 |
+
|-------|------|-------------|
|
| 150 |
+
| `event_id` | int64 | Unique event identifier |
|
| 151 |
+
| `track_id` | list\<int64\> | Unique track identifier within event |
|
| 152 |
+
| `majority_particle_id` | list\<int64\> | Truth particle with most hits on this track |
|
| 153 |
+
| `d0` | list\<float64\> | Transverse impact parameter (mm) |
|
| 154 |
+
| `z0` | list\<float64\> | Longitudinal impact parameter (mm) |
|
| 155 |
+
| `phi` | list\<float64\> | Azimuthal angle (radians) |
|
| 156 |
+
| `theta` | list\<float64\> | Polar angle (radians) |
|
| 157 |
+
| `qop` | list\<float64\> | Charge divided by momentum (e/GeV) |
|
| 158 |
+
| `hit_ids` | list\<list\<int32\>\> | List of tracker hit IDs assigned to this track |
|
| 159 |
+
|
| 160 |
+
**Track parameters**: Standard ACTS track representation (perigee parameters at origin).
|
| 161 |
+
|
| 162 |
+
**Derived quantities**:
|
| 163 |
+
- Transverse momentum: `pt = abs(1/qop) * sin(theta)`
|
| 164 |
+
- Pseudorapidity: `eta = -ln(tan(theta/2))`
|
| 165 |
+
- Total momentum: `p = abs(1/qop)`
|
| 166 |
+
|
| 167 |
+
**Typical event**: ~100-150 reconstructed tracks per event
|
| 168 |
+
|
| 169 |
+
### Data Splits
|
| 170 |
+
|
| 171 |
+
Currently, the dataset does not have predefined train/validation/test splits. Users should implement their own splitting strategy based on their use case. Recommended approach:
|
| 172 |
+
|
| 173 |
+
```python
|
| 174 |
+
from sklearn.model_selection import train_test_split
|
| 175 |
+
|
| 176 |
+
# Example: 70% train, 15% validation, 15% test
|
| 177 |
+
all_events = list(range(29000))
|
| 178 |
+
train_val, test = train_test_split(all_events, test_size=0.15, random_state=42)
|
| 179 |
+
train, val = train_test_split(train_val, test_size=0.176, random_state=42) # 0.176 * 0.85 ≈ 0.15
|
| 180 |
+
```
|
| 181 |
+
|
| 182 |
+
## Dataset Creation
|
| 183 |
+
|
| 184 |
+
### Curation Rationale
|
| 185 |
+
|
| 186 |
+
This dataset was created to support machine learning research in high-energy physics, specifically for:
|
| 187 |
+
|
| 188 |
+
1. **Benchmarking tracking algorithms**: Compare traditional and ML-based track reconstruction methods
|
| 189 |
+
2. **Hierarchical representation learning**: Study information flow from detector hits → tracks → particles
|
| 190 |
+
3. **Physics analysis**: Develop ML models for event classification and particle identification
|
| 191 |
+
4. **Open science**: Provide publicly accessible, realistic detector simulation data
|
| 192 |
+
|
| 193 |
+
The ttbar process is chosen because:
|
| 194 |
+
- It produces complex final states with many particles
|
| 195 |
+
- It's a key signature at hadron colliders (LHC)
|
| 196 |
+
- Top quarks decay to b-quarks, W bosons, and ultimately jets and leptons
|
| 197 |
+
- Relevant for searches for new physics beyond the Standard Model
|
| 198 |
+
|
| 199 |
+
### Source Data
|
| 200 |
+
|
| 201 |
+
#### Initial Data Collection and Normalization
|
| 202 |
+
|
| 203 |
+
The data is generated through the following simulation chain:
|
| 204 |
+
|
| 205 |
+
1. **Event Generation**: ttbar events generated using a Monte Carlo event generator
|
| 206 |
+
2. **Detector Simulation**: Particle propagation through the Open Data Detector using ACTS
|
| 207 |
+
3. **Digitization**: Conversion of energy deposits to realistic detector signals
|
| 208 |
+
4. **Reconstruction**: Track finding and fitting using ACTS tracking algorithms
|
| 209 |
+
5. **Format Conversion**: EDM4HEP → Parquet using the ColliderML data pipeline
|
| 210 |
+
|
| 211 |
+
#### Who are the source data producers?
|
| 212 |
+
|
| 213 |
+
The data is produced by the **ColliderML collaboration** as part of the **ATLAS ITk ML Reconstruction** project at NERSC (National Energy Research Scientific Computing Center).
|
| 214 |
+
|
| 215 |
+
### Annotations
|
| 216 |
+
|
| 217 |
+
#### Annotation process
|
| 218 |
+
|
| 219 |
+
The dataset includes truth-level annotations automatically generated during the simulation:
|
| 220 |
+
|
| 221 |
+
- **Particle-level truth**: Generator-level particle information
|
| 222 |
+
- **Hit-to-particle associations**: Which particle created each detector hit
|
| 223 |
+
- **Track-to-particle matching**: `majority_particle_id` links reconstructed tracks to truth particles
|
| 224 |
+
|
| 225 |
+
These annotations enable supervised learning for tasks like:
|
| 226 |
+
- Track efficiency (did we reconstruct this particle?)
|
| 227 |
+
- Track purity (how many hits belong to the correct particle?)
|
| 228 |
+
- Fake rate (how many tracks are not matched to real particles?)
|
| 229 |
+
|
| 230 |
+
#### Who are the annotators?
|
| 231 |
+
|
| 232 |
+
N/A (Annotations are from simulation ground truth)
|
| 233 |
+
|
| 234 |
+
### Personal and Sensitive Information
|
| 235 |
+
|
| 236 |
+
This dataset contains only simulated physics data. No personal or sensitive information is included.
|
| 237 |
+
|
| 238 |
+
## Considerations for Using the Data
|
| 239 |
+
|
| 240 |
+
### Social Impact of Dataset
|
| 241 |
+
|
| 242 |
+
This dataset supports fundamental physics research and ML algorithm development. It has no direct social impact but contributes to:
|
| 243 |
+
|
| 244 |
+
- Open science and reproducible research
|
| 245 |
+
- Education in HEP and ML
|
| 246 |
+
- Development of algorithms that may have broader applications (e.g., pattern recognition, tracking in medical imaging)
|
| 247 |
+
|
| 248 |
+
### Discussion of Biases
|
| 249 |
+
|
| 250 |
+
As a simulated dataset, biases may arise from:
|
| 251 |
+
|
| 252 |
+
1. **Generator-level biases**: The event generator's modeling of ttbar production
|
| 253 |
+
2. **Detector simulation biases**: Approximations in material interactions, detector response
|
| 254 |
+
3. **Reconstruction biases**: Algorithm choices in track finding and fitting
|
| 255 |
+
4. **No pileup**: Real LHC data has 20-60 simultaneous collisions; this dataset has only 1
|
| 256 |
+
|
| 257 |
+
Users should be aware that models trained on this data may not generalize to:
|
| 258 |
+
- Real detector data (requires calibration and alignment)
|
| 259 |
+
- Different detector geometries
|
| 260 |
+
- Events with pileup
|
| 261 |
+
|
| 262 |
+
### Other Known Limitations
|
| 263 |
+
|
| 264 |
+
- **Limited statistics**: ~29,000 events is moderate for ML training (consider data augmentation)
|
| 265 |
+
- **Single physics process**: Only ttbar; does not include background processes
|
| 266 |
+
- **Idealized detector**: ODD is a generic detector, not an exact replica of ATLAS/CMS
|
| 267 |
+
- **No detector inefficiencies**: Assumes 100% hit efficiency (real detectors have dead regions)
|
| 268 |
+
|
| 269 |
+
## Additional Information
|
| 270 |
+
|
| 271 |
+
### Dataset Curators
|
| 272 |
+
|
| 273 |
+
This dataset is maintained by the ColliderML team:
|
| 274 |
+
|
| 275 |
+
- Primary contact: [danieltm@lbl.gov](mailto:danieltm@lbl.gov)
|
| 276 |
+
- Collaboration: ATLAS ITk ML Reconstruction working group
|
| 277 |
+
- Infrastructure: NERSC (National Energy Research Scientific Computing Center)
|
| 278 |
+
|
| 279 |
+
### Licensing Information
|
| 280 |
+
|
| 281 |
+
This dataset is released under the **Creative Commons Attribution 4.0 International (CC BY 4.0)** license.
|
| 282 |
+
|
| 283 |
+
You are free to:
|
| 284 |
+
- **Share**: Copy and redistribute the material
|
| 285 |
+
- **Adapt**: Remix, transform, and build upon the material
|
| 286 |
+
|
| 287 |
+
Under the following terms:
|
| 288 |
+
- **Attribution**: You must give appropriate credit and indicate if changes were made
|
| 289 |
+
|
| 290 |
+
### Citation Information
|
| 291 |
+
|
| 292 |
+
If you use this dataset in your research, please cite:
|
| 293 |
+
|
| 294 |
+
```bibtex
|
| 295 |
+
@dataset{colliderml_ttbar_pu0_2024,
|
| 296 |
+
title={ColliderML: Top-Quark Pair Production Dataset (No Pileup)},
|
| 297 |
+
author={ColliderML Collaboration},
|
| 298 |
+
year={2024},
|
| 299 |
+
publisher={NERSC},
|
| 300 |
+
howpublished={\url{https://huggingface.co/datasets/OpenDataDetector/ColliderML_ttbar_pu0}},
|
| 301 |
+
note={Simulation performed using ACTS and the Open Data Detector}
|
| 302 |
+
}
|
| 303 |
+
```
|
| 304 |
+
|
| 305 |
+
### Contributions
|
| 306 |
+
|
| 307 |
+
This dataset was produced using:
|
| 308 |
+
|
| 309 |
+
- **ACTS (A Common Tracking Software)**: https://acts.readthedocs.io/
|
| 310 |
+
- **Open Data Detector**: https://acts.readthedocs.io/en/latest/examples/open_data_detector.html
|
| 311 |
+
- **EDM4HEP**: https://edm4hep.web.cern.ch/
|
| 312 |
+
- **ColliderML Pipeline**: https://github.com/ATLAS-ITk-ML/colliderml
|
| 313 |
+
|
| 314 |
+
## How to Use This Dataset
|
| 315 |
+
|
| 316 |
+
### Loading the Dataset
|
| 317 |
+
|
| 318 |
+
The dataset is hosted on the NERSC public portal and can be streamed directly without downloading:
|
| 319 |
+
|
| 320 |
+
```python
|
| 321 |
+
from datasets import load_dataset
|
| 322 |
+
|
| 323 |
+
# Load particles (truth-level)
|
| 324 |
+
particles_ds = load_dataset(
|
| 325 |
+
"OpenDataDetector/ColliderML_ttbar_pu0",
|
| 326 |
+
"particles",
|
| 327 |
+
split="train",
|
| 328 |
+
streaming=True
|
| 329 |
+
)
|
| 330 |
+
|
| 331 |
+
# Load tracker hits
|
| 332 |
+
tracker_hits_ds = load_dataset(
|
| 333 |
+
"OpenDataDetector/ColliderML_ttbar_pu0",
|
| 334 |
+
"tracker_hits",
|
| 335 |
+
split="train",
|
| 336 |
+
streaming=True
|
| 337 |
+
)
|
| 338 |
+
|
| 339 |
+
# Load calorimeter hits
|
| 340 |
+
calo_hits_ds = load_dataset(
|
| 341 |
+
"OpenDataDetector/ColliderML_ttbar_pu0",
|
| 342 |
+
"calo_hits",
|
| 343 |
+
split="train",
|
| 344 |
+
streaming=True
|
| 345 |
+
)
|
| 346 |
+
|
| 347 |
+
# Load reconstructed tracks
|
| 348 |
+
tracks_ds = load_dataset(
|
| 349 |
+
"OpenDataDetector/ColliderML_ttbar_pu0",
|
| 350 |
+
"tracks",
|
| 351 |
+
split="train",
|
| 352 |
+
streaming=True
|
| 353 |
+
)
|
| 354 |
+
```
|
| 355 |
+
|
| 356 |
+
### Example: Iterating Over Events
|
| 357 |
+
|
| 358 |
+
```python
|
| 359 |
+
import numpy as np
|
| 360 |
+
|
| 361 |
+
# Iterate over first 10 events
|
| 362 |
+
for i, event in enumerate(particles_ds.take(10)):
|
| 363 |
+
event_id = event['event_id']
|
| 364 |
+
n_particles = len(event['particle_id'])
|
| 365 |
+
|
| 366 |
+
print(f"Event {event_id}: {n_particles} particles")
|
| 367 |
+
|
| 368 |
+
# Access list columns as numpy arrays
|
| 369 |
+
px = np.array(event['px'])
|
| 370 |
+
py = np.array(event['py'])
|
| 371 |
+
pz = np.array(event['pz'])
|
| 372 |
+
|
| 373 |
+
# Compute transverse momentum
|
| 374 |
+
pt = np.sqrt(px**2 + py**2)
|
| 375 |
+
print(f" Mean pt: {pt.mean():.2f} GeV")
|
| 376 |
+
```
|
| 377 |
+
|
| 378 |
+
### Example: Computing Track Features
|
| 379 |
+
|
| 380 |
+
```python
|
| 381 |
+
import numpy as np
|
| 382 |
+
|
| 383 |
+
for event in tracks_ds.take(5):
|
| 384 |
+
# Get track parameters
|
| 385 |
+
qop = np.array(event['qop'])
|
| 386 |
+
theta = np.array(event['theta'])
|
| 387 |
+
phi = np.array(event['phi'])
|
| 388 |
+
|
| 389 |
+
# Compute derived quantities
|
| 390 |
+
pt = np.abs(1.0 / qop) * np.sin(theta)
|
| 391 |
+
eta = -np.log(np.tan(theta / 2.0))
|
| 392 |
+
|
| 393 |
+
print(f"Event {event['event_id']}: {len(qop)} tracks")
|
| 394 |
+
print(f" pt range: [{pt.min():.2f}, {pt.max():.2f}] GeV")
|
| 395 |
+
print(f" eta range: [{eta.min():.2f}, {eta.max():.2f}]")
|
| 396 |
+
```
|
| 397 |
+
|
| 398 |
+
### Example: Matching Tracks to Particles
|
| 399 |
+
|
| 400 |
+
```python
|
| 401 |
+
# Load both datasets
|
| 402 |
+
particles = load_dataset("OpenDataDetector/ColliderML_ttbar_pu0", "particles", split="train", streaming=True)
|
| 403 |
+
tracks = load_dataset("OpenDataDetector/ColliderML_ttbar_pu0", "tracks", split="train", streaming=True)
|
| 404 |
+
|
| 405 |
+
# Process event-by-event
|
| 406 |
+
for particle_event, track_event in zip(particles, tracks):
|
| 407 |
+
assert particle_event['event_id'] == track_event['event_id']
|
| 408 |
+
|
| 409 |
+
# Create particle ID lookup
|
| 410 |
+
particle_ids = np.array(particle_event['particle_id'])
|
| 411 |
+
particle_pt = np.sqrt(
|
| 412 |
+
np.array(particle_event['px'])**2 +
|
| 413 |
+
np.array(particle_event['py'])**2
|
| 414 |
+
)
|
| 415 |
+
|
| 416 |
+
# Get track associations
|
| 417 |
+
track_particle_ids = np.array(track_event['majority_particle_id'])
|
| 418 |
+
|
| 419 |
+
# Find matched particles
|
| 420 |
+
for track_idx, pid in enumerate(track_particle_ids):
|
| 421 |
+
if pid in particle_ids:
|
| 422 |
+
particle_idx = np.where(particle_ids == pid)[0][0]
|
| 423 |
+
truth_pt = particle_pt[particle_idx]
|
| 424 |
+
|
| 425 |
+
# Compute reconstructed pt
|
| 426 |
+
qop = track_event['qop'][track_idx]
|
| 427 |
+
theta = track_event['theta'][track_idx]
|
| 428 |
+
reco_pt = abs(1.0 / qop) * np.sin(theta)
|
| 429 |
+
|
| 430 |
+
print(f"Track {track_idx}: truth pt = {truth_pt:.2f}, reco pt = {reco_pt:.2f} GeV")
|
| 431 |
+
```
|
| 432 |
+
|
| 433 |
+
### Data Location
|
| 434 |
+
|
| 435 |
+
The Parquet files are hosted at:
|
| 436 |
+
|
| 437 |
+
```
|
| 438 |
+
https://portal.nersc.gov/cfs/m4958/ColliderML/hard_scatter/ttbar/v1/parquet/
|
| 439 |
+
├── truth/
|
| 440 |
+
│ └── particles/
|
| 441 |
+
│ ├── hard_scatter.ttbar.v1.truth.particles.events0-9.parquet
|
| 442 |
+
│ ├── hard_scatter.ttbar.v1.truth.particles.events2000-2999.parquet
|
| 443 |
+
│ └── ... (29 files total, ~29,000 events)
|
| 444 |
+
├── reco/
|
| 445 |
+
│ ├── tracker_hits/
|
| 446 |
+
│ │ ├── hard_scatter.ttbar.v1.reco.tracker_hits.events0-9.parquet
|
| 447 |
+
│ │ └── ... (29 files)
|
| 448 |
+
│ ├── calo_hits/
|
| 449 |
+
│ │ ├── hard_scatter.ttbar.v1.reco.calo_hits.events0-9.parquet
|
| 450 |
+
│ │ └── ... (29 files)
|
| 451 |
+
│ └── tracks/
|
| 452 |
+
│ ├── hard_scatter.ttbar.v1.reco.tracks.events0-9.parquet
|
| 453 |
+
│ └── ... (29 files)
|
| 454 |
+
```
|
| 455 |
+
|
| 456 |
+
### File Naming Convention
|
| 457 |
+
|
| 458 |
+
Files follow the pattern:
|
| 459 |
+
```
|
| 460 |
+
<campaign>.<dataset>.<version>.<category>.<object>.<event_range>.parquet
|
| 461 |
+
```
|
| 462 |
+
|
| 463 |
+
Example: `hard_scatter.ttbar.v1.reco.tracks.events0-9.parquet`
|
| 464 |
+
- Campaign: `hard_scatter`
|
| 465 |
+
- Dataset: `ttbar`
|
| 466 |
+
- Version: `v1`
|
| 467 |
+
- Category: `reco` (or `truth`)
|
| 468 |
+
- Object: `tracks`
|
| 469 |
+
- Event range: `events0-9` (inclusive)
|
| 470 |
+
|
| 471 |
+
### Performance Tips
|
| 472 |
+
|
| 473 |
+
1. **Streaming**: Use `streaming=True` to avoid downloading the entire dataset
|
| 474 |
+
2. **Batch processing**: Process events in chunks for better memory efficiency
|
| 475 |
+
3. **Parallel loading**: Use `num_proc` parameter for multi-threaded data loading
|
| 476 |
+
4. **Selective loading**: Only load the data types you need (particles, hits, tracks)
|
| 477 |
+
|
| 478 |
+
### Related Datasets
|
| 479 |
+
|
| 480 |
+
- **ColliderML_ttbar_pu200** (coming soon): Same process with 200 pileup interactions
|
| 481 |
+
- **ColliderML_higgs_pu0** (coming soon): Higgs boson production without pileup
|
| 482 |
+
|
| 483 |
+
### Support
|
| 484 |
+
|
| 485 |
+
For questions, issues, or feature requests:
|
| 486 |
+
- Open an issue on GitHub: https://github.com/ATLAS-ITk-ML/colliderml/issues
|
| 487 |
+
- Email: danieltm@lbl.gov
|
| 488 |
+
|
| 489 |
+
### Acknowledgments
|
| 490 |
+
|
| 491 |
+
This work was supported by:
|
| 492 |
+
- ATLAS ITk ML Reconstruction project
|
| 493 |
+
- NERSC computing resources
|
| 494 |
+
- U.S. Department of Energy, Office of Science
|
| 495 |
+
|
| 496 |
+
---
|
| 497 |
+
|
| 498 |
+
**Last updated**: October 2024
|
| 499 |
+
**Dataset version**: v1
|