QSBench commited on
Commit
4629295
Β·
verified Β·
1 Parent(s): bf47ebc

Update GUIDE.md

Browse files
Files changed (1) hide show
  1. GUIDE.md +43 -68
GUIDE.md CHANGED
@@ -1,120 +1,95 @@
1
  # 🌌 Quantum Noise Robustness Benchmark Guide
2
 
3
  Welcome to the **Quantum Noise Robustness Benchmark**.
4
- This tool shows how well Machine Learning can **predict the impact of noise** on quantum circuits using only structural and topological features β€” without running any noisy simulation.
5
 
6
  ---
7
 
8
  ## ⚠️ Important: Demo Dataset Notice
9
 
10
- This Space uses the **demo shard** of the QSBench Amplitude Damping dataset.
11
 
12
- - **Limited size**: Only a small subset of circuits is loaded for fast demonstration.
13
- - **Impact**: Results may vary depending on the random split and selected features.
14
- - **Goal**: Showcase how circuit structure correlates with noise-induced errors β€” not achieve production-level accuracy.
15
 
16
  ---
17
 
18
  ## 🎯 1. What is Being Predicted?
19
 
20
- The model performs **multi-output regression** and predicts **three error values** simultaneously:
21
 
22
- ### Targets
23
  - **`error_Z_global`** β€” deviation in Z-basis expectation value
24
  - **`error_X_global`** β€” deviation in X-basis expectation value
25
  - **`error_Y_global`** β€” deviation in Y-basis expectation value
26
 
27
- These errors are calculated as:
28
- **`error = noisy_expval - ideal_expval`**
29
 
30
- The goal is to estimate **how much noise distorts** the observable without actually applying the noise channel.
31
 
32
  ---
33
 
34
  ## 🧩 2. How the Model β€œSees” a Circuit
35
 
36
- The model never simulates quantum states or noise.
37
- It only uses **structural proxies**:
38
 
39
  ### πŸ”Ή Topology Features
40
- - `adj_density` β€” how densely qubits are connected
41
- - `adj_degree_mean` / `adj_degree_std` β€” average and variability of connectivity
42
 
43
- ### πŸ”Ή Gate Structure & Complexity
44
- - `depth`, `total_gates`, `cx_count`, `two_qubit_gates`
45
- - `gate_entropy` β€” measure of randomness vs regularity
 
46
 
47
- ### πŸ”Ή QASM-derived Signals
48
- - `qasm_length`, `qasm_line_count`, `qasm_gate_keyword_count`
49
-
50
- These features capture **entanglement potential** and **circuit complexity** β€” the main factors that determine noise sensitivity.
51
-
52
- ---
53
-
54
- ## πŸ€– 3. Model Overview
55
-
56
- The system uses:
57
-
58
- ### MultiOutput HistGradientBoostingRegressor
59
- - Fast and accurate gradient boosting
60
- - Predicts all three errors (`Z`, `X`, `Y`) at once
61
- - Pipeline includes median imputation + standard scaling
62
-
63
- This is currently the strongest and fastest model for tabular quantum data.
64
 
65
  ---
66
 
67
- ## πŸ“Š 4. Understanding the Results
68
-
69
- After clicking **"Train Multi-Output Regressor"** you get:
70
 
71
- ### A. Predicted vs True Error (three plots)
72
- - Each point = one circuit
73
- - Red dashed line = perfect prediction
74
- - The tighter the points around the line β†’ the better the model
75
 
76
- ### B. Residual Distribution
77
- - Shows prediction errors (`True - Predicted`)
78
- - Centered around zero = unbiased model
79
- - Narrow spread = high precision
80
 
81
  ---
82
 
83
- ## πŸ“‰ 5. Metrics Explained
84
 
85
- For each basis (**Z**, **X**, **Y**) the model reports:
 
 
 
86
 
87
- - **MAE** β€” average absolute error (in expectation value units)
88
- - **RMSE** β€” root mean squared error (penalizes large mistakes)
89
- - **RΒ²** β€” coefficient of determination (1.0 = perfect fit, 0 = no better than mean)
90
 
91
- Higher RΒ² and lower MAE/RMSE = better noise robustness prediction.
 
 
92
 
93
  ---
94
 
95
- ## πŸ§ͺ 6. Experimentation Tips
96
 
97
- Try these experiments to understand the model better:
98
-
99
- - Use **only topology features** (`adj_*`) β†’ isolate structural effect
100
- - Remove `cx_count` β†’ see how much two-qubit gates matter
101
- - Increase **Max Iterations** to 600–800 for more stable predictions
102
- - Change **Test Split** and re-train several times β†’ check robustness
103
- - Compare results with and without `gate_entropy`
104
 
105
  ---
106
 
107
- ## πŸ”¬ 7. Key Insight
108
-
109
- > Noise does not appear randomly β€” it leaves clear fingerprints in circuit structure.
110
- Even without running expensive noisy simulations, features like connectivity, depth, and gate counts already contain enough signal to predict how much the expectation values will be distorted.
111
 
112
- This demonstrates the power of **structure-aware** quantum machine learning for noise benchmarking.
113
 
114
  ---
115
 
116
- ## πŸ”— 8. Project Resources
117
 
118
- - πŸ€— **Hugging Face**: [https://huggingface.co/QSBench](https://huggingface.co/QSBench)
119
- - πŸ’» **GitHub**: [https://github.com/QSBench](https://github.com/QSBench)
120
- - 🌐 **Website**: [https://qsbench.github.io](https://qsbench.github.io)
 
1
  # 🌌 Quantum Noise Robustness Benchmark Guide
2
 
3
  Welcome to the **Quantum Noise Robustness Benchmark**.
4
+ This tool demonstrates how Machine Learning can **predict the impact of noise** on quantum circuits using only structural and topological features β€” without running any expensive noisy simulations.
5
 
6
  ---
7
 
8
  ## ⚠️ Important: Demo Dataset Notice
9
 
10
+ This Hub uses **v1.0.0-demo shards** of the QSBench dataset family.
11
 
12
+ - **Limited Scale**: Only a small subset of circuits is loaded for fast demonstration.
13
+ - **Complexity**: Predicting quantum observables from pure structure is a **non-trivial mapping**.
14
+ - **Goal**: Showcase the correlation between circuit topology and noise sensitivity β€” not to achieve production-level $R^2$ on a limited sample.
15
 
16
  ---
17
 
18
  ## 🎯 1. What is Being Predicted?
19
 
20
+ The model performs **multi-target regression** to estimate how much noise distorts the final signal.
21
 
22
+ ### Targets (The Error Vector)
23
  - **`error_Z_global`** β€” deviation in Z-basis expectation value
24
  - **`error_X_global`** β€” deviation in X-basis expectation value
25
  - **`error_Y_global`** β€” deviation in Y-basis expectation value
26
 
27
+ **Formula:** `error = noisy_expval - ideal_expval`
 
28
 
29
+ Unlike predicting the state itself, predicting the **error shift** allows us to understand the "noise fingerprint" left by the circuit's architecture.
30
 
31
  ---
32
 
33
  ## 🧩 2. How the Model β€œSees” a Circuit
34
 
35
+ The model never simulates quantum states. It uses **structural proxies** to guess the noise impact:
 
36
 
37
  ### πŸ”Ή Topology Features
38
+ - `adj_density` β€” how densely qubits are connected (proxy for crosstalk risk).
39
+ - `adj_degree_mean` β€” average connectivity (proxy for entanglement speed).
40
 
41
+ ### πŸ”Ή Complexity & Entanglement
42
+ - `depth` / `total_gates` β€” length of the decoherence window.
43
+ - `cx_count` / `two_qubit_gates` β€” the most noise-sensitive components in NISQ hardware.
44
+ - `gate_entropy` β€” measures circuit regularity vs. randomness.
45
 
46
+ ### πŸ”Ή QASM Signals
47
+ - `qasm_length` & `gate_keyword_count` β€” capture the overall "instruction weight".
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
48
 
49
  ---
50
 
51
+ ## πŸ€– 3. Technical Overview: The ML Pipeline
 
 
52
 
53
+ To handle the non-linear nature of quantum data, we use:
 
 
 
54
 
55
+ - **HistGradientBoostingRegressor**: A high-performance boosting algorithm designed for large tabular data.
56
+ - **MultiOutput Wrapper**: Ensures all three axes ($X, Y, Z$) are learned in a unified context.
57
+ - **Robust Preprocessing**: Median imputation for missing values and Standard Scaling for feature parity.
 
58
 
59
  ---
60
 
61
+ ## πŸ“Š 4. Interpreting the Analytics
62
 
63
+ ### A. Physics Emulation Plot (Crucial!)
64
+ - **Gray Points**: Actual simulated noisy values.
65
+ - **Red Points**: ML-predicted noisy values ($Ideal + Predicted Error$).
66
+ - **Insight**: If red points follow the trend of gray points, the model has successfully "learned" the physics of the noise channel without a simulator.
67
 
68
+ ### B. Why is my $R^2$ near Zero?
69
+ Even with 200,000+ samples, structural metrics alone (like `depth` or `entropy`) provide a "complexity baseline" but do not capture specific gate rotation angles.
 
70
 
71
+ 1. **The Result:** Standard regressors (Random Forest/XGBoost) will hit a performance ceiling near R2β‰ˆ0, as they see the circuit's skeleton but not its parameters.
72
+
73
+ 2. **The Opportunity:** This makes QSBench the perfect playground for **Graph Neural Networks (GNN)** and **Geometric Deep Learning**, where models can integrate gate parameters as node features to break this "structural ceiling."
74
 
75
  ---
76
 
77
+ ## πŸ§ͺ 5. Experimentation Tips
78
 
79
+ - **Isolate Topology**: Select only `adj_*` features to see how much qubit mapping alone affects noise.
80
+ - **The "CX" Test**: Remove `cx_count` and see how much the MAE increases. This quantifies the "cost" of entanglement in your noise model.
81
+ - **Iteration Scaling**: Increase **Max Iterations** (400 -> 800) to see if the model can find deeper patterns in the demo data.
 
 
 
 
82
 
83
  ---
84
 
85
+ ## πŸ”¬ 6. Key Insight
 
 
 
86
 
87
+ > **Noise is not random.** It is a deterministic function of circuit complexity and hardware topology. Even without a quantum simulator, ML can "guess" the fidelity of a result just by looking at the circuit diagram.
88
 
89
  ---
90
 
91
+ ## πŸ”— 7. Project Resources
92
 
93
+ - πŸ€— **Hugging Face**: [Datasets & Shards](https://huggingface.co/QSBench)
94
+ - πŸ’» **GitHub**: [Source Code & Tools](https://github.com/QSBench)
95
+ - 🌐 **Official Store**: [Get Full-Scale Datasets](https://qsbench.bgng.io)