A newer version of the Gradio SDK is available: 6.11.0
π Quantum Noise Robustness Benchmark Guide
Welcome to the Quantum Noise Robustness Benchmark.
This tool demonstrates how Machine Learning can predict the impact of noise on quantum circuits using only structural and topological features β without running any expensive noisy simulations.
β οΈ Important: Demo Dataset Notice
This Hub uses v1.0.0-demo shards of the QSBench dataset family.
- Limited Scale: Only a small subset of circuits is loaded for fast demonstration.
- Complexity: Predicting quantum observables from pure structure is a non-trivial mapping.
- Goal: Showcase the correlation between circuit topology and noise sensitivity β not to achieve production-level $R^2$ on a limited sample.
π― 1. What is Being Predicted?
The model performs multi-target regression to estimate how much noise distorts the final signal.
Targets (The Error Vector)
error_Z_globalβ deviation in Z-basis expectation valueerror_X_globalβ deviation in X-basis expectation valueerror_Y_globalβ deviation in Y-basis expectation value
Formula: error = noisy_expval - ideal_expval
Unlike predicting the state itself, predicting the error shift allows us to understand the "noise fingerprint" left by the circuit's architecture.
π§© 2. How the Model βSeesβ a Circuit
The model never simulates quantum states. It uses structural proxies to guess the noise impact:
πΉ Topology Features
adj_densityβ how densely qubits are connected (proxy for crosstalk risk).adj_degree_meanβ average connectivity (proxy for entanglement speed).
πΉ Complexity & Entanglement
depth/total_gatesβ length of the decoherence window.cx_count/two_qubit_gatesβ the most noise-sensitive components in NISQ hardware.gate_entropyβ measures circuit regularity vs. randomness.
πΉ QASM Signals
qasm_length&gate_keyword_countβ capture the overall "instruction weight".
π€ 3. Technical Overview: The ML Pipeline
To handle the non-linear nature of quantum data, we use:
- HistGradientBoostingRegressor: A high-performance boosting algorithm designed for large tabular data.
- MultiOutput Wrapper: Ensures all three axes ($X, Y, Z$) are learned in a unified context.
- Robust Preprocessing: Median imputation for missing values and Standard Scaling for feature parity.
π 4. Interpreting the Analytics
A. Physics Emulation Plot (Crucial!)
- Gray Points: Actual simulated noisy values.
- Red Points: ML-predicted noisy values ($Ideal + Predicted Error$).
- Insight: If red points follow the trend of gray points, the model has successfully "learned" the physics of the noise channel without a simulator.
B. Why is my $R^2$ near Zero?
Even with 200,000+ samples, structural metrics alone (like depth or entropy) provide a "complexity baseline" but do not capture specific gate rotation angles.
The Result: Standard regressors (Random Forest/XGBoost) will hit a performance ceiling near R2β0, as they see the circuit's skeleton but not its parameters.
The Opportunity: This makes QSBench the perfect playground for Graph Neural Networks (GNN) and Geometric Deep Learning, where models can integrate gate parameters as node features to break this "structural ceiling."
π§ͺ 5. Experimentation Tips
- Isolate Topology: Select only
adj_*features to see how much qubit mapping alone affects noise. - The "CX" Test: Remove
cx_countand see how much the MAE increases. This quantifies the "cost" of entanglement in your noise model. - Iteration Scaling: Increase Max Iterations (400 -> 800) to see if the model can find deeper patterns in the demo data.
π¬ 6. Key Insight
Noise is not random. It is a deterministic function of circuit complexity and hardware topology. Even without a quantum simulator, ML can "guess" the fidelity of a result just by looking at the circuit diagram.
π 7. Project Resources
- π€ Hugging Face: Datasets & Shards
- π» GitHub: Source Code & Tools
- π Official Store: Get Full-Scale Datasets