FelixzeroSun commited on
Commit
85a5b9d
Β·
verified Β·
1 Parent(s): 19c1f58

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +68 -0
README.md CHANGED
@@ -0,0 +1,68 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # SynthRAD2025 – Task 1 & Task 2 Solutions
2
+
3
+ [![MICCAI](https://img.shields.io/badge/MICCAI-Grand%20Challenge-blue)](https://grand-challenge.org/)
4
+ [![Docker](https://img.shields.io/badge/Docker-ready-brightgreen)](https://www.docker.com/)
5
+ [![License](https://img.shields.io/badge/license-MIT-lightgrey)](LICENSE)
6
+
7
+ This repository contains our solutions for the **MICCAI Grand Challenge – SynthRAD2025**, focusing on **Task 1** and **Task 2**.
8
+ Our team achieved **1st place** in the **post-challenge leaderboard** for both tasks (with **Task 1 also ranking 1st during the official test phase**).
9
+
10
+ ---
11
+
12
+ ## πŸ† Challenge Overview
13
+
14
+ - **Task 1**: MRI-to-CT synthesis (**MR β†’ sCT**)
15
+ - **Task 2**: CBCT-to-CT synthesis (**CBCT β†’ sCT**)
16
+
17
+ Our methods emphasize **robust image synthesis**, **reproducible pipelines**, and **multi-region generalization**.
18
+
19
+ ---
20
+
21
+
22
+ ### File Descriptions
23
+
24
+ - **`docker_task_1/`**
25
+ Contains all necessary files to build and run the Docker image for **Task 1** (MR β†’ sCT).
26
+ - `process.py`: Script that performs inference, converting MR images into synthetic CT (sCT).
27
+
28
+ - **`docker_task_2/`**
29
+ Contains all necessary files to build and run the Docker image for **Task 2** (CBCT β†’ sCT).
30
+ - `process.py`: Script that performs inference, converting CBCT images into synthetic CT (sCT).
31
+
32
+ - **Normalization Config Files**
33
+ - `260_gt_nnUNetResEncUNetLPlans.json`: Normalization configuration for the **Abdomen** region.
34
+ - `262_gt_nnUNetResEncUNetLPlans.json`: Normalization configuration for the **Head & Neck** region.
35
+ - `266_gt_nnUNetResEncUNetLPlans.json`: Normalization configuration for the **Thorax** region.
36
+ These files are essential for inverse normalization, ensuring that the synthesized CT intensities are mapped back to their correct clinical ranges.
37
+
38
+ - **`nnunet_results`**
39
+ You also need to place your **trained models** under the `nnunet_results/` directory so that inference can correctly locate and load them.
40
+
41
+ - **`Dockerfile`**
42
+ Defines all steps and dependencies needed to build the Docker image. It ensures reproducibility and consistency across environments.
43
+
44
+ - **`base_algorithm/`**
45
+ Contains the baseline algorithm files provided by the official **SynthRAD2025 template**, serving as the foundation for our solution.
46
+
47
+ - **`build.sh`**
48
+ Shell script for automating the Docker build process.
49
+
50
+ - **`export.sh`**
51
+ Shell script for exporting the built Docker image into a compressed archive for submission or deployment.
52
+
53
+ - **`requirements.txt`**
54
+ Lists all Python dependencies required to run the code.
55
+
56
+ - **`revert_normalisation.py`**
57
+ Script to apply **inverse normalization** to synthesized CT outputs, restoring them to the correct intensity distributions for downstream evaluation.
58
+
59
+
60
+ ## πŸš€ Getting Started
61
+
62
+ ### 1. Build the Docker Image
63
+ ```bash
64
+ cd docker_task_1
65
+ bash build.sh
66
+
67
+ To test the algorithm locally, you can run the Docker container with GPU support, memory limit, and a larger shared memory (`/dev/shm`) size (e.g., 8 GB).
68
+