File size: 3,265 Bytes
85a5b9d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25cbfe7
 
 
 
85a5b9d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25cbfe7
 
85a5b9d
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
# SynthRAD2025 – Task 1 & Task 2 Solutions

[![MICCAI](https://img.shields.io/badge/MICCAI-Grand%20Challenge-blue)](https://grand-challenge.org/)
[![Docker](https://img.shields.io/badge/Docker-ready-brightgreen)](https://www.docker.com/)
[![License](https://img.shields.io/badge/license-MIT-lightgrey)](LICENSE)

This repository contains our solutions for the **MICCAI Grand Challenge – SynthRAD2025**, focusing on **Task 1** and **Task 2**.  
Our team achieved **1st place** in the **post-challenge leaderboard** for both tasks (with **Task 1 also ranking 1st during the official test phase**).

---

## πŸ† Challenge Overview

- **Task 1**: MRI-to-CT synthesis (**MR β†’ sCT**)  
- **Task 2**: CBCT-to-CT synthesis (**CBCT β†’ sCT**)  

Our methods emphasize **robust image synthesis**, **reproducible pipelines**, and **multi-region generalization**.

---


### File Descriptions

- **`docker_task_1/`**  
  Contains all necessary files to build and run the Docker image for **Task 1** (MR β†’ sCT).  
  - `process.py`: Script that performs inference, converting MR images into synthetic CT (sCT).  

- **`docker_task_2/`**  
  Contains all necessary files to build and run the Docker image for **Task 2** (CBCT β†’ sCT).  
  - `process.py`: Script that performs inference, converting CBCT images into synthetic CT (sCT).  

- **`Normalization Config Files`**  
  - `260_gt_nnUNetResEncUNetLPlans.json/540_gt_nnUNetResEncUNetLPlans.json`: Normalization configuration for the **Abdomen** region.  
  - `262_gt_nnUNetResEncUNetLPlans.json/542_gt_nnUNetResEncUNetLPlans.json`: Normalization configuration for the **Head & Neck** region.  
  - `264_gt_nnUNetResEncUNetLPlans.json/544_gt_nnUNetResEncUNetLPlans.json`: Normalization configuration for the **Thorax** region.  
  These files are essential for inverse normalization, ensuring that the synthesized CT intensities are mapped back to their correct clinical ranges.  

- **`Dockerfile`**  
  Defines all steps and dependencies needed to build the Docker image. It ensures reproducibility and consistency across environments.  

- **`base_algorithm/`**  
  Contains the baseline algorithm files provided by the official **SynthRAD2025 template**, serving as the foundation for our solution.  

- **`build.sh`**  
  Shell script for automating the Docker build process.  

- **`export.sh`**  
  Shell script for exporting the built Docker image into a compressed archive for submission or deployment.  

- **`requirements.txt`**  
  Lists all Python dependencies required to run the code.  

- **`revert_normalisation.py`**  
  Script to apply **inverse normalization** to synthesized CT outputs, restoring them to the correct intensity distributions for downstream evaluation.  


## πŸš€ Getting Started
- **`nnunet_results`**  
  Before starting inference, you also need to create a folder called `nnunet_results/` in docker_task_1/docker_task_2 and place your **trained models** under the `nnunet_results/` directory so that inference can correctly locate and load them.  
### 1. Build the Docker Image
```bash
cd docker_task_1
bash build.sh

To test the algorithm locally, you can run the Docker container with GPU support, memory limit, and a larger shared memory (`/dev/shm`) size (e.g., 8 GB).