Improve dataset card: add task categories, paper link and usage instructions
#1
by nielsr HF Staff - opened
README.md
CHANGED
|
@@ -1,20 +1,66 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
---
|
| 4 |
|
| 5 |
-
|
| 6 |
|
| 7 |
-
|
| 8 |
|
|
|
|
|
|
|
|
|
|
| 9 |
|
| 10 |
## Download Dataset
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
```
|
| 12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
```
|
| 14 |
|
| 15 |
## Citation
|
| 16 |
|
| 17 |
-
If you find this repo useful for your research, please consider citing the paper as follows:
|
| 18 |
|
| 19 |
```bibtex
|
| 20 |
@inproceedings{GF-Screen,
|
|
@@ -23,6 +69,7 @@ If you find this repo useful for your research, please consider citing the paper
|
|
| 23 |
booktitle={ICLR},
|
| 24 |
year={2026},
|
| 25 |
}
|
|
|
|
| 26 |
@article{ma2024flare,
|
| 27 |
title={Unleashing the strengths of unlabelled data in deep learning-assisted pan-cancer abdominal organ quantification: the FLARE22 challenge},
|
| 28 |
author={Ma, Jun and Zhang, Yao and Gu, Song and Ge, Cheng and Mae, Shihao and Young, Adamo and Zhu, Cheng and Yang, Xin and Meng, Kangkang and Huang, Ziyan and others},
|
|
@@ -33,4 +80,4 @@ If you find this repo useful for your research, please consider citing the paper
|
|
| 33 |
year={2024},
|
| 34 |
publisher={Elsevier}
|
| 35 |
}
|
| 36 |
-
```
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
+
task_categories:
|
| 4 |
+
- image-segmentation
|
| 5 |
+
tags:
|
| 6 |
+
- medical
|
| 7 |
+
- cancer-screening
|
| 8 |
+
- pan-cancer
|
| 9 |
---
|
| 10 |
|
| 11 |
+
# Pancancer Dataset
|
| 12 |
|
| 13 |
+
[**Paper**](https://huggingface.co/papers/2601.19103) | [**Code**](https://github.com/Luffy03/GF-Screen)
|
| 14 |
|
| 15 |
+
This repository contains the dataset for the paper **"Glance and Focus Reinforcement for Pan-cancer Screening"**, accepted at ICLR 2026.
|
| 16 |
+
|
| 17 |
+
GF-Screen is a Glance and Focus reinforcement learning framework for pan-cancer screening in large-scale CT scans. It employs a Glance model to localize diseased regions and a Focus model to precisely segment lesions, addressing the challenge of localizing diverse types of tiny lesions in large CT volumes.
|
| 18 |
|
| 19 |
## Download Dataset
|
| 20 |
+
|
| 21 |
+
You can download the dataset using the Hugging Face CLI:
|
| 22 |
+
```bash
|
| 23 |
+
huggingface-cli download linshangmail/Pancancer --repo-type dataset --local-dir . --cache-dir ./cache
|
| 24 |
+
```
|
| 25 |
+
|
| 26 |
+
## Data Preparation
|
| 27 |
+
|
| 28 |
+
Following the official repository structure, the data should be organized as follows:
|
| 29 |
+
```
|
| 30 |
+
./ # project root
|
| 31 |
+
├──data
|
| 32 |
+
├──imagesTr
|
| 33 |
+
├──labelsTr
|
| 34 |
+
├──external
|
| 35 |
+
├──...
|
| 36 |
+
├──jsons
|
| 37 |
+
├──models
|
| 38 |
+
├──...
|
| 39 |
+
```
|
| 40 |
+
|
| 41 |
+
## Usage
|
| 42 |
+
|
| 43 |
+
### Training
|
| 44 |
+
Training requires one 80G GPU:
|
| 45 |
+
```bash
|
| 46 |
+
bash GF_RL_Screen.sh
|
| 47 |
```
|
| 48 |
+
|
| 49 |
+
### Evaluation
|
| 50 |
+
```bash
|
| 51 |
+
# Internal validation:
|
| 52 |
+
python --trained_pth $YOUR_MODEL_PATH val_GF_internal.py
|
| 53 |
+
|
| 54 |
+
# External validation:
|
| 55 |
+
python --trained_pth $YOUR_MODEL_PATH --dataset_name FLARE23 val_GF_external.py
|
| 56 |
+
|
| 57 |
+
# FLARE prediction:
|
| 58 |
+
python --trained_pth $YOUR_MODEL_PATH --test_data_path $YOUR_PATH_TO_FLARE_VALIDATION_IMAGES val_GF_internal.py
|
| 59 |
```
|
| 60 |
|
| 61 |
## Citation
|
| 62 |
|
| 63 |
+
If you find this repo or dataset useful for your research, please consider citing the paper as follows:
|
| 64 |
|
| 65 |
```bibtex
|
| 66 |
@inproceedings{GF-Screen,
|
|
|
|
| 69 |
booktitle={ICLR},
|
| 70 |
year={2026},
|
| 71 |
}
|
| 72 |
+
|
| 73 |
@article{ma2024flare,
|
| 74 |
title={Unleashing the strengths of unlabelled data in deep learning-assisted pan-cancer abdominal organ quantification: the FLARE22 challenge},
|
| 75 |
author={Ma, Jun and Zhang, Yao and Gu, Song and Ge, Cheng and Mae, Shihao and Young, Adamo and Zhu, Cheng and Yang, Xin and Meng, Kangkang and Huang, Ziyan and others},
|
|
|
|
| 80 |
year={2024},
|
| 81 |
publisher={Elsevier}
|
| 82 |
}
|
| 83 |
+
```
|