YongchengYAO commited on
Commit
038903a
·
1 Parent(s): fe50be6

[doc] chore: update doc on how to redownload and rebuild dataset

Browse files
README.md CHANGED
@@ -525,6 +525,156 @@ We only share the annotations (https://huggingface.co/datasets/YongchengYAO/MedV
525
  ```
526
  </details>
527
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
528
  <br/>
529
 
530
  # Advanced Usage
 
525
  ```
526
  </details>
527
 
528
+
529
+ ## Dataset Building Workflow
530
+
531
+ ### Workflow
532
+
533
+ <details>
534
+ <summary> MedVision Dataset Building Workflow (Black) </summary>
535
+ <img src="fig/medvision-dataset-flow-b.png" alt="MedVision Dataset Building Workflow (Black)" /><br>
536
+ </details>
537
+
538
+ <details>
539
+ <summary> MedVision Dataset Building Workflow (White) </summary>
540
+ <img src="fig/medvision-dataset-flow-w.png" alt="MedVision Dataset Building Workflow (White)" /><br>
541
+ </details>
542
+
543
+ </br>
544
+
545
+ There are a few venues to control the dataset loading and building behavior:
546
+
547
+ - **Rebuild Dataset (Arrow files)**: Use the `download_mode` argument in `load_dataset()` ([docs](https://huggingface.co/docs/datasets/v3.6.0/en/package_reference/builder_classes#datasets.DownloadMode)).
548
+ - **[1]** Set `download_mode="force_redownload"` to ignore the cached Arrow files and trigger the data loading script `MedVision.py` to rebuild the dataset.
549
+ - **Redownload Raw Data**:
550
+ - **[2]** `MedVision_FORCE_DOWNLOAD_DATA`: Set this environment variable to `True` to force re-downloading raw images and annotations.
551
+ - **[3]** `.downloaded_datasets.json`: This tracker file records downloaded status. Removing a dataset's entry here will trigger a re-download of the raw data for that dataset.
552
+
553
+ > [!Note] ⚠️
554
+ > **How to properly update/redownload raw data?**
555
+ >
556
+ > If you need to update raw data (images, masks, landmarks) using [2] or [3], you **MUST ALSO** use [1] (`download_mode="force_redownload"`).
557
+ >
558
+ > Why? Because if Hugging Face finds a valid cached dataset (Arrow files), it will load it directly and **skip running the script entirely**. Without running the script, the environment variable [2] or tracker file [3] will never be checked.
559
+ >
560
+ > **Summary:**
561
+ > - Update Arrow/Fields only: Use [1].
562
+ > - Update Raw Data: Use [1] **AND** ([2] or [3]).
563
+ >
564
+ > 🔥 We will maintain a [change log](https://huggingface.co/datasets/YongchengYAO/MedVision/blob/main/doc/changelog.md) for essential updates.
565
+
566
+
567
+ ### Examples
568
+
569
+ <details>
570
+ <summary> Run this for the first time will download the raw data and build the dataset </summary>
571
+
572
+ ```python
573
+ import os
574
+ from datasets import load_dataset
575
+
576
+ # Set data folder
577
+ wd = os.path.join(os.getcwd(), "Data-testing")
578
+ os.makedirs(wd, exist_ok=True)
579
+ os.environ["MedVision_DATA_DIR"] = wd
580
+
581
+ # Pick a dataset config name and split
582
+ config = "OAIZIB-CM_BoxSize_Task01_Axial_Test"
583
+ split_name = "test" # use "test" for testing set config; use "train" for training set config
584
+
585
+ # Get dataset
586
+ ds = load_dataset(
587
+ "YongchengYAO/MedVision",
588
+ name=config,
589
+ trust_remote_code=True,
590
+ split=split_name,
591
+ )
592
+ ```
593
+ </details>
594
+
595
+ <details>
596
+ <summary> Run the same script again will use the cached dataset </summary>
597
+
598
+ ```python
599
+ import os
600
+ from datasets import load_dataset
601
+
602
+ # Set data folder
603
+ wd = os.path.join(os.getcwd(), "Data-testing")
604
+ os.makedirs(wd, exist_ok=True)
605
+ os.environ["MedVision_DATA_DIR"] = wd
606
+
607
+ # Pick a dataset config name and split
608
+ config = "OAIZIB-CM_BoxSize_Task01_Axial_Test"
609
+ split_name = "test" # use "test" for testing set config; use "train" for training set config
610
+
611
+ # Get dataset
612
+ ds = load_dataset(
613
+ "YongchengYAO/MedVision",
614
+ name=config,
615
+ trust_remote_code=True,
616
+ split=split_name,
617
+ )
618
+ ```
619
+ </details>
620
+
621
+ <details>
622
+ <summary> Adding `download_mode="force_redownload"` will skip raw data downloading and rebuild the dataset </summary>
623
+
624
+ ```python
625
+ import os
626
+ from datasets import load_dataset
627
+
628
+ # Set data folder
629
+ wd = os.path.join(os.getcwd(), "Data-testing")
630
+ os.makedirs(wd, exist_ok=True)
631
+ os.environ["MedVision_DATA_DIR"] = wd
632
+
633
+ # Pick a dataset config name and split
634
+ config = "OAIZIB-CM_BoxSize_Task01_Axial_Test"
635
+ split_name = "test" # use "test" for testing set config; use "train" for training set config
636
+
637
+ # Get dataset
638
+ ds = load_dataset(
639
+ "YongchengYAO/MedVision",
640
+ name=config,
641
+ trust_remote_code=True,
642
+ split=split_name,
643
+ download_mode="force_redownload",
644
+ )
645
+ ```
646
+ </details>
647
+
648
+ <details>
649
+ <summary> Adding `download_mode="force_redownload"` and `os.environ["MedVision_FORCE_DOWNLOAD_DATA"] = "True"` will redownload raw data and rebuild the dataset </summary>
650
+
651
+ ```python
652
+ import os
653
+ from datasets import load_dataset
654
+
655
+ # Set data folder
656
+ wd = os.path.join(os.getcwd(), "Data-testing")
657
+ os.makedirs(wd, exist_ok=True)
658
+ os.environ["MedVision_DATA_DIR"] = wd
659
+
660
+ # Pick a dataset config name and split
661
+ config = "OAIZIB-CM_BoxSize_Task01_Axial_Test"
662
+ split_name = "test" # use "test" for testing set config; use "train" for training set config
663
+
664
+ # Force redownload
665
+ os.environ["MedVision_FORCE_DOWNLOAD_DATA"] = "True"
666
+
667
+ # Get dataset
668
+ ds = load_dataset(
669
+ "YongchengYAO/MedVision",
670
+ name=config,
671
+ trust_remote_code=True,
672
+ split=split_name,
673
+ download_mode="force_redownload",
674
+ )
675
+ ```
676
+ </details>
677
+
678
  <br/>
679
 
680
  # Advanced Usage
fig/medvision-dataset-flow-b.png ADDED

Git LFS Details

  • SHA256: f4dbc11b10d2840667b9e0e52c4dab8553ef0c8501d75bedfa55dd70eaa0a6cd
  • Pointer size: 131 Bytes
  • Size of remote file: 148 kB
fig/medvision-dataset-flow-w.png ADDED

Git LFS Details

  • SHA256: 40ac4cc09c6bf98447bb9d1b5815a83c27cd0263f2395792f433c23d558d6646
  • Pointer size: 131 Bytes
  • Size of remote file: 158 kB