Datasets:
Dataset Viewer
The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code: StreamingRowsError
Exception: ImportError
Message: To support decoding NIfTI files, please install 'nibabel'.
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
return get_rows(
^^^^^^^^^
File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2543, in __iter__
for key, example in ex_iterable:
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2061, in __iter__
batch = formatter.format_batch(pa_table)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/formatting/formatting.py", line 472, in format_batch
batch = self.python_features_decoder.decode_batch(batch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/formatting/formatting.py", line 234, in decode_batch
return self.features.decode_batch(batch, token_per_repo_id=self.token_per_repo_id) if self.features else batch
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 2161, in decode_batch
decode_nested_example(self[column_name], value, token_per_repo_id=token_per_repo_id)
File "/usr/local/lib/python3.12/site-packages/datasets/features/features.py", line 1419, in decode_nested_example
return schema.decode_example(obj, token_per_repo_id=token_per_repo_id) if obj is not None else None
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/features/nifti.py", line 172, in decode_example
raise ImportError("To support decoding NIfTI files, please install 'nibabel'.")
ImportError: To support decoding NIfTI files, please install 'nibabel'.Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Summary ๐
OAIZIB-CM: 507 knee MRIs and segmentation masks of 5 ROIs
Data
| Source | link |
|---|---|
| Huggingface | main |
| load_dataset-support | |
| Zenodo | here |
| Google Drive | here |
- Huggingface Dataset Branch:
main: The main branch contains the same files as those in Zenodo and Google Driveload_dataset-support: We added HFload_dataset()support in this branch (ref: intended usage 2)
About
This is the official release of OAIZIB-CM dataset
- OAIZIB-CM is based on the OAIZIB dataset
- In OAIZIB-CM, tibial cartilage is split into medial and lateral tibial cartilages.
- OAIZIB-CM includes CLAIR-Knee-103R, consisting of
- a template image learned from 103 MR images of subjects without radiographic OA
- corresponding 5-ROI segmentation mask for cartilages and bones
- corresponding 20-ROI atlas for articular cartilages
- It is compulsory to cite the paper if you use the dataset
Changelog ๐ฅ
- [10 Oct, 2025] This dataset is integrated into ๐ฅMedVision๐ฅ
- [09 Jul, 2025]
trust_remote_codeis no longer supported in datasets==4.0.0, install withpip install datasets==3.6.0 - [22 Mar, 2025] Add HF
load_dataset()support in theload_dataset-supportbranch. - [27 Feb, 2025] Add the template and atlas CLAIR-Knee-103R
- [26 Feb, 2025] Update compulsory citation (CartiMorph) for the dataset
- [15 Feb, 2025] Update file
imagesTs/oaizib_454_0000.nii.gz - [14 Feb, 2025] Identify corrupted files: case 454
Files
Images & Labels
- imagesTr: training images (#404)
- labelsTr: training segmentation masks (#404)
- imagesTs: testing images (#103)
- labelsTs: testing segmentation masks (#103)
Data Split & Info
subInfo_train: list of training datasubInfo_test: list of testing datakneeSideInfo: a file containing knee side information, used in CartiMorph Toolbox
Intended Usage
1. Download Files from the main or load_dataset-support Branch
#!/bin/bash
pip install --upgrade huggingface-hub[cli]
huggingface-cli login --token $HF_TOKEN
# python
from huggingface_hub import snapshot_download
snapshot_download(repo_id="YongchengYAO/OAIZIB-CM", repo_type='dataset', local_dir="/your/local/folder")
# python
from huggingface_hub import snapshot_download
snapshot_download(repo_id="YongchengYAO/OAIZIB-CM", repo_type='dataset', revision="load_dataset-support", local_dir="/your/local/folder")
2. Load Dataset or IterableDataset from the load_dataset-support Branch โผ๏ธ
โผ๏ธ Require datasets<=3.6.0
>>> from datasets import load_dataset
# Load Dataset
>>> dataset_test = load_dataset("YongchengYAO/OAIZIB-CM", revision="load_dataset-support", trust_remote_code=True, split="test")
>>> type(dataset_test)
<class 'datasets.arrow_dataset.Dataset'>
# Convert Dataset to IterableDataset: use .to_iterable_dataset()
>>> iterdataset_test = dataset_test.to_iterable_dataset()
>>> type(iterdataset_test)
<class 'datasets.iterable_dataset.IterableDataset'>
# Load IterableDataset: add streaming=True
>>> iterdataset_train = load_dataset("YongchengYAO/OAIZIB-CM", revision="load_dataset-support", trust_remote_code=True, streaming=True, split="train")
>>> type(iterdataset_train)
<class 'datasets.iterable_dataset.IterableDataset'>
Segmentation Labels
labels_map = {
"1": "Femur",
"2": "Femoral Cartilage",
"3": "Tibia",
"4": "Medial Tibial Cartilage",
"5": "Lateral Tibial Cartilage",
}
Citations
The dataset originates from these projects:
- CartiMorph: https://github.com/YongchengYAO/CartiMorph
- CartiMorph Toolbox:
@article{YAO2024103035,
title = {CartiMorph: A framework for automated knee articular cartilage morphometrics},
journal = {Medical Image Analysis},
author = {Yongcheng Yao and Junru Zhong and Liping Zhang and Sheheryar Khan and Weitian Chen},
volume = {91},
pages = {103035},
year = {2024},
issn = {1361-8415},
doi = {https://doi.org/10.1016/j.media.2023.103035}
}
@InProceedings{10.1007/978-3-031-82007-6_16,
author="Yao, Yongcheng
and Chen, Weitian",
editor="Wu, Shandong
and Shabestari, Behrouz
and Xing, Lei",
title="Quantifying Knee Cartilage Shape and Lesion: From Image to Metrics",
booktitle="Applications of Medical Artificial Intelligence",
year="2025",
publisher="Springer Nature Switzerland",
address="Cham",
pages="162--172"
}
License
This dataset is released under the CC BY-NC 4.0 license.
- Downloads last month
- 17
