Corrupted .npy file: shape–size mismatch when loading image

#4
by Arvind69 - opened

I am encountering what appears to be a corrupted .npy file in the dataset.
The array header indicates shape (1, 512, 512, 58), but the file only contains 1,048,544 elements, which makes reshaping impossible.

Expected elements:
1 × 512 × 512 × 58 = 15,204,352

Actual elements:
1,048,544

This results in a ValueError when loading the file with NumPy.

Minimal Reproducible Example:

import numpy as np

image_path = ".../M3D-Seg/M3D_Seg/0017/0017/pancreas_case10-1/image.npy"
image_data = np.load(image_path)[0]
print(f"Image data shape: {image_data.shape}")

Output:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[13], line 3
      1 import numpy as np
      2 image_path = ".../M3D-Seg/M3D_Seg/0017/0017/pancreas_case10-1/image.npy"
----> 3 image_data = np.load(image_path)[0]
      4 print(f"Image data shape: {image_data.shape}")

File ~/.local/share/mamba/envs/mmm/lib/python3.11/site-packages/numpy/lib/npyio.py:456, in load(file, mmap_mode, allow_pickle, fix_imports, encoding, max_header_size)
    453         return format.open_memmap(file, mode=mmap_mode,
    454                                   max_header_size=max_header_size)
    455     else:
--> 456         return format.read_array(fid, allow_pickle=allow_pickle,
    457                                  pickle_kwargs=pickle_kwargs,
    458                                  max_header_size=max_header_size)
    459 else:
    460     # Try a pickle
    461     if not allow_pickle:

File ~/.local/share/mamba/envs/mmm/lib/python3.11/site-packages/numpy/lib/format.py:839, in read_array(fp, allow_pickle, pickle_kwargs, max_header_size)
    837         array = array.transpose()
    838     else:
--> 839         array.shape = shape
    841 return array

ValueError: cannot reshape array of size 1048544 into shape (1,512,512,58)

Please verify and re-upload the affected files if needed.

Sign up or log in to comment