Dataset Viewer
The dataset viewer is not available for this dataset.
Cannot get the config names for the dataset.
Error code:   ConfigNamesError
Exception:    ReadTimeout
Message:      (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: 7f7b833f-e3ce-4ff9-af3a-08af24647fea)')
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/dataset/config_names.py", line 66, in compute_config_names_response
                  config_names = get_dataset_config_names(
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 161, in get_dataset_config_names
                  dataset_module = dataset_module_factory(
                                   ^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1031, in dataset_module_factory
                  raise e1 from None
                File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 1004, in dataset_module_factory
                  ).get_module()
                    ^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/load.py", line 632, in get_module
                  data_files = DataFilesDict.from_patterns(
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 689, in from_patterns
                  else DataFilesList.from_patterns(
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 592, in from_patterns
                  origin_metadata = _get_origin_metadata(data_files, download_config=download_config)
                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 506, in _get_origin_metadata
                  return thread_map(
                         ^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/tqdm/contrib/concurrent.py", line 69, in thread_map
                  return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/tqdm/contrib/concurrent.py", line 51, in _executor_map
                  return list(tqdm_class(ex.map(fn, *iterables, chunksize=chunksize), **kwargs))
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/tqdm/std.py", line 1169, in __iter__
                  for obj in iterable:
                             ^^^^^^^^
                File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 619, in result_iterator
                  yield _result_or_cancel(fs.pop())
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 317, in _result_or_cancel
                  return fut.result(timeout)
                         ^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 456, in result
                  return self.__get_result()
                         ^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
                  raise self._exception
                File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 59, in run
                  result = self.fn(*self.args, **self.kwargs)
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/data_files.py", line 485, in _get_single_origin_metadata
                  resolved_path = fs.resolve_path(data_file)
                                  ^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 198, in resolve_path
                  repo_and_revision_exist, err = self._repo_and_revision_exist(repo_type, repo_id, revision)
                                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_file_system.py", line 125, in _repo_and_revision_exist
                  self._api.repo_info(
                File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
                  return fn(*args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_api.py", line 2816, in repo_info
                  return method(
                         ^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
                  return fn(*args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/huggingface_hub/hf_api.py", line 2673, in dataset_info
                  r = get_session().get(path, headers=headers, timeout=timeout, params=params)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 602, in get
                  return self.request("GET", url, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
                  resp = self.send(prep, **send_kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
                  r = adapter.send(request, **kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/huggingface_hub/utils/_http.py", line 96, in send
                  return super().send(request, *args, **kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/requests/adapters.py", line 690, in send
                  raise ReadTimeout(e, request=request)
              requests.exceptions.ReadTimeout: (ReadTimeoutError("HTTPSConnectionPool(host='huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: 7f7b833f-e3ce-4ff9-af3a-08af24647fea)')

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Dataset Card for Synthetic Inline Holographical Images

This dataset provides synthetic image triplets representing inline holographical imaging in a simulated environment. Each data sample consists of:

  1. An object-domain field (ground truth),
  2. Its corresponding forward-propagated hologram (the inline holographic pattern at the sensor plane, intensity),
  3. The numerically reconstructed image (via angular spectrum method).

The dataset is intended to facilitate research in computational imaging, holographic reconstruction, phase retrieval, and machine learning-based hologram analysis.

It is primarily used in conjunction with the open-source project Hologen v2, which provides simulation and learning tools for inline holography.

Example Data


Dataset Details

The Synthetic Inline Holographical Images dataset contains triplets of images generated from numerically simulated optical propagation.
The synthetic nature of the data enables large-scale, controllable experiments without the need for physical holographic recording setups.
This makes the dataset especially suitable for deep learning research in holographic imaging, where paired data (object ↔ hologram ↔ reconstruction) are rarely available.

Dataset consists of 8 noise configuration types with each having 1500 samples:

  • no_noise: No noise added.
  • speckle_noise: Only speckle noise is added.
  • shot_noise: Only shot noise is added.
  • read_noise: Only read noise is added.
  • dark_current_noise: Only dark current noise is added.
  • speckle_shot_noise: Both speckle and shot noise are added.
  • speckle_shot_read_noise: The noises speckle, shot and read are added.
  • speckle_shot_read_dark_noise: All the possible noise types are added.

One can find the configuration name under config_name attribute of a sample.

Simulation Settings

One can reproduce the results using the HoloGen Toolkit with following simulation settings.

Parameter Description Value
Simulation seed Random number generator's seed 42
Object height Height of both object and sensor plane 512 pixels
Object width Width of both object and sensor plane 512 pixels
Pixel pitch Physical spacing between adjacent pixels 4.65e-6 meters
Illumination wavelength Monochromatic light wavelength 532e-9 meters
Propagation distance Distance between object and sensor planes 0.02 meters
Parameter Value
Speckle Noise's Strength 0.15
Speckle Noise's Roughness 1.0
Read Noise's Sigma 10.0
Dark Noise's Mean 20.0

Uses

The dataset is designed for:

  • Training and evaluating neural networks that reconstruct objects from inline holograms.
  • Developing models for phase retrieval, complex field estimation, and denoising.
  • Exploring signal transformation relationships between object and propagation domains.
  • Benchmarking holographic forward and inverse modelling algorithms.

This dataset is not suitable for:

  • Real-world holography generalisation studies without domain adaptation.
  • Tasks requiring physical measurements or phase-accurate calibration data.
  • Medical, biometric, or personal data analysis (no human-related content is included).

Dataset Structure

Each data file is named as ground_truth, hologram or reconstructed. These corresponds to the stage of the holography.

  • Each sample includes config_name attribute to indicate the noise configuration, global_idx attribute to track the sample ID across configs, and sample_idx attribute to track the sample ID on the same config.
  • ground_truth and reconstructed samples include real and imag attributes.
  • hologram samples include intensity attribute.

Example:

from pathlib improt Path
from hologen.utils import load_samples_from_parquet

dataset_dir = Path("inline-digital-holography")
resolution = 512  # Important to re-structure flatten to 2D.

# Load all dataset (automatically handles multiple parquet files)
holograms = load_samples_from_parquet(dataset_dir, "hologram", resolution=resolution)
ground_truths = load_samples_from_parquet(dataset_dir, "ground_truth", resolution=resolution)
reconstructeds = load_samples_from_parquet(dataset_dir, "reconstructed", resolution=resolution)

print(f"Loaded {len(holograms)} hologram samples")

Dataset Creation

Curation Rationale

Inline holography involves recording the interference pattern between an object wave and a reference wave. However, collecting large, labelled datasets in laboratory conditions is impractical due to optical setup complexity and noise factors. This dataset provides a synthetic, physically consistent alternative that mimics realistic propagation physics using scalar diffraction models.

Source Data

Data Collection and Processing

Images were generated using numerical wave propagation based on the Angular Spectrum Method (ASM), as implemented in the Hologen framework. Objects were synthetically generated using shape primitives, textures, and random phase and amplitude patterns. Each object was propagated through a simulated inline holography setup to produce hologram and reconstruction pairs.

Who are the source data producers?

  • All data were generated algorithmically by Gökhan Koçmarlı using simulation code in Hologen.
  • No external or third-party datasets were used.

Annotations

No manual annotations are included. Each triplet is automatically labelled by filename correspondence.

Annotation process

Not applicable (fully synthetic, self-labelled data).

Who are the annotators?

All data is generated programmatically.

Personal and Sensitive Information

This dataset contains no personal, identifiable, or sensitive information.
All images are synthetic and algorithmically generated.


Bias, Risks, and Limitations

  • As the dataset is fully synthetic, it lacks real-world optical aberrations, noise, and coherence effects that occur in experimental holography.
  • Models trained purely on this dataset may require fine-tuning on physical hologram data to generalise effectively.
  • The dataset assumes ideal optical parameters (e.g., monochromatic light, planar sensor).

Recommendations

Users should consider:

  • Augmenting with noise or real holograms for domain adaptation.
  • Interpreting reconstruction metrics (e.g., PSNR, SSIM) relative to synthetic references.
  • Avoiding conclusions about physical accuracy without experimental validation.

Citation

BibTeX:

@dataset{kochmarla2025synthetic_inline_holographical_images_v2,
  author       = {Gökhan Koçmarlı},
  title        = {Synthetic Inline Holographical Images v2},
  year         = {2025},
  url          = {https://huggingface.co/datasets/electricalgorithm/inline-digital-holography-v2},
  note         = {Synthetic dataset for inline holography simulation and reconstruction.}
}
Downloads last month
9