Dataset Viewer
Duplicate
The dataset viewer is not available for this split.
Cannot extract the features (columns) for the split 'train' of the config 'default' of the dataset.
Error code:   FeaturesError
Exception:    ValueError
Message:      Failed to convert pandas DataFrame to Arrow Table from file hf://datasets/nthPerson/famail-temporal-data@b6c364393f59b8b05ceb9d19c8b12e7331a9e288/source_data/processing_metadata.json.
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 246, in compute_first_rows_from_streaming_response
                  iterable_dataset = iterable_dataset._resolve_features()
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 4196, in _resolve_features
                  features = _infer_features_from_batch(self.with_format(None)._head())
                                                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2533, in _head
                  return next(iter(self.iter(batch_size=n)))
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2711, in iter
                  for key, pa_table in ex_iterable.iter_arrow():
                                       ^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2249, in _iter_arrow
                  yield from self.ex_iterable._iter_arrow()
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 494, in _iter_arrow
                  for key, pa_table in iterator:
                                       ^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 384, in _iter_arrow
                  for key, pa_table in self.generate_tables_fn(**gen_kwags):
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/json/json.py", line 292, in _generate_tables
                  raise ValueError(
              ValueError: Failed to convert pandas DataFrame to Arrow Table from file hf://datasets/nthPerson/famail-temporal-data@b6c364393f59b8b05ceb9d19c8b12e7331a9e288/source_data/processing_metadata.json.

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

FAMAIL Temporal Source Data

Source datasets, raw GPS, and the pre-trained Siamese discriminator checkpoint for the FAMAIL Temporal (Fairness-Aware Trajectory Modification) algorithm.

Contents

  • source_data/ — 12 derived files consumed by famail_temporal.preprocess and famail_temporal.data.loader. Produced by the unified source-generation tool.
  • raw_data/ — 3 raw taxi GPS pickle files (taxi_record_{07,08,09}_50drivers.pkl). Input to the source-generation tool. Provided so reviewers can re-run the full pipeline.
  • discriminator_checkpoints/default/best.pt — pre-trained Multi-Stream Siamese discriminator used for the F_fidelity term.

Usage

Download with the project's fetch script:

export HF_TOKEN=<your-read-token>
python -m famail_temporal.fetch_data --repo-id <this-repo>

Or with the HuggingFace CLI directly:

hf download <this-repo> --repo-type dataset --local-dir ./famail-data

Provenance

Raw datasets provided to researchers at SDSU for the FAMAIL project. All other data is generated from raw datasets.

Citation

misc{famail_temporal_data, title = {FAMAIL Temporal Source Data}, author = {Ashe, Robert and collaborators}, year = {2026}, url = {https://huggingface.co/datasets/nthPerson/famail-temporal-data}, }

Downloads last month
37