# Overview The embodied intelligence industry is currently facing significant development challenges. The most critical issue is the lack of high-quality data, particularly omnimodal data that integrates force and tactile sensing. The PaXini introduces the PX OmniSharing Dataset, built on the PaXini Super EID Factory, enabling large-scale, high-fidelity human data collection across diverse tasks and scenarios. The dataset includes multi-dimensional tactile data, multi-view visual data, voice, text, proprioception, and spatial trajectories, comprehensively addressing the challenge of rapid generalization for embodied agents across diverse scenarios. Together with the PX OmniSharing Toolkit, it provides an end-to-end pipeline for efficient data processing and model development. The following tasks have been updated in this dataset:
Scenarios Task Quantity
Office Scene Cleaning Computer Keyboard 100
Assembling Triple File Basket 100
Disassembling Triple File Basket 100
Restaurant Scene Beer Storage 100
Automotive Scene Cable Insertion - Cable Connection 100
Home Scene Building Block Storage 100
Industrial Scene Bolt Kit Assembly 100
Supermarket Scene Glasses Cleaning 100
Primitive Action Scene Insert and Remove 100
--- # Get Started ## Download the Dataset To download the full dataset, you can use the following code. If you encounter any issues, please refer to the official Hugging Face documentation. ```bash # Make sure you have git-lfs installed (https://git-lfs.com) git lfs install # When prompted for a password, use an access token with write permissions.# Generate one from your settings: https://huggingface.co/settings/tokens git clone https://huggingface.co/datasets/paxini/Omnisharing_DB_SampleData ``` Subfolder only (e.g., part_07): ```bash # Make sure you have git-lfs installed (https://git-lfs.com) git lfs install # Initialize an empty Git repository git init Omnisharing_DB_SampleData cd Omnisharing_DB_SampleData # Set the remote repository git remote add origin https://huggingface.co/datasets/paxini/Omnisharing_DB_SampleData # Enable sparse-checkout git sparse-checkout init # Specify the folders and files git sparse-checkout set data/part_07 # Pull the data git pull origin main ``` The whole dataset covers 10 different tasks. You can find relevant information in the meta group inside each HDF5. --- # Dataset Structure The PX OmniSharing Toolkit processing workflow involves four primary data categories: DF-1, DF-2, DF-2R, and DF-3. The corresponding data structure formats are illustrated below:
Data Format Naming Description Custom Format Suffix Data File Name Example
DF-1 The overall input: raw data after preprocessing and quality inspection Yes (HDF5) No Suffix episode_11_111219_112_120024.hdf5
DF-2 1st output: DF-1 with encoder and tactile data parsed; adds bimanual and object poses; includes both action and observation Yes (HDF5) "_glove" episode_11_111219_112_120024_glove.hdf5
DF-2R 2nd output: DF-2 retargeted to a dexterous hand model Yes (HDF5) "_{MODEL}" episode_11_111219_112_120024_dh13.hdf5 (retargeting to DexH13)
episode_11_111219_112_120024_mano.hdf5 (retargeting to MANO)
...
DF-3 3rd output: converts DF-2R to the LeRobot dataset format; can be used for VLA model training No No Suffix -
The sample data has undergone pose estimation & parsing, and is provided in the standardized DF-2 format. For access to raw data, don't hesitate to get in touch with us omnisharingdb@paxini.com, or visit our dataset marketplace: https://dataset-mall.paxini.com/. For advanced formats such as DF-2R or DF-3, users can perform conversion and further processing through the PX OmniSharing Toolkit, available on GitHub: https://github.com/px-DataCollection/px_omnisharing_dataprocess_kit. ## DF-2(Data Format-2): ```text /dataset ├── attributes # e.g., generated_time, data_id (compressed error info) ├── action # Action signals (no tactile) │ ├── lefthand │ │ ├── attributes # description, etc. │ │ ├── joints │ │ │ ├── data # (n, 29) joint angles in URDF joint order │ │ │ └── attributes # joint_names = [...] │ │ └── handpose │ │ ├── data # (n, 7) │ │ └── attributes # order = [x, y, z, qw, qx, qy, qz] │ └── righthand │ ├── attributes # description, hand_name, urdf, etc. │ ├── joints │ │ ├── data # (n, 29) │ │ └── attributes # joint_names = [...] │ └── handpose │ ├── data # (n, 7) │ └── attributes # order = [x, y, z, qw, qx, qy, qz] └── observation # Episode state ├── audio # Compressed audio stream (includes text) ├── image │ ├── RGB_CameraXXX │ │ ├── data # 1D compressed payload │ │ ├── extrinsics │ │ └── intrinsics # attrs include width/height │ ├── RGBD_XXX │ │ ├── data # 1D compressed payload │ │ ├── extrinsics │ │ ├── intrinsics │ │ └── attributes # width/height │ └── [...] ├── lefthand │ ├── attributes # description, etc. │ ├── joints │ │ ├── data # (n, 29) │ │ └── attributes # joint_names = [...] │ ├── handpose │ │ ├── data # (n, 7) │ │ └── attributes # order = [x, y, z, qw, qx, qy, qz] │ └── tactile │ ├── data # (n, 3465) │ └── attributes # sensor_names, sensor_lengths, etc. ├── righthand │ ├── attributes │ ├── joints │ │ ├── data # (n, 29) │ │ └── attributes │ ├── handpose │ │ ├── data # (n, 7) │ │ └── attributes │ └── tactile │ ├── data # (n, 3465) │ └── attributes ├── obj1 │ ├── data # (n, 17) │ └── attributes # obj_name, obj_id, order/detail ├── obj2 └── [...] ``` --- # License and Citation All the data within this repo is licensed under CC BY-NC-SA 4.0. Please consider citing our project if it contributes to your research. ```bash @misc{PX OmniSharing DB, title = {PX OmniSharing DB}, author = {PX OmniSharing DB}, howpublished = {\url{https://huggingface.co/datasets/paxini/Omnisharing_DB_SampleData}}, year = {2026} } ```