File size: 1,964 Bytes
6f0655f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
## hf_space_mini_tug Requirements

This directory contains the Streamlit application that runs on Hugging Face Spaces and visualizes each episode.

### Files

- `app.py` – Streamlit entrypoint. Downloads the dataset via `snapshot_download`, exposes an episode selector, video player, and Plotly charts.
- `requirements.txt` – Python dependencies (`streamlit`, `plotly`, `pandas`, `huggingface_hub`).
- `README.md` – Hub metadata (SDK = Streamlit).

### Data Contract

`app.py` expects the dataset `raffaelkultyshev/mini_tug_tape_to_bowl` to have:

- `meta/info.json` describing `episodes`, `data_path`, and `video_path`.
- Parquet files with either:
  1. LeRobot-style `observation.state` columns (`list<float>`) **or**
  2. Long-form `frame_idx`, `joint_name`, `x_cm`, etc. produced by `hand_pose_pipeline.py`.
- RGB MP4s encoded as H.264 / yuv420p / faststart (`videos/chunk-000/rgb/...`).

### Internal Logic

1. `get_dataset_revision()` uses `HfApi.repo_info` so Streamlit caches auto-invalidate when the dataset updates.
2. `load_episode()` reads the Parquet file and, via `build_state_dataframe`, pivots the long table back into the `[wrist_x_cm, ...]` format used by the charts.  
   - This function enforces accuracy bounds by reindexing frames and rejecting missing joint rows.
3. Plots are generated with Plotly (`build_plot`) and share the same axis scaling as the offline PNGs.

### Accuracy Guarantees

- Streamlit charts display the exact values computed by `hand_pose_pipeline.py`. No down-sampling is performed.
- If gaps exist, the app surfaces `NaN` segments directly, ensuring transparency about measurement uncertainty (<3 mm XYZ, <5° angles as validated in `scripts/Requirements.md`).

### Deployment

- Use `huggingface_hub.HfApi.upload_folder(folder_path='hf_space_mini_tug', repo_type='space')`.
- Before deploying, regenerate the `dataset_cache/` by running the app locally or deleting the cache to avoid shipping stale data.***