Dataset Viewer

The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

ContentVault

Content packs for Lucky Robots. Each pack contains a metadata.yaml, a thumbnail.png, and a .zip with the pack contents.

Content Packs

Pack Description
ExamplePack Example content pack
Go1VelocityTracking Unitree Go1 velocity tracking
MujocoExample MuJoCo simulation example
Oscillator Oscillator environment
Panda Franka Panda robot
Piper Piper robot
Piper-Pick-Place Piper pick and place task
PiperUnscrewCap Piper cap unscrewing task
Skies-Vol-1 Sky HDRIs volume 1
TheBungalow Bungalow environment
TheLoft Loft environment
TheOffice Office environment
UnitreeG1 Unitree G1 humanoid
UnitreeGo1 Unitree Go1 quadruped
Welcome Welcome / intro pack

CDN

All content packs are served globally via Cloudflare CDN:

https://contentvault.luckyrobots.com/{PackName}/{filename}

For example:

https://contentvault.luckyrobots.com/ExamplePack/metadata.yaml
https://contentvault.luckyrobots.com/TheLoft/TheLoft.zip

How sync works

A Cloudflare Worker (worker/) listens for webhooks from HuggingFace. When a push is made to the HuggingFace dataset repo, the Worker:

  1. Lists all files in the repo via the HuggingFace API
  2. Streams each file directly to Cloudflare R2 (no buffering, handles large zip files)
  3. R2 serves the files through Cloudflare's CDN with a 30-day cache, cached at 300+ edge locations worldwide

Worker setup

cd worker
npm install

Secrets (set via wrangler)

npx wrangler secret put HF_WEBHOOK_SECRET
npx wrangler secret put HF_TOKEN

Deploy

npx wrangler deploy

Commands

Setup

# Install huggingface_hub with fast upload backend
pip install -U "huggingface_hub[hf_xet]"

# Login (requires write-access token from https://huggingface.co/settings/tokens)
hf auth login

CLI Commands

Upload entire folder

hf upload <repo-id> <local-folder> . --repo-type dataset --commit-message "Your message"

Upload a single pack/subfolder

hf upload <repo-id> <local-folder>/<pack-name> <pack-name> --repo-type dataset --commit-message "Add <pack-name>"

Upload a single file

hf upload <repo-id> <local-file> <path-in-repo> --repo-type dataset --commit-message "Add file"

Upload large folder (resumable, multi-threaded, auto-retry)

hf upload-large-folder <repo-id> <local-folder> --repo-type=dataset

Delete files remotely (without re-uploading everything)

hf upload <repo-id> . . --repo-type dataset --include="" --delete="<folder-name>/*" --commit-message "Remove <folder-name>"

Create a new dataset repo

hf repo create <repo-name> --type dataset

Create under an organization

hf repo create <repo-name> --type dataset --organization <org-name>

Python One-Liners

Upload large folder (resumable)

python -c "from huggingface_hub import HfApi; HfApi().upload_large_folder(repo_id='<repo-id>', repo_type='dataset', folder_path='<local-folder>')"

Upload a single folder/pack

python -c "from huggingface_hub import HfApi; HfApi().upload_folder(folder_path='<local-folder>', path_in_repo='<path-in-repo>', repo_id='<repo-id>', repo_type='dataset', commit_message='Your message')"

Upload a single file

python -c "from huggingface_hub import HfApi; HfApi().upload_file(path_or_fileobj='<local-file>', path_in_repo='<path-in-repo>', repo_id='<repo-id>', repo_type='dataset', commit_message='Your message')"

Delete a folder

python -c "from huggingface_hub import HfApi; HfApi().delete_folder('<folder-name>', repo_id='<repo-id>', repo_type='dataset', commit_message='Remove <folder-name>')"

Delete a single file

python -c "from huggingface_hub import HfApi; HfApi().delete_file('<file-path>', repo_id='<repo-id>', repo_type='dataset', commit_message='Remove <file-path>')"

Create a repo

python -c "from huggingface_hub import HfApi; HfApi().create_repo(repo_id='<repo-id>', repo_type='dataset')"

Python Script Examples

Upload large folder (full script with options)

from huggingface_hub import HfApi

api = HfApi()
api.upload_large_folder(
    repo_id="<repo-id>",
    repo_type="dataset",
    folder_path="<local-folder>",
)

Upload a folder to a specific path in the repo

from huggingface_hub import HfApi

api = HfApi()
api.upload_folder(
    folder_path="<local-folder>",
    path_in_repo="<path-in-repo>",
    repo_id="<repo-id>",
    repo_type="dataset",
    commit_message="Your message",
    ignore_patterns=["*.cache", ".git/*"],
)

Batch delete + upload in one commit

from huggingface_hub import HfApi, CommitOperationAdd, CommitOperationDelete

api = HfApi()
operations = [
    CommitOperationDelete(path_in_repo="<old-folder>/"),
    CommitOperationAdd(path_in_repo="<file-path>", path_or_fileobj="<local-file>"),
]
api.create_commit(
    repo_id="<repo-id>",
    repo_type="dataset",
    operations=operations,
    commit_message="Your message",
)

Tips

  • Resumable uploads: upload_large_folder and hf upload-large-folder are resumable. If interrupted, run the same command again.
  • Don't use raw git push for large files -- use the CLI/API instead. They handle LFS/Xet automatically.
  • Write token required: Tokens default to read-only. Enable write access at https://huggingface.co/settings/tokens.
  • 50 GB max per file on HuggingFace. Split larger files before uploading.
  • Ignore patterns: Use --include and --exclude (CLI) or ignore_patterns (Python) to filter what gets uploaded.
  • Delete the .cache/huggingface/ folder inside your local folder to reset upload state if something goes wrong with upload_large_folder.
Downloads last month
71