ContentVault / README.md
devrim99's picture
Update README for model repo with proper frontmatter
abd099a
---
license: mit
tags:
- content-packs
- robotics
- 3d-environments
- luckyrobots
---
# ContentVault
Content packs for [Lucky Robots](https://luckyrobots.com). Each pack contains a `metadata.yaml`, a `thumbnail.png`, and a `.zip` with the pack contents.
## Clone
```bash
git clone https://huggingface.co/luckyrobots/ContentVault
```
## Content Packs
| Pack | Description |
|------|-------------|
| Bunglalow | Bungalow environment |
| Go2VelocityTracking | Unitree Go2 velocity tracking |
| Loft | Loft environment |
| MujocoExample | MuJoCo simulation example |
| Office | Office environment |
| Oscillator | Oscillator environment |
| Panda | Franka Panda robot |
| Piper | Piper robot |
| PiperBlockStacking | Piper block stacking task |
| PiperPickPlace | Piper pick and place task |
| PiperUnscrewCap | Piper cap unscrewing task |
| SO100PickAndPlace | SO-100 pick and place task |
| Skies-Vol-1 | Sky HDRIs volume 1 |
| UnitreeG1 | Unitree G1 humanoid |
| UnitreeGo2 | Unitree Go2 quadruped |
| Welcome | Welcome / intro pack |
## CDN
All content packs are served globally via Cloudflare CDN:
```
https://contentvault.luckyrobots.com/{PackName}/{filename}
```
For example:
```
https://contentvault.luckyrobots.com/Loft/metadata.yaml
https://contentvault.luckyrobots.com/Loft/Loft.zip
```
## How sync works
A Cloudflare Worker (`worker/`) listens for webhooks from HuggingFace. When a push is made to this repo, the Worker:
1. Parses the commit diff to find changed files
2. Streams each file directly to Cloudflare R2 (no buffering, handles large zip files)
3. Deletes removed files from R2
4. R2 serves the files through Cloudflare's CDN with a 30-day cache at 300+ edge locations worldwide
## Worker setup
```bash
cd worker
npm install
```
### Secrets (set via wrangler)
```bash
npx wrangler secret put HF_WEBHOOK_SECRET
npx wrangler secret put HF_TOKEN
```
### Deploy
```bash
npx wrangler deploy
```
## Setup
```bash
# Install huggingface_hub with fast upload backend
pip install -U "huggingface_hub[hf_xet]"
# Login (requires write-access token from https://huggingface.co/settings/tokens)
hf auth login
```
---
## CLI Commands
### Upload entire folder
```bash
hf upload luckyrobots/ContentVault <local-folder> . --commit-message "Your message"
```
### Upload a single pack/subfolder
```bash
hf upload luckyrobots/ContentVault <local-folder>/<pack-name> <pack-name> --commit-message "Add <pack-name>"
```
### Upload a single file
```bash
hf upload luckyrobots/ContentVault <local-file> <path-in-repo> --commit-message "Add file"
```
### Upload large folder (resumable, multi-threaded, auto-retry)
```bash
hf upload-large-folder luckyrobots/ContentVault <local-folder>
```
### Delete files remotely
```bash
hf upload luckyrobots/ContentVault . . --include="" --delete="<folder-name>/*" --commit-message "Remove <folder-name>"
```
---
## Tips
- **Resumable uploads**: `upload_large_folder` and `hf upload-large-folder` are resumable. If interrupted, run the same command again.
- **Don't use raw `git push`** for large files — use the CLI/API instead. They handle LFS/Xet automatically.
- **Write token required**: Tokens default to read-only. Enable write access at https://huggingface.co/settings/tokens.
- **50 GB max per file** on HuggingFace. Split larger files before uploading.
- **Ignore patterns**: Use `--include` and `--exclude` to filter what gets uploaded.