somehand maps human hand motion to configurable robot dexterous-hand models — from webcam input all the way to real hardware control.
| Source | Description |
|---|---|
| MediaPipe webcam | Live hand tracking from a camera |
| MediaPipe video | Offline tracking from a video file |
| PICO VR | Live hand tracking via PICO Bridge |
| hc_mocap UDP | Live hand data over UDP |
| Saved recordings | Replay .pkl recordings from any source above |
| Backend | Description |
|---|---|
| viewer | MuJoCo visualization (single or bi-hand) |
| sim | MuJoCo simulation with physics |
| real | Real-hand hardware control (single-hand only) |
| Doc | What it covers |
|---|---|
| Getting Started | Install, asset download, optional SDK setup, first run |
| Runtime Modes | What each CLI mode does, when to use it, all options |
| Configuration | YAML config layout, schema, and which file to edit |
| Assets & Models | Asset groups, external repos, 20+ supported hand models |
| Troubleshooting | Common setup and runtime issues with fixes |
| Maintainer Guide | Update workflow, verification, maintenance rules |
This repository is optimized for:
- Single-hand retargeting with configurable YAML models
- Bi-hand visualization and replay in
viewermode - Asset-light source control — large runtime assets hosted externally
Read Runtime Modes before assuming a specific input + backend combination is supported.