David Woollard
commited on
Commit
·
c01605a
1
Parent(s):
4144632
initial datacard w/ visual
Browse files- README.md +91 -1
- assets/visual.mp4 +3 -0
README.md
CHANGED
|
@@ -1,3 +1,93 @@
|
|
| 1 |
---
|
| 2 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
+
pretty_name: Tracks
|
| 3 |
+
license: CC-BY-NC-4.0
|
| 4 |
+
tags:
|
| 5 |
+
- computer-vision
|
| 6 |
+
- human-motion
|
| 7 |
+
- robotics
|
| 8 |
+
- trajectory
|
| 9 |
+
- pose-estimation
|
| 10 |
+
- navigation
|
| 11 |
+
- retail
|
| 12 |
+
task_categories:
|
| 13 |
+
- human-pose-estimation
|
| 14 |
+
- object-tracking
|
| 15 |
+
- motion-prediction
|
| 16 |
+
- reinforcement-learning
|
| 17 |
+
size_categories:
|
| 18 |
+
- 1M<n<100M
|
| 19 |
---
|
| 20 |
+
|
| 21 |
+
# Tracks Dataset
|
| 22 |
+
**Real Human Motion for Robotics Planning and Simulation**
|
| 23 |
+
|
| 24 |
+
---
|
| 25 |
+
|
| 26 |
+
## Overview
|
| 27 |
+
|
| 28 |
+
<video src="assets/visual.mp4"
|
| 29 |
+
autoplay
|
| 30 |
+
loop
|
| 31 |
+
muted
|
| 32 |
+
playsinline
|
| 33 |
+
style="width:100%;height:auto;border-radius:8px;">
|
| 34 |
+
Your browser does not support the video tag.
|
| 35 |
+
</video>
|
| 36 |
+
|
| 37 |
+
The **Tracks Dataset** captures continuous, real-world human movement in retail environments, providing one of the largest and most structured pose-based trajectory corpora available for **robotics** and **embodied AI** research.
|
| 38 |
+
Each record represents **3D pose sequences** sampled at 10 Hz across normalized store coordinates, enabling research in motion planning, human-aware navigation, and humanoid gait learning derived directly from real behavior :contentReference[oaicite:0]{index=0}.
|
| 39 |
+
|
| 40 |
+
---
|
| 41 |
+
|
| 42 |
+
## Key Specifications
|
| 43 |
+
|
| 44 |
+
| Field | Description |
|
| 45 |
+
|:------|:-------------|
|
| 46 |
+
| **Source** | Anonymized in-store multi-camera captures (10 retail sites) |
|
| 47 |
+
| **Scope** | ≈ 60 000 hours of human trajectory data (plus 1-hour evaluation subset) |
|
| 48 |
+
| **Format** | CSV schema, ROS 2–compatible via playback plug-in |
|
| 49 |
+
| **Sampling Frequency** | 10 Hz (10 FPS) |
|
| 50 |
+
| **Pose Structure** | 26 keypoints per person per frame (3D coordinates) |
|
| 51 |
+
| **Environment** | Real retail environments with normalized floor layouts |
|
| 52 |
+
| **Evaluation Subset** | One-hour segment including trajectories + store layout |
|
| 53 |
+
| **Key Metrics** | ≈ 2.3 M unique shoppers |
|
| 54 |
+
| **Anonymization** | Face and body suppression; coordinate-only representation |
|
| 55 |
+
| **Governance** | Managed under Standard AI’s data governance policies aligned with GDPR/CCPA and Responsible AI principles :contentReference[oaicite:1]{index=1} |
|
| 56 |
+
|
| 57 |
+
---
|
| 58 |
+
|
| 59 |
+
## Integration & Applications
|
| 60 |
+
|
| 61 |
+
- Distributed in **CSV** with schema documentation and import notebooks.
|
| 62 |
+
- Ready for **ROS 2** integration for **path planning** and **human–robot interaction** simulation.
|
| 63 |
+
- Compatible with **Python**, **PyTorch**, and standard **reinforcement-learning** frameworks :contentReference[oaicite:2]{index=2}.
|
| 64 |
+
|
| 65 |
+
### Example Research Uses
|
| 66 |
+
- Motion prediction and trajectory planning
|
| 67 |
+
- Reinforcement learning for humanoid gait and control
|
| 68 |
+
- Human-aware navigation and avoidance behavior
|
| 69 |
+
- Simulation of human–robot interaction environments :contentReference[oaicite:3]{index=3}
|
| 70 |
+
|
| 71 |
+
---
|
| 72 |
+
|
| 73 |
+
## Access
|
| 74 |
+
|
| 75 |
+
The **Tracks Dataset** is available now for evaluation and licensing.
|
| 76 |
+
- **Evaluation subset:** 1-hour sample under 30-day Evaluation Agreement (private Hugging Face repo).
|
| 77 |
+
- **Full dataset:** 60,000-hour commercial dataset available by request.
|
| 78 |
+
|
| 79 |
+
For inquiries or licensing:
|
| 80 |
+
> ✉️ [[email protected]](mailto:[email protected])
|
| 81 |
+
|
| 82 |
+
---
|
| 83 |
+
|
| 84 |
+
## Citation
|
| 85 |
+
|
| 86 |
+
```bibtex
|
| 87 |
+
@dataset{standardlabs_tracks_2025,
|
| 88 |
+
title = {Tracks Dataset: Real Human Motion for Robotics Planning and Simulation},
|
| 89 |
+
author = {Standard Labs},
|
| 90 |
+
year = {2025},
|
| 91 |
+
publisher = {Hugging Face},
|
| 92 |
+
url = {https://huggingface.co/datasets/standard-labs/tracks}
|
| 93 |
+
}
|
assets/visual.mp4
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:83499751763bf4ab47320c4fc588b416374e7ccb0e95a7e23d4fbece3107cd0f
|
| 3 |
+
size 28121845
|