Datasets:
The dataset viewer is not available for this split.
Error code: TooBigContentError
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
TimeLens-Bench
📑 Paper | 💻 Code | 🏠 Project Page | 🤗 Model & Data | 🏆 TimeLens-Bench Leaderboard
✨ Dataset Description
TimeLens-Bench is a comprehensive, high-quality evaluation benchmark for video temporal grounding, proposed in our paper TimeLens: Rethinking Video Temporal Grounding with Multimodal LLMs.
During our annotation process, we identified critical quality issues within existing datasets and performed extensive manual corrections. We observed a dramatic re-ranking of models on TimeLens-Bench compared to legacy benchmarks, demonstrating that TimeLens-Bench provides more reliable evaluation for video temporal grounding.
(See more details in our paper and project page.)

📊 Dataset Statistics
The benchmark consists of manually refined versions of three widely used evaluation datasets for video temporal grounding:
| Refined Dataset | # Videos | Avg. Duration | # Annotations | Source Dataset | Source Dataset Link |
|---|---|---|---|---|---|
| Charades-TimeLens | 1313 | 29.6 | 3363 | Charades-STA | https://github.com/jiyanggao/TALL |
| ActivityNet-TimeLens | 1455* | 134.9 | 4500 | ActivityNet-Captions | https://cs.stanford.edu/people/ranjaykrishna/densevid/ |
| QVHighlights-TimeLens | 1511 | 149.6 | 1541 | QVHighlights | https://github.com/jayleicn/moment_detr |
* To reduce the high evaluation cost from the excessively large ActivityNet Captions, we sampled videos uniformly across duration bins to curate ActivityNet-TimeLens.
🚀 Usage
To download and use the benchmark for evaluation, please refer to the instructions in our GitHub Repository.
📝 Citation
If you find our work helpful for your research and applications, please cite our paper:
@article{zhang2025timelens,
title={TimeLens: Rethinking Video Temporal Grounding with Multimodal LLMs},
author={Zhang, Jun and Wang, Teng and Ge, Yuying and Ge, Yixiao and Li, Xinhao and Shan, Ying and Wang, Limin},
journal={arXiv preprint arXiv:2512.14698},
year={2025}
}
- Downloads last month
- 83