Skip to main content

Advertisement

Log in

SEVAR: a stereo event camera dataset for virtual and augmented reality

SEVAR: 用于虚拟和增强现实场景的双目事件相机数据集

  • Correspondence
  • Published:
Frontiers of Information Technology & Electronic Engineering Aims and scope Submit manuscript

Conclusions

In this paper, we present a precisely synchronized event-based dataset, designed especially for multi-sensor fusion in SLAM applications, with a particular emphasis on VR and AR scenarios. Alongside setting up commonly used stereo regular cameras and an IMU, we have integrated stereo event cameras. We specialize in recording sequences to imitate real-life scenarios, while adding challenging sequences such as low light and fast motion. Consequently, it is our aspiration that this dataset will serve as a valuable resource for the advancement of research in the domain of event-based multi-sensor fusion algorithms.

摘要

近年来, 事件相机以其低延迟、 高动态范围和高时间分辨率等特点受到越来越多关注. 这些特点使它特别适合应用于虚拟和增强现实(VR/AR)领域. 为了促进事件相机在VR/AR应用中的三维感知和定位算法的发展, 我们引入用于虚拟和增强现实场景的双目事件相机数据集(SEVAR). 该数据集以头戴式设备为主体, 覆盖几种常见的室内场景序列, 包括面向事件相机的快速运动和高动态范围的挑战性情景. 我们发布了第一组VR/AR场景的感知和定位数据集, 该数据集由双目事件体相机、 30 Hz双目标准相机和1000 Hz惯性测量单元采集. 相机的放置方式、 视场和分辨率与商用头戴设备(如Meta Quest Pro)相似. 所有传感器在硬件上进行时间同步. 为更好地开展定位精度和轨迹的评估, 提供了由动作捕捉系统捕捉的位姿真值. 数据集见https://github.com/sevar-dataset/sevar.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Data availability

The dataset can be found at https://github.com/sevar-dataset/sevar.

References

Download references

Author information

Authors and Affiliations

Authors

Contributions

Yuda DONG designed the research. Yuda DONG, Junchi FENG, and Yinong CAO processed the data. Zichao SHU contributed to hand–eye calibration. Yuda DONG and Zetao CHEN drafted the paper. Xin HE, Jianyu WANG, and Lijun LI helped organize the paper. Yuda DONG, Chunlai LI, and Shijie LIU revised and finalized the paper. Xin HE provided research funding.

Corresponding authors

Correspondence to Zetao Chen  (陈泽涛) or Xin He  (何欣).

Ethics declarations

All the authors declare that they have no conflict of interest.

Additional information

Project supported by the Zhejiang Provincial Natural Science Foundation of China (No. 2023C03012), the Postdoctoral Preferential Funding Project of Zhejiang Province, China (No. ZJ2022116), and the Independent Project of Hangzhou Institute for Advanced Study, China (No. B02006C019014)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dong, Y., Chen, Z., He, X. et al. SEVAR: a stereo event camera dataset for virtual and augmented reality. Front Inform Technol Electron Eng 25, 755–762 (2024). https://doi.org/10.1631/FITEE.2400011

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1631/FITEE.2400011

关键词