Out of the Room: Generalizing Event-Based Dynamic Motion Segmentation for Complex Scenes

التفاصيل البيبلوغرافية
العنوان: Out of the Room: Generalizing Event-Based Dynamic Motion Segmentation for Complex Scenes
المؤلفون: Georgoulis, Stamatios, Ren, Weining, Bochicchio, Alfredo, Eckert, Daniel, Li, Yuanyou, Gawel, Abel
سنة النشر: 2024
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computer Vision and Pattern Recognition
الوصف: Rapid and reliable identification of dynamic scene parts, also known as motion segmentation, is a key challenge for mobile sensors. Contemporary RGB camera-based methods rely on modeling camera and scene properties however, are often under-constrained and fall short in unknown categories. Event cameras have the potential to overcome these limitations, but corresponding methods have only been demonstrated in smaller-scale indoor environments with simplified dynamic objects. This work presents an event-based method for class-agnostic motion segmentation that can successfully be deployed across complex large-scale outdoor environments too. To this end, we introduce a novel divide-and-conquer pipeline that combines: (a) ego-motion compensated events, computed via a scene understanding module that predicts monocular depth and camera pose as auxiliary tasks, and (b) optical flow from a dedicated optical flow module. These intermediate representations are then fed into a segmentation module that predicts motion segmentation masks. A novel transformer-based temporal attention module in the segmentation module builds correlations across adjacent 'frames' to get temporally consistent segmentation masks. Our method sets the new state-of-the-art on the classic EV-IMO benchmark (indoors), where we achieve improvements of 2.19 moving object IoU (2.22 mIoU) and 4.52 point IoU respectively, as well as on a newly-generated motion segmentation and tracking benchmark (outdoors) based on the DSEC event dataset, termed DSEC-MOTS, where we show improvement of 12.91 moving object IoU.
Comment: 3DV 2024, the first two authors contributed equally
نوع الوثيقة: Working Paper
الوصول الحر: http://arxiv.org/abs/2403.04562Test
رقم الانضمام: edsarx.2403.04562
قاعدة البيانات: arXiv