Accurate depth information is essential for many computer vision applications. Yet, no available dataset recording method allows for fully dense accurate depth estimation in a large scale dynamic environment. In this paper, we introduce DOC-Depth, a novel, efficient and easy-to-deploy approach for dense depth generation from any LiDAR sensor. After reconstructing consistent dense 3D environment using LiDAR odometry, we address dynamic objects occlusions automatically thanks to DOC, our state-of-the art dynamic object classification method. Additionally, DOC-Depth is fast and scalable, allowing for the creation of unbounded datasets in terms of size and time. We demonstrate the effectiveness of our approach on the KITTI dataset, improving its density from 16.1% to 71.2% and release this new fully dense depth annotation, to facilitate future research in the domain. We also showcase results using various LiDAR sensors and in multiple environments.
@article{deMoreau2024doc,title={DOC-Depth: A novel approach for dense depth ground truth generation},author={De Moreau, Simon and Corsia, Mathias and Bouchiba, Hassan and Almehio, Yasser and Bursuc, Andrei and El-Idrissi, Hafid and Moutarde, Fabien},journal={IEEE Intelligent Vehicles Symposium},year={2025},}
2024
LED: Light Enhanced Depth Estimation at Night
Simon
De Moreau, Yasser
Almehio, Andrei
Bursuc, and
3 more authors
Nighttime camera-based depth estimation is a highly challenging task, especially for autonomous driving applications, where accurate depth perception is essential for ensuring safe navigation. We aim to improve the reliability of perception systems at night time, where models trained on daytime data often fail in the absence of precise but costly LiDAR sensors. In this work, we introduce Light Enhanced Depth (LED), a novel cost-effective approach that significantly improves depth estimation in low-light environments by harnessing a pattern projected by high definition headlights available in modern vehicles. LED leads to significant performance boosts across multiple depth-estimation architectures (encoder-decoder, Adabins, DepthFormer) both on synthetic and real datasets. Furthermore, increased performances beyond illuminated areas reveal a holistic enhancement in scene understanding. Finally, we release the Nighttime Synthetic Drive Dataset, a new synthetic and photo-realistic nighttime dataset, which comprises 49,990 comprehensively annotated images.
@article{deMoreau2024led,title={LED: Light Enhanced Depth Estimation at Night},author={De Moreau, Simon and Almehio, Yasser and Bursuc, Andrei and El-Idrissi, Hafid and Stanciulescu, Bogdan and Moutarde, Fabien},journal={arXiv preprint arXiv:2409.08031},year={2024},}
2021
Development of agricultural robot platform with virtual laboratory capabilities
German
Monsalve, Oriane
Thiery, Simon
De Moreau, and
1 more author
In IECON 2021–47th Annual Conference of the IEEE Industrial Electronics Society, 2021
Agricultural robots are called to help in many tasks in emerging clean and sustainable agriculture. These complex electro-mechanical systems can actually integrate artificial intelligence (AI), the Internet of Things (IoT), sensors, actuators, and advanced control methods to accomplish functions in autonomous or in collaborative ways. Before the deployment of such techniques in the field, it is convenient to carry out laboratory validations. These last could be at the sub-system, e.g., sensors or servos operation, or the whole system level. This paper proposes the development of the hardware and software parts of a platform of agricultural robot. The proposed system, highly motivated by the restrictions imposed by COVID-19 context, enables laboratory tests virtualization while keeping real-time functionalities
@inproceedings{monsalve2021development,title={Development of agricultural robot platform with virtual laboratory capabilities},author={Monsalve, German and Thiery, Oriane and De Moreau, Simon and Cardenas, Alben},booktitle={IECON 2021--47th Annual Conference of the IEEE Industrial Electronics Society},pages={1--6},year={2021},organization={IEEE},}