ESLAM: Efficient Dense SLAM System Based on Hybrid Representation of Signed Distance Fields

Mohammad Mahdi Johari 1,2, Camilla Carta 3, François Fleuret 4,2,1
1 Idiap Research Institute    2 EPFL    3 ams Osram    4 University of Geneva   

CVPR 2023 Highlight



Abstract

We present ESLAM, an efficient implicit neural representation method for Simultaneous Localization and Mapping (SLAM). ESLAM reads RGB-D frames with unknown camera poses in a sequential manner and incrementally reconstructs the scene representation while estimating the current camera position in the scene. We incorporate the latest advances in Neural Radiance Fields (NeRF) into a SLAM system, resulting in an efficient and accurate dense visual SLAM method. Our scene representation consists of multi-scale axis-aligned perpendicular feature planes and shallow decoders that, for each point in the continuous space, decode the interpolated features into Truncated Signed Distance Field (TSDF) and RGB values. Our extensive experiments on three standard datasets, Replica, ScanNet, and TUM RGB-D show that ESLAM improves the accuracy of 3D reconstruction and camera localization of state-of-the-art dense visual SLAM methods by more than 50%, while it runs up to ×10 faster and does not require any pre-training.

overview_image



Qualitative Analysis with Textured Meshes

Hover over image to move the zoomed in patch

Click on reference image to switch to a different image

Reference
iMAP*
NICE-SLAM
ESLAM (Ours)

    iMAP* is the reimplementation of iMAP by NICE-SLAM.



Qualitative Analysis with Untextured Meshes

Hover over image to move the zoomed in patch

Click on reference image to switch to a different image

Reference
iMAP*
NICE-SLAM
ESLAM (Ours)

    iMAP* is the reimplementation of iMAP by NICE-SLAM.


Visualization on Replica

ESLAM vs. iMAP*
ESLAM vs. NICE-SLAM
iMAP* is the reimplementation of iMAP by NICE-SLAM.

Visualization on ScanNet

ESLAM vs. iMAP*
ESLAM vs. NICE-SLAM
iMAP* is the reimplementation of iMAP by NICE-SLAM.

BibTeX


@inproceedings{johari-et-al-2023,
  author = {Johari, M. M. and Carta, C. and Fleuret, F.},
  title = {{ESLAM}: Efficient Dense SLAM System Based on Hybrid Representation of Signed Distance Fields},
  booktitle = {Proceedings of the IEEE international conference on Computer Vision and Pattern Recognition (CVPR)},
  year = {2023},
  type = {Highlight}
}