Skip to content

Latest commit

 

History

History
28 lines (17 loc) · 1.57 KB

README.md

File metadata and controls

28 lines (17 loc) · 1.57 KB

LSD-SLAM: Large-Scale Direct Monocular SLAM

LSD-SLAM is a novel approach to real-time monocular SLAM. It is fully direct (i.e. does not use keypoints / features) and creates large-scale, semi-dense maps in real-time on a laptop. For more information see http://vision.in.tum.de/lsdslam where you can also find the corresponding publications and Youtube videos, as well as some example-input datasets, and the generated output as rosbag or .ply point cloud.

This fork contains a version that relieves the user of the horrors of a ROS dependency and uses the much nicer lightweight Pangolin framework instead.

Related Papers

  • LSD-SLAM: Large-Scale Direct Monocular SLAM, J. Engel, T. Schöps, D. Cremers, ECCV '14

  • Semi-Dense Visual Odometry for a Monocular Camera, J. Engel, J. Sturm, D. Cremers, ICCV '13

1. Quickstart / Minimal Setup

For the setps to setup, please refer to this post. I have successfully setup the project on a Macbook pro (Early 2015, EI Capitan, Xcode 7.2.1) and an iMac (Late 2012, EI Capitan, Xcode 7.3). Let me know if you have any issue.

3. Running

Supports raw PNG images. For example, you can down any dataset from here in PNG format, and run like;

./LSD -c ~/Mono_Logs/LSD_machine/cameraCalibration.cfg -f ~/Mono_Logs/LSD_machine/images/

4. License

LSD-SLAM is licensed under the GNU General Public License Version 3 (GPLv3), see http://www.gnu.org/licenses/gpl.html.