DepthSLAM on a Mobile Device
Bachelor Thesis
About This Topic
Simultaneous localization and mapping (SLAM) using depth data from structured light (i.e. MS Kinect) has become very popular in recent years to enable markerless tracking, 3D mapping, scene understanding and visual odometry.
In this thesis, upcoming structured light sensors are used to perform depth SLAM on an off the shelf mobile device. Therefore, the sensor's data are directly accessed by the mobile device and processed using GPU computing of upcoming mobile graphics chips. The developed research prototype is then tested in indoor environment for robustness, performance and size of mappable areas.
Programming: Java, C/C++, CUDA
SDKs/Platforms: Android SDK and NDK, OpenNI