About this Project
Accurate navigation systems for large indoor environments utilizing visual features, which are able to
provide real time 3D positions and rotations, are currently under development. However, research on indoor navigation and especially augmented reality (AR) indoor path generation and visualization is sparse.
Recent methods that focus mainly on outdoor environments are not suitable for indoor scenarios due to the complexity of floor plans, the limited field-of-view of AR devices, and do not consider fast changes in the viewing direction while being guided. Therefore new methods for path planning, obstacles avoidance and navigation visualization have to be developed.
In this project, we propose novel navigation methods to assist mobile indoor navigation by utilizing AR. We present a new dynamic path planning algorithm with real-time obstacle avoidance reacting to changing environments. We also propose three new navigation visualization methods utilizing AR: particles, object-following, and realistic human avatars as guides. In addition, we plan to research real light estimation from shadows using a monocular moving RGB-D camera for realistic lighting, to embed an avatar as a guide. For navigation visualization, different stages of realism will be developed and evaluated on mobile devices. Finally, we will research and integrate haptic feedback to aid navigation.
We plan to evaluate the developed algorithms in comprehensive user studies with respect to the efficiency in navigation, sense of presence and user comfort. We expect our approaches to advance theory and practice of personal indoor navigation.
Funding provided by
Vienna Science and Technology Fund (WWFT)