The diploma thesis discusses touchless interaction techniques on handheld devices to intuitively manipulate a virtual 3D scene that is presented to the user on the handheld display. The robustness and performance of different solutions and combinations to detect the user´s hand and the fingertips without markers are examined. Therefore, the first methods focus on using the built-in RGB camera, while later the built-in camera is combined with an additional depth sensor. Hand position and finger gestures are used to select and move objects in the virtual scene. Furthermore, the user´s head position is tracked to adapt the perspective of the virtual scene in order to create a 3D impression on the device display. The approaches use RGB data for hand segmentation, gesture detection and the hand size or the maximum gray scale value of the hand for relative depth estimation. In addition RGBD data is used for improved hand segmentation and absolute depth estimation. To detect two different finger gestures or the palm of the hand, Haar-like feature-based cascaded classifiers were trained. If the classifier is used to recognize the palm, the finger gestures are determined by the amount of detected fingertips. Therefore, different image processing operations are applied for hand segmentation and its contour is used for fingertip detection. An already trained Haar-like feature cascade classifier is implemented to detect the user´s face and obtain its 3D position with relative depth estimation using the size of the face. Within the diploma thesis an Android application is developed using OpenCV, Unity3D und OpenNI. The hardware prototype rigidly connects the handheld device with the depth sensor (Microsoft Kinect) to enable correct calibration and mapping of the received RGBD data. The performance of the approaches for gesture recognition were systematically evaluated by comparing their accuracy under varying illumination and background. Based on this study, useful guidelines for developers were derived to choose the appropriate technique for their mobile interaction task. Furthermore, an experimental study was conducted using the detected finger gestures to perform the two canonical 3D interaction tasks selection and positioning and demonstrate the different characteristics of the depth estimation methods. Overall the best result was obtained using RGBD data for finger gesture detection and absolute depth estimation at the expense of latency.
D. Fritz: "Intuitive und markerlose Interaktion in einer mobilen Virtual Reality Anwendung auf Basis von RGBD-Daten"; Supervisor: H. Kaufmann, A. Mossel; Institut für Softwaretechnik und Interaktive Systeme, 2014; final examination: 11-2014.
Click into the text area and press Ctrl+A/Ctrl+C or ⌘+A/⌘+C to copy the BibTeX into your clipboard… or download the BibTeX.