ATTENTION: This is a web archive! The IMS Group was split up in 2018 and does not exist anymore. Recent work of former members can be found at the VR/AR Group and the Computer Vision Group.

Interactive Media Systems, TU Wien

Ubiquitous Animated Agents for Augmented Reality

Thesis by István Barakonyi

Supervision by Dieter Schmalstieg and Andreas Butz

Abstract

A growing spectrum of Ubiquitous Computing (UbiComp) applications suggests that interaction with computers should be as natural and effortless as using pen, paper and language when writing. Unlike current computer environments that require a considerable amount of adaptation from users for smooth interaction, future digital interfaces are envisioned to act unobtrusively and intelligently in our environment. This dissertation describes a novel user interface approach combining Augmented Reality (AR), UbiComp and Autonomous Animated Agents into a single coherent human-computer interface paradigm that makes steps toward this vision.

A significant challenge for the UbiComp community is to create efficient, natural and user-friendly interfaces since there are no standards and best practices to follow yet. Typical UbiComp scenarios include numerous mobile users roaming a large area while interacting with various stationary and mobile devices. Since the location and behavior of users and devices change rather frequently, an enormous amount of events describing changes gets generated in the environment. Processing such large data sets can be greatly overwhelming for humans, therefore an interface to a UbiComp system is expected to possess certain autonomy in order to filter and interpret relevant events and react proactively without constant user guidance and explicit instructions. By relieving users from dealing with low-level details and allowing computers to make decisions by themselves, these interfaces appear to be "smart".

This thesis presents software solutions that employ reactive, autonomous and social digital assistants in UbiComp environments. These systems rely on software agent technology tailored to the needs of AR applications, where system behavior is visualized by virtual animated characters appearing on top of the real world. We discuss how autonomous animated agents can be employed to mediate communication between humans and computers in AR environments while exploiting real world attributes as input and output communication channels. The agents maintain a model of the real world by analyzing data coming from the sensors that measure physical properties such as pose, velocity, sound or light, and autonomously react to changes in the environment in accordance with the users´ perception. Autonomous, emergent behavior is a novel feature in UbiComp, while awareness of real world attributes is yet unexploited by autonomous agents.

This dissertation explores the requirements for context-aware animated agents concerning visualization, appearance, and behavior as well as associated technologies, application areas, and implementation details. Several application scenarios illustrate design and implementation concepts.

Reference

I. Barakonyi: "Ubiquitous Animated Agents for Augmented Reality"; Supervisor, Reviewer: D. Schmalstieg, A. Butz; Institut für Softwaretechnik und Interaktive Systeme 188/2, 2006; oral examination: 10-31-2006.

BibTeX

Click into the text area and press Ctrl+A/Ctrl+C or ⌘+A/⌘+C to copy the BibTeX into your clipboard… or download the BibTeX.