ATTENTION: This is a web archive! The IMS Group was split up in 2018 and does not exist anymore. Recent work of former members can be found at the VR/AR Group and the Computer Vision Group.

Interactive Media Systems, TU Wien

Projects

Projects Feed

Research projects and lab projects. Click a project title or picture to learn more about
a particular project, or click the button on the right to subscribe to the projects feed.

Completed Projects

Virtual and Augmented Reality

Vienna Virtual Reality Framework for Unreal Vienna Virtual Reality Framework for Unreal   (“V2RFUn”)
H. Kaufmann — November 2017 to October 2022
Development of a framework with flexible tracking options, device integration, robot and avatar interaction and much more for VR applications in…

Robot Supported Virtual and Augmented Reality Robot Supported Virtual and Augmented Reality   (“RobotVR”)
E. Vonach — October 2013 to June 2020 • Keywords: Immersive Virtual Reality, Encounter-type haptics, Prop-based haptic feedback, Robotics
The issue of inadequate haptic feedback in VR/AR setups is an open problem. Haptic or tactile gloves as well as force-feedback devices approach the…

Learn VR: Introduction to Virtual and Augmented Reality Learn VR: Introduction to Virtual and Augmented Reality   (“VR course”)
H. Kaufmann — June 2016 to December 2019 • Keywords: Teaching, Immersive Virtual Reality, Augmented Reality
Creating high quality virtual reality (VR) experience takes time and requires extensive practice. Although, multiple virtual and augmented reality…

Bridges in VR Bridges in VR   (“VR Bridges”)
K. Vasylevska — January to August 2019
Virtual Reality group from TU Wien is looking for study participants!

Immersive Interaction in Streamed Dense 3D Surface Reconstructions Immersive Interaction in Streamed Dense 3D Surface Reconstructions   (“ImmersivePointClouds”)
A. Mossel — March 2015 to July 2018 • Keywords: Virtual Reality, Immersive, Selection, Navigation, Dense 3D Surface Reconstructions, Point Clouds, Perception, Occlusion Management
Within the project, we investigate novel methods to understand and interact with large dense 3D surface reconstructions while being immersive within…

ImmersiveDeck: A large-scale wireless VR system for multiple users ImmersiveDeck: A large-scale wireless VR system for multiple users
H. Kaufmann — January 2015 to August 2017
We present a low-cost multi-user immersive Virtual Reality system that enables collaborative experiences in large virtual environments. In the…

Stereo video see-through head mounted display Stereo video see-through head mounted display   (“SteVSees”)
C. Schönauer — September 2014 to March 2017 • Keywords: Stereo, video see-through, head mounted display, stereo camera
Für stereoskopische Augmented Reality Anwendungen benötigt man ein see-through Headset/Head Mounted Display. Die reale Welt sieht man dabei entweder…

Assessing the Sensory Functionality of Children with Autism Spectrum Disorder Assessing the Sensory Functionality of Children with Autism Spectrum Disorder   (“Autism”)
C. Schönauer — June 2013 to February 2017 • Keywords: Autism, Motion Tracking, Sensors, Depth Sensor
The causes and mechanisms of Autism Spectrum Disorder (ASD) are not yet fully understood. One line of research that should translate into more…

Virtual Reality Menu Interaction with a Smartwatch Virtual Reality Menu Interaction with a Smartwatch   (“Smartwatch”)
I. Podkosova — December 2015 to September 2016 • Keywords: Vitual Reality, VR Menus, Smartwatch
In immersive virtual environments users are placed inside virtual reality simulations. In such systems, menus are needed to change the system state,…

SmartCopter - Autonomous Flight with a Smartphone as On-Board Processing Unit SmartCopter - Autonomous Flight with a Smartphone as On-Board Processing Unit
A. Mossel — January 2012 to December 2015 • Keywords: Unmanned Aerial Vehicles, Autnonomous Localization, Mixed Reality, Autonomous Flight
This project combines a number of research activities to design and develop UAVs based on off the shelf hardware. The major goals of our research…

The Virtualizer - A New Locomotion Device The Virtualizer - A New Locomotion Device   (“Virtualizer”)
H. Kaufmann — May 2013 to December 2015 • Keywords: Locomotion Device, Virtualizer, Cyberith
The Virtualizer is a locomotion device with integrated sensors for motion detection. With this device you can move, run, jump, crouch, strafe in your…

Multimodal Motion Guidance: Techniques for Adaptive and Dynamic Feedback Multimodal Motion Guidance: Techniques for Adaptive and Dynamic Feedback   (“Motion Guidance”)
C. Schönauer — November 2011 to December 2015
The ability to guide human motion through automatically generated feedback has significant potential for applications in areas, such as motor…

Wide Area Indoor Tracking Wide Area Indoor Tracking   (“WideTracker”)
A. Mossel — January 2013 to August 2015 • Keywords: Wide Area Tracking, Real-Time Optical Tracking, Indoor, Unconstrained Environments, Harsh Environments
In this project, a real-time tracking system is developed that tracks multiple active infrared targets in real-time within large volumes. A minimum…

MediCubes: Design of a Health Monitoring Toy for Children MediCubes: Design of a Health Monitoring Toy for Children
E. Vonach — December 2013 to February 2015 • Keywords: Health monitoring, home monitoring, pervasive healthcare, storytelling toy, children, Tangible User Interface, sensors.
Especially for children monitoring their health can result in a stressful situation, even when conducted at home by their parents. Current advances…

ARTiFICe - Augmented Reality Framework for Distributed Collaboration ARTiFICe - Augmented Reality Framework for Distributed Collaboration
A. Mossel — January 2011 to December 2014 • Keywords: Virtual Reality Framework, 3D Interaction Techniques, Collaboration, Mobile Augmented Reality, Mobile Interaction, Low-cost Tracking Devices
This project aims on developing a flexible and powerful VR/AR framework build around on an off the shelf game engine (Unity3D). It offers rapid…

Natural User Interfaces for Mobile Mixed Reality Natural User Interfaces for Mobile Mixed Reality   (“Mobile Interaction”)
A. Mossel — January 2012 to December 2014 • Keywords: Handheld Mixed Reality, 3D Interaction, Human Computer Interface, Natural User Interface, Interaction Methaphors, Gesture Recognition
This project combines a number of research activities to create intuitive ways to interact with 3D content in a mobile mixed reality environment. Our…

OpenTracker v2.0 - An Open Software Framework for Virtual Reality Input OpenTracker v2.0 - An Open Software Framework for Virtual Reality Input   (“OpenTracker2”)
H. Kaufmann — January 2011 to August 2014 • Keywords: Virtual Reality, Augmented Reality, Tracking, 3D Interaction, ARTiFICe
OpenTracker was originally developed in 2002/2003 @ IMS (see OpenTracker project page @ TU Graz) to be generic solution to the different tasks…

MoveOnPC - Supporting the PlayStation Move controller on the PC MoveOnPC - Supporting the PlayStation Move controller on the PC
H. Kaufmann — November 2010 to December 2013 • Keywords: Playstation Move Controller, 3D Input, OpenCV, Sensor Fusion
In October/September 2010 Sony released the PlayStation Move – a motion-sensing game controller for the PlayStation 3. The Move controller is an…

Vertical Navigation Methaphors: Virtual Elevator Vertical Navigation Methaphors: Virtual Elevator   (“Virtual Elevator”)
K. Vasylevska — April to September 2013 • Keywords: Virtual Reality, Redirected Walking, Spatial Presence, Haptic Feedback, Visual Realism
A sense of spatial presence as a feeling of “being there” is an important part of the virtual experience. Navigation is a fundamental task in virtual…

iOrb - Unifying Command and 3D Input for Mobile Augmented Reality iOrb - Unifying Command and 3D Input for Mobile Augmented Reality
G. Reitmayr — January 2004 to December 2005 • Keywords: Studierstube, Mobile Computing, Augmented Reality
Input for mobile augmented reality systems is notoriously difficult. Three dimensional visualization would be ideally accompanied with 3D…

The Invisible Train: A Handheld Augmented Reality Game The Invisible Train: A Handheld Augmented Reality Game   (“Invisible Train”)
T. Pintaric — January 2004 to December 2005 • Keywords: Augmented Reality, Mobile Devices, Wearable Devices, PDA, Games
The Invisible Train is the first real multi-user Augmented Reality application for handheld devices (PDAs). Unlike other projects, in which wearable…

OpenTracker - An Open Software Framework for Virtual Reality Input OpenTracker - An Open Software Framework for Virtual Reality Input
H. Kaufmann — January 2001 to December 2004 • Keywords: Studierstube, Augmented Reality, Tracking, Distributed Systems
OpenTracker is developed to be generic solution to the different tasks involved in tracking input devices and processing tracking data for virtual…

Augmented Presentation and Interaction Language Augmented Presentation and Interaction Language   (“APRIL”)
F. Ledermann — January 2002 to December 2003 • Keywords: Studierstube
An XML language for authoring content of story-based augmented reality presentations.

Outdoor Collaborative Augmented Reality Outdoor Collaborative Augmented Reality   (“OCAR”)
G. Reitmayr — January to December 2003 • Keywords: Studierstube, Augmented Reality, Mobile Computing, GPS
The aim of this project is to investigate how two or more users can interact on such tasks as navigation and information browsing/display with the…

Pivy Pivy
T. Fahmy — January 2002 to December 2003 • Keywords: Studierstube
Pivy allows you to write Open Inventor applications in Python. It will be possible to interactively edit Open Inventor programs from within the…

Signpost Signpost
D. Schmalstieg — January 2001 to December 2003 • Keywords: Studierstube, Augmented Reality, Tracking, Mobile Computing, Applications
SignPost is an augmented reality application that is able to guide a person through an unfamiliar building.

Studierstube Render Array Studierstube Render Array   (“Stuberena”)
D. Schmalstieg — January to December 2003 • Keywords: Studierstube
The Studierstube Render Array creates a seamless tiled display using multiple overlapping projectors and a cluster of PCs.

Virtual Showcase Virtual Showcase
F. Ledermann — January 2001 to December 2003 • Keywords: Studierstube, Augmented Reality, Visualization
The project aims at developing the knowledge and technology for Virtual Showcases to become standard equipment for museums and other public…

Media Processing

Semantic Multimodal TU Information System Semantic Multimodal TU Information System   (“mplTIS2”)
H. Eidenberger — December 2015 to December 2020
Enhance the existing TU Wien information system TISS by semantic and multimodal components, including but not limited to, semantic search over all…

Audio Analysis and Authoring Audio Analysis and Authoring   (“mplAudio”)
H. Eidenberger — July 2000 to December 2019 • Keywords: Audio Retrieval, Speech Processing, Sound Recognition, Music Analysis
The audio analysis and authoring project comprises all our efforts in the area of audio signal processing and categorization. Projects range from…

Biosignal Understanding Biosignal Understanding   (“mplBiosignal”)
H. Eidenberger — January 2004 to December 2019 • Keywords: Brain Computer Interface, EEG, EMG, Noise Mesurement
Biosignal processing activities in the IMS group range from mental cursor control over various applications for the brain computer interface to the…

Geographical Media Systems Geographical Media Systems   (“mplGeomedia”)
H. Eidenberger — January 2000 to December 2019 • Keywords: Location-Based Services, Gemedia Referencing, Map-Based Sonfication
The Geomedia project summarizes our efforts in the area of location-based media presentation, summarization and referencing. Tasks range from classic…

Machine Learning for Multimedia Understanding Machine Learning for Multimedia Understanding   (“mplLearning”)
H. Eidenberger — January 1998 to December 2019 • Keywords: Limits of Categorization, Psychophysics, Similarity Modeling, Kernel-Based Learning
The machine learning activities of the media processing group derive from the necessity to classify media objects described by signal processing in…

Media Authoring and Production Media Authoring and Production   (“mplAuthoring”)
H. Eidenberger — January 2002 to December 2019 • Keywords: Mobile Apps, Media Storage, Media Applications, Media Production
In this project, we endeavor to frame media content with application design, thus giving interactivity to the per se passive content. Typical…

Media Programming and Transport Media Programming and Transport   (“mplProgramming”)
H. Eidenberger — July 2000 to December 2019 • Keywords: Mobile Media Programming, Sensor Control, Evaluation Databases, Programming Environments, Streaming
The media processing team of the IMS group has collected expertise in the area of media programming and transport for almost two decades. Programming…

Multimedia Gaming Multimedia Gaming   (“mplGaming”)
H. Eidenberger — January 2003 to December 2019 • Keywords: Gaming Apps, Strategy Gaming, Media Games for Children, Multimedia Storytelling
Multimedia and gaming are two areas as closely linked in science as in practical application. The IMS has always endeavored to provide multimedia in…

Text Understanding Text Understanding   (“mplText”)
H. Eidenberger — January 2004 to December 2019 • Keywords: Text Retrieval, Autmated Translation, Natural Language Processing
Text is a traditional source for information retrieval applications. At the IMS we develop solutions for selected (domain-specific) text…

Visual Media Analysis Visual Media Analysis   (“mplVisual”)
H. Eidenberger — January 1998 to December 2019 • Keywords: Video Summrization, Scene Classification, Visual Retrieval, Image Understanding, Big Media Data
Visual media analysis is the central activity of the media processing subgroup at the IMS. Expertise has been gained since the mid 1990ies. Today,…

Forensic Media Analysis Forensic Media Analysis   (“mplForensic”)
H. Eidenberger — January 2002 to January 2019 • Keywords: Biometric Face Analysis, Voice Identification, Forensic Video Recording, Limits of Biometric Recognition
The media processing group of the IMS is highly active in the application and further development of solutions for forensic audio and video analysis.…

TU Fly Into the Future TU Fly Into the Future   (“mplTUFly”)
H. Eidenberger — April 2015 to October 2016 • Keywords: Virtual Reality, Immersion, Virtual Flying, Immersive Virtual Reality, Virtual Space Walk, Virtual Diving
The focus of this project is real flying through virtual worlds, where the applications are not just bird flight, but as well diving along virtual…

TU Jump Into The Future TU Jump Into The Future   (“VirtualJumpSimulator”)
H. Eidenberger — September 2014 to April 2015 • Keywords: Virtual Reality, Immersion, Virtual Parachute Jump, Immersive Virtual Reality, 3D City Model
Within the project "TU Jump Into The Future", a Virtual Jump Simulator (VJS) was developed that allows users to perform a virtual parachute jump over…

Others

TraitsDB: Online Platform to Collect Trait Observations TraitsDB: Online Platform to Collect Trait Observations
M. Ceric — April to December 2016
The TraitsDB software provides a collaborative online platform to efficiently and reliably collect and exchange fuzzy coded trait observation data.…

imsNUKE - PostNuke and Document Juggler Combined imsNUKE - PostNuke and Document Juggler Combined
H. Platzer — January 2001 to December 2002 • Keywords: Content Management, Portal, PHP, MySQL
imsNUKE is a content management system concentrating on structured, documentbased content. It combines the webportal-feature of Post-Nuke with the…