Media Files
- › Fig._Hardware Overview [JPEG | 64 KB ]
- › Fig._AR Life_Distributed Architecture [JPEG | 73 KB ]
- › Fig._Transition from AR Guide to AR Life [JPEG | 43 KB ]
- › Questionaire_LifePlus_Technical Details & Innovative Aspects [PDF | 137 KB ]
- › Assistance and Experience [JPEG | 39 KB ]
- › Assistance_AR Guide [JPEG | 41 KB ]
- › Fig._AR Life_Functional Elements [JPEG | 28 KB ]
Technical Description
The goal of LIFEPLUS is to push the limits of current Augmented Reality (AR) technologies, exploring the processes of narrative design of fictional spaces (e.g. frescos-paintings) where users can experience a high degree of realistic interactive immersion. Based on a captured/real-time video of a real scene, the project is oriented in enhancing these scenes by allowing the possibility to render realistic 3D simulations of virtual flora and fauna (humans, animals and plants) in real-time. According to its key mobile AR technology, visitors are provided with a see-through Head-Mounted-Display (HMD), earphone and mobile computing equipment. A tracking system determines their location within the site and audio-visual information is presented to them in context with their exploration, superimposed on their current view of the site. LIFEPLUS will extend that system and provide key new technologies to render lively, real-time animations and simulations of ancient virtual life (3D human groups, animals and plants).
LIFEPLUS Real-Time Systems Architecture
HARDWARE REQUIREMENTS OVERVIEW
In this section we present an overview of the main functional requirements of the LIFEPLUS system proposal followed by the short overview of the hardware components that will be necessary to meet them.
Mobility:
-The overall hardware unit must be highly mobile
- Its weight should be limited
- It must be easy to use
- It must have low power consumption and run on batteries
Responsiveness:
- The overall tracking, rendering and AR image composing times must be kept under certain limits in order to assure quality of immersive experience (~200ms)
Performance:
- The system must be able to deliver certain number of frames per second (not less than 10fps)
Robustness:
- System AR mode boot and run phases should be easy to initialise and robust in operation in order to assure possibly unconstrained interaction of the visitor with the site
Camera:
- Head mounted
- Reasonable resolution (~800x600)
- FireWire for the best digital quality and high transmission bandwidth (limiting lags)
- Monoscopic video see-through AR allowing to avoid critical problems with composing of synthetic and real images
DGPS and Digital Compass:
- On-site localization of the visitor
o Optional support for vision based real camera tracking
HMD display:
- Light, compact and low cost
Two mobile workstations:
- Separation and parallelization of the main two heavyweight system tasks: real-time camera tracking and real-time 3D rendering (the ideal solution would be double processor mobile workstation)
- 3D image rendering workstation must feature state of the art Graphics Processing Unit (GPU) which will allow for real-time generation of VR images of high quality
(see Fig. Hardware Overview)
SOFTWARE ARCHITECTURE AND CONTENT COMPONENTS OVERVIEW
The overall LIFEPLUS system architecture is designed based on the VHD++ real-time development framework being a proprietary middleware solution of both MIRAlab and VRlab laboratories. VHD++ is a highly flexible and extendible real-time framework supporting component based development of interactive audio-visual simulation applications in the domain of AR and VR with particular focus on virtual character simulation technologies. C++ has been chosen as the main implementation language. The most important features and functionalities of the VHD++ framework as seen from these different perspectives are:
1) support for real-time audio-visual applications,
2) extendible spectrum of technologies,
3) middleware portability, extendibility and scalability,
4) runtime flexibility: XML based system and content configuration,
5) complexity curbing: multiple design patterns improve clarity and abstraction levels simplify description of problems,
6) fundamental components and pluggable components hide implementation level details allowing to work on required abstraction level simplifying implementation constructs,
7) large scale code reuse: fundamental components and readymade components encapsulating heterogeneous technologies.
The LIFEPLUS system architecture is designed around two VHD++ runtime engines being active software elements running on two separate portable computers. Each of the runtime engines power supplies a set of pluggable VHD++ services that provide to the system encapsulation of required application level technologies. This kind of architecture allows for separation of computationally heavy real-time tracking and synthetic image rendering tasks.
As shown on Fig. AR Life_Distributed Architecture, the TRACK runtime engine features relatively light weight services taking care of DGPS based on-site positioning of the visitor and providing her/him with additionally multimedia information related to the current position at the site being visited. Those services play the mayor role in the phase of sight-seeing of the actual site. Once the visitor reaches the point of interest where the AR simulation is possible then DGPS services become secondary and play a supportive role for the computationally heavy weight real-time, vision based camera tracking service that calculates and sends subsequent camera matrices to the VR/AR runtime engine.
The VR/AR runtime engine features multiple services encapsulating required VR simulation technologies supporting real-time generation of realistic synthetic 3D images and sound effects according to camera matrices obtained from the tracking side. Here we find as well services that are responsible for real image buffering and composition with the synthetic image.
SYSTEM IN OPERATION
The LIFEPLUS mobile system is required to operate in two main modes. The first one is designed to support the visitor of the site with location based multimedia information facilitating sight-seeing of the area by provision of both practical and historical information in form of text, images, short movies overlaid on the head mounted display. In this “sight-seeing” operational mode mainly DGPS technology is used to track relatively coarsely current position of the visitor in the area. Once the visitor reaches the spot where the AR simulation is possible (s)he is informed about it and allowed to enter into AR simulation mode.
In AR simulation mode, the visitor is exposed to the VR simulation scenario blended into the real imagery of the site. The visitor is able to walk around and look around some spatially limited space usually naturally constrained by the walls of a particular site. In this mode DGPS technology plays secondary role supporting real-time, vision based, computationally heavyweight camera tracking module that is designed to deliver precise camera matrix values per each simulation frame in order to allow for generation of respective 3D synthetic images and blending them with the real camera ones.
High quality FireWire digital video signal carrying both image and frame ID information is split to two portable computers running respectively TRACK and VR/AR runtime engines. For each frame real-time, vision based tracking module, optionally supported by directional data from DGPS calculates real camera matrix and sends it together with the received FireWire frame’s ID to the VR/AR side. As the real camera tracking costs certain time the VR/AR side needs to buffer vide images obtained from the FireWire together with their respective Ids. Once the real camera matrix is ready for a real image stored in the cyclic buffer the VR simulation module generates respective 3D synthetic image that is then blended with the real image and sent to the HMD by AR image blending module. It is important to note that VR simulation module is responsible not only for 3D image generation but as well for generation of proper 3D sound effects accompanying the simulated scenario.
(see Fig. Transition from AR Guide to AR Life)
netzspannung.org was provided with the information above by MIRALab. However large parts have been published before:
George Papagiannakis, Michael Ponder, Tom Molet, Sumedha Kshirsagar, Frederic Cordier, Nadia Magnenat-Thalmann, Daniel Thalmann: LIFEPLUS: Revival of life in ancient Pompeii. Virtual Systems and Multimedia, SMM2002-invited paper, October 2002 http://www.miralab.unige.ch/papers/128.pdf
Hardware / Software
LIFEPLUS software components
Automatic Real-time Camera Tracking S/W
Need addressed: Urgent need from the movie/TV industry for pre-visualisation of special effects, need for real-time AR environments with Virtual flora and fauna for the cultural industries, tourism and edutainment
Prime developer: VMSL
Partners that expressed commercialisation intentions: VMSL, FORTH
Alternative Real-time camera tracking Algorithms
Need addressed: Need from the academic community of alternative specialised application-based methodologies (robotics etc.)
Prime developer:IGD, FORTH
Partners that expressed commercialisation intentions: VMSL, IGD, FORTH
Real-time virtual fauna simulation SDK S/W
Need addressed: Need for an application independent realistic integrated solution for virtual human and animal simulation
Prime developer: UNIGE, EPFL
Partners that expressed commercialisation intentions: UNIGE, noDna
Real-time virtual flora simulation SDK S/W
Need addressed: Need for an integrated solution for virtual plant generation, representation and simulation
Prime developer: Bionatics
Partners that expressed commercialisation intentions: Bionatics
Behavioural animation of virtual characters Algorithms
Need addressed: Important component of the virtual fauna simulation SDK for autonomous virtual agents
Prime developer: EPFL
Partners that expressed commercialisation intentions: EPFL, Bionatics
Real-time Hair, Cloth, Facial simulation Algorithms
Need addressed: Optimised solution for pre-visualisation and behaviour simulation (cosmetic, apparel industry) as well as for heightened edutainment experiences
Prime developer: UNIGE
Partners that expressed commercialisation intentions: UNIGE, noDna
AR Authoring Tools Suite S/W
Need addressed: An integrated, extensible authoring solution with SDKs for Virtual life in AR situations
Prime developer: IGD
Partners that expressed commercialisation intentions: INTRACOM
Mobile On-Site AR Guide S/W
Need addressed: To enhance visitor’s site experience with AR audiovisual information for new edutainment interactions.
Prime developer: INTRACOM, IGD
Partners that expressed commercialisation intentions: INTRACOM