Immersive Experiences in AR and VR

Special Session on Immersive Experiences in AR and VR

  • Simon Gunkel (TNO, The Hague, Netherlands)
  • Christian Timmerer (Alpen-Adria-Universität Klagenfurt and Bitmovin, Austria)
  • Jacob Chakareski (University of Alabama, Tuscaloosa, USA)
  • Daisuke Iwai (Osaka University, Osaka, Japan
Scope & Goals

With a new wave of Augmented reality (AR) and Virtual Reality (VR) devices approaching the consumer market to offer Immersive Experiences to its users, it is evermore becoming a hot topic in research and industry. However, the challenges are also still great, as those Immersive Experiences oppose a complete paradigm shift in terms of consuming multi-media content. This is because multi-media content perceived through head mounted displays is very different from a traditional 2D screen. Both in terms of visuals and interaction models. Thus any existing content and experience has to be redesigned to match the requirements of AR and VR. Furthermore, the technology has to match and catch up with these new requirements. Overall Immersive Experiences in AR and VR are more critical to delay and synchronization, and generally demand more resources from end devices and within the system (CPU, GPU, storage and network bandwidth). This results into many challenges in the whole delivery pipeline from capturing and processing a multitude of sensor information, to manipulating, streaming and visualizing different multimedia streams, while estimating the performance with new AR/VR QoE and QoS metrics. In this session, we like to offer a forum to discuss those issues connecting a broad and interdisciplinary field of research areas, including computer vision, computer graphics, mobile and embedded systems, displays and optics, user interface design, and applications from a broad range of areas, including the entertainment, industry, military and commercial sectors.

Topics of Interest:

  • Innovative immersive applications, software architectures, and systems design
  • Web-based AR and VR
  • Networking and distributed systems for AR/VR
  • Over-the-top streaming of 360 degree and 3D content
  • Multimedia compression for visual search, AR and VR
  • Real-time systems and resource-constrained implementations
  • Mobile and embedded computing for AR/VR
  • System-level energy management for mobile AR/VR systems
  • Display technologies for mixed and augmented reality
  • User interface designs for immersive applications in AR/VR
  • QoE assessment of 360 degree and 3D immersive experiences and media content
  • Metadata and mapping of media and objects into AR/VR scenes
  • Security and privacy concerns in AR/VR systems
  • Sensor fusion and ego-motion estimation
  • Active and passive stereo systems
  • 3D modeling and image-based localization
  • Standardization in Virtual/Augmented/Mixed Reality

Please refer to this page for an overview of important dates, submission guidelines and procedures for MMSys'17 tracks, special sessions, and co-located workshops.

The submission site for Immersive Experiences in AR and VR session is available here