Back

Dynamic Aberrations Correction Enables Users to See High Resolution in VR Displays

Jun 3

09:00 AM - 09:25 AM

Description

In the session we will explore how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Oculus Quest, Varjo VR-1 and HP Reverb G2.

Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest-and-greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box.

In order to achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence created the Digital Lens, a computational solution utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.

Speakers

CEO , Almalence

Related Sessions

Jun 1

11:00 AM - 11:25 AM

Description

Whether hands free mobile displays or in situ data overlay, head-mounted augmented reality offers much to improve productivity and reduce human error in space. Unfortunately, existing solutions for tracking and holographic overlay alignment tend to rely on, or at least assume, earth gravity. Nothing inherent to a microgravity environment would make AR tracking impossible but several factors need to be taken into account. First, the high-frequency camera pose estimation uses SLAM, which relies on data from IMU sensors, which, by default accommodate the acceleration of gravity in their base measurements. Second, most holographic alignment strategies assume a consistent down direction. This session will explore strategies to mitigate these limitations.

Speakers

Chief Technology Officer , Argyle.build
Jun 1

11:00 AM - 11:25 AM

Description

Whether hands free mobile displays or in situ data overlay, head-mounted augmented reality offers much to improve productivity and reduce human error in space. Unfortunately, existing solutions for tracking and holographic overlay alignment tend to rely on, or at least assume, earth gravity. Nothing inherent to a microgravity environment would make AR tracking impossible but several factors need to be taken into account. First, the high-frequency camera pose estimation uses SLAM, which relies on data from IMU sensors, which, by default accommodate the acceleration of gravity in their base measurements. Second, most holographic alignment strategies assume a consistent down direction. This session will explore strategies to mitigate these limitations.

Speakers

Chief Technology Officer , Argyle.build
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Stealth
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Stealth
Jun 1

12:00 PM - 12:25 PM

Description

Volumetric video technology captures full-body, dynamic human performance in four dimensions. An array of 100+ cameras point inward at a living entity (person, animal, group of people) and record their movement from every possible angle. Processed and compressed video data from each camera becomes a single 3D file – a digital twin of the exact performance that transpired on stage – for use on virtual platforms. Finished volcap assets are small enough to stream on mobile devices but deliver the visual quality detail of 100+ cameras, making them a go-to solution for bringing humans into the Metaverse.

The volumetric video market is expected to grow from $1.5B USD in 2021 to $4.9B USD by 2026 as holographic imaging becomes increasingly crucial for the development of compelling, human-centric immersive content and Metaverse creators strive to solve the “uncanny valley” problem.

The session dives into the latest and greatest applications of volcap in augmented reality across multiple sectors – including fashion, entertainment, AR marketing and branding, enterprise training, and more…

We’ll examine the ground-breaking potential this technology holds for augmented and mixed reality as well as some of the challenges that may face this burgeoning industry.

Speakers

Stage Hand , Departure Lounge Inc.
General Manager , Metastage