Back

High-Accuracy Diffractometer for Augmented Reality Waveguide Characterization

Jun 2

05:10 PM - 05:35 PM

Description

Due to the demanding optical architectures of diffractive waveguide gratings used for augmented reality applications, the gratings must be manufactured to extremely high accuracies, or the image quality will suffer. The grating period has to match the designs within tens of picometers, and the tolerances for the relative orientation of the gratings are in the arcsecond range. Both the production masters and the replicated gratings need to be characterized nondestructively, and the grating areas scanned to ensure uniformity. The measurement system should work for surface relief and volume holographic gratings in various material systems. We describe a Littrow diffractometer that can perform this challenging task. A narrow-band and highly stable laser source is used to illuminate a spot on the sample. Mechanical stages with high-accuracy encoders rotate and tilt the sample, until the laser beam is diffracted back to the the laser. This so-called Littrow condition is detected through a feedback loop with a beam splitter and a machine vision camera. The grating period and relative orientation can then be calculated from the stage orientation data. With the system properly constructed, and custom software algorithms performing an optimized measurement sequence, it is possible to reach repeatability in the picometer and arcsecond range for the grating period and relative orientation, respectively. By carefully calibrating the stages or by using golden samples, absolute accuracy for the grating period can also reach picometer range.

Speakers

Team Lead, Optics , OptoFidelity

Related Sessions

Jun 1

11:00 AM - 11:25 AM

Description

Whether hands free mobile displays or in situ data overlay, head-mounted augmented reality offers much to improve productivity and reduce human error in space. Unfortunately, existing solutions for tracking and holographic overlay alignment tend to rely on, or at least assume, earth gravity. Nothing inherent to a microgravity environment would make AR tracking impossible but several factors need to be taken into account. First, the high-frequency camera pose estimation uses SLAM, which relies on data from IMU sensors, which, by default accommodate the acceleration of gravity in their base measurements. Second, most holographic alignment strategies assume a consistent down direction. This session will explore strategies to mitigate these limitations.

Speakers

Chief Technology Officer , Argyle.build
Jun 1

11:00 AM - 11:25 AM

Description

Whether hands free mobile displays or in situ data overlay, head-mounted augmented reality offers much to improve productivity and reduce human error in space. Unfortunately, existing solutions for tracking and holographic overlay alignment tend to rely on, or at least assume, earth gravity. Nothing inherent to a microgravity environment would make AR tracking impossible but several factors need to be taken into account. First, the high-frequency camera pose estimation uses SLAM, which relies on data from IMU sensors, which, by default accommodate the acceleration of gravity in their base measurements. Second, most holographic alignment strategies assume a consistent down direction. This session will explore strategies to mitigate these limitations.

Speakers

Chief Technology Officer , Argyle.build
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Stealth
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Stealth
Jun 1

12:00 PM - 12:25 PM

Description

Volumetric video technology captures full-body, dynamic human performance in four dimensions. An array of 100+ cameras point inward at a living entity (person, animal, group of people) and record their movement from every possible angle. Processed and compressed video data from each camera becomes a single 3D file – a digital twin of the exact performance that transpired on stage – for use on virtual platforms. Finished volcap assets are small enough to stream on mobile devices but deliver the visual quality detail of 100+ cameras, making them a go-to solution for bringing humans into the Metaverse.

The volumetric video market is expected to grow from $1.5B USD in 2021 to $4.9B USD by 2026 as holographic imaging becomes increasingly crucial for the development of compelling, human-centric immersive content and Metaverse creators strive to solve the “uncanny valley” problem.

The session dives into the latest and greatest applications of volcap in augmented reality across multiple sectors – including fashion, entertainment, AR marketing and branding, enterprise training, and more…

We’ll examine the ground-breaking potential this technology holds for augmented and mixed reality as well as some of the challenges that may face this burgeoning industry.

Speakers

Stage Hand , Departure Lounge Inc.
General Manager , Metastage