09:00 AM - 09:25 AM
In the session we will explore how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Oculus Quest, Varjo VR-1 and HP Reverb G2.
Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest-and-greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box.
In order to achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence created the Digital Lens, a computational solution utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.
09:00 AM - 09:25 AM
In the session we will explore how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Oculus Quest, Varjo VR-1 and HP Reverb G2.
Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest-and-greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box.
In order to achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence created the Digital Lens, a computational solution utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.
09:30 AM - 09:55 AM
Miniaturization is the trend to manufacture even smaller mechanical, optical and electronic products, medical devices, and other high-value parts. This trend continues to be strong, with year-over-year growth in many markets. One of the limiting factors to miniaturization is the inability of traditional manufacturing methods like injection molding and CNC machining to effectively and economically produce smaller and smaller parts.
Additive Manufacturing (AM), or 3D Printing has been around now for over 30 years. For a long time, there were only a few technologies available and applications were generally limited to prototyping. Past advances in AM have come short in meeting the needs of small parts, printing them at a resolution, accuracy, precision and speed that made them a viable option for end-use production parts. That has all changed. Additive Manufacturing and Miniaturization are now converging – in a very meaningful and impactful way.
The growth in the AR/VR market and pace of innovation is opening up applications that were not even imagined a decade ago. With this comes challenges for manufacturing to scale at the same pace. Many leading AR/VR technology companies have started using micro 3D printing as a method for producing various micro-precision components as an alternative to traditional fabrication methods – finding huge time and cost savings.
Learn how micro-precision 3D printing enables companies in this competitive space to address development challenges limited by current microfabrication methods, but also allows companies to explore the potential of pushing the limits on miniaturization by expanding the boundaries otherwise thought impossible with 3D printing.
09:30 AM - 09:55 AM
Miniaturization is the trend to manufacture even smaller mechanical, optical and electronic products, medical devices, and other high-value parts. This trend continues to be strong, with year-over-year growth in many markets. One of the limiting factors to miniaturization is the inability of traditional manufacturing methods like injection molding and CNC machining to effectively and economically produce smaller and smaller parts.
Additive Manufacturing (AM), or 3D Printing has been around now for over 30 years. For a long time, there were only a few technologies available and applications were generally limited to prototyping. Past advances in AM have come short in meeting the needs of small parts, printing them at a resolution, accuracy, precision and speed that made them a viable option for end-use production parts. That has all changed. Additive Manufacturing and Miniaturization are now converging – in a very meaningful and impactful way.
The growth in the AR/VR market and pace of innovation is opening up applications that were not even imagined a decade ago. With this comes challenges for manufacturing to scale at the same pace. Many leading AR/VR technology companies have started using micro 3D printing as a method for producing various micro-precision components as an alternative to traditional fabrication methods – finding huge time and cost savings.
Learn how micro-precision 3D printing enables companies in this competitive space to address development challenges limited by current microfabrication methods, but also allows companies to explore the potential of pushing the limits on miniaturization by expanding the boundaries otherwise thought impossible with 3D printing.
10:00 AM - 10:25 AM
10:00 AM - 10:25 AM
10:40 AM - 11:05 AM
Smart glasses in a normal form factor will significantly change our everyday lives. tooz envisioned this more than a decade ago when the tooz journey began in the corporate research labs of ZEISS Germany. Starting with the first curved waveguide, tooz developed several generations of optical engines and provided a continuous stream of patented inventions to the smart glass industry. 5 years ago, Deutsche Telekom/T-Mobile joined the journey as a 50% shareholder and enabled the development of full smart glasses solutions based on tooz’ waveguides. At AWE 2022, tooz will launch its next breakthrough innovation on its mission to lead this market: tooz ESSNZ Berlin is the first market-ready smart glasses reference design with vision correction that will change the daily interaction of consumers with data, media and ecosystem interfaces. The underlying tooz technology is highly customizable, scalable, and marketable – Not in the future, but already today.
10:40 AM - 11:05 AM
Smart glasses in a normal form factor will significantly change our everyday lives. tooz envisioned this more than a decade ago when the tooz journey began in the corporate research labs of ZEISS Germany. Starting with the first curved waveguide, tooz developed several generations of optical engines and provided a continuous stream of patented inventions to the smart glass industry. 5 years ago, Deutsche Telekom/T-Mobile joined the journey as a 50% shareholder and enabled the development of full smart glasses solutions based on tooz’ waveguides. At AWE 2022, tooz will launch its next breakthrough innovation on its mission to lead this market: tooz ESSNZ Berlin is the first market-ready smart glasses reference design with vision correction that will change the daily interaction of consumers with data, media and ecosystem interfaces. The underlying tooz technology is highly customizable, scalable, and marketable – Not in the future, but already today.
11:10 AM - 11:35 AM
In the backdrop of the Metaverse, how does the real world fit in, what does open, interoperable, and ethical look like, and what is the role of the Web ? We explore some potential guiding principles for the real world Spatial Web and early steps to realization through standards activities and the Open Spatial Computing Platform (OSCP). Along the way, we highlight areas that present strong opportunities for academic research, standards, and open source contributions.
11:10 AM - 11:35 AM
In the backdrop of the Metaverse, how does the real world fit in, what does open, interoperable, and ethical look like, and what is the role of the Web ? We explore some potential guiding principles for the real world Spatial Web and early steps to realization through standards activities and the Open Spatial Computing Platform (OSCP). Along the way, we highlight areas that present strong opportunities for academic research, standards, and open source contributions.
11:35 AM - 12:00 PM
The Open AR Cloud is working to democratize the AR Cloud with infrastructures based on open and interoperable technology. And we are building city-scale AR testbeds that are being experienced throughout cities around the world. These are real-world use cases that combine the digital with the physical–rich experiences that are synchronous, persistent, and geospatially tied to a specific location. Content in situ allows the user to explore the world, connect with others, and have a shared experience.
We will discuss new types of content activations based on proximity, gaze, voice, sensor data, and algorithmic spatial ads. Partners will present use cases such as wayfinding and NFT exhibits, as well as case studies that demonstrate how the technology is being used to build more diverse, equitable, and inclusive, real-world communities that raise awareness on important critical issues like climate change and public health.
11:35 AM - 12:00 PM
The Open AR Cloud is working to democratize the AR Cloud with infrastructures based on open and interoperable technology. And we are building city-scale AR testbeds that are being experienced throughout cities around the world. These are real-world use cases that combine the digital with the physical–rich experiences that are synchronous, persistent, and geospatially tied to a specific location. Content in situ allows the user to explore the world, connect with others, and have a shared experience.
We will discuss new types of content activations based on proximity, gaze, voice, sensor data, and algorithmic spatial ads. Partners will present use cases such as wayfinding and NFT exhibits, as well as case studies that demonstrate how the technology is being used to build more diverse, equitable, and inclusive, real-world communities that raise awareness on important critical issues like climate change and public health.
01:30 PM - 02:00 PM
We are rapidly entering the world of augmented intelligent reality where experiences are built at an intersection of the real and digital worlds. In this talk we will share some amazing success stories: discover how Passio is using Unity to build game-changing experiences by combining AR with AI that runs on user devices to transform fitness, healthcare, home remodel and other industries. Join us and be inspired to create the world of your dreams using the next wave of AI and AR technologies.
01:30 PM - 02:00 PM
We are rapidly entering the world of augmented intelligent reality where experiences are built at an intersection of the real and digital worlds. In this talk we will share some amazing success stories: discover how Passio is using Unity to build game-changing experiences by combining AR with AI that runs on user devices to transform fitness, healthcare, home remodel and other industries. Join us and be inspired to create the world of your dreams using the next wave of AI and AR technologies.
02:05 PM - 02:30 PM
As part of a DoD project, Deloitte recently built a successful 5G Infrastructure and a multitude of technologies built on top adding measured value to one of the largest military branches in the US. In this talk we'll discuss AR, Edge Compute and the impact 5G is having actively across dozens of bases in the Military today and where we see the future of xR going beyond the current 'Metaverse' buzz.
02:05 PM - 02:30 PM
As part of a DoD project, Deloitte recently built a successful 5G Infrastructure and a multitude of technologies built on top adding measured value to one of the largest military branches in the US. In this talk we'll discuss AR, Edge Compute and the impact 5G is having actively across dozens of bases in the Military today and where we see the future of xR going beyond the current 'Metaverse' buzz.
02:35 PM - 03:00 PM
Ray tracing techniques improve graphics rendering qualities dramatically in recent years. In the coming few years, mobile chipsets will also support hardware accelerated ray tracing, which will bring a more visually believable virtual environment with realistic lighting and shadowing effects. It will become a major technique used in mobile gaming, augmented and virtual reality devices.
OPPO, collaborated with partners, began developing its ray tracing technology early in 2018 and had initially adopted a hybrid rendering method to gradually introduce ray tracing to existing mobile devices. This talk will introduce the short history of mobile ray tracing, forecast its trends in mobile devices and explore the potential applications.
02:35 PM - 03:00 PM
Ray tracing techniques improve graphics rendering qualities dramatically in recent years. In the coming few years, mobile chipsets will also support hardware accelerated ray tracing, which will bring a more visually believable virtual environment with realistic lighting and shadowing effects. It will become a major technique used in mobile gaming, augmented and virtual reality devices.
OPPO, collaborated with partners, began developing its ray tracing technology early in 2018 and had initially adopted a hybrid rendering method to gradually introduce ray tracing to existing mobile devices. This talk will introduce the short history of mobile ray tracing, forecast its trends in mobile devices and explore the potential applications.
03:05 PM - 03:30 PM
The prize of functional, lightweight, all-day wearable smartglasses remains as elusive as ever. The missing linchpin component is the optical combiner which must reconcile the conflicting requirements of see-through quality, display performance, light weight, efficiency, prescription, compatibility and aesthetics. In this talk I’ll describe META’s approach to AR optical combiners based on a free space architecture enabled by holographic optical elements. Combined with our ophthalmic compatible ARfusion lens casting technology, the result is a monolithic combiner that provides extraordinary optical efficiency, ophthalmic grade see-through clarity, and the prescription and aesthetic properties that opticians, consumers, and regulators expect from eyewear. I’ll present META’s One Stop Shop approach to individualization that circumvents many of the concerns around eyebox size and manufacturability.
03:05 PM - 03:30 PM
The prize of functional, lightweight, all-day wearable smartglasses remains as elusive as ever. The missing linchpin component is the optical combiner which must reconcile the conflicting requirements of see-through quality, display performance, light weight, efficiency, prescription, compatibility and aesthetics. In this talk I’ll describe META’s approach to AR optical combiners based on a free space architecture enabled by holographic optical elements. Combined with our ophthalmic compatible ARfusion lens casting technology, the result is a monolithic combiner that provides extraordinary optical efficiency, ophthalmic grade see-through clarity, and the prescription and aesthetic properties that opticians, consumers, and regulators expect from eyewear. I’ll present META’s One Stop Shop approach to individualization that circumvents many of the concerns around eyebox size and manufacturability.