11:05 - 11:30
Dispelix develops and delivers lightweight, high-performance see-through waveguide combiners that are used as transparent displays in extended reality (XR) devices. Our full-color near-eye displays encompass all dimensions of XR comfort - social, wearable, and visual alike – in a simple eyeglass-lens form. Rich and pleasant XR experience calls for seamless merger of display and light engine technologies. In her presentation, Dispelix Vice President Pia Harju discusses how advanced design contributes to XR comfort and same time helps draw full potential from the light engine and display. We will showcase how built-in mechanical and optical compatibility fuse esthetic and functional aspects of design, paving the way for mind-altering XR eyewear experience.
10:05 - 10:30
Extended reality (XR) is a rapidly growing industry and the next logical step in the evolution of wearable technology. However, prior to becoming truly mainstream, there are still several significant challenges that need to be addressed first. These include: a large visual field, see-through, distortion free optics and obtaining a lightweight, compact form factor to allow extended wear, just to name a few.
Virtual retinal display (VRD) technology is a promising approach to AR that could help overcome these and several other fundamental challenges (such as the Vergence-Accommodation conflict). VRDs create images directly on the retina, which allows for a much more compact and lightweight device. To achieve an optimal VRD user experience, one must account for eye movements to constantly keep the projected image in the center of vision. This can be achieved by combining VRD with precise retinal (blood vessels) eye-tracking (RET) to provide a more immersive and natural AR experience.
In this talk, we will discuss the potential of VRD and RET to break the barriers of AR. We’ll start by identifying the challenges that the AR industry is currently facing, and then explain how VRD coupled with RET can help to address these challenges. We will also mention some of the unique, potential applications of VRD with RET in AR. We believe that VRD with RET have the potential to revolutionize the AR industry. By overcoming the challenges of size, comfort, and several others, VRD and RET could make AR wearables that are truly convenient and practical for all day use. This would open up a whole new world of possibilities for AR, and it could help make AR a part of our everyday lives.
16:35 - 17:00
Join us at the conference for an exciting session on WebGPU, the cutting-edge technology revolutionizing high-performance in-browser rendering for Metaverses.
In this session, we will explore how WebGPU makes high-quality graphics accessible to all, eliminating the need for separate app installations on user devices. As an evolution of WebGL, WebGPU offers enhanced XR capabilities, enabling developers to create immersive and visually stunning WebXR experiences. Moreover, you'll have the opportunity to witness a live demonstration showcasing the real-time capabilities of both WebGPU and WebGL on a captivating real-world project.
15:30 - 16:00
The productivity metaverse is in the trough of Gartner’s hype cycle and that means it is actually taking off. All around the globe, industries and governments are eagerly pushing and realizing the first benefits of projects leveraging XR. We will share some joint research conducted by Ernst and Young jointly with Nokia that highlights concrete commercial and productivity impacts of XR today. And then – because we are dreamers and innovators– we’ll provide some insights on the road to 6G. What can 6G bring that is already in planning, and how can the XR community ensure 6G addresses its needs and hopes and by the same token take full advantage of the power of a sensing network?
13:30 - 13:55
Solving the vergence-accommodation conflict – the mismatch between perceived and focal distances for stereoscopic 3D displays – represents a critical hurdle to overcome for augmented reality (AR). Overcoming this would smooth interactions with virtual content, blurring the line between the simulated and the real, yet the industry has yet to settle on an approach.
In this presentation, IDTechEx outlines the range of display and optical systems proposed to solve the vergence-accommodation conflict, weighing up technological suitability and market forces to suggest likely candidates for wide deployment. Technologies including retinal projection, holographic and light field displays, and focus-tunable lenses are detailed and benchmarked on promotion of social acceptability, manufacturing feasibility and other factors. Alongside assessment of industry forces, this leads to the presentation of adoption roadmaps, plotting a path for integration into immersive consumer AR devices.
12:30 - 12:55
Join Yacine Achiakh, Founder of Wisear, as he delves into the crucial role of neural interfaces in driving the widespread adoption of augmented reality (AR) and demo their benefit live on stage. In this captivating keynote, Yacine will explore the evolution of human-computer interfaces, from keyboards and mice to touchscreens, and highlight the need for a new generation of interfaces to propel AR into the mainstream. Discover how current controllers fall short in delivering seamless and immersive interactions, and why alternatives like voice and hand tracking have their limitations.
Yacine will unveil the game-changing potential of neural interfaces, which enable touchless and voiceless control through facial muscular, eye, and brain activity. Witness live how Wisear is at the forefront of building neural-interface powered products that revolutionize the way we interact with AR and VR devices, paving the road for ubiquitous adoption in consumer and enterprise applications. Don't miss this enlightening presentation that will shape the future of human-computer interactions in XR.
17:35 - 18:00
This talk will provide an overview of existing Web3 technologies and the ground reality of their capabilities in supporting XR applications. Web3, powered by blockchain technology, is evolving rapidly, and numerous upcoming features benefit Augmented and Virtual Reality use cases. It is resourceful for the XR developers and the enterprise to be aware of the developments in the Web3 space and explore synergies with decentralized protocols.
13:00 - 13:25
Testing optical performance of components and the image quality of a completed XR headset is an important but not well-known part of the product development and mass production cycle - in fact, today's headsets would likely not be existing without it.
In this presentation, we will give an overview of the various steps in the production process where optical testing comes into play and discuss new developments like active alignment technology where we make use of real-time test data to assemble XR modules for best image quality.
Finally, we describe a new generation of test equipment that uses custom high-end optics specially tailored to XR that integrates both "big" and "small" picture-scale image quality test capabilities in one instrument, bringing test technology to the next level to support tomorrow's high-resolution XR headsets.
17:05 - 17:30
This talk delves into the immersive world of Augmented Reality (AR) content creation and its profound impact on spatial storytelling across diverse industries. We will discuss the untapped potential of AR experiences and discuss using intuitive tools and generative AI to craft captivating AR experiences.
The talk begins with exploring why spatial storytelling matters and how it revolutionises how we engage with the world around us. Attendees will gain insights into the unique power of AR to seamlessly blend digital elements into the real world, creating interactive and emotionally resonant narratives that transcend traditional media formats.
We will delve into the practical aspects of AR content creation. Attendees will learn to harness essential creative tools and user-friendly authoring tools empowering them to build immersive AR experiences with ease. The integration of generative AI tools will be showcased, demonstrating how AI-driven content creation enhances storytelling possibilities and audience engagement.
Beyond content creation, the talk will touch on the commercial side of AR. Attendees will gain valuable insights into strategies for commercialising AR content and effectively targeting specific audiences. Spatial storytelling can be leveraged to create next-generation entertainment, interactive educational experiences, and enhanced customer interactions, making it an indispensable tool for anyone seeking to stand out in a competitive market.
16:05 - 16:30
The first full time adopters of smart eyewear will be people that are already wearing eyewear on a daily basis. Virtually everyone in this category requires prescription lenses. Therefore, the first generation of smart eyewear that is worn all day and everyday, will have integrated prescription. Currently, the AR ecosystem is lacking a proper solution for prescription. Mass customization of optics without compromise in quality is key to solving this problem and will be extensively covered in this talk. This talk will cover what the path to the first AR glasses for full-time use will look like, and the role that 3D printing will play. We will give an overview of the technical aspects of utilizing 3D printing for personalized optics, compare different 3D printing based technologies, and look at different optical designs for AR glasses. Designs that are becoming increasingly popular are so-called push-pull designs. This talk will cover such designs from both a technical and user experience perspective. At the end of this talk, you will understand how 3D printing can be used to manufacture nanometer smooth optical lenses in a way that scales to mass production. You will be up to date on some of the latest advancement in AR optics and have a new perspective on what the first successful smart glasses will look like.
11:05 - 11:30
Dispelix develops and delivers lightweight, high-performance see-through waveguide combiners that are used as transparent displays in extended reality (XR) devices. Our full-color near-eye displays encompass all dimensions of XR comfort - social, wearable, and visual alike – in a simple eyeglass-lens form. Rich and pleasant XR experience calls for seamless merger of display and light engine technologies. In her presentation, Dispelix Vice President Pia Harju discusses how advanced design contributes to XR comfort and same time helps draw full potential from the light engine and display. We will showcase how built-in mechanical and optical compatibility fuse esthetic and functional aspects of design, paving the way for mind-altering XR eyewear experience.
10:05 - 10:30
Extended reality (XR) is a rapidly growing industry and the next logical step in the evolution of wearable technology. However, prior to becoming truly mainstream, there are still several significant challenges that need to be addressed first. These include: a large visual field, see-through, distortion free optics and obtaining a lightweight, compact form factor to allow extended wear, just to name a few.
Virtual retinal display (VRD) technology is a promising approach to AR that could help overcome these and several other fundamental challenges (such as the Vergence-Accommodation conflict). VRDs create images directly on the retina, which allows for a much more compact and lightweight device. To achieve an optimal VRD user experience, one must account for eye movements to constantly keep the projected image in the center of vision. This can be achieved by combining VRD with precise retinal (blood vessels) eye-tracking (RET) to provide a more immersive and natural AR experience.
In this talk, we will discuss the potential of VRD and RET to break the barriers of AR. We’ll start by identifying the challenges that the AR industry is currently facing, and then explain how VRD coupled with RET can help to address these challenges. We will also mention some of the unique, potential applications of VRD with RET in AR. We believe that VRD with RET have the potential to revolutionize the AR industry. By overcoming the challenges of size, comfort, and several others, VRD and RET could make AR wearables that are truly convenient and practical for all day use. This would open up a whole new world of possibilities for AR, and it could help make AR a part of our everyday lives.
16:35 - 17:00
Join us at the conference for an exciting session on WebGPU, the cutting-edge technology revolutionizing high-performance in-browser rendering for Metaverses.
In this session, we will explore how WebGPU makes high-quality graphics accessible to all, eliminating the need for separate app installations on user devices. As an evolution of WebGL, WebGPU offers enhanced XR capabilities, enabling developers to create immersive and visually stunning WebXR experiences. Moreover, you'll have the opportunity to witness a live demonstration showcasing the real-time capabilities of both WebGPU and WebGL on a captivating real-world project.
15:30 - 16:00
The productivity metaverse is in the trough of Gartner’s hype cycle and that means it is actually taking off. All around the globe, industries and governments are eagerly pushing and realizing the first benefits of projects leveraging XR. We will share some joint research conducted by Ernst and Young jointly with Nokia that highlights concrete commercial and productivity impacts of XR today. And then – because we are dreamers and innovators– we’ll provide some insights on the road to 6G. What can 6G bring that is already in planning, and how can the XR community ensure 6G addresses its needs and hopes and by the same token take full advantage of the power of a sensing network?
13:30 - 13:55
Solving the vergence-accommodation conflict – the mismatch between perceived and focal distances for stereoscopic 3D displays – represents a critical hurdle to overcome for augmented reality (AR). Overcoming this would smooth interactions with virtual content, blurring the line between the simulated and the real, yet the industry has yet to settle on an approach.
In this presentation, IDTechEx outlines the range of display and optical systems proposed to solve the vergence-accommodation conflict, weighing up technological suitability and market forces to suggest likely candidates for wide deployment. Technologies including retinal projection, holographic and light field displays, and focus-tunable lenses are detailed and benchmarked on promotion of social acceptability, manufacturing feasibility and other factors. Alongside assessment of industry forces, this leads to the presentation of adoption roadmaps, plotting a path for integration into immersive consumer AR devices.
12:30 - 12:55
Join Yacine Achiakh, Founder of Wisear, as he delves into the crucial role of neural interfaces in driving the widespread adoption of augmented reality (AR) and demo their benefit live on stage. In this captivating keynote, Yacine will explore the evolution of human-computer interfaces, from keyboards and mice to touchscreens, and highlight the need for a new generation of interfaces to propel AR into the mainstream. Discover how current controllers fall short in delivering seamless and immersive interactions, and why alternatives like voice and hand tracking have their limitations.
Yacine will unveil the game-changing potential of neural interfaces, which enable touchless and voiceless control through facial muscular, eye, and brain activity. Witness live how Wisear is at the forefront of building neural-interface powered products that revolutionize the way we interact with AR and VR devices, paving the road for ubiquitous adoption in consumer and enterprise applications. Don't miss this enlightening presentation that will shape the future of human-computer interactions in XR.
17:35 - 18:00
This talk will provide an overview of existing Web3 technologies and the ground reality of their capabilities in supporting XR applications. Web3, powered by blockchain technology, is evolving rapidly, and numerous upcoming features benefit Augmented and Virtual Reality use cases. It is resourceful for the XR developers and the enterprise to be aware of the developments in the Web3 space and explore synergies with decentralized protocols.
13:00 - 13:25
Testing optical performance of components and the image quality of a completed XR headset is an important but not well-known part of the product development and mass production cycle - in fact, today's headsets would likely not be existing without it.
In this presentation, we will give an overview of the various steps in the production process where optical testing comes into play and discuss new developments like active alignment technology where we make use of real-time test data to assemble XR modules for best image quality.
Finally, we describe a new generation of test equipment that uses custom high-end optics specially tailored to XR that integrates both "big" and "small" picture-scale image quality test capabilities in one instrument, bringing test technology to the next level to support tomorrow's high-resolution XR headsets.
17:05 - 17:30
This talk delves into the immersive world of Augmented Reality (AR) content creation and its profound impact on spatial storytelling across diverse industries. We will discuss the untapped potential of AR experiences and discuss using intuitive tools and generative AI to craft captivating AR experiences.
The talk begins with exploring why spatial storytelling matters and how it revolutionises how we engage with the world around us. Attendees will gain insights into the unique power of AR to seamlessly blend digital elements into the real world, creating interactive and emotionally resonant narratives that transcend traditional media formats.
We will delve into the practical aspects of AR content creation. Attendees will learn to harness essential creative tools and user-friendly authoring tools empowering them to build immersive AR experiences with ease. The integration of generative AI tools will be showcased, demonstrating how AI-driven content creation enhances storytelling possibilities and audience engagement.
Beyond content creation, the talk will touch on the commercial side of AR. Attendees will gain valuable insights into strategies for commercialising AR content and effectively targeting specific audiences. Spatial storytelling can be leveraged to create next-generation entertainment, interactive educational experiences, and enhanced customer interactions, making it an indispensable tool for anyone seeking to stand out in a competitive market.
16:05 - 16:30
The first full time adopters of smart eyewear will be people that are already wearing eyewear on a daily basis. Virtually everyone in this category requires prescription lenses. Therefore, the first generation of smart eyewear that is worn all day and everyday, will have integrated prescription. Currently, the AR ecosystem is lacking a proper solution for prescription. Mass customization of optics without compromise in quality is key to solving this problem and will be extensively covered in this talk. This talk will cover what the path to the first AR glasses for full-time use will look like, and the role that 3D printing will play. We will give an overview of the technical aspects of utilizing 3D printing for personalized optics, compare different 3D printing based technologies, and look at different optical designs for AR glasses. Designs that are becoming increasingly popular are so-called push-pull designs. This talk will cover such designs from both a technical and user experience perspective. At the end of this talk, you will understand how 3D printing can be used to manufacture nanometer smooth optical lenses in a way that scales to mass production. You will be up to date on some of the latest advancement in AR optics and have a new perspective on what the first successful smart glasses will look like.