09:00 AM - 09:25 AM
As immersive technologies take on greater roles in our training, education, and healthcare, excluding people with disabilities carries more legal and economic risk than ever before. We're trapped in a vicious cycle of exclusion that threatens to deepen inequity, limit XR's potential, and hurt your bottom line. But we can break free!
XR Access' Dylan Fox and Danielle Montour will share why accessibility is vital to the success of your application. You'll learn about the fundamental principles of codesign and modularity, the surprising link between accessibility and artificial intelligence, and best practices for addressing specific disabilities. Finally, you'll learn how XR Access has been working to end the cycle of exclusion.
09:00 AM - 09:25 AM
As immersive technologies take on greater roles in our training, education, and healthcare, excluding people with disabilities carries more legal and economic risk than ever before. We're trapped in a vicious cycle of exclusion that threatens to deepen inequity, limit XR's potential, and hurt your bottom line. But we can break free!
XR Access' Dylan Fox and Danielle Montour will share why accessibility is vital to the success of your application. You'll learn about the fundamental principles of codesign and modularity, the surprising link between accessibility and artificial intelligence, and best practices for addressing specific disabilities. Finally, you'll learn how XR Access has been working to end the cycle of exclusion.
09:30 AM - 09:55 AM
The technology of virtual worlds is constantly evolving, and one of the most significant changes in recent years has been the rise of spatial chat allowing participants to communicate in a three-dimensional virtual space. But spatial chat is more than just a tool for enhancing immersion. In this talk, we will explore how spatial chat is changing the way participants communicate and collaborate. We will discuss the benefits of spatial chat, including its ability to facilitate teamwork, improve experiences, foster social connections, and the potential for future innovation in this area.
09:30 AM - 09:55 AM
The technology of virtual worlds is constantly evolving, and one of the most significant changes in recent years has been the rise of spatial chat allowing participants to communicate in a three-dimensional virtual space. But spatial chat is more than just a tool for enhancing immersion. In this talk, we will explore how spatial chat is changing the way participants communicate and collaborate. We will discuss the benefits of spatial chat, including its ability to facilitate teamwork, improve experiences, foster social connections, and the potential for future innovation in this area.
10:00 AM - 10:25 AM
Shared AR is a foundational service for our AR future and opens up new social opportunities today. Learn more about how Niantic can enable you to create shared experiences with 8th Wall (WebAR) and Lightship ARDK (Unity).
10:00 AM - 10:25 AM
Shared AR is a foundational service for our AR future and opens up new social opportunities today. Learn more about how Niantic can enable you to create shared experiences with 8th Wall (WebAR) and Lightship ARDK (Unity).
10:30 AM - 10:55 AM
When AR and VR are used to train people in advance or assist them on the fly, users are typically given information about the current task step (cues), either immediately before or while doing it. Many tasks, in the real world and in AR and VR, also use precues about what to do after the current step. We have been exploring how precueing multiple steps in advance in AR and VR can influence performance, for better or for worse. Our work addresses tasks ranging from simple path following to manipulating physical and virtual objects with one or both hands. I will present some of our results on the effectiveness of different numbers and types of precues, and explain their implications for task training and guidance.
10:30 AM - 10:55 AM
When AR and VR are used to train people in advance or assist them on the fly, users are typically given information about the current task step (cues), either immediately before or while doing it. Many tasks, in the real world and in AR and VR, also use precues about what to do after the current step. We have been exploring how precueing multiple steps in advance in AR and VR can influence performance, for better or for worse. Our work addresses tasks ranging from simple path following to manipulating physical and virtual objects with one or both hands. I will present some of our results on the effectiveness of different numbers and types of precues, and explain their implications for task training and guidance.
11:00 AM - 11:25 AM
Luisa Paes from ARVORE Immersive Experiences brings an overall look on how the studio works narrative in their award-winning XR projects. We'll have an ample look on where to start and what to know about the segment, including practical solutions brought in their games Yuki (both VR and MR) and Pixel Ripped. If you want to know more about the link between narrative design and XR, look no further!
11:00 AM - 11:25 AM
Luisa Paes from ARVORE Immersive Experiences brings an overall look on how the studio works narrative in their award-winning XR projects. We'll have an ample look on where to start and what to know about the segment, including practical solutions brought in their games Yuki (both VR and MR) and Pixel Ripped. If you want to know more about the link between narrative design and XR, look no further!
11:30 AM - 11:55 AM
Immersive technology is rapidly evolving, and AR/VR developers must collectively update our design standards. When innovative XR developers create new interactive functionality, those features become adopted across the industry as XR Design Standards. As technology for hand-tracking improves, we will continue to see an evolution of gesture controls and tools.
This session will offer a deep-dive into some of the best interactive features found in XR games and utility experiences. From the "lasso grab" feature in Half Life: Alyx, to the amazing suite of gesture controls in Hand Physics Lab, we will explore best practices for building and maintaining modular tools for designing interactive XR experiences.
11:30 AM - 11:55 AM
Immersive technology is rapidly evolving, and AR/VR developers must collectively update our design standards. When innovative XR developers create new interactive functionality, those features become adopted across the industry as XR Design Standards. As technology for hand-tracking improves, we will continue to see an evolution of gesture controls and tools.
This session will offer a deep-dive into some of the best interactive features found in XR games and utility experiences. From the "lasso grab" feature in Half Life: Alyx, to the amazing suite of gesture controls in Hand Physics Lab, we will explore best practices for building and maintaining modular tools for designing interactive XR experiences.
01:05 PM - 01:30 PM
In this talk we'll explore the technology stack required for creating generative XR experiences. We'll discuss the various components of the stack, including the XR platform, programming languages, libraries, and frameworks, and explain how they all work together to create immersive and interactive experiences. Our focus will be on selecting the right XR platform, programming language, libraries, and frameworks for your specific generative XR project. We'll also dive into implementing generative techniques such as procedural generation and artificial intelligence to create unique and engaging experiences. Additionally, we'll discuss the current limitations of generative AI in XR development and the potential benefits and challenges of using generative AI in XR. By the end of the talk, listeners will be equipped with the knowledge and tools to start exploring the possibilities of generative XR and be aware of the current limitations and future possibilities.
01:05 PM - 01:30 PM
In this talk we'll explore the technology stack required for creating generative XR experiences. We'll discuss the various components of the stack, including the XR platform, programming languages, libraries, and frameworks, and explain how they all work together to create immersive and interactive experiences. Our focus will be on selecting the right XR platform, programming language, libraries, and frameworks for your specific generative XR project. We'll also dive into implementing generative techniques such as procedural generation and artificial intelligence to create unique and engaging experiences. Additionally, we'll discuss the current limitations of generative AI in XR development and the potential benefits and challenges of using generative AI in XR. By the end of the talk, listeners will be equipped with the knowledge and tools to start exploring the possibilities of generative XR and be aware of the current limitations and future possibilities.
01:35 PM - 02:00 PM
The electric power industry has many opportunities to enhance employee learning with XR, from training and interactive procedure walkthroughs to data visualization. However, the details of the immersive interface can impact users’ ability to gain knowledge in both positive and negative ways. This talk will use the current challenges facing the electric power industry, and ongoing research into addressing those challenges with XR, as a starting point for a larger discussion about the potential of XR to both enhance and hinder users’ knowledge acquisition and situational awareness.
01:35 PM - 02:00 PM
The electric power industry has many opportunities to enhance employee learning with XR, from training and interactive procedure walkthroughs to data visualization. However, the details of the immersive interface can impact users’ ability to gain knowledge in both positive and negative ways. This talk will use the current challenges facing the electric power industry, and ongoing research into addressing those challenges with XR, as a starting point for a larger discussion about the potential of XR to both enhance and hinder users’ knowledge acquisition and situational awareness.
02:05 PM - 02:30 PM
Learn how to make player feedback fun and informative for your entire team with this one simple trick! Join Ashley “ashleyriott” Blake, Chief Operating Officer, Andromeda Entertainment, as she breaks down some simple tips for tagging and interpreting player feedback about your IP into quantifiable priorities for your production teams using examples from made-for-VR titles. Best practices for parsing player feedback include tips to help demystify emotional feedback by breaking it down into measurable, quantifiable priorities and actionable data for your teams. Use data to give your players a voice and paint a clear picture for your team.
02:05 PM - 02:30 PM
Learn how to make player feedback fun and informative for your entire team with this one simple trick! Join Ashley “ashleyriott” Blake, Chief Operating Officer, Andromeda Entertainment, as she breaks down some simple tips for tagging and interpreting player feedback about your IP into quantifiable priorities for your production teams using examples from made-for-VR titles. Best practices for parsing player feedback include tips to help demystify emotional feedback by breaking it down into measurable, quantifiable priorities and actionable data for your teams. Use data to give your players a voice and paint a clear picture for your team.
02:35 PM - 03:30 PM
"Shortcuts for Indie AR Devs" conference panel will be led by the "Wand Duel AR" team, who will share their knowledge on using Metahumans and open source tools to make AR development easier and create interactive experiences. The team will discuss the challenges and opportunities in AR development and provide guidance on effectively using these resources.
02:35 PM - 03:30 PM
"Shortcuts for Indie AR Devs" conference panel will be led by the "Wand Duel AR" team, who will share their knowledge on using Metahumans and open source tools to make AR development easier and create interactive experiences. The team will discuss the challenges and opportunities in AR development and provide guidance on effectively using these resources.