Back

Discreet and Tactile Interactions in AR

May 31

02:55 PM - 03:20 PM

Description

Doublepoint builds a gesture-based touch interface to enable efficient, robust, and intuitive AR interaction. At this talk, they will showcase how these solutions can work as standalone input modalities or be combined to supercharge eye tracking and hand tracking.

User input is one of the many challenges standing in the way of the mass adoption of AR. How will the everyday person interact in Augmented Reality? What's the role of touch in interaction? At Doublepoint we research different ways for people to interact in AR and develop the best technologies to detect these interactions efficiently and reliably.

Currently, there are a lot of input methods for AR using built-in sensors of the headset, such as hand tracking, eye tracking, and voice input. However, if we want it to be as transformative as the graphical user interface or the capacitive touch screen, we need to put deliberate thought into building the ideal input paradigm and the needed hardware that might not be in the headset itself.

At this talk:

• We’ll demonstrate how a machine learning algorithm on existing smartwatches can already significantly improve AR interaction.
• We’ll show how it can be combined with eye tracking and hand tracking sensors in the headset to improve interactions even more.
• Lastly, we'll show some of our future custom hardware dedicated to sensing advanced micro-gestures in a small and convenient form factor.

Speakers

Co-Founder and CTO , Doublepoint

Related Sessions

May 31

11:00 AM - 11:25 AM

Description

Get ready for a look into the world of spatial computing with AWS! Join us as we dive into how AWS is transforming the way 3D models are brought to life in the cloud. We'll be showcasing the latest spatial computing services on AWS that enable you to build, deliver, and manage your 3D workloads with ease and efficiency.

But that's not all - via an on stage demonstration you'll get to see how we paired a Magic Leap 2 with a Boston Dynamics Spot Robot to showcase how AWS's cutting-edge technology can help users visualize live robot telemetry and control the robot in even the most challenging and remote environments.

This session and the session following it is a must-attend for professionals who are interested in exploring the full potential of spatial computing on AWS. Join us for a captivating and informative presentation that is sure to inspire and inform!

Speakers

Sr. Strategic Architect , AWS
Senior Manager, Spatial Computing , AWS
May 31

11:00 AM - 11:25 AM

Description

Get ready for a look into the world of spatial computing with AWS! Join us as we dive into how AWS is transforming the way 3D models are brought to life in the cloud. We'll be showcasing the latest spatial computing services on AWS that enable you to build, deliver, and manage your 3D workloads with ease and efficiency.

But that's not all - via an on stage demonstration you'll get to see how we paired a Magic Leap 2 with a Boston Dynamics Spot Robot to showcase how AWS's cutting-edge technology can help users visualize live robot telemetry and control the robot in even the most challenging and remote environments.

This session and the session following it is a must-attend for professionals who are interested in exploring the full potential of spatial computing on AWS. Join us for a captivating and informative presentation that is sure to inspire and inform!

Speakers

Sr. Strategic Architect , AWS
Senior Manager, Spatial Computing , AWS
May 31

11:30 AM - 11:55 AM

Description

Digital Twins connect physical systems with virtual representations and models to allow for visual representations, integration of sensor data, and predictive capabilities for how assets or processes will behave in the future. Globally, organizations are grappling with the acceleration of remote operations and an increasingly prevalent skills gap. Forward-looking companies are addressing this problem by equipping their in-person, hybrid, and off-site teams with mixed reality (MR) solutions that enhance productivity capabilities especially when integrated with Digital Twins. In this session, learn how Sphere and AWS are working together to develop a digital workspace which enables professionals to communicate across languages, distances, dimensions, and time (with predictive capabilities). By partnering on initiatives which include TwinMaker and Lookout for Vision, as well as cloud rendering powered by AWS and its partners, Sphere’s cutting-edge solution is pioneering the future of collaboration, as well as expert assistance and workflow guidance using MR.

Speakers

Global Business Dev. & GTM Leader , AWS
CEO , Sphere (by holo|one)
May 31

11:30 AM - 11:55 AM

Description

Digital Twins connect physical systems with virtual representations and models to allow for visual representations, integration of sensor data, and predictive capabilities for how assets or processes will behave in the future. Globally, organizations are grappling with the acceleration of remote operations and an increasingly prevalent skills gap. Forward-looking companies are addressing this problem by equipping their in-person, hybrid, and off-site teams with mixed reality (MR) solutions that enhance productivity capabilities especially when integrated with Digital Twins. In this session, learn how Sphere and AWS are working together to develop a digital workspace which enables professionals to communicate across languages, distances, dimensions, and time (with predictive capabilities). By partnering on initiatives which include TwinMaker and Lookout for Vision, as well as cloud rendering powered by AWS and its partners, Sphere’s cutting-edge solution is pioneering the future of collaboration, as well as expert assistance and workflow guidance using MR.

Speakers

Global Business Dev. & GTM Leader , AWS
CEO , Sphere (by holo|one)
May 31

12:00 PM - 12:25 PM

Description

Zheng Qin has since 2018 developed the cutting edge, wide FoV AR optics system called Mixed Waveguide, with the most advanced Crossfire solution that gives you a 120-degree FoV and is interchangeable between AR and VR. It’s even slimmer than most VR optics (including the Pancake solutions), so it could be the ultimate optical solution for AR & VR hybrid glasses. Zheng will walk you through the reasons why Crossfire is much better than its competitor Pancake +VST (video see-through) solution. Additionally, Zheng will introduce the whole family of Mixed Waveguide solutions, which has been adopted by many key clients around the world.

Speakers

Founder , Ant Reality