Agenda

  • Expand all
Jun 1

02:30 PM - 02:55 PM

Description

From 2005 onwards, Forklift University has been at the forefront of training people in Powered Industrial Trucks (PITs). They do both OSHA-compliant training and issue OSHA certificates.

The brain retains more information from real-life experiences. The millennial generation requires training that captivates and teaches in a more visual manner. So, Forklift University needed a captivating solution for Gen Z while being user-friendly and adaptive for Gen Y to the boomers.

Also, when it comes to PITs training, safety is of utmost importance, followed by cost. They needed a solution that would enable their users to learn the dos and don’ts of forklift driving in a reasonably realistic warehouse setting.

Travancore Analytics (TA) has been a critical player in the extended reality domain. When Forklift University approached TA with the problem, TA suggested a combination of Metaverse and VR to achieve the intended needs. TA created a realistic warehouse and sit-down forklift using 3D software. The same was incorporated into a complex, yet user-friendly application developed using Unity. The application works in conjunction with HTC VIVE Pro 2, a VR head-mounted device. The combination allows the user to drive a virtual forklift in a virtual
environment and undertake training curated with OSHA guidelines in mind. With various training modules of varying degrees of difficulty and functionality, Forklift University can provide its users with close-to real-life forklift training with complete safety. With VR technology, we can create multiple PITs training software under various environments that replicate real-life challenges and incorporate several features that, in turn, develop more in-depth training software.

Speakers

President , Forklift University
VP - Engineering , Travancore Analytics
May 31

01:55 PM - 02:20 PM

Description

Construction is a complex and messy process, combining millions of individual components - all needing to be positioned in the exact right place with mm-level accuracy. However, the construction process has largely remained unchanged for thousands of years. A group of people design a building, listening to every wish and elaborate desire coming from the Owner, and then they hand the design over to thousands of other people to actually build it, without any understanding of how this design fits the real world. It’s kind of like building a massive jigsaw puzzle where thousands of people are responsible for one piece or another, and no one really knows how they all fit together. This waterfall process leads to building things up only to tear them down immediately after and then build them right back again - just moved over by 1ft - something the construction industry spends over $280B per year doing. This is simply not sustainable, for the industry, for the stakeholders, and most importantly - for the planet. With nearly 40% of the world’s Carbon emissions being contributed by the construction industry, transformation is desperately needed.
And that’s exactly what Trimble is working to do. As a leader in high-accuracy positioning technologies, Trimble has a long standing history of bringing precision to the construction industry - helping to fit all those puzzle pieces together. But we see the opportunity to do more. Since 1997, when we filed our first XR patent, Trimble has been transforming the way the world works by connecting the physical and digital worlds. Now, we’re working to change this archaic narrative, by empowering everyone in construction to visualize, interpret, and action the design through Augmented and Mixed Reality technologies in the field. From catching design mistakes by viewing 3D models on any worker’s iPad, to being more efficient by controlling the most precise Total Station with nothing more than your gaze, we are improving communication and collaboration around design intent, enabling more efficient and sustainable projects. Follow us on this journey as we outline how Extended Reality technologies are revolutionizing the way the construction industry operates today, tomorrow, and for decades to come.

Speakers

VDC Manager , Canadian Turner Construction Company
Product Manager , Trimble
May 31

02:55 PM - 03:20 PM

Description

Doublepoint builds a gesture-based touch interface to enable efficient, robust, and intuitive AR interaction. At this talk, they will showcase how these solutions can work as standalone input modalities or be combined to supercharge eye tracking and hand tracking.

User input is one of the many challenges standing in the way of the mass adoption of AR. How will the everyday person interact in Augmented Reality? What's the role of touch in interaction? At Doublepoint we research different ways for people to interact in AR and develop the best technologies to detect these interactions efficiently and reliably.

Currently, there are a lot of input methods for AR using built-in sensors of the headset, such as hand tracking, eye tracking, and voice input. However, if we want it to be as transformative as the graphical user interface or the capacitive touch screen, we need to put deliberate thought into building the ideal input paradigm and the needed hardware that might not be in the headset itself.

At this talk:

• We’ll demonstrate how a machine learning algorithm on existing smartwatches can already significantly improve AR interaction.
• We’ll show how it can be combined with eye tracking and hand tracking sensors in the headset to improve interactions even more.
• Lastly, we'll show some of our future custom hardware dedicated to sensing advanced micro-gestures in a small and convenient form factor.

Speakers

Co-Founder and CTO , Doublepoint
May 31

11:30 AM - 11:55 AM

Description

Digital Twins connect physical systems with virtual representations and models to allow for visual representations, integration of sensor data, and predictive capabilities for how assets or processes will behave in the future. Globally, organizations are grappling with the acceleration of remote operations and an increasingly prevalent skills gap. Forward-looking companies are addressing this problem by equipping their in-person, hybrid, and off-site teams with mixed reality (MR) solutions that enhance productivity capabilities especially when integrated with Digital Twins. In this session, learn how Sphere and AWS are working together to develop a digital workspace which enables professionals to communicate across languages, distances, dimensions, and time (with predictive capabilities). By partnering on initiatives which include TwinMaker and Lookout for Vision, as well as cloud rendering powered by AWS and its partners, Sphere’s cutting-edge solution is pioneering the future of collaboration, as well as expert assistance and workflow guidance using MR.

Speakers

Global Business Dev. & GTM Leader , AWS
CEO , Sphere (by holo|one)
Jun 2

09:00 AM - 09:25 AM

Description

Both large and small companies are looking at VR with AR passthrough MR. The use of smaller and smaller optics, particularly systems using pancake optics, seems to have accelerated this trend. Everything that works well with Passthrough MR is a major problem to solve in Optical MR and vice versa. This presentation will discuss the optical, display, and human factor issues associated with Optical MR compared to VR with Passthrough MR.

Speakers

President , KGonTech
Jun 2

02:35 PM - 03:30 PM

Description

According to leading executives of XR companies, what are the key considerations when designing haptics experiences and products? What insights and lessons do you need to know to save you from reinventing the wheel? What are the brightest minds in XR predicting the future of haptics will look like?

Speakers

Business Development Manager , bHaptics
CEO / Co-Founder , Sensoryx
Strategic Partnerships , TITAN Haptics
Vice President, Sales , HaptX
Strategic Partnership Manager , Contact CI
May 31

05:35 PM - 06:00 PM

Description

Behind the world’s first XR creator tool for responsive content, SyncReality is premiering its latest offering at AWE Santa Clara. The Beta release of the SyncReality Suite, will don the main tool, for automated placement of virtual assets in the real world, a room scanner, a parametric asset tool, and a simulator.

The SyncReality Suite is a revolutionary XR spatial content creation tool- enabling any XR content to automatically adapt to the space of the enduser - and designed to unlock new possibilities for businesses, brands, and developers, that want to provide enhanced user experiences across vCommerce, education, gaming, and more - to engage and excite their audiences.

Following the launch of Alpha in February 2023, SyncReality Beta boasts cutting-edge features including:

• Optimized product interface for easy navigation and creation
• Streamlined workflow management for cross-team development
• Improved user experience for seamless interactions
• Parameric Asset Bundles, keeping aesthetics intact in a broad variety of spaces.
• Simulator to verify XR content before exporting.

SyncReality’s unique technology makes it easier than ever for developers to create seamless experiences that engage and excite users. With SyncReality, you can create immersive worlds that allow users to interact with a character, world, brand, or product in ways never before possible. Imagine:

• Enjoying a concert in your living room by your favorite musician or
• Learning a new skill at home while sitting in a classroom environment or
• Playing an interactive escape game that transforms your space into a vast rainforest!

Speakers

Founder / CEO , SyncReality
Jun 1

01:30 PM - 01:55 PM

Description

XR cannot scale until...

Luis Ramirez, Mawari’s Founder and CEO, will deliver a comprehensive presentation that covers the delivery bottlenecks and technological advancements in XR development needed for XR delivery to truly scale. He will also showcase case studies from various sectors, including transportation, education, and entertainment, to discuss whether the ubiquitous XR Cloud and ever persistent digital twin of the world are currently achievable. This talk will be organized to provide a clear and concise overview of the topic.

Speakers

Founder and CEO , Mawari
May 31

12:00 PM - 12:25 PM

Description

Zheng Qin has since 2018 developed the cutting edge, wide FoV AR optics system called Mixed Waveguide, with the most advanced Crossfire solution that gives you a 120-degree FoV and is interchangeable between AR and VR. It’s even slimmer than most VR optics (including the Pancake solutions), so it could be the ultimate optical solution for AR & VR hybrid glasses. Zheng will walk you through the reasons why Crossfire is much better than its competitor Pancake +VST (video see-through) solution. Additionally, Zheng will introduce the whole family of Mixed Waveguide solutions, which has been adopted by many key clients around the world.

Speakers

Founder , Ant Reality
Jun 1

04:35 PM - 05:00 PM

Description

In this presentation, we introduce the innovative PinTILT™ structure, which combines the best of traditional waveguides and birdbath structures, incorporating both pupil expansion and pupil forming functionalities.

Speakers

CTO , LetinAR
Jun 2

09:30 AM - 09:55 AM

Description

Almalence presents a set of ISP techniques enabling high picture quality beyond optical hardware limitations in VR/AR head-mounted displays by dynamically compensating for the optical system deficiencies specific to the optical path at a given eye position.

Optical hardware design constraints inherent to head-mounted displays compromise the image quality and visual experience in VR/AR. Even the latest-and-greatest devices retain the common flaws spoiling user experience: poor image detail, blur and color fringing outside of the small “sweet spot,” geometry distortion when moving the gaze direction, and low picture resolution of the outer world captured via pass-through cameras.

In the session, we will explore why dynamic optical aberrations correction, distortion correction, and super-resolution algorithms are indispensable to achieving high visual quality in head-mounted displays. We will also see examples of picture quality improvement those computational techniques produce on the highest-end VR displays from Pico, Varjo, and HP.

Speakers

CEO , Almalence
Jun 2

11:10 AM - 11:35 AM

Description

Join us for a presentation of a new metaverse social platform where people can gather, communicate, and express themselves digitally.

Speakers

CEO , Seerslab
Corporate Development Manager , Seerslab
Jun 2

10:00 AM - 10:25 AM

Description

Craig will present an outlining of how Contact CI has come to define Multi-Force Ergonomic Haptics, sharing our approach and the story of building Maestro EP, all while providing examples and evidence for why MFE haptics are important and impactful in generating an immersive sense of touch for a range of VR/AR users.

Speakers

CEO , Contact CI
Jun 2

10:40 AM - 11:05 AM

Description

The DPT pixel is a single microLED emitter whose emission wavelength can be tuned all the way from red to blue and beyond, including white light. This removes the need for ‘RGB’ sub-pixels made from different material systems, and no further colour conversion/filtering or complex stacking architectures are required. DPT enables colour uniformity, eliminates complex fabrication processes, simplifies system design to deliver high-performance microLED displays and optical solutions, offers the most advanced approach to full-colour microLED displays, and brings these advantages to the whole display industry.

Speakers

CEO , Porotech
Jun 2

02:05 PM - 02:30 PM

Description

In this session, we will explore the design aspects of BenVision, an accessible AR experience utilizing a Technology-Centered Design approach. We will discuss the integration of Camera Vision and AI in AR and the role of soundscapes in shaping emotions within XR environments. Our speakers will share their expertise in developing an immersive and accessible experience for vision-impaired individuals.

Using BenVision as a case study, we will investigate how the project enhances the world's experiential appreciation for visually impaired individuals through sound. We will cover our approach to effectively conveying object presence using sound in AR glasses and address ambience, musical soundscapes, and spatial audio features. Additionally, we will discuss our methodology for categorizing object characteristics based on sound qualities.

The session will touch on design principles from both UX and audio perspectives, emphasizing best practices for creating accessible XR experiences that promote functionality and emotional connections. Attendees will gain insights into designing for technology and humanity in the ever-evolving landscape of emerging technologies like XR.

Speakers

Design Technologist, Scholar , DesignSingapore Council
Audio Designer | Team Benvision , Velan Studios
Jun 2

01:05 PM - 02:00 PM

Description

Join a timely and compelling conversation between product leaders across the field of XR wearables. Nexus Studios hosts a roundtable between innovators at Qualcomm, T-Mobile, and Meta, to explore both present and future trends in HMD AR applications. Spanning both consumer and enterprise solutions, from personal computing, to remote work, to multiuser gaming, hear how professionals are thinking about, preparing for, and executing on the upcoming era of HMD-based augmented reality. What projects, products, and processes will inspire you to see the world anew?

Speakers

Sr. Director XR , Qualcomm
Head of Immersive , Nexus Studios
XR Lead , T-Mobile
Founder , Th3 Third Door
Jun 1

03:35 PM - 04:30 PM

Description

As American cities struggle to build housing, improve transit, and otherwise convince a skeptical public that change is good - and necessary! How can AR help win over their critics? This panel will bring together a startup (inCitu) and platform (Snap) engaging the public at massive scale through offering passerby a glimpse of new projects in their actual context. They’ll be joined by a city official to discuss the potential of AR to deliver services, fast-track development, and re-imagine our relationship with the built environment at large.

Speakers

Urban Tech Fellow , Cornell Tech
Former Director, Dept. of Buildings , DC Goverment
Public Policy Manager , Snap Inc. / Snapchat
Founder & CEO , inCitu
May 31

11:00 AM - 11:25 AM

Description

Get ready for a look into the world of spatial computing with AWS! Join us as we dive into how AWS is transforming the way 3D models are brought to life in the cloud. We'll be showcasing the latest spatial computing services on AWS that enable you to build, deliver, and manage your 3D workloads with ease and efficiency.

But that's not all - via an on stage demonstration you'll get to see how we paired a Magic Leap 2 with a Boston Dynamics Spot Robot to showcase how AWS's cutting-edge technology can help users visualize live robot telemetry and control the robot in even the most challenging and remote environments.

This session and the session following it is a must-attend for professionals who are interested in exploring the full potential of spatial computing on AWS. Join us for a captivating and informative presentation that is sure to inspire and inform!

Speakers

Sr. Strategic Architect , AWS
Senior Manager, Spatial Computing , AWS
May 31

01:25 PM - 01:50 PM

Description

Recently, there has been, and continues to be, a flurry of activities around AR and the Metaverse. How these domains intersect and unfold over time is still very much in the early stages. What is clear, however, is that the “on-ramp” or gateways into the Metaverse starts with the ability to perceive the physical and digital worlds simultaneously. Many technologies and devices are needed to enable the true immersion and first and foremost is the ability to overlay the digital domain onto the physical space. In this talk we will discuss these aspects and delve deeply into near-to-eye display technologies that allows uses to coexist in the physical and digital domains.

Speakers

Director, Strategic Marketing , STMicroelectronics
Jun 1

03:00 PM - 03:25 PM

Description

After over a decade searching for mixed reality smart glasses capable of enhancing his low vision, legally blind Chris McNally walked into the Paris offices of Lynx with his blind cane and sighted guide, and met founder Stan Larroque.

After putting on the Lynx R1, Chris’ world was transformed making it possible for him to see tables, chairs and other obstacles in the room and the expressions on people’s faces during conversations - all impossible moments before. Was it magic? No, just the right combination of technology capabilities to meet the needs of Chris’ low vision.

However, this is just the tip of the iceberg, as advancements in smart glasses hardware, software and MR/AR/XR technologies, along with advancing AI and more, will create incredible opportunities for people with low vision.

Speakers

Co-Founder , iMcNally
CEO , LYNX