Back

Use Cases for the Real World Metaverse (AR Cloud)

Jun 3

11:35 AM - 12:00 PM

Description

The Open AR Cloud is working to democratize the AR Cloud with infrastructures based on open and interoperable technology. And we are building city-scale AR testbeds that are being experienced throughout cities around the world. These are real-world use cases that combine the digital with the physical–rich experiences that are synchronous, persistent, and geospatially tied to a specific location. Content in situ allows the user to explore the world, connect with others, and have a shared experience.

We will discuss new types of content activations based on proximity, gaze, voice, sensor data, and algorithmic spatial ads. Partners will present use cases such as wayfinding and NFT exhibits, as well as case studies that demonstrate how the technology is being used to build more diverse, equitable, and inclusive, real-world communities that raise awareness on important critical issues like climate change and public health.

Speakers

CEO , XR Masters
Director of Spatial Experiences , UXXR Design
Founder , XR Masters
CEO, Creative Director , Novaby

Related Sessions

Jun 1

11:00 AM - 11:25 AM

Description

Whether hands free mobile displays or in situ data overlay, head-mounted augmented reality offers much to improve productivity and reduce human error in space. Unfortunately, existing solutions for tracking and holographic overlay alignment tend to rely on, or at least assume, earth gravity. Nothing inherent to a microgravity environment would make AR tracking impossible but several factors need to be taken into account. First, the high-frequency camera pose estimation uses SLAM, which relies on data from IMU sensors, which, by default accommodate the acceleration of gravity in their base measurements. Second, most holographic alignment strategies assume a consistent down direction. This session will explore strategies to mitigate these limitations.

Speakers

Chief Technology Officer , Argyle.build
Jun 1

11:00 AM - 11:25 AM

Description

Whether hands free mobile displays or in situ data overlay, head-mounted augmented reality offers much to improve productivity and reduce human error in space. Unfortunately, existing solutions for tracking and holographic overlay alignment tend to rely on, or at least assume, earth gravity. Nothing inherent to a microgravity environment would make AR tracking impossible but several factors need to be taken into account. First, the high-frequency camera pose estimation uses SLAM, which relies on data from IMU sensors, which, by default accommodate the acceleration of gravity in their base measurements. Second, most holographic alignment strategies assume a consistent down direction. This session will explore strategies to mitigate these limitations.

Speakers

Chief Technology Officer , Argyle.build
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Stealth
Jun 1

11:30 AM - 11:55 AM

Description

Digital Twins, The Metaverse, Game Engines, 3D Maps and Computer Vision are all merging to make a vision of "Sim City" where the city is our real city. When will this happen? How will it happen? What will it mean for AR and VR? What will we use it for? Why will web3 & crypto be important?

Matt will explore what comes after the AR Cloud is built, and we can finally connect The Metaverse to the Real World, showing technically ground-breaking world-first demos of his new startup which is making all this possible

Speakers

CEO , Stealth
Jun 1

12:00 PM - 12:25 PM

Description

Volumetric video technology captures full-body, dynamic human performance in four dimensions. An array of 100+ cameras point inward at a living entity (person, animal, group of people) and record their movement from every possible angle. Processed and compressed video data from each camera becomes a single 3D file – a digital twin of the exact performance that transpired on stage – for use on virtual platforms. Finished volcap assets are small enough to stream on mobile devices but deliver the visual quality detail of 100+ cameras, making them a go-to solution for bringing humans into the Metaverse.

The volumetric video market is expected to grow from $1.5B USD in 2021 to $4.9B USD by 2026 as holographic imaging becomes increasingly crucial for the development of compelling, human-centric immersive content and Metaverse creators strive to solve the “uncanny valley” problem.

The session dives into the latest and greatest applications of volcap in augmented reality across multiple sectors – including fashion, entertainment, AR marketing and branding, enterprise training, and more…

We’ll examine the ground-breaking potential this technology holds for augmented and mixed reality as well as some of the challenges that may face this burgeoning industry.

Speakers

Stage Hand , Departure Lounge Inc.
General Manager , Metastage