13:15 - 14:15
Invite Only: Founders and investors are invited to a networking luncheon in a reserved area of the Expo Hall. Come raise a glass, celebrate your achievements and make new friends with others who are building the future of XR.
12:30 - 13:20
Who will own love in the Metaverse? How will we make love in the Metaverse? Will anyone want to?
This panel explores how life in the Metaverse will impact the perception of self, the emotional and mental wellbeing, and the interpersonal relationships of the users. How do we serve physical needs in a world that by definition lacks all physicality?
What interactions - and how many degrees of suspension of disbelief - are possible? What kind of sensory input is required to achieve sensual immersion? What will be the role of avatars, haptic technologies, motion tracking, and spatial audio, to facilitate emotional connectedness in this environment? How can creators make sure users don't end up in sex dungeons, unless they really want to? What about true love, romance, feelings, and intimacy? Can virtual love thrive in the real world?
This panel puts the metaverse hype in perspective, and imagines how the convergence of physical and digital could enhance people's affective bandwidth in digital environments to create a more sustainable and inclusive experience.
10:00 - 10:50
Hear how the values of security, openness, education as well as the success of gaming impact XR design from a Nordic perspective.
15:35 - 16:25
Why it is so important to keep the AR community alive, well and prosperous? Considering the future of how we're going to experience and interact online, envisioning the Web 3.0 as an XR-based Metaverse, we don't want our future built by one or two companies, but rather a global community of conscious individuals!
11:30 - 11:55
What exactly should we do about all that Digital Fashion/WEB3/Metaverse buzz? Most brands don't understand the value but want to do something because of the FOMO. We need to dispel the fog and explain 1) how these technologies can be applied in a simple language and 2) how to not just create a project but also connect it to the receiving of predictable results.
Join us as Dmitry Kornilov addresses:
- analysis of significant changes in consumer persona
- overlaying of audience behavior and technical opportunities
- finding viable Digital Fashion and Metaverse use-cases starting from virtual try-on and ending with in-game integrations
- integration of Digital Fashion and Metaverse projects into a cross-platform communication ecosystem, from online to offline
- analysis of Digital Fashion and Metaverse projects and connecting them to revenue growth
20:00 - 23:00
Join us at the FIL Rooftop for the official AWE Afterparty!
Open to all AWE participants with an AWE badge.
Come enjoy drinks and snacks in this gorgeous setting while mingling with friends and colleagues to make memorable, AWE-some connections!
10:05 - 10:30
Shopping is the killer feature in any reality. The Spatial Commerce team at Shopify aims to prepare merchants for a wild future where real and virtual worlds merge, delivering the best of IRL and URL shopping. Russ Maschmeyer, Team Lead, and Eric Florenzano, Staff Engineer, are exploring what's possible and useful for merchants in this new landscape. They'll share cutting-edge prototypes as well as tools and capabilities merchants can leverage today.
14:10 - 14:35
Many industry and construction 4.0 use cases such as maintenance, production line planning, or default detection require displaying in AR and updating over time complex Digital Twins at a very large scale. By moving spatial computing and rendering into the cloud or at the edge based on 5G connectivity, we will show how the AR cloud platform developed in the context of the EU Research and Innovation project ARtwin can meet these requirements, whatever the AR device capabilities.
14:10 - 14:35
For adolescent and young adult (AYA) cancer patients, support group sessions have been proven to increase survival chances and alleviate depression. But for a number of good reasons, very few patients are comfortable attending in-person therapy sessions. So the Yale School of Medicine teamed up with Foretell Reality (A Glimpse Group Company) to develop a VR application to solve this dilemma. Join us to find out how.
16:05 - 16:30
Established remote-assistance solutions allow remote experts to set annotations and highlights directly within the on-site user’s field of view. However, there is no shared realm of experience where both users can interact as if they were physically standing next to each other.
This talk presents an ongoing research project that aims to solve this problem by capturing and streaming the on-site user’s real 3D-environment (including persons) to create a shared sense of space. The remote expert, who uses a VR-HMD, is able to move freely within the live 3D reconstruction, whereas the on-site user is able to see him using an AR-HMD. The result is an intuitive and natural collaboration experience that solves the shortcomings of current solutions.
15:55 - 16:20
João Blümel, the Metaverse Mentalist, promises to read your mind at his one-of-a-kind show.
Come witness the world’s first show to fuse VR, AR and Mind-Reading into a single performance.
10:35 - 11:00
From smart factories to digital supply chains and connected work – Industry 4.0 processes rely on Intelligent Technologies to run better. Together, TeamViewer and SAP are driving innovation and help enterprises accelerate their digital transformation in the industrial space and build resilient supply chains. AR solutions close the loop where manual work becomes inevitable, supplying frontline workers with digital tools and real-time data to get the job done efficiently, for use cases like for logistics, assembly, quality assurance, inspections, maintenance, and field service. Learn how you can optimize processes along the entire value chain by putting the human factor at the center of digitization by relying on seamlessly integrated systems and new technology.
13:55 - 14:20
There is an enormous buzz around the metaverse with tales of fully virtual spaces mimicking real life and extending it via games or online collaboration. Taking a closer look though, the metaverse does not seem to offer much for businesses from the industrial space. It is so far leaving out people that are already solving real-world problems across industries and processes. TeamViewer strongly believes in a continuous convergence between the physical world and technology in general. So, it is way beyond time to leap into the “industrial metaverse” and redefine frontline work in the 21st century – with the help of AR solutions and the power of AI.
10:00 - 10:25
Everyone will wear AR glasses on their faces by 2030. Just centimeters from our optic nerve, computers have never been closer to our brains. This changes both content creation and consumption.
These immersive technologies could create a more empathetic, educated and empowering future. But they could also promote inequality and fake news.
Yusuf Omar, Co-CEO of SEEN, has been wearing cameras on his face for over a decade. His team has overlaid journalism stories on cities from Boston to Cape Town. Everyone loves a good story, especially if they can be a part of that story. And that’s what immersive storytelling does. On your face.
17:35 - 18:00
Doug Engelbert expressed the idea of augmenting human intellect, not just for individuals, but as a collaborative experience. We have been working on Croquet, with the goal of creating a foundation that would enable such an augmentation of our ability to create, explore, understand and solve problems collaboratively. We define the augmented conversation as follows:
1 - A discussion within a group of users that is extraordinarily enhanced with the kinds of tools and capabilities that are only available with a computer.
2 - A computer AI is a full participant in this conversation. It allows us to jointly discuss and explore complex systems, data sets, and simulations as naturally and easily as we engage today.
3 - There is a guarantee of shared truth. The simulation I see must be the exact same as what you see. Anything I do to affect the shared simulation must also be accurately shared. If this is not the case, then there is no guarantee that what you see and understand is the same as for me, so you are unable to trust that communication channel.
In the very near future, the virtual objects that will soon populate our world must be as responsive and alive as a physical object. There won't be a physical reality and a virtual or augmented reality. Soon, there will only be a seamless, multi-user reality.
15:10 - 15:35
Join as leaders in extended reality (XR) for the U.S. Department of Veterans Affairs talk candidly about how immersive technology is transforming the way healthcare is delivered and experienced. This transformation has been enabled and amplified through collaboration – both within the organization and with the broader XR community.
11:05 - 11:30
The AR Cloud will become the most important communication medium of our time, providing connections and context between people, places and connected machines. We will discuss available SDK's and cover the essential elements, components, and tools required to develop public and private spatial computing applications. We'll walk through real world examples created with the YOUAR Cloud SDK to illustrate how to create, use and leverage AR Cloud infrastructure at your facility today.
10:05 - 10:30
Join Kasper Tiri as he opens the floor for a discussion with developers about interoperability. Is it essential for the future of immersive technology? This is an opportunity to chat with others in the industry who are passionate about this topic.
14:40 - 15:05
Learn how specialized Virtual Reality NeuroTherapy (VRNT) is used for in-patient, out-patient and telehealth care for Spinal Cord Injury, Brain Injury and Stroke recovery to enhance neurological engagement for improved range of motion, pain management and quality of life.
14:40 - 15:05
The modern warfighter has limited situational awareness and performs under a high cognitive load while operating and executing missions utilizing autonomous systems. Currently, operators are head and eyes down, losing the ability to fully know what is happening in the environment around them.
The use of AI-generated 3D models and AR pre-visualization tools can eliminate mission repetition and reduce commander intent communication while allowing for full situational awareness. The warfighter will also see a reduction in the expertise required in unmanned system flight control enabling them to "launch and forget" until notified of pertinent activity.
XR tools enable seamless command and control of unmanned system while providing safety to the operator.