04:40 PM - 05:05 PM
More than a decade ago, the industry started to signal interest in enabling core XR technologies. Around 6~8 years into that phase, the industry started seeing first generation XR devices introduced into the market with varying degrees of success. Just 2 years ago, we started to see an explosion in user interest in XR-related products, in part due to the pandemic influencing user behavior, but also thanks to companies trail-blazing in products and services. The question we continue to ask ourselves is: what are the devices and the user experience that should be offered to the end-users in the next 3~5 years? Goertek is proud of its legacy, and its continued investment into the XR category and we look forward to using this session to engage with the XR community.
01:55 PM - 02:25 PM
Companies are often intimidated by implementing VR Training due to the challenges of scaling it. Join PIXO and its panel to discuss how enterprises have overcome these challenges, including:
• Getting buy-in from IT and other stakeholders
• Compliance and security
• Hardware acquisition
• Unique use cases
• How to blend VR with traditional training methods
• LMS Integration
• Identifying relevant content and deploying it remotely
• Cost/ROI
09:00 AM - 09:55 AM
Panel discussion with XR platform leaders to compare and contrast what form factors and scenarios will entice main stream consumers to engage in XR. Evaluate VR versus AR in terms of what consumers will actually enjoy and see value in. Do we really need 360VR or is location triggered AR a better use case?
12:10 PM - 12:35 PM
Speakers from AbbVie and Tipping Point Media (TPM) will discuss the case study of AbbVie’s Virtual Reality Suite of resources, including the newest addition
– an overview of presbyopia, or age-related degeneration of the eye. These VR experiences have been designed to support the launch of AbbVie’s newest products and indications by engaging healthcare providers (HCPs) and creating disease awareness through experiential learning. The newest eye-care focused expansion to this training tool leverages VR’s unique, visual immersion capabilities to simulate the impact of an eye disease on patient in a first-person perspective capacity.
During this time, speakers will detail how the use of innovative technologies can provide physicians and their staff with a deep understanding of various disease states through immersion. The session is meant to give audiences a first-hand account of the huge strides that can be made when companies invest in creativity and innovation and build towards the future.
11:10 AM - 11:35 AM
At its best technology can be a powerful force for good in the world.
New cutting edge computer vision and code scanning techniques are helping CPG brands reimagine their packaging and product offering to help the visually impaired locate and access product information through their phone both in-store and at home.
Increasingly brands and businesses are looking to embrace the accessibility and inclusivity agenda for all their customers: providing access to relevant and augmented information for their total customer base, whether they are sighted or blind and partially sighted.
The trick is to deliver this in a way that can work operationally through the design, production and manufacturing chain whilst offering a simple and single scanning solution for all end users. Discover a new approach to solve this problem.
11:10 AM - 11:35 AM
Join for a discussion around the security/seamier side of the Metaverse and how we can establish trust and keep safety top of mind as we develop the infrastructure and regulations around immersive and emerging technologies.
09:00 AM - 09:25 AM
The ability of field medics and field surgeons to be able to get immediate consultation from an expert surgeon may be the critical difference in the survival and more positive longer-term recovery of a wounded warfighter. Being able to do this in an Augmented-to-Virtual Reality space would set this collaboration experience to an even higher level. This 3D collaborative space would provide a spatial calibration of the physical and virtual realities to allow the Novice surgeon to work on the physical body within an augmented reality alongside an Expert surgeon’s avatar with a virtual reproduction of the patient’s injury in his virtual reality.
The Virtual Augmented Distributed Engagement Reality (VADER) is an AR-VR Remote Telementoring/Telemedicine application that was implemented by SOLUTE for a medical organization in San Diego and is part of their operations center showcase offerings highlighting the latest leading-edge medical technologies.
02:00 PM - 02:25 PM
Immersive mediums are playing an increasingly important role in architecture by facilitating design reviews, communicating complex building data, and guiding both off-site manufacturing and on-site construction. Join the conversation with SHoP Architects, the design firm behind projects such as the new YouTube and Uber headquarters in California, and Assembly OSM, a venture-backed architectural manufacturing startup.
Adam Chernick from SHoP and Christopher Morse from AOSM are excited to be co-speaking with Thiasa Yamamura of Sony Electronics about their Spatial Reality Display (SRD) system.
SHoP and AOSM have been at the forefront of pushing the use of immersive tools within the Architecture, Engineering and Construction (AEC) Industry. They have worked closely with Sony in order to push what is possible with immersive technology including the SRD. They will show some of the applied research efforts currently underway at their companies and will speak to their collaboration with Sony.
01:35 PM - 02:00 PM
In today's world technology is ubiquitous. COVID-19 has taught us that we can live and operate in a virtual world. We as a society rely on technology more than ever, from doing our jobs to running our homes to education. The future of tomorrow will bring more immersive technologies to everyday use for younger and younger audiences. This talk is meant to spark concerns and curiosity from the technical and non-technical audience about children's safety and well-being in the XR ecosystems aka the Metaverse.
We’ll explore hypothetical scenarios and ask difficult questions to ensure we’re protecting the next generation to the best of our ability. Malicious use of these technologies can lead to psychological, physical, reputational, social, and economic harm and put children at risk. In this session, XRSI founder Kavya Pearlman and advisory board member Kai Frazier explore the potential opportunities and risks in XR systems, understand how to mitigate them and how to better protect children and young people as we move fast into the Metaverse. This session will approach the topic from multiple different directions. An introduction to the Metaverse, and child safety specific challenges, concerns, constraints overlap and the types of threats children experience and may experience in the future. This includes discussion on issues of privacy and trust in the context of child safety and well-being. Finally, framing how the industry can respond to these challenges: what research and development are needed in order to move from research prototypes and early demonstrators to a secure, reliable, and trustworthy Metaverse that can play a more significant role in everyday life.
11:10 AM - 11:35 AM
In the backdrop of the Metaverse, how does the real world fit in, what does open, interoperable, and ethical look like, and what is the role of the Web ? We explore some potential guiding principles for the real world Spatial Web and early steps to realization through standards activities and the Open Spatial Computing Platform (OSCP). Along the way, we highlight areas that present strong opportunities for academic research, standards, and open source contributions.
10:05 AM - 10:55 AM
Though many companies investing billions into metaverse technology are competitors, they share a common interest in creating a shared digital universe. To enable an open future where assets are able to move freely and work consistently across the entire digital universe, the metaverse needs standards that combat fragmentation and preserve openness.
Hundreds of industry leaders are working to enable the transition from the development of 3D assets to metaverse assets and explore the roles open standards and open source software will play in the creation of portable, interoperable metaverse scenes and objects.
Join leaders from The Khronos Group, Academy Software Foundation, and the Worldwide Web Consortium as they discuss how companies, creators, and standards developers can work together to make sure the tools required for an open metaverse are not just available, but ubiquitous and robust.
02:00 PM - 02:25 PM
Discover how Red 6 has executed the development of the first multi-player video game in the sky and lead to the creation of the most advanced training platform to date.
Red 6’s technology leverages an environment where a persistent ubiquitous layer of connected data and metadata interact seamlessly with the real-world. Decisions made within the metaverse now, for the first time, have live, physical consequences for the trainee, resulting in a more immersive experience.
Daniel Robinson, Founder and CEO of Red 6, will share how this technology is advancing the readiness of our nation’s armed forces and how Red 6’s technology is the genesis of the Military Metaverse.
10:00 AM - 10:25 AM
The Metaverse is coming and it will impact all our lives before the current decade is out. As an industry pioneer who began working in AR and VR over thirty years ago, I have a broad perspective on where the field has been and where it's going. And right now, I am both excited and concerned. Excited because the momentum has finally hit critical mass. Concerned because of the very real prospect that large corporations could bring the destabilizing features of social media into the Metaverse. To address this, I have been calling for sensible regulation of metaverse platforms to help ensure the technology rolls out in a positive way. This talk will address this important issue by answering three key questions: (1) What will the metaverse really be like when widely deployed? (2) What are the most significant risks consumers will face? and (3) What are the most helpful and sensible forms of regulation to avoid the metaverse becoming a destabilizing force like social media?
03:40 PM - 04:30 PM
Where are we in the course of spatial computing’s lifecycle? Where are AR and VR having an impact today… and where aren’t they? What's the fate of Apple's market entrance? And will Meta's big metaverse bet pay off? We’ll tackle these and other burning questions with a multidisciplinary panel of industry thought leaders, innovators, and heavy hitters.
03:40 PM - 05:10 PM
Join us as startups pitch to our panel of judges for a chance to be named AWE's "Startup to Watch" for 2022. The winner will be announced at the Auggie Awards ceremony.
Judges for the Start-Up Pitch competition are Dave Haynes, Tipatat Chennavasin, Pearly Chen, and Kristina Serafim.
04:35 PM - 05:05 PM
Guests Benny Arbel along with Helen Freeman will cover how education is the only area of our lives that the digital revolution has not significantly improved. The experience gap between children's education and consumer life is massive and still growing. How Inception along with leading educational publishers like Oxford University Press is overcoming those barriers. Why XR is a key to ushering in the long-awaited digitization of children's education.
02:30 PM - 03:20 PM
As designers and developers build AR and VR hardware, applications and content, how can they make sure they are inclusively designed so that everyone - including those who identify as people with disabilities - can access, enjoy and benefit from them? How can AR and VR designers and developers leverage inclusive design and development practices to consider many aspects of human diversity such as ability, language, culture, gender, and age? What are some of the broader benefits of designing more accessible AR and VR such as broader usability for people with temporary impairments or people in different environments and use cases. In this session, panelists will explore these and other topics related to XR Accessibility as well as share a variety of practical community XR accessibility design and development resources.
01:25 PM - 01:50 PM
As companies scramble to define what "hybrid work" looks like for their workforce, the first real use cases of photorealistic-quality, real-time holograms are helping organizations bridge a vital engagement gap in team collaboration. Allowing participants to interact in ways previously only possible in-person, this 3D technology also enables content sharing of both physical objects and 3D files, which allows users to simultaneously interact with the presenter and the content without being forced to focus on one experience at the detriment of the other.
This session will describe the first real world examples of how this innovative technology is being used by companies today. You will hear about a leading medical device manufacturer using 3D holograms in training scenarios for surgeons, as well as McLaren Racing, which is using it as a powerful tool for their design engineers, drivers and crews - rather than flying a technician to the racing team or explaining procedures through flat images, they can now immediately show an engine component from every angle, convey sizing, and instruct on assembly and usage as if they were in person – all while saving countless hours in travel time.
04:05 PM - 04:30 PM
The joke used to be that "On the Internet, nobody knows you're a dog." In the metaverse, nobody knows who you are.
How do we evolve to the next stage of identity? What does it mean to be human? What do our names mean? How much do we want to look like ourselves, versus look like avatars? Today, we face a new paradigm of online identity. It presents many questions on how we develop for this world.
These are questions to engage the audience in an examination on not only what it means to be human, but how do we evolve to that next stage of identity.
02:05 PM - 03:00 PM
Using Resolve's VR software, Meta’s data center design and construction teams use Quest 2 for cross functional collaborative feedback sessions of the General Contractor's Master Construction Model. This enables data center end users, like facilities operations, culinary, and security teams, to provide valuable feedback early in the design and construction process. This type of virtual collaboration has saved Meta substantial time and money by identifying issues before construction and optimizing for a safer, more efficient, and sustainable final data center build.