12:30 PM - 05:00 PM
12:30-1:00pm PDT: Introduction to the AWE Immersive Arts Symposium, Presented in the Museum of Other Realities with Kaleidoscope;
1:00-2:00pm PDT: The Once and Future Artistic Legacies of Virtual Realties;
2:30-3:30pm PDT: VR Sculpting Best Practices Panel;
4:00-5:00pm PDT: How to Fund your XR project with Kaleidoscope
To check these out, read the below, these instructions (https://www.notion.so/A-Guide-to-Attending-Events-in-the-MOR-2410f5de72054b339531fa4d15ec36d5), and enter 'awe1' for the code.
To access The AWE Immersive Arts Symposium, download the Museum of Other Realities app on Oculus, Steam, or Viveport for $19.99. Once inside, enter the room name (5/26: awe1, 5/27: awe2, 5/28: awe3) either on the flatscreen monitor or in VR. On the flatscreen, click the hamburger menu to bring up the interface (if it isn't visible already).
Enter the given room name in the "Online Room" field and hit the "Connect" button. You might have to disconnect and reconnect for it to work. The interface mostly matches the flatscreen version, so you can follow the instructions above to connect to the room name. Once that's done, you should be in the same room with other attendees and presenters!
12:30 PM - 05:00 PM
12:30-1:00pm PDT: Introduction to the AWE Immersive Arts Symposium, Presented in the Museum of Other Realities with Kaleidoscope;
1:00-2:00pm PDT: The Once and Future Artistic Legacies of Virtual Realties;
2:30-3:30pm PDT: VR Sculpting Best Practices Panel;
4:00-5:00pm PDT: How to Fund your XR project with Kaleidoscope
To check these out, read the below, these instructions (https://www.notion.so/A-Guide-to-Attending-Events-in-the-MOR-2410f5de72054b339531fa4d15ec36d5), and enter 'awe1' for the code.
To access The AWE Immersive Arts Symposium, download the Museum of Other Realities app on Oculus, Steam, or Viveport for $19.99. Once inside, enter the room name (5/26: awe1, 5/27: awe2, 5/28: awe3) either on the flatscreen monitor or in VR. On the flatscreen, click the hamburger menu to bring up the interface (if it isn't visible already).
Enter the given room name in the "Online Room" field and hit the "Connect" button. You might have to disconnect and reconnect for it to work. The interface mostly matches the flatscreen version, so you can follow the instructions above to connect to the room name. Once that's done, you should be in the same room with other attendees and presenters!
01:00 PM - 01:20 PM
How do new products & services triumph in crowded marketplaces? What can we learn from neuroscience to help inform design choices?
This talk will be a brief tour of the science and psychology behind making unforgettable experiences. The talk will also be a second-screen experience - where audiences are invited to contribute, share their views, interact and ultimately affect the outcome of the talk - in real-time.
01:00 PM - 01:20 PM
How do new products & services triumph in crowded marketplaces? What can we learn from neuroscience to help inform design choices?
This talk will be a brief tour of the science and psychology behind making unforgettable experiences. The talk will also be a second-screen experience - where audiences are invited to contribute, share their views, interact and ultimately affect the outcome of the talk - in real-time.
01:00 PM - 01:20 PM
From black and white to color, and desktop to mobile, our electronic devices are visual mediums that have rapidly evolved and advanced over the past century. New device technologies continue to add clarity and mobility, enhancing our viewing experience to be more convenient and closer to our reality. Today, more than ever, these devices have become indispensable information vehicles that are ingrained into every aspect of our daily lives.
The average person is now so acclimated to having these thousands of digital interactions each day, they’re hungry for a more immersive experience. They’re ready for the next medium that will seamlessly enhance their lives and bridge the gap between real life and technology (without a headset). They’re ready for light fields.
In this talk, David will discuss Lightfield as the next generation medium. He will explain how this emerging technology provides users with a fully interactive, lifelike viewing experience by rendering images and videos with 3D depth, and complex, realistic light effects such as sparkles, texture and highlights. He will also discuss how this type of technology will change the game for businesses by empowering them to provide end-users with more immersive and engaging experiences across industries including auto, retail, medical and education.
01:00 PM - 01:20 PM
From black and white to color, and desktop to mobile, our electronic devices are visual mediums that have rapidly evolved and advanced over the past century. New device technologies continue to add clarity and mobility, enhancing our viewing experience to be more convenient and closer to our reality. Today, more than ever, these devices have become indispensable information vehicles that are ingrained into every aspect of our daily lives.
The average person is now so acclimated to having these thousands of digital interactions each day, they’re hungry for a more immersive experience. They’re ready for the next medium that will seamlessly enhance their lives and bridge the gap between real life and technology (without a headset). They’re ready for light fields.
In this talk, David will discuss Lightfield as the next generation medium. He will explain how this emerging technology provides users with a fully interactive, lifelike viewing experience by rendering images and videos with 3D depth, and complex, realistic light effects such as sparkles, texture and highlights. He will also discuss how this type of technology will change the game for businesses by empowering them to provide end-users with more immersive and engaging experiences across industries including auto, retail, medical and education.
01:00 PM - 01:20 PM
As the media landscape continues to segment, technology is pushing the boundaries on how media companies are connecting brands and audiences. At the center of this digital evolution, ViacomCBS is driving authentic brand extensions that reach fans through emerging products and experiences. Join TimAdams, VP Emerging Products Group, ViacomCBS, to learn how the company is doubling down on innovation to extend brand narratives through the use of spatial computing, augmented reality and voice interactivity.
01:00 PM - 01:20 PM
As the media landscape continues to segment, technology is pushing the boundaries on how media companies are connecting brands and audiences. At the center of this digital evolution, ViacomCBS is driving authentic brand extensions that reach fans through emerging products and experiences. Join TimAdams, VP Emerging Products Group, ViacomCBS, to learn how the company is doubling down on innovation to extend brand narratives through the use of spatial computing, augmented reality and voice interactivity.
01:00 PM - 02:00 PM
Transfer your consciousness ,via a virtual avatar, into ENGAGE where you will join Chris Madsen, Steven Sato and fellow AWE peers for an immersive interactive journey into ENGAGE to experience first-hand how virtual reality is being used for collaboration, training, education, communication, events and more! Space is limited.
ENGAGE is an education and corporate training platform in virtual reality. It empowers educators and companies to host meetings, presentations, classes and events with people across the world. Using the platform, virtual reality training and experiences can be created in minutes. The tools are very easy to use and require no technical expertise. You can choose to host your virtual reality sessions live, or record and save them for others to experience later. A wide variety of effective and immersive virtual experiences can be created with an extensive library of virtual objects, effects and virtual locations available on the platform.
Space is limited to 50 people for one hour each day from May 26-29 at 1pm PST.
Details to sign up for this virtual event are on the ENGAGE event calendar.
IMPORTANT: The AWE conference password is 'vr'
Day 1: https://app.engagevr.io/events/V8jqZ/share
The livestream can be found at: https://engagevr.io/awe-conference/
01:00 PM - 02:00 PM
Transfer your consciousness ,via a virtual avatar, into ENGAGE where you will join Chris Madsen, Steven Sato and fellow AWE peers for an immersive interactive journey into ENGAGE to experience first-hand how virtual reality is being used for collaboration, training, education, communication, events and more! Space is limited.
ENGAGE is an education and corporate training platform in virtual reality. It empowers educators and companies to host meetings, presentations, classes and events with people across the world. Using the platform, virtual reality training and experiences can be created in minutes. The tools are very easy to use and require no technical expertise. You can choose to host your virtual reality sessions live, or record and save them for others to experience later. A wide variety of effective and immersive virtual experiences can be created with an extensive library of virtual objects, effects and virtual locations available on the platform.
Space is limited to 50 people for one hour each day from May 26-29 at 1pm PST.
Details to sign up for this virtual event are on the ENGAGE event calendar.
IMPORTANT: The AWE conference password is 'vr'
Day 1: https://app.engagevr.io/events/V8jqZ/share
The livestream can be found at: https://engagevr.io/awe-conference/
01:20 PM - 01:40 PM
The NBA has been at the forefront of creating innovative digital fan experiences. Scott Stanchak, the league’s Vice President, Emerging Technology, will detail how the NBA is using augmented reality, mixed reality and virtual reality to bring fans from around the world closer to the game. Stanchak will also discuss how the NBA is using a similar innovation process to enhance the live game-viewing experience.
01:20 PM - 01:40 PM
The NBA has been at the forefront of creating innovative digital fan experiences. Scott Stanchak, the league’s Vice President, Emerging Technology, will detail how the NBA is using augmented reality, mixed reality and virtual reality to bring fans from around the world closer to the game. Stanchak will also discuss how the NBA is using a similar innovation process to enhance the live game-viewing experience.
01:20 PM - 01:40 PM
This talk will present a few use cases from Dr. Jayaram’s work at Intel, start-ups, and university to discuss some of the technology and business aspects of bringing VR to two different audiences – 1) to an end-consumer in sports and concerts, and 2) to enterprises for engineering design and manufacturing. What are the challenges in delivering VR based experiences and solutions at scale? What does it mean to weave them into the “normal” way of doing things to arrive at a new normal where VR is a mundane, predictable, and a regular part of entertainment and work?
01:20 PM - 01:40 PM
This talk will present a few use cases from Dr. Jayaram’s work at Intel, start-ups, and university to discuss some of the technology and business aspects of bringing VR to two different audiences – 1) to an end-consumer in sports and concerts, and 2) to enterprises for engineering design and manufacturing. What are the challenges in delivering VR based experiences and solutions at scale? What does it mean to weave them into the “normal” way of doing things to arrive at a new normal where VR is a mundane, predictable, and a regular part of entertainment and work?
01:40 PM - 02:00 PM
This year NFL football fans were able to enjoy a new AR activation on-site in Miami, the host city of Super Bowl LIV. Utilizing the Unity platform, Bose created a beacon-based fan experience as part of the 10-day celebration leading up to the game, in partnership with the NFL and Trigger Digital. Fans were asked to don a pair of Bose headphones and approach a set of “players’ lockers”. Activated by a beacon, the lockers came to life with an audio experience offering content related to the relevant NFL player. For example, the fan may have heard a previously conducted interview between the player and a reporter. In this presentation, Michael Ludden, Global Head of Enablement & Principal AR Advocate at Bose, Tony Parisi, Head of AR/VR Ad Innovation at Unity Technologies and Jason Yim CEO at Trigger Digital will explain how this activation was developed and why an exclusively audio-first approach to augmented reality with tailored audio content can be a new way of engaging consumers that is often overlooked. The speakers will demonstrate what is possible with audio AR, including how it can provide a sense of motion and direction.
01:40 PM - 02:00 PM
This year NFL football fans were able to enjoy a new AR activation on-site in Miami, the host city of Super Bowl LIV. Utilizing the Unity platform, Bose created a beacon-based fan experience as part of the 10-day celebration leading up to the game, in partnership with the NFL and Trigger Digital. Fans were asked to don a pair of Bose headphones and approach a set of “players’ lockers”. Activated by a beacon, the lockers came to life with an audio experience offering content related to the relevant NFL player. For example, the fan may have heard a previously conducted interview between the player and a reporter. In this presentation, Michael Ludden, Global Head of Enablement & Principal AR Advocate at Bose, Tony Parisi, Head of AR/VR Ad Innovation at Unity Technologies and Jason Yim CEO at Trigger Digital will explain how this activation was developed and why an exclusively audio-first approach to augmented reality with tailored audio content can be a new way of engaging consumers that is often overlooked. The speakers will demonstrate what is possible with audio AR, including how it can provide a sense of motion and direction.
01:40 PM - 02:00 PM
3D face scanning is the first step in cloning a person and bringing the person to a virtual environment with high fidelity. High quality 3D face scanning typically has been done with specialized and complicated photogrammetry systems. The emergence of low-cost and high-quality mobile depth sensors, such as those in Apple iPhone X, helps to fuel the advance of 3D scanning and brings the scanning capabilities to the hands of millions of users. The talk will discuss different types of mobile depth sensors, including structured light, ToF, and active stereo, and compare their pros and cons for close-range 3d face scanning. The talk will also present the unique challenges of 3d face scanning and discuss their applications in fields such as medical, dental, eyewear, entertainment, and 3d printing.
01:40 PM - 02:00 PM
3D face scanning is the first step in cloning a person and bringing the person to a virtual environment with high fidelity. High quality 3D face scanning typically has been done with specialized and complicated photogrammetry systems. The emergence of low-cost and high-quality mobile depth sensors, such as those in Apple iPhone X, helps to fuel the advance of 3D scanning and brings the scanning capabilities to the hands of millions of users. The talk will discuss different types of mobile depth sensors, including structured light, ToF, and active stereo, and compare their pros and cons for close-range 3d face scanning. The talk will also present the unique challenges of 3d face scanning and discuss their applications in fields such as medical, dental, eyewear, entertainment, and 3d printing.